Design and develop evaluation and evaluation plans.
Systems that collect information about the learners knowledge and placement in the course are focused in user experience design strategies. The Instructional Designer must always consider the user. The tutor must know the learner and the system must be accessible. Every course must gain consent from the participants to collect data. All evaluations of the learner are nonbias. Content is vetted and the environment is straightforward, open, and inviting.
Evaluation is built in when Instructional designers practice evaluating user experience data. Bounce rate is an easy data point to capture. There are great details about users in this one data point. Very usefulness for collecting meaningful site data. Knowing an individual was on a page for only 5 seconds is intriguing. This type of examination into learner’s actions can be collected and used to learn about the design of the course and the actions of the user.
In developing systems and plans to monitor learners in a system the Instructional Designer must use two research of data to completely understand the overarching viewpoint. Quantitative research data is categorized by numbers. Perfect for measuring things. Qualitative research is the opposite. It is the stuff that gives the researcher all the feels. This data is used to make sense of the situation. Take a basketball team for example. They use quantitative data with the analysis of the games and players. But they also talk a lot about the culture. Just how important it is to play as a team. While there may be quantitative data that indicates teams are better when the culture is in place, how to get that culture is another dynamic. Sometimes it just has to feel right.
Conduct a formative evaluation of the instructional situation.
First, let’s look at how far the instructional movement has come. Currently, we are in the Performance Improvement Movement (2008-present), Instructional Strategies (1994-2008) before that, Systematic Design Process (1977-1994) before that, Systematic Instructional Design (1950-1970) before that, and The Visual Instruction Movement (1920-1940) at the beginning. Pretty cool huh? All these years built on evauation.
Digital learning systems make models of instruction built on values, objectives, and motivation. Digital learning models have content, strategies, controls, messaging, representation, logic, and management. Instructional Designers building learning systems that try to understand a user’s conceptual framework by recognizing varied details about the user. It looks at skill and knowledge and transfers the data. Learning this way makes creative thinkers who know a good challenge. People are lifelong learners now. The hope is that Instructional Design will tech more individuals to think critically, communicate effectively, and work collaboratively.
Apply appropriate qualitative and quantitative data collection methods.
As an Instructional Designer, the challenge is to be on top of all the feedback and user data. Designers must know when to make moves and clear realizations don’t always present themselves. One must use evaluation tools to measure and critique the positive and negative effects of learning systems. Also Important for labeling data, Instructional Designers must have the system identified by levels of correctness in order to classify a user’s understanding. These learning states are generally labeled as learned, partially learned, unlearned, and misconceptions.
I was in a group with two other students and we put together a report that looked into the effects of preschool teachers with a Masters degree. We wondered if teachers with Master's degrees gave better instruction resulting in better child academics. To research this we performed an inquiry, interviews, surveys, formative interview data, a usability test, and a full report. We found there were a few small differences but for the most part, the children did the same with or without masters degree teachers. Master degree teachers did more technology. That was about it. The big kicker was consistency. Preschoolers performance suffered in places with poor attendance. Check the results from our findings by clicking the link below.
Construct valid and reliable data collection tools. Develop a communication, implementation, and evaluation plan. Collect, analyze, and summarize data.
The Instructional designer must ask a few things before entering the analyze phase. How much time do I have? What resources do I have to work with? Who is in the position of control? What tools and techniques do I need to understand the mission of the company? What is needed from the partner? What is the beliefs and culture of the company? How can I obtain the best practices from the experts? Where do I learn how to communicate and get acquainted with the working environment.
I tried to collect, analyze, and summarize data I received from my kids over the school year. Check out my paper Educational Analytics: Koby and Elijah 2016-17 by clicking the link below. In this paper, I use data analytics to look at test scores.
In another example of data collection strategies, I try to use big data collected from Twitter to communicate the trends of music festivals. Collecting data gives Instructional designers the ability to implement a solid plan. The evaluation of the music festival scene using social analytics is a good example of what designers piece together to understand a trend happening with thousands of people coming together in the matter of one week. Click the link below to see just what big data can tell Instructional Designers about the individual.
Manage the evaluation and research processes.
Business advantages of having practitioner research in online learning is extremely helpful for making design decisions about the learning model and providing adaptive user experiences. Practitioners have useful information. They are immersed in the day to day. They are the ones who get direct feedback. Disadvantages to practitioner research is sometimes designers hear what they don’t want to hear. This tends to make us want to drag our heels or have a negative reaction. However, these situations only mean there is room to grow. There is a lot of learning that can happen from bad evaluations. Everyone will benefit from research that tackles real-world problems. Research tries to build a strong foundation, is not afraid to get dirty, and makes a positive difference.
In selecting problems suitable for research, I start to think about what I can and can’t do. To researching real-world problems I must understand the problem, finding workable solutions, working towards a solution, and evaluating change.
Collect all the data you can! When collecting accurate content first identify the main topic, subtopics, variables, theories, theorists, key topics, and so on. I typically pay attention to the details that interest me first, then I’ll find the overlapping sections. I manage content by creating relevancy and noting the sources. I take lots of notes.
Generate evaluation and research reports and circulate to stakeholders.
To collect user data Instructional Designers will develop a questionnaire. A good questionnaire will ask the right questions and provide clear instructions. No one is excited to take a survey so keeping it to the point and organized is a big part of acquiring accurate data. If it’s necessary Instructional Designer may need to administer the survey. The instructional designer has open eyes, ears, and mind. They are aware to all that is going on around them.
Provide a rationale for evaluation and research decisions.
If you’re looking at data it must have been collected in some way. Right? How did it get there? The researcher must process, think about, and understand the entire process of collecting data. “Actually doing it!” The process of making sense of data. Because if the answer is in the data you will recognize it in threads and patterns. The way data is collected and categorized is important to the understanding of the data. The more accurate the details, the more reliable the data story becomes. I use journaling to collect data about my life. It’s a good way to capture the details that get lost when we tell ourselves the stories of who we are.
When presenting a report there is always the likelihood that the presenter is putting some sort of spin on the data. Well, they are, but they can’t help it. A researcher can use methods to minimize personal bias however, it will always bee there. Think about all the opinions, beliefs, and considerations that must be made in this complex array of individualism. With that in mind, I think it’s a good idea to present both integrated and isolated data. This is a good practice because presenting data in its isolated form is very matter of fact. The reader can see the facts for what they are. On the flip side, data can be integrated by the presenter to highlight interesting facts whether they are the truth, exaggerations, or false. The presenter could have already curated the findings and made the associations for the reader. I think it is important to combine the two data presentation strategies because readers are already used to seeing this in storytelling with fiction and non-fiction. Using integrated and isolated data techniques gives the reader the ability to make their own distinction on the content.
I can only speak to my experiences because it’s all I know to be true. Similarly, the Instructional Designer has to follow the fairness of the investigation by making sure to include everyone and to always remain honest. It is simple in my eyes, trust the ethics committee. They are there to ensure integrity in learning programs, promote responsibility for learners, protect the researcher, and protect from legal ramifications that might arise from unethical research. Basically, don’t go there. The Instructional Designer has to seek help early and often so nothing can sneak up on you.