The DOER Effect and H5P
One more post for the H5P mini-series I’ve been working on. You can find the others in this series such as: H5P and Pressbooks, a Brief Highlight Reel, Which H5P Type is Right for You?, Designing useful feedback in H5P. Now on with the show.
The “doer effect” is an association between the number of online interactive practice activities students’ do and their learning outcomes that is not only statistically reliable but has much higher positive effects than other learning resources, such as watching videos or reading text. We also provide generalizability [sic] evidence across four different courses involving over 12,500 students that the learning effect of doing is about six times greater than that of reading. (Koedinger, McLaughlin, Jia, and Bier, 2016.)
There are a number of items highlighted to point us in the direction of creating activities that make use of the DOER effect:
- are aligned with learning objectives,
- are embedded in the course content,
- provide opportunities for students to test their understanding of concepts and to practice skills,
- take various formats (e.g., multiple-choice questions, interactive simulations, drop and drag, matching, and other options),
- target common student misconceptions or areas of confusion,
- deliver corrective feedback (e.g., correcting student misunderstandings when an incorrect answer is selected),
- deliver reinforcing feedback (e.g., explaining why correct answers are correct),
- deliver immediate feedback as-needed (e.g., when a selected answer is incorrect),
- deliver immediate feedback as-requested (e.g., in the form of a hint).
There is a worksheet available from BC Campus related to this, but I would like to take this opportunity to provide some comments and maybe a few examples along the way.
Are Aligned With Learning Objectives
This is the refrain you’ll hear from many many instructional designers, so hopefully it’s no surprise you see it here again. Often objectives can be viewed as something you’re forced to do without much sense of what it can help you do. Well constructed objectives can make creating activities and assessments more streamlined and support constructive alignment in your course/learning materials etc. There have been projects I’ve worked on as well where the instructor might hit a creative road block in creating assessment items, and the objectives allow me as an ID to draft something to help the instructor get unstuck. Let’s take a look at an example of an objective that became and H5P activity.
- List the three types of genocide as originally defined by Raphael Lemkin
- Identify which acts of genocide in Canada’s criminal code fall short of both the UN Convention on genocide and Lemkin’s original definition of genocide
Are Embedded in the Course Content
Normally in an LMS if you have a quiz where students can practice their knowledge based skills, you use a quiz tool. A problem with this tool is that it is the equivalent of having an active class discussion, then sending your students down the hall to another room to try out their knew skills and knowledge, then they have to come back down the hall to get at the class discussion again. Embedding the practice items inline with the content allows students to try things out without going anywhere else. When I worked on the ProComm OER, we provided practice opportunities at the end of major sections, as well as at the end of the chapter. You can also use this embedding to create virtual proximity between the exercise and the material it’s related to.
Provide Opportunities for Students to Test Their Understanding of Concepts and to Practice Skills
This is a big one. The old saying goes, “practice makes perfect”, although a newer version of this is “perfect practice makes perfect.” Imaging trying to learn to ride a bike from only reading about it or watching video. You might be able to do it, but without the practice of actually riding a bike you wouldn’t master the skill. In fact, watching video along without practice can lead novices to a false sense of their own ability:
Six experiments (N = 2,225) reveal that repeatedly watching others can foster an illusion of skill acquisition. The more people merely watch others perform (without actually practicing themselves), the more they nonetheless believe they could perform the skill, too (Experiment 1). However, people’s actual abilities—from throwing darts and doing the moonwalk to playing an online game—do not improve after merely watching others, despite predictions to the contrary (Experiments 2–4). (Kardas, M. & O’Brien, E., 2018.)
The typical example of this practice might be straight forward objective questions like defining terms, but my colleague Julie just reminded me of a bunch of activities that give short descriptions or scenarios where students have to identify which concept fits the description. This takes your learner from the remember or comprehension level of knowledge to application in many cases. An example from an early Ed Psyc project I worked on:
Take Various Formats
In a previous post I shared some research on the finding that high performance on MCQ items was an indicator for future performance on MCQs and not task completion more broadly. What does this mean with regards to having various formats? It means that if we change up the format of the question that students won’t get locked into receiving the information in one format and responding only in one format.
A great example of this is Duolingo. I’ve been working on the Finnish language with this app since the language was released and I’m finding it really interesting from an instructional design perspective. The app teaches without long form reading, but instead through active practice. It’s all tranlastion based tasks: “green” (vihreä), That is a cute bunny (tuo on söpö pupu). Sometimes it employs images, or audio. The ways it asks you to translate it by selecting an image and word, selecting words in order, filling in a blank, writing the whole sentence. Duolingo scaffolds you from choosing words and shorter sentences to longer sentences that you need to type in manually.
There’s some critique to be done on the app, but from the take various formats perspective, Duolingo gets a lot right. For a bit more about Duolingo, Donald Clark writes 7 great things Duolingo teaches us about good online learning.
How does this apply to H5P? Well, in line with an example I gave in a previous post, you can also create a series of questions that are almost the same, addressing the same objective, but changing the format. You can have an MCQ, a Fill in the Blank, Drag the Text, etc. versions of the question.
A word of caution here, the format should be functional for the learner. Drag the Text is better than say drag and drop for a lot of things. Drag and Drop is fun to build, but most effective where there’s a spacial element.
Target Common Student Misconceptions or Areas of Confusion
After you’ve taught a course a few times you start to see patterns of students’ misunderstanding. This is a great opportunity to create assessment items and activities that play on these misconceptions to tease them out early in the course, allowing you to correct the errors. In the BC Campus post on feedback they describe how to create alternatives that are plausible, this is one way to include common misconceptions in H5P Activities. In the Kitchen Show episode, my colleague Julie will show off an example of making a whole activity around misconceptions with a Nuture vs. Nature activity.
Deliver Corrective Feedback
Correcting student misunderstandings when an incorrect answer is selected. This aligns with the point above. If there is a common error, you can use the feedback to point out what the problem is. This assumes that, in the case of MCQs for example, that the alternatives are all plausible. If not, this type of feedback might have limited utility. Here’s an example:
Deliver Reinforcing Feedback
Explaining why correct answers are correct is perhaps an under utilized opportunity. When we create assessment items, I often turn this kind of thing back to the learner. I see “justify” or “explain” appended to a bunch of objectives and exam items. You can couple an MCQ where a student has to choose an answer, then follow that with an essay content type (or other type) to explain or justify why the correct answer is correct. This keeps the learner in an active state, and my hunch is it’s more useful than feedback built into the correct item in an MCQ question alone.
Deliver Immediate Feedback As-Needed
When a selected answer is incorrect the learner finds out its wrong early. This is why I like to enable the “check” button on content types like the Question Set. In H5P you have the option of completing say a 10 question set, but you can check along the way whether you answered correctly rather than having to “Finish” and then review all the questions. Coincidently, Duolingo employs this design as well. You don’t have to check and can wait until the finish but it’s handy to get that feedback immediately.
David Wiley had posted a video about feedback where his son tried to throw three balls into a waste bin (or something like that) a while ago. In the video he demonstrated the difference between no feedback (0 shots succeed), feedback after all three have been thrown (still not great success on the next attempt), and feedback after each shot. The last one had the best success rate.
Deliver immediate feedback as-requested
Hints! Back on the Duolingo comparison, one thing it does quite well is allow you to get a sneak peak at the translations in certain exercises. This can boost your memory a bit so that you don’t need the whole translation. Many H5P content types allow hints such as the dialog cards and MQCs. For example, in the dialog cards for plant identification, not only did we provide a hint for where you should direct your attention to identify the plant, but also on the “answer” side how the expert used that observation to i.d. the plant.
Kardas, M., & O’Brien, E. (2018). Easier seen than done: Merely watching others perform can foster an illusion of skill acquisition. Psychological science.
Koedinger, K. R., McLaughlin, E. A., Jia, J. Z., & Bier, N. L. (2016, April). Is the doer effect a causal relationship? How can we tell and why it’s important. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 388-397).