Which H5P Type is Right for You?

Which H5P Type is Right for You?

August 14, 2020 Off By JR

An example juxtaposition I used in the presentation What is Old is New Again: Patterns in Pedagogy and Educational Technology

This is the second in a series of reflections about how I’ve used H5P until now. The first post covered some Pressbooks projects and some H5P presentations I’ve given over the past few years. Today I’m thinking about the question:

What considerations go into choosing the H5P content types?

This is an interesting question. I’ve been using H5P for years, and to be honest I have not used every H5P type. In fact I use a handful of content types repeatedly depending on the situation. This reminds me of a concept Andy Gibbons wrote about instructional design, Design by Default. Andy’s observation was that for each design project that there are innumerable decisions to be made, but that designers don’t actively make all of the decisions actively every time. Instead, we fall back on patterns and recognizing where solutions fit certain problems based on those patterns. Elizabeth Boling noted that this observation has also been made elsewhere in the design literature:

[quote style=”default” cite=”Lawson, B. (2004).” url=”https://www.sciencedirect.com/science/article/abs/pii/S0142694X04000328?casa_token=4Iz2D09r2_oAAAAA:Y4IopMK6XE0vTEs_tH4Y0Em-l7ZMdXEJEM-pg4gSyWR2wQPQSTfXLhT9AfwDS932wJbVxUFDiyPf”]The argument here is that recognising design situations is one of those key skills. Seeing some kind of underlying pattern or theme that enables a designer to recognise this and make a connection with some precedent in the episodic memory[/quote]

Some general items to consider include:

  • what are the learning objectives for the module/chapter?
  • what is not said or requires further explanation than is currently present?
  • are existing questions or activities available?
  • how would an interactive either support student learning or enhance the user experience?
  • is the activity helpful or just fun to make?
  • what is the balance between text of the proposed activity compared to media use?

Designed by Default

This is not always the case, but is a pattern in my own work. If, for example, an MCQ is the best fit for many of the content types being used, then that is what will be most frequent. The goal is not to use every content type in every project. The converse is also true, you shouldn’t shoehorn everything to be an MCQ. So lets begin there, most often if you receive a test bank from a publisher or with an open textbook there will be MCQs. They might actually be the most common type of assessment item in all of higher ed. When writing MCQs I generally look to the Vanderbilt U page detailing how to construct them effectively. This means that I almost never use True/False questions. What’s important to note here is that writing valid and reliable MCQs takes a lot more effort than is often given credit for. One estimate I recall seeing (I think from the ETS) was that a single MCQ can take about one hour to construct. Generally you don’t see that in practice. Take from that what you will. Anyway, an example of that might look like this:

Pharmaceutical Calculation, J. Anderson

One content type that is generally recommended against, especially in the context of quizzes deployed through an LMS is the fill-in-the-blank question type. In the context of graded assessment items this content type often requires manual grading from the instructor because the system is inflexible with capitalization, spelling, etc. Often LMSs are looking for exact matches for FITB questions. However, these are better question types from a learning standpoint than MCQs. MCQs tend to be selection tasks rather than increasing retention and recall (Duchaster & Nungester, 1982). Typing in answers for active recall is what increases retention and recall (Kang, 2007; Jacoby, 1978; MacDaniel, 1986; Hirsham & Bjork, 1988; Richland, 2005). With that in mind, if you’ve been provided a question, have you ever come across one like this:

Sociology Question, W. Little and R. McGivern
Instead you could create a FITB which not only better supports retention and recall, but takes less time to write and less time to build into H5P.

That being said, there are places where MCQ might still be quite useful, for example, when constructed well they will have alternatives which result from common errors learners make. You can then tailor the feedback in the MCQ to address the common error, such as we did in the PHAR H5P activities.

For longer form questions, such as short answer, I typically leaned on the accordion content type. Here I would ask students to consider the question and then compare that to the provided sample answer by the subject matter expert. The limitation in the according titles sometimes meant revising of the questions, but this worked pretty well for most courses.

Urban Agriculture Question, G. Wood

Now H5P has an essay question type which may actually perform this function. I’ll take a closer look at these content types in the feedback post.

In one of the episodes of Instructional Designers In Offices Drinking Coffee podcast, they mention Drag-and-Drop activities. After MCQ this seems to be one of the most popular quiz question types in elearning. But why? The hosts talk a bit about the line between what is interesting and useful for the learner and what is interesting to make as a designer. That seems to be why so many interactions that really shouldn’t be drag-and-drop end up that way, because it’s an interesting content type to make. So when would I use the drag-and-drop content type? Particularly when there is a spacial element to the problem. For example, in Geology  you might need to know where items appear in the cycle. Where would I not use drag-and-drop? Pretty much everywhere else. I see drag-and-drop for selection and categorization problems and generally there are better tools for that. One other consideration for drag-and-drop is screen size and whether the learner is on mobile. Generally speaking they don’t work super great on mobile.

Geological Materials and Biogeochemical Cycles (by K. Panchuk)

One reason drag-and-drop gets selected is because different authoring tools will not necessarily have a matching content/question type. Therefore it seems like a good option. With regard to H5P, we have drag-the-text content types, which is what I tend to use for matching questions:

A word of caution around drag the text is to consider the devices students will be using, and also how much space the activity as a whole takes up. More than once I’ve had to go back and separate out the question because it did not fit on the screen at once which made it exceedingly difficult to use.

When the task is identifying something visually, find multiple hotspots is my go-to. One such example is from Geology, again:

Geology (by K. Panchuk)

One of my favourite, and first, examples I saw in higher ed of this content type was what my colleague Ellen Watson cooked up for a dentistry course. They had x-rays of teeth and students had to click on where they could see whatever abnormality they were asked to. She created a pretty great set of images for students to practice for a blended learning project years ago and I believe it’s still in use today.

Previously, I only used the Interactive Video type for YouTube videos, until I was shown a way to use other video with H5P. Now, I don’t make all video interactive necessary, but am slowly adding interactive elements to them. The choice to make the video interactive is usually just triggered by the fact that there is a video there, but because that content type can have other H5P types embedded inside of it, then you make decisions using all those previous questions like: what’re the learning objectives, does this require further clarification, etc.

Geology (by K. Panchuk)

Rapid Prototyping

Sometimes the pattern is not quite clear and therefore the design by default is shaken off and we engage with the creation of activities differently. One example of finding “the right” H5P content type is by rapid prototyping. H5P is a simple enough tool that with the seed of an idea you can try out a bunch of different content types at relatively low cost. This provides a way to try out how the content type will function so you can not only focus on the learning design, but also the user experience.

One recent example of this was features of language. The activity idea started along the lines of “Each scenario below depicts one of the 16 design features of language. After reading each scenario, click on the design feature that you think corresponds with it to see if you are correct. Some scenarios illustrate more than one design feature.” That leaves it open to a few different solutions. In a relatively short amount of time I whipped up five demos for the course author so that we could have prototypes of the activity to talk through the development of the rest of the activity. You can view those prototypes here. The purpose was not to have five golden ideas/samples, but instead prototypes that would prompt discussion. Sometimes you need to approach the content type selection without absolute commitment. Not forcing your activity to be one content type or another will allow you freedom and flexibility to find the best one for you.

One Offs

I’ve written a few times about the use of digital flashcards for learning. One example I keep coming back to is their application in a Woody Landscape Plants Course. This was a customized dialog card (now H5P has actual flashcards that were based on this customization). The basic idea was to provide students a way to be presented a random plant they would have to identify. They could look at a hint to see where they should focus. They flip the card for the answer and there is another “hint” that reveals how the expert knew what the plant was. The cards answered correctly can be removed from the deck. This works great for more media rich questions. A question set might also be able to do something similar, but we liked the card format better.

SM 4-2 (2) Source: Grant Wood, Department of Plant Sciences, University of Saskatchewan.

 

Every now and then an idea will spark from other things you see online. One such piece of inspiration was those quizzes that suggest a particular archetype or “personality quiz”. Rather than sending students to one of those, we adapted one for use to for students to see where they were on the spectrum of the English School. This is later used for a follow-up activity.

English School Spectrum, M. Gaal

Though it isn’t used often, the Timeline is a good choice when you need exactly that, a timeline. You could actually make just about of whole chapter in a book one of these because they have the ability to be so content rich.

Timeline, J. Maier

Combining Content Types

Sometimes there is no single content type that will do everything you need it to. In those cases I’ve found that I can place different content types one after the other. For example, with the PHAR course there were limitations at the time of development around the MCQ content type particularly around feedback (many of the H5P content types have slightly different rich content editors, which at the time doing a calculations course made things interesting). The way was designed around that was to present the question, include basic feedback, but then follow that with an accordion which contained the full solution (a bit like flipping to the back of your old school math textbook).

Pharmaceutical Calculations, J. Anderson


Photo by Vladislav Babienko on Unsplash

Duchastel, P. C., & Nungester, R. J. (1982). Testing effects measured with alternate test forms. Journal of Educational Research, 75, 309-313.

Hirshman, E. L., & Bjork, R. A. (1988). The generation effect: Support for a two-factor theory. Journal of Experimental Psychology: Learning, Memory, & Cognition, 14, 484–494.

Jacoby, L. L. (1978). On interpreting the effects of repetition: Solving a problem versus remembering a solution. Journal of Verbal Learning and Verbal Behavior, 17, 649-667.

Kang, S. H. K., McDermott, K. B., & Roediger, H. L., III. (2007). Test format and corrective feedback modulate the effect of testing on long-term retention. European Journal of Cognitive Psychology, 19, 528-558.

McDaniel, M. A., Einstein, G. O., Dunay, P. K., & Cobb, R.  (1986).  Encoding difficulty and memory:  Toward a unifying theory.  Journal of Memory and Language25, 645-656.

Richland, L. E., Bjork, R. A., Finley, J. R., & Linn, M. C. (2005). Linking cognitive science to education: Generation and interleaving effects. In B. G. Bara, L. Barsalou, & M. Bucciarelli (Eds.), Proceedings of the twenty-seventh annual conference of the cognitive science society. Mahwah, NJ: Erlbaum.