UBC H5P Symposium

UBC H5P Symposium

February 28, 2022 Off By JR

Last week I was virtually attending the UBC H5P Symposium, and it’s the first conferency thing I’ve done in 2022 for sure, and probably since spring 21.

In this multi-day H5P symposium, we will introduce you to H5P, demonstrate its strengths and limitations, and highlight successful examples of implementation and course integration for various content types (e.g. multiple choice question sets, auto-graded essays, drag and drop, and interactive videos). We will also offer studio time with the support of experienced content creators to introduce you to the authoring platform and help you brainstorm how it could be used in your context.UBC CTLT

Before we began, I discovered UBC has a new H5P hub! It’s still only in its softlaunch phase, but it already has a feel like eCampusOntario’s H5P Studio, so good job UBC!

Keynote – Importance of Formative Assessment

It was a rapid speech, so prepare for jot notes!

Retrieval Practice: Broad Benefits at Minimal Cost

  • Direct Effects of Retrieval
    • Put another way “Testing helps attenuate forgetting”
  • Indirect Effects of Retrieval
    • increases metamemory accuracy, basically student awareness of what they know they remember well or not.
    • the test included students stating how well they thought they would do and compared to how they did perform.
    • why is this important? metamemory affects study time allocation
    • retrieval increases study effectiveness, as shown in highlighting study, students focused on incorrect items during subsequent restudy periods
      • study-study-study performed worst, test-study-test best, and test-test-test middle
      • test-study-test reflects another study I saw where learning objectives were actually presented as a quiz instead of the standard boring affair (presenting “at the end of this module you will be able to…” on the first slide of the course)
    • potentiates new learning
    • may encourage more frequent study; exam-a-day example; basically the Duolingo effect; students in these studies indicated they studies more and kept up more in these classes.
    • reduces test anxiety (preliminary support)
  • Practical Recommendations
    • incentives needed?
      • quizzes voluntary vs required choose required (minimal grade attached, can be extra credit)
    • format?
      • use methods that require more retrieval effort e.g. short answer over MCQ is the assumption, and backed by plenty of previous research. However, McDaniel et al. 2012 performance levels from highest to lowest short answer, MCQ, reading, no-exposure. McDaniel found SA & MCQ were not significantly different for classroom application (when you have two MCQ quizzes). This aligns with advice Patti Schank provides in her book all about writing MCQs.
    • Short answers don’t require grading, just provide the feedback.
    • on ‘remembering facts’ vs “integration/higher level thinking”; application questions have benefit on concept-type questions.
      • Jensen article, comparing courses with retention level questions, another with application questions; the latter outperforms the former on high-level final exam questions. However, what is more interesting is that the latter also still outperforms the former on low-level final exam questions.

I asked:

A recent article in Nature highlights “Despite benefiting more from interleaved practice, students tended to rate the technique as more difficult and incorrectly believed that they learned less from it.” Other studies such as those by Bjork et al. reference students misperceptions of how much they learned after a session as well (I think this relates to your discussion of meta memory accuracy).

What strategies do you use to bridge the gap in perception and evidence of the impact on learning for students?

He responded:

Give them a demonstration so they can see for themselves that they learn more with retrieval. He does in class the first article experiment with his class in class. Ask which you learn more from. A week later do the recall quiz. A poly sci instructor noted they experienced pushback initially but students came around.

Question: about conditional release of content (e.g. take the quiz then proceed)

McDaniel, “anything you can do that motivates students to do the quizzing, then you’re likely to see a positive impact.”

McDaniel, changed the title from “quiz” to “learning opportunity 1, 2, 3…”

On correct answer feedback:

He did a recent study on the amount of feedback given and the effects on restudy. if you just show how they did and provide the quiz for reference during restudy, they do just as well as with corrective feedback.

Project Showcase Roundtable

Use of H5P in a Opentextbooks (Simon & Kayli)

The first project was for a chemistry course: eChIRP. Kayli created a book with a format of overview, content (with integrated questions), practice (questions at the end). Each of the presentations fielded questions about how students perceived the practice opportunities and how they performed, which was refreshing to see. Students describe the practice opportunities as “magic” (when made well).

Do students find H5P components helpful?

  • book is helpful in general, but challenge questions, practice questions, and interactive video tutorials are rated as most useful by students
  • 90K interactions with the book in a single day at peak, 10K/day avg
  • used xAPI to capture frequency of use (I was happy to see a practical application of H5P and xAPI in this case)
  • Kayli did a study on interactive vs non-interactive videos and found statistical significant improvement using interactive video (student-perception). Students preferred the interactive one.

Next, Simon discussed his Psychology Textbook: OpenStax version.  This was part of the most recent H5P&PB projects facilitated and funded by BCCampus. Questions were added by section (within a chapter; i.e. multiple per page). They made use of the branching scenario to provide a form of adaptive questioning (wrong choice leads to remedial content), a departure from how I tend to explain and envision the use case for that content type, but it looks like an innovative approach. Interactive videos of the sleep lab put students in a position of authority. The video uses crossroads to jump around the video instead of basic questioning. Agamotto got a shout out for its use for a visible different concept (yellow dot, dark/light background).

Developing Writers (Brenna)

Brenna shared her work, and recent session Developing Writers: Using H5P to Support Composition Practice. Brenna’s work is fantastic and what I really appreciate is the use of H5P outside of quizzing methods. Using the summary, students sort through the order of ideas for example she broke a paragraph into its component sentences to sort in the correct order. She notes the essay tool is “for writing summaries” as opposed to essays.

She uses the documentation tool for a thesis development exercise. Each step is guided, then at the end they review the thesis and are able to rate how they felt they did against their goals. The documentation tool is also used for assignment preparation.

She finds students tend to skip pre-writing and free writing exercises in textbooks. The exercises she provides help to provide space for that practice with tangible results. Giving students a clear space to do the work right in the textbook.

Checklists get a shout out here as well. She uses the MCQ tool (select all that apply) used as a checklist. Reminds me of the checklists I wrote on this blog not long ago.

Customizing H5P to Create Tapestry-Tool (Steven)

Here is an application taking advantage of the open-source nature of H5P tapestry-tool. You create a canvas for mapping and every node in the tapestry is an h5p widget. So far, its been used in healthcare settings for professional development, focusing on learning health systems.

It can be used for collaborative mind-mapping, remote field trips, individual student projects and group projects, or instructor-student co-created content where the instructor creates the tapestry and students add nodes. Nodes can be moderated and you can lock nodes based on time or completion criteria.

  • TYDE project is using it
  • ACE-BC is using it
  • BC Support Unit about Patient Engagement Research

WordPress Plugin currently. MS Accessibility Team to help improve accessibility to achieve WCAG 2.0 guidelines.

Panel Discussion

7 panelists

What are your favourite H5P content types?

Summary, quiz, etc. come up frequently. The “coolest” one that gets a shout out is the 360 degree tour. Timeline is another favourite, can display events that are clustered together; more “organic exploration”, but a “curated exploration”.

Favourite thing about it is the ecosystem, learn one learn them all kind of thing.

Branching scenario gets a shout out. Interactive videos, guiding students through a question step-by-step, and also branching by using crossroads;

Most used is question set. MCQs because that’s what students encounter most in a particular program, so use MCQs to familiarize students with what they will encounter on exams.

What role does it play in your course?

Embedding activities right at the point where students are learning. Quiz tools in the LMS send students away. The panelists reference the keynote stating they hope to increase metamemory to enhance students studying and provide opportunities to practice their skills.

How did you find the learning curve and why did you stick with it?

On why stick with it: students gave positive feedback, removing a step re: canvas quizzes (sending students away to do a quiz), students like that they’re not just reading content but also answering questions and getting immediate feedback.

What has it allowed you to do that other tools have not?

Finding widgets online and using them in your own course: re-usability and sharing is a major feature. There was some mention of community building but no clear example of what that meant.

Compared to other tools, the learning curve is great, also it’s all in one spot (not needing 100 logins), cost, stability (re:updates), quick authoring, portability. There was some comparison between H5P and other tools where a project hinges on whether content should be “secure”, and comment about articulate. I think the comment was that articulate would be used where “secure” content is needed, but I’m not sure I understood exactly what they were getting at.

My question: what’s one thing you wish someone told you before jumping in?

One of the panelists had a great reply but I only managed to get this quote down: “it’s there for me to play around with. stop treating edtech as ‘this is the answer’ instead what will this do for you?”

Another panelist shared this gem which I often have to share as well: “be very clear about what you’re trying to achieve before you start, both for you and your students”.

This one came up a few times which I thought was interesting: “using tags on items, especially when you get a large library”.  The management of libraries came up a couple of times, as did every edtech’s favourite word, metadata.

When do you choose a different tool than h5p and why?

The examples given were about when to use H5P vs Canvas quizzes. H5P tended to be formative assessments and Canvas quizzes were for graded exercises. One other area where Canvas quizzes was preferred was for math notation and math related questions (e.g. formula question) and for analytics and reporting.

What to do when you can’t build in feedback?

Not all H5P content types enable autofeedback beyond right/wrong. Like myself, the panelists ended up using a column to combine the feedback limited question + accordion for detailed response.

Organizing an h5p library

tags, ownership & roles, consistent titles.

Accessibility

I was surprised by an answer to accessibility that wasn’t about WCAG. The panelist described the 360 tour as a way to bridge the distance from remote to being ‘in the place’.

Of course then the WCAG related responses: alt text, bandwidth, scripts for audio stuffs and the sharing of great work from Alan Levine over at the H5P Kitchen specifically about accessibility.

What support did you have? Not all faculties have the same levels of support.

TLEF funding support seems really important for UBC. I can see that from a perspective of working across western Canada and freelance projects as well. External grants such as those from BCCampus or eCampusOntario are just as important.

Although they were funded projects for the most part, one comment that 1-2hr/week meetings with an ID goes a LONG way gives me some hope that even in non-funded projects there might be some grassroots type effort that could be put in for uptake in using H5P in more courses.

How do you assess whether students are engaging with the h5p?

One panelist uses pre/mid/exit surveys (QT&QL), how different tools impact learning. Another uses canvas analytics to see time spent on pages that h5ps are on and compare to times in quizzes. However, As noted by myself and others on Twitter last week that time on page doesn’t often tell you about what a person is actually doing. In roll20, I’ve logged 224hrs in the current campaign I am running, and friend, I guarantee you it isn’t actually that many hours. Not even close.

Keynote References

Roediger III, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological science, 17(3), 249-255.
Little, J. L., & McDaniel, M. A. (2015). Metamemory monitoring and control following retrieval practice for text. Memory & Cognition, 43(1), 85-98.
McDaniel, M. A., Bugg, J. M., Liu, Y., & Brick, J. (2015). When does the test-study-test sequence optimize learning and retention?. Journal of Experimental Psychology: Applied, 21(4), 370.
Wissman, K. T., Rawson, K. A., & Pyc, M. A. (2011). The interim test effect: Testing prior material can facilitate the learning of new material. Psychonomic bulletin & review, 18(6), 1140-1147.
Leeming, F. C. (2002). The exam-a-day procedure improves performance in psychology classes. Teaching of Psychology, 29(3), 210-212.
Johnson, B. C., & Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teaching of Psychology, 36(1), 33-37.
Agarwal, P. K., D’Antonio, L., Roediger III, H. L., McDermott, K. B., & McDaniel, M. A. (2014). Classroom-based programs of retrieval practice reduce middle school and high school students’ test anxiety. Journal of Applied Research in Memory and Cognition, 3(3), 131-139.
Trumbo, M. C., Leiting, K. A., McDaniel, M. A., & Hodge, G. K. (2016). Effects of reinforcement on test-enhanced learning in a large, diverse introductory college psychology course. Journal of Experimental Psychology: Applied, 22(2), 148.
McDaniel, M. A., Wildman, K. M., & Anderson, J. L. (2012). Using quizzes to enhance summative-assessment performance in a web-based class: An experimental study. Journal of Applied Research in Memory and Cognition, 1(1), 18-26.
McDaniel, M. A., Thomas, R. C., Agarwal, P. K., McDermott, K. B., & Roediger, H. L. (2013). Quizzing in middle‐school science: Successful transfer performance on classroom exams. Applied Cognitive Psychology, 27(3), 360-372.
Jensen, J. L., McDaniel, M. A., Woodard, S. M., & Kummer, T. A. (2014). Teaching to the test… or testing to teach: Exams requiring higher order thinking skills encourage greater conceptual understanding. Educational Psychology Review, 26(2), 307-329.

Photo by Pixabay from Pexels