After a summer of feature releases, we’re focused on platform stability, bug fixes, and the development of our next significant feature: Course Readiness Reporting.
The new Course Readiness Report
Last month we previewed the development of the Course Readiness Report. In this month’s product update, I’d like to go into more detail about what the Course Readiness Report is designed to do and how it will help educators and ID teams save time in the course authoring process.
To recap, the Course Readiness Report is a summary report that that looks at course organization and content elements — specifically alignment between content, activities, and assessments. These are core elements that we’ve identified as critical to get right when when designing a course to generate effective and meaningful data.
Previously, users of Smart Author had to manually review these elements, which could be a bit tedious. More often, the course would be released to students and they would be the ones to notice errors in assessment items or learning objectives.
With this new, forthcoming feature, Smart Author users can run the Course Readiness Report and generate a list of possible errors in the course, such as missing media, poorly formed questions, or questions with no correct answer, or questions with more than one correct answer. The report gives course development teams a succinct checklist to work through.
While the Readiness Report simplifies course authoring by spotlighting all the possible common errors in a course, it can also be overwhelming. To make the inventory of errors more digestible for users, we’ve added a summary at the top of each report.
The summary takes the form of a chart, listing all six areas of a course that can contain errors — process, Skill Graph, alignment, questions, accessibility, course structure — and shows how many errors each of those categories contains. A separate graphic visualizes the course design triangle and shows “at a glance” where misalignments may be based on the underlying course Skill Graph.
In the example below, this Readiness Report is indicating poor alignment between objectives and both assessments and practice. The assessments themselves are good (as indicated by the blue); however, there are issues with the practice items and objectives — possibly due to skills not being tagged correctly or assessments with missing distractors, etc.
Our goal with the Course Readiness Report is to make it easier for course authors concerned about particular errors to find them quickly. For example, if an author is worried that sections of the course aren’t accessible to screen readers, they can see at a glance how many accessibility errors exist. Then, if the author chooses to address them, they can scroll down to find the individual errors.
The six areas the Course Readiness Report reviews include:
- Process: Checks that you have removed temporary content place holder or media elements that may have been inserted in the lesson authoring phase.
- Skill Graph: Checks for duplicative objectives, whether objectives have skills and whether all questions are tagged to skills. This check provides a quick summary of the the overall integrity of course Skill Graph.
- Alignment: Ensures that practice elements are aligned to a learning objective and that every practice question is aligned to one or more summative assessments. This check also looks to see if every learning objective that’s been added to a course is used.
- Questions: Looks for common mistakes such as duplicate answers, missing distractors, missing right answers, and whether enough variety in being used in question types.
- Accessibility: Checks that images have alt text and video content has transcripts. This also also flags alt text that is too short.
- Structure: Checks that an appropriate number of learning objectives are included on each page and that every page has at least one learning objective. This also checks to see if course BluePrints have empty sections.
By running the Course Readiness Report, we can help both course authors and the platform fulfill its most important purpose: to ensure all formative assessments are tagged to skills and that those skills are tied to learning objectives on the Skill Graph.
Related reading: This Is How We Do It: Defining Personalized, Adaptive Learning
All of these connections are important for the adaptive learning engine to collect data and produce a predictive learning estimate for each student against every learning outcome. If a question is incorrectly tagged (or not tagged to a skill), the platform won’t know which skills have been mastered by which students. And if a student’s skills aren’t being correctly assessed, the learning engine will not be able to show an instructor which learning objectives have been met.
By looking at the summary, a course author can quickly see if a course has course alignment or Skill Graph errors and help course authors efficiently address those errors.
As we shared in our last product update, the Course Readiness Report has only been given a limited release this summer. Once we fine-tune this feature, it will be made available to all our clients — likely by this fall.
That’s it for now, but keep an eye out for more improvements and fixes as the start of the school year approaches.
For more information on Acrobatiq’s platform or Smart Author, feel free to contact us. We’d love to hear from you.
And for more on the learning science, instructional design, and use cases behind adaptive learning, be sure to check out our library of white papers.