Anatomy of SCORM Minutiae Mistake

I don’t know how many times I’ve said to someone on the phone, “SCORM is difficult, especially for the LMS provider.” There are many moving parts, countless interpretations, and vagaries in the specification itself. For the most part, we handle these things exceptionally well. Sometimes we make mistakes, and sometimes those mistakes can compound themselves.

The Source of Today’s Problem

In SCORM 1.2, mastery_score and lesson_status can interact with each other strangely. Frankly, the specification can be interpreted in two ways.

From Section 3.4.4, “The SCORM Run-Time Environment Data Model”, in the cmi.core.lesson_status section (henceforth called “The Narrow View”):

After setting the cmi.core.lesson_status to “completed”, the LMS should now check to see if a Master Score has been specified in the cmi.student_data_mastery_score, if supported, or the manifest that the SCO is a member of. If a Mastery Score is provided and the SCO did set the cmi.core.score.raw, the LMS shall compare the cmi.core.score.raw to the Mastery Score and set the cmi.core.lesson_status to either “passed” or “failed”. If no Mastery Score is provided, the LMS will leave the cmi.core.lesson_status as “completed”.

From Section 3.4.4, “The SCORM Run-Time Environment Data Model”, in the cmi.core.lesson_status section, incorporating text before and after “The Narrow View” (henceforth called “The Holistic View”):

Additional Behavior Requirements: If a SCO sets the cmi.core.lesson_status then there is no problem. However, the SCORM does not force the SCO to set the cmi.core.lesson_status. There is some additional requirements that must be adhered to successfully handle these cases:

  • Upon initial launch the LMS should set the cmi.core.lesson_status to “not attempted”.
  • Upon receiving the LMSFinish() call or the user navigates away, the LMS should set the cmi.core.lesson_status for the SCO to “completed”.
  • From above After setting the cmi.core.lesson_status to “completed”, the LMS should now check to see if a Master Score has been specified in the cmi.student_data_mastery_score, if supported, or the manifest that the SCO is a member of. If a Mastery Score is provided and the SCO did set the cmi.core.score.raw, the LMS shall compare the cmi.core.score.raw to the Mastery Score and set the cmi.core.lesson_status to either “passed” or “failed”. If no Mastery Score is provided, the LMS will leave the cmi.core.lesson_status as “completed”.

Herein lies the big difference. The bullets are intended only for the cases in which the LMS has been forced to manage the status on its own. In a piece of content that sets its status (as we’ll discuss below), we believe the LMS is not supposed to intervene with regard to the Mastery Score.

What we did wrong, a while ago

In SCORM Engine 2007.1, we went with this logic, which maps to the “Narrow View”:

If cmi.core.lesson_status has been set and cmi.core.score.raw has been set, compare the Mastery Score to the cmi.core.score.raw and set the status to “passed” or “failed”.

Ultimately, as this logic rolls up through the course, this tolerates content we believe is wrong and reads as “completion_status=complete and success_status=passed” or “completion_status=complete and success_status=failed” to the client LMS. Put another way, it cleans up the mistaken interpretations made by the content author. (It’s an understandable mistake.)

This seems OK at first blush, but then you start running into content that expects the other behavior. If you’re a content author, one that reads the spec holistically, and you’ve intentionally set a value for lesson_status, and the LMS overrides it, that’s pretty confusing. If the spec were totally clear on the subject, we would stand behind it. Given that the spec is ambiguous here, we can appreciate the author’s point of view.

So, we did what we do. We made accommodations.

How we accommodate different interpretations of the specification

We have long believed that the best way to have a highly compatible SCORM player is to accommodate different interpretations from content. This is a perfect example of why we do this, and it allows us to properly support content in a way that other LMSs and players just don’t.

From our release notes for 2008.1:

Mastery Score Overrides Lesson Status – In SCORM 1.2, there is a debate about when and if the LMS should override the lesson status reported by the SCO with a status determined by the reported score’s relation to the mastery score (i.e. if the reported score is 60 and the mastery score is 80, then should the LMS set the status to failed even though the SCO said the status should be passed?). This setting allows you to choose whether or not the LMS should override the status based on the score for this course.

Alright, this is great, right? Now we can have our cake and eat it too. (The fact that cake is gross will have to be another post.)

Every time we add a new package property like this one, we have to make a decision on the part of our clients. We have to decide what the default is. In some cases, this is easy stuff. When we’re tolerating departures from the standard, we simply go with the standard as the default. This is a tough one, though, because the spec is a bit ambiguous. In this situation, we go with what we believe is the correct interpretation of the standard.

In this case, we decided to opt for “false”, or, mastery score does not override status. We think that a content developer who’s smart enough to set his or her own status is also smart enough to retrieve the mastery score and compare against if they want to. We’re erring on the holistic side of things here, and I still feel good about this decision.

I do not, however, feel good about our mistake.

The Mistake

We chose the default. We deployed the new version of the SCORM Engine. And we added the necessary columns as part of the upgrade script. In doing so, we used the default value.

Big Mistake. Big. Huge.


–Vivian, Pretty Woman

(Note, this is not a wide spread problem. It’s isolated to content with an atypical interpretation, but it is very problematic for those courses. I just like to quote movies.)

Some of our clients have content that expected the LMS to make the comparison against the Mastery Score even though they’d already set the status themselves. This content had functioned without issue for some time. And in upgrading to 2008.1, they introduced a problem with older content.

With the new default, though, this is what happens. A course could set cmi.core.lesson_status to “completed” and then report a cmi.core.score.raw that exceeds the Mastery Score they’ve provided. The content could assume that the LMS logic defined in Section 3.4.4 (the narrow view) would then change the lesson_status to “passed”. Because we’ve opted to go with the holistic approach by default, the status would in fact not be changed. This scenario, though, isn’t a big deal. The client LMS would still interpret this course as sufficiently completed and all would be well.

The mistake manifests itself, though, when the cmi.core.score.raw is less than the Mastery Score. In this situation, the status values would remain “completion_status=complete and success_status=unknown”. To the client LMS, this appears to be a course that is probably complete and has no testing, when in fact, it’s really a failed test.

The conclusion? We have picked the right defaults for people going forward, but we probably should have set the defaults in the upgrade script to stick to the old behavior. (We have, in fact, gone back to the 2008.1 upgrade script and made this change for those of you who have yet to upgrade.)

Now What?

Well, we just, last night, discovered this side effect behavior, and it obviously merits immediate action for some clients.

For those of you who ran against 2007.1 and have some concern that you may have courses that function like this, you can opt to revert to the old logic. If you’d like help doing just that, you can simply ask us for the queries to revert to that default. We’ll help you through that and we’ll help you examine any potential “false completions” that have happened since you deployed 2008.1.

If you’re building a new SCORM Engine integration, you can opt to go with our defaults. It is our experience that more content (including some from a big authoring tool vendor) benefits from our new default behavior. But that doesn’t mean it catches every scenario. This is something that you and we will continue to be on the lookout for. In fact, we’re going to see if there’s any sort of a heuristic that we could deploy successfully to handle this ourselves. (We’re not optimistic, but we’d like to catch this one without human intervention.)


  • Mike Rustici

    I need to step in here and make an important clarification. The statements and opinions included herein are the opinion of the author and do not represent the official views of Rustici Software.

    Rustici Software’s official position is that cake is a tasty delight. Additionally, having cake without being able to eat it is a big tease.

  • Thank you for taking the time to document this issue. The ability of the LMS to override the lesson_status by comparing score.raw against the mastery score has long been a point of pain and confusion for anybody developing for SCORM 1.2 and is just one of several areas where the specification is ambiguous at best.

    This type of issue highlights the need for specifications that are driven by the industry itself based upon real world experience.

  • Carlos Martin

    Thanks for the post, Tim. We work with several SCORM 1.2 content providers (some are experienced, some are not) and our recommendation has always been to send a lesson_status of complete or incomplete and let the LMS set the passed/failed status based on mastery_score.

    We do not use your SCORM player (yet) but it’s good to see that our “narrow view” contents will work fine.

  • Gordon Harding

    I am a little confused about multi-sco packages and the scoring. I would appreciate you adding your thoughts. I have three SCOs with the first two being content only and reporting completed/incomplete. The third SCO is my quiz with a pass/fail option.

    When I run it through your system the score is the quiz score divided by three and the status is complete.
    Is this the expected behaviour?

    When I look at the log file I see that the first two SCOs are reporting a raw score of 0 and a max score of 100. Would it make a difference if there was no raw score reported? My course builder does not offer that option or I would have tried it.

  • Gordon.

    Key distinction here: are we talking SCORM 1.2? Or SCORM 2004? And are you working in a Rustici supported SCORM implementation? Or someone else’s?

    I can help based on any answers from you, but the solutions are different in each case.

  • Gordon Harding

    I am working in SCORM 1.2 and the implementation is not Rustici but I did test with Rustici.
    I am not sure who is at fault here. I am of the opinion that the course should not send a score since the course is set as completed/incomplete.
    The LMS might taking the tack that since there is a score offered then it must be a valid score.
    I don’t want to fight the wrong fight and attempt to change the mind of the wrong vendor. Not that I am likely to win. 😉

  • The short answer: SCORM1.2 just doesn’t specify a behavior in this regard. It’s up to the LMS.

    I’m guessing the reason you’re seeing a score divided by three is that the LMS is averaging your scores across all of the SCOs (two of which aren’t reporting anything).

    In our stuff, we have the ability to say, “Rollup the score only from the last thing, as if it’s a post-test.” Most LMSs don’t have this, though.

    Does that help?

  • Gordon Harding

    Thanks, that appears to be the case. The SCORM spec certainly is open to interpretation and much of the interpretation seems built on the vendor’s tools. I guess that is always the case. I am sure that some could argue convincingly against my interpretation.

    Since there is no consensus on how it should be interpreted I will attempt to find a third option. I thought that was likely anyway.

    All the best.

  • Yugndhar

    Please provide one example project for this problem.