The Output of Project Tin Can is Experience API (xAPI).

For the most current and up to date information about xAPI, please visit 

Next Steps

The information on this page documents the Project Tin Can research project and is available for archival/historical purposes only

Our prototypes have been online for a while, and you can tinker with them all you like.

What’s new? Well, the Tin Can API is now baked into all of our products. SCORM Engine lets you have an installed LRS, SCORM Driver lets your content generate Tin Can statements that are delivered to an LRS, and SCORM Cloud now contains a hosted LRS.

Working with our samples and our prototypes

The easiest way to see the Tin Can API in action today is to use some sample courses that we’ve created (or your own courses) to generate Tin Can statements to our public (beta) LRS, or to the SCORM Cloud LRS. Run these courses from your own computer or on your own web server, and start reporting “I did this” statements to SCORM Cloud or the beta LRS.

You also have the option of creating Tin Can Packages (similar to SCORM or AICC packages) and importing them directly into SCORM Cloud.

Check out the prototypes on to get started.