imperial standard lengths

Developing standards for digital learning: how can we work together to make progress?

Sometimes the less glamorous tasks are the ones that derive the greatest value. I have recently spent some time on a modest redecorating project at home. As all the best YouTube videos tell us, preparation is the key to a good and lasting finish. This requires the messy and tedious work of stripping paint, filling gaps and holes, sanding surfaces and removing dust and grease etc. The actual painting is the satisfying part (and takes a fraction of the total time). Progress is easy to detect. You can point at it. People may even notice you did it. I can see from my desk, as I type, a patch of poorly prepared window paint peeling away from last year’s project. I am cursing my lack of preparation. A redo beckons.

Having nursed the thought quietly for some years, I worry that a lack of consistency and agreement on standards in Learning and Development is one of those neglected, unglamorous efforts. We are painting on unknown surfaces.

What are industry standards for?

Standards solve a range of problems for businesses in defined industries. Often, they help customers and suppliers transact more easily. Industry cooperation can help the creation of shared value for participants. Many problems are common or have shared symptoms. Notable benefits include:

  • Managing risk and ensuring safety – food safety standards
  • Reduction in waste – standard unit sizing 
  • Product compatibility – the USB port
  • Reduction in harm – advertising standards
  • Interoperability of products and services – the shipping container
  • Simplification of contracts, agreements and obligations – service standards for retailers
  • Aligning measurement and metrics – audit standards, retail ratings
  • Measuring efficacy – energy efficiency ratings
  • Comparing services and projects/implementations – media measurement standards
  • Process improvement and benchmarking – media circulation data
  • Training and cross-training workforces in the industry – certification standards
How has digital L&D approached standards?

Standards are a hallmark of a mature industry. One where barriers are recognised and removed for the collective good and clarity is sought in the interest of common understanding. Pre-digital (PD?), L&D made impressive progress with standards. SCORM is the most obvious example. A great deal of time and money was saved in the creation and management of online training, allowing customers to commission with confidence and suppliers to manage their production to a known output specification. A crucial step in the creation of the eLearning industry was taken. 

The LMS as a means of managing online and classroom training has been a foundation since, as has the completion score as the standard unit of tracking learner behaviour. As time has passed, SCORM has become more of an anchor than a sail. Last updated in 2009, it has fallen far behind a digital environment requiring more flexibility and recognition of more than one format. xAPI was developed as the alternative standard, liberating us from the single content format (and the source of content) to reflect the range of experiences which contribute to learning. That was in 2013. Flexibility has come with complexity for many, I suspect. Rendering activity in the ‘subject-verb-object’ format does not match the simplicity of user behaviour and engagement measures digital platforms offer: clicks, opens, shares etc. are the digital norm. 

It is difficult to understand the extent to which xAPI has been adopted (this LinkedIn thread is a helpful summary of perspectives). Can status as a standard be claimed if widespread adoption is unclear? Importantly, the receptacle for xAPI, the Learning Record Store, brings data into the learning realm, where it can be analysed along with more traditional learning data. There is a case to be made for travelling in the other direction and mixing with the warehouses, lakes and lakehouses (not my term, honest) of the business. If so, more business ready definitions may be welcome.

Measuring the usefulness of learning is a contentious topic. It is likely to remain a voyage of discovery, as scientific understanding guides what we know and defines what we mean by the L word. To be fair to Kirkpatrick, however, those four levels are an enduring model and whilst not a standard, are close to common practice.

Struggles to be clear on effectiveness of what we do are a drag on development of the industry. Stakeholders are not interested in obscure internal metrics, designers and developers are challenged to compare projects and fallacious claims are still easy to make against wobbly assertions of engagement, for example. Will Thalheimer has made a powerful case for an alternative model. He has also clearly illustrated the effort and attention required. But, do we need to make progress on all fronts at the same time?

What problems does L&D face from a lack of standards and benchmarks?

This is where the ‘important but not urgent’ debate comes to the fore. We are on something of a merry-go-round when we need to move forward in the same direction. History is instructive here.

The early days of the web were something of a wild west for commercial practices. Grandiose and unsubstantiated claims were constantly made about traffic to websites to attract advertisers. It was impossible to distinguish truth from hype. Meaningless claims of millions of “hits” were bandied about as proof of popularity. All industry players knew independent standards were needed to attract budgets away from TV and print media. The industry needed credibility. It needed collective action.

In 2000 (or so) I was part of the early conversations and committee meetings, on behalf of a nascent BBC Online, to establish standard measures for monthly traffic that could be verified and certified by trusted third parties. At the time, this was ABC Electronic,  the internet part of the Audit Bureau of Circulations. We agreed, after some healthy debate, on standard measures for Unique Users and Page Impressions. Audited certificates were granted to website owners for validated monthly counts. A foundation stone in the development of the digital advertising market was laid and a currency was born.

There are echoes of this story in the digital development of L&D. Common problems outweigh our unique challenges. We apply similar systems, tools and content to the problems our organisations face. This industry, globally, is worth $200bn. How effective is that investment? The pressure to demonstrate the value of our budgets and resources is rising. Clarity and consistency are needed.

What kind of problems could we address?

From experience and research so far, I would summarise the challenges we face without standards and benchmarks as follows:

  • Comparing products and services – which content library, LXP, LMS implementation might suit me best? Where is the benchmark evidence to support that decision?
  • Comparing projects, programmes and initiatives – where is the benchmark evidence to analyse performance and manage future expectations?
  • Interoperability – Managing technologies and services together effectively (content providers and platforms, ease of matching content tags across systems)
  • Measuring and analysing effectiveness of learning services and activity: 
    • What does digital engagement look like?
    • How well-used are our products versus an industry norm?
    • Measuring and describing impact – what do conistsent signals of efficacy look like? 
    • What do we even mean by impact: there are a number of different approaches
  • Describing what we do in a clear and consistent manner to stakeholders, customers and investors – taking us beyond delivery of learning

No doubt there are other candidates. No standard will solve all of these problems, after all. But, some sensible measures will blow away a great deal of fog and could gather us around common concerns.

Where can we start and who should be involved?

Making some kind of collective first step is essential, I believe. These are industry challenges and need action across as many fronts as possible. In the spirit of starting somewhere, here are some suggestions, taking inspiration from other industries, as areas to explore:

  • Standard measures of user behaviour on learning products – a big step towards describing what we mean by learner engagement
  • Interoperability of systems, products and tools – ecosystem standards for handling data and content or shared API documentation
  • Tagging standards – a collective approach to taxonomy and managing content across technologies and platforms

Voices are needed from leaders in L&D technology in organisations, leaders in vendor, content and services businesses and industry analysts. If that sounds like you, and you want to discuss the problems standards can help us solve and how to organise that effort, we are organising a forum event in November to make a start.

Sign up here to receive more information about the event and registration details.

Feature image courtesy of:,_Greenwich.jpg


Leave a Reply

%d bloggers like this: