In the spirit of openness, this post shares contributions from those who have signed up for the Digital Learning Standards Forum so far. It summarises where interest is gathering (our hopes and fears, if you like). It, therefore, also outlines where our efforts will focus in pursuit of simple, clear and open approaches – defining the common ground we share. This is very much a rising tide lifts all boats initiative. More voices are welcome and called for. There is a sign-up form below to join the Forum. So…to the themes:
The notion of a learning technology ecosystem has become a common conversational theme. Unlike natural ecosystems, the components of ours lacks the organic, mutually beneficial and connected nature of the real thing. Managing components together is time consuming, resource intensive, and a source of frustration for customers and vendors alike. Many of us do not really manage ecosystems. We struggle with the act of knitting connections together. Despite this shared frustration, the industry has yet to approach common means of managing the data and content at the heart of our services.
We have an opportunity to create the shipping container equivalent for L&D. A standard unit in which what we make can be shipped from point to point, regardless of what it contains, with clear expectations of what can be done when it arrives and the content can be unpacked for the end consumer. It can move from mode to mode, across the world, with little friction and greater flexibility of purpose.
Sometimes it seems like every implementation is approached as if nothing has been done before. Interoperability standards will help to reduce the time and costs of working with various providers and aid accuracy and simplicity of implementation.
2. Benchmarking and metrics
Creating a currency to help investment decisions is a common feature of many industries. Understanding and comparing the contributions and effectiveness of modalities, products and services using some agreed, common metrics is a missing feature of the L&D landscape. We can support customers in making decisions about the relative value of solutions. Vendors can be supported by a familiar set of measures with which to describe their value. It will also help learning leaders make design choices and manage expectations. Consistent and familiar effectiveness measures add clarity and simplicity to what we tell our customers and stakeholders. The advertising industry is built on similar digital ground across the web and mobile apps. Campaigns can be designed, planned and bought across the full range of media vehicles.
Such benchmarks will not answer all effectiveness and impact questions, but they can lay a clear and familiar surface on which to formulate answers.
Sometimes it seems like every implementation is approached as if nothing has been done before. Defined benchmarks for product and service effectiveness can help guide decisions on which platforms and content, for example, to choose and how to apply them.
What might such measures be?
As with every industry, L&D capability is built on a software foundation. Software development has thrived on standard methods and approaches. In many areas, those standards and approaches are open for developers to use and apply and to contributions to enhance them as well. Linux and MySQL were the engines of many of the tools we take for granted today. L&D is no stranger to open source, with SCORM and xAPI making significant, if not transformational, contributions. There is an excellent opportunity to build on this foundation for the new challenges and opportunities we are amongst. Interoperability will need openness to thrive. As will metrics definitions. In a sense, openness should be a guiding principle of other developments.
Sometimes it seems like every implementation is approached as if nothing has been done before. Openness can help to relieve this burden and reduce the cost of developing and implementing solutions.
4. Skills and taxonomy
In many ways, the skills challenge is the issue of our time for L&D. It is simultaneously the opportunity to prove our strategic worth and a threat to our traditional models. Juggling scale, personalisation, long term development and data are properly difficult problems for learning services. The (very) high level analysis shows the economic direction, but is hard to apply to an organisation context where local relevance is needed. Individual models and vendor tools need some serious knitting and managing in a customer context. And yet, we are all pursuing similar outcomes with similar toolkits. What can we learn from these efforts to define common approaches?
Sometimes it seems like every implementation is approached as if nothing has been done before. The devil is very much in this detail, but how unique is that detail, really?
5. Data ethics
A topic about which there is heightened anxiety and concern. The power of our data is rendered ever more clear day by day. In L&D, brows are furrowing about employee monitoring and the risk/reward equation, as they are about AI systems and automation. This is a great summary of the opportunity of Responsible AI and how shared principles can guide development. How should the industry grapple with these issues? Where should that grappling take place – is there more value to join with HR and people data interests more broadly? We have to be evidence based. We also have to engender trust in our services.
The opportunity here feels like the agreement of principles by which we gather, handle and take ownership of data. The risk is that the train is leaving without us.
So often paid lip service to and so often an afterthought. As L&D reaches into new territories with new tool kits (AR and VR spring to mind), ensuring that design decisions are equally valuable to all audience members needs to be a pressing concern. Whether we lobby for our audience’s needs in other forums and/or bring our L&D house to order is an important choice to make. It should not be taken unilaterally, though.
Sometimes it seems like every implementation is approached as if nothing has been done before. We have a responsibility to keep the bar at the highest level possible for all.
A personal observation
Further to that spirit of openness, I will also share some personal lessons from this standards effort from the last few weeks:
- Persuading people to face in the same direction might not be as obvious to all as it might seem
- Setting out a common cause is challenged by people’s immediate priorities – busy times
- Networks are personal rather than organisational – working with others in the industry is in few people’s job descriptions
- Cooperation beyond organisation boundaries relies on longer term incentives
- There is a tendency to jump to the answer – shared definitions of problems will be part of the value of working together
- Few things of value happen quickly and with ease
Thanks again to Filtered for their support and encouragement to bring us this far.
Sign up to join the Digital Learning Standards Forum here:
Feature image courtesy of: https://negativespace.co/colorful-shipping-containers/