When tackling Data & Analytics, look for signals not for proof

I am becoming almost perpetually bothered by this finding from the LPI about the poverty of confidence and capability regarding Data and Analytics capability. At the risk of repetition, it should be a top three skill. So, it’s a serious problem. It might be the most serious capability problem the profession has.

I will describe the source of my anxiety quickly and move on. In any digital endeavour, there are two foundations: data and connection. Without these, hopes of value are foolhardy. In 2019 and beyond data is a requisite. So…what to do?

One important step, I think, is to stop seeking certainty and proof. The quest for ROI and proof of impact is looking like a red herring. In the messy world of work and workplaces, full of unpredictable people making mostly irrational choices, certainty is a phantom. Isolating the variables with enough clarity to assign impact to our learning stuff is notoriously difficult and the circumstances to run clean experiments are rare.

We are better off seeking good evidence to apply to our best judgement and use that evidence to increase our confidence in the value of our choices. One again, it is useful to look outside our professional boundaries and take a look at the digital marketing world. There is a sophisticated and still rapidly growing marketplace out there trading on the signals observed from user behaviour. These might be signals of our preferences, signals of our likelihood to purchase, signals of our affinity to groups and opinions and worryingly, signals of our likely voting behaviour. They are predictive signals and in combination, they can be very accurate. Over time they can add confidence to outcomes as well – that’s why the digital persuasion industry is so powerful.

Since its inception, Google search has used a variety of signals to calculate the most relevant results to a query. Needless to say, they are very good at it. Simple signals such as the presence of keywords in page content are supplemented with the volume of inbound links to that page. Those inbound link signals are weighted by the subject matter of those sites, their own inbound links etc. A host of other signals factor into the selection and positioning of links,  with personal search patterns becoming increasingly accurate indicators. But not all signals are equal. Specialist links are worth more than generic content links. Commercial links might be less valuable than organic links. These might be called signals of expertise or authority. Signals of popularity might also help, assessing the click through rates to those links from similar queries. No proof of relevance is offered but confidence is increased by interpreting the signals.

Looking further to user intent, the marketing world scans for signals of brand preference, propensity to buy and the effectiveness of advertising tactics and messaging. The analysis of the signals in combination and over time increase the confidence in clients to buy those placements in their media pans. (In the direct response segment, these trades are increasingly programmatic and automated to optimise outcomes).

Clearly, these signals are matched with purchasing behaviour, sign-ups, and other target behaviour, were possible, to assess commercial outcomes. The question is: What is the payoff? For us L&D types, this is where the analogy takes us into business data and we must travel beyond our organisation boundaries to gather insights into the payoff. Learning data does not tell us enough – it never did really – how could it? We need to develop our fluency in the commercial, operational and organisation data of our colleagues (back to that meaningful business partnering challenge). Our own signals will only tell us so much.

Publishers and content owners (that’s us in the L&D digital world I reckon) are reviewing signals of audience activity to get a picture of the value of what they make and where they place it. (Completion is rarely a consideration here which is an important clue). Frequency of visit is a good measure of utility, for example. Loyalty rates also help to understand usefulness. Signals of engagement can be gathered from duration of visit data and pages or content items viewed per visit is useful in that regard too. They do not describe behaviour change itself but they do develop our confidence in the attention we are gathering around our stuff and how helpful it is to our users.

We can continue to ask people what they think as well. In this mix of data sets, survey data is a clear signal. “How was it for you?” is a good candidate for a perennial signal of value. Mixing this direct feedback with other information creates a richer picture of response and activity than feedback surveys alone.

Moving closer to behaviour change, the usage signals on social platforms are starting to become startlingly good predictors of behaviour. As data sets are combined and compared, sophisticated campaigns are influencing all sorts of nudged and persuaded outcomes. Experimentation and testing is adding to the confidence with which these signals are interpreted. Big bets are being placed on these outcomes and our use of social media is becoming a vital source of prediction about what we might do. I can be scary territory as we are discovering yet these signals are the new currency of digital value.

None of this is new, I realise. What I am thinking though, is that this signals focused approach could help L&D out of its data free trench. If we stop fretting about proof and start analysing signals to increase our confidence, we might well get further faster. It seems that we might also be talking the language of business in doing so.

2 responses to “When tackling Data & Analytics, look for signals not for proof”

Leave a Reply

%d bloggers like this: