Human brain with data surrounding it

Big data needs big thinkers

People in Learning and Development love big data, or at least the concept of big data. It’s a perennial fixture of key trend lists, and we’re warned to ignore it at our peril. But there is risk involved in the L&D community viewing the collection of data as an end in and of itself.

The appeal of ‘mathiness’

It’s easy to be seduced by big data. The reason for widespread data fetishism is its sense of empirical authority, or what economist Paul Romer would call ‘mathiness’. The issue with the sense of ‘mathiness’ derived from the ability to capture reams of learner’s data is that it can create the false impression that you know more about them than you actually do.

The false certainty derived from amassing more exhaustive data can lead people to think that the responsibility for decision-making can be handed over to mathematical formulas. This could narrow the scope for innovation in learning technology. Organisations could judge success on the metrics that are easiest to measure, or misguided inferences.

And it’s very difficult to change course if you’ve already made a decision. In ‘The Black Swan: The Impact of the Highly Improbable’, Nassim Nicholas Taleb explained that ‘when you develop your opinions on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if the new information is obviously more accurate.’

A blind faith in the power of data is risky as it’s easy to make wrong inferences. An example of an over-reliance on data nearly leading to catastrophe took place in 1983, during the Cold War. The Soviet Union had developed the Oko nuclear early warning system to detect a US missile strike. Stanislav Petrov was the duty officer at the Oko command centre on September 26th 1983, when the system reported that the US had launched five missiles.

Petrov was instructed to order a nuclear response if this happened. But he didn’t. He decided to let his own reasoning override the state of the art system, and he was proved right in doing so. It turned out that the system had picked up on a rare alignment of sunlight on high-altitude clouds. If Petrov had obeyed the data instead of thinking critically, the outcome would have been unimaginable.

The importance of human insight

While exercising caution is wise, this isn’t to say that there is a huge amount we can find out about learners from using data. Companies are using the information they gather on customers to do amazing things. Andrew Pole, a data analyst at US retail giant Target was tasked with using data to attract pregnant customers to make all baby-related purchases at their stores. The ability to predict pregnant shoppers is lucrative. People’s shopping habits are deep-rooted and only change during significant shifts in lifestyle, the arrival of a new born being a prime example. If Target could get pregnant customers to buy baby goods at their stores, they’d do all their other shopping there too, out of convenience.

Pole was able to spot patterns emerge in pregnant women’s shopping habits. They switched to scent-free soap, and began buying vitamin supplements, cotton balls and hand sanitizers. Based on Pole’s model, Target began sending out leaflets promoting baby products to customers who fitted the profile.

A problem arose when Pole’s model turned out to be effective. One day, an angry customer visited his local Target outlet demanding to speak to the manager. He asked why his daughter received coupons for baby products when she was in high school, and accused Target of encouraging teenage pregnancy. The manager dutifully apologised, but received a call from the man a few days later. The customer explained that ‘there have been some activities in my house that I haven’t been completely aware of,’ and that his daughter was due in August.

This raised the question of what level of personalisation the general public would accept. People are increasingly concerned about their privacy, and Target realised that using data to predict women’s pregnancies could be slightly unsettling. Of course, the data alone couldn’t work this out – data can’t predict people’s emotional reactions.

Target needed a way of using Pole’s data without making customers feel that they were being spied on. So they included offers for lawnmowers and other irrelevant products amongst the real offers for baby goods. This created the illusion that the items were chosen at random. The approach worked – Target’s revenues grew from $44 billion to $67 billion, largely as a result of their focus on the baby market.

The key thing to take away from this is that only when contextual information was considered, and humanity was injected into the solution did it become effective. There’s no doubt that learning management systems should report on more than just completion data. But as Target recognised, without a keen understanding of psychology and the application of skilled human intervention, data isn’t particularly useful.