The Warehouse, 47-49 Cowleaze Road, Kingston Upon Thames, KT2 6DZ
+44 (0)208 541 3434

Website version

Cicero said ‘the face is a picture of the mind.’ So what if technology gave the Communications industry the means to measure facial emotive response to online content on a mass scale and so create a platform for true ‘sentiment’ analysis?

I pose this scenario because I read this week that Tesco is installing ‘hundreds of hi-tech screens’ in petrol station forecourts that use facial tracking and recognition to serve targeted ads to consumers. The system uses built-in cameras to detect the age and sex of customers in real time, which is then used to determine what ten-second advert is shown to them.

Naturally the media is dissecting this in every which way, a fair few referencing the film Minority Report – an excuse I’m sure, to slip in a nice picture of Tom Cruise.

As a technology enthusiast, it got me thinking about the other applications this technology could have, particularly in the field of Communications.

One of the challenges in this industry is demonstrating the value of Communications output to the business. Over the last decade, the industry has seen a major shift away from the use of EAVs – equivalent advertising values – and to many PR institutions they are outlawed. Try slipping an EAV metric into an award submission these days and it’s an automatic zero on your scorecard.

And rightly so. Whilst it does have that convenient ‘£’ sign, EAV is old fashioned and doesn’t even account for where much of the ‘communicating’ happens these days – online.

We have lots of new metrics for measuring the reach and impact of a piece of online coverage, but where reporting still stumbles for the most part, especially in terms of accuracy, is sentiment analysis.

Sentiment analysis or opinion mining, is the extraction of subjective information in your source material, be that an editorial article, a blog post or forum comment. This is most often represented as ‘Positive’, ‘Neutral’ or ‘Negative’ – hardly the best means of summing up the spectrum of human emotion.

Where this evaluation also falls down is letting a computer interpret the subtle nuances of the English language. Put ‘bad’ into UrbanDictionary.com and the first definition is ‘good’ but it can also mean bad. Try posing that as a logical argument in computer code.

When it comes to YouTube, a domain where many PROs operate and where good content is king, we’re stuck with even more polarised sentiment analysis. Viewers can give a video a thumb up, or a thumb down. Not massive insight there either.

Going into the Analytics, we can get a sense of engagement/appreciation by how much of the video they watched – or how little – and if they went on to share it.

What we can’t get, however, is their true emotion whilst watching the video. Ideally there would be a way we could sit with all these viewers whilst they watched these videos and look at how they react to the video. See the smiles, the frowns, hear the laughter and feel the abuse they throw at the screen because our ‘cool stunt’ suddenly became too much like an advert.

Here’s where this new technology could come into play.

Take a look at the screen you’re reading this blog post on. Have a scan around the perimeter of the display. If you’re reading this on an iPad, a laptop or even a new desktop PC, chances are there’s a little webcam staring back at you.

And as more of us use mobile devices to consume content, we’re going to be spending more and more time with our faces looking at increasingly sophisticated cameras, whilst they stare, unblinking, back at us. Cameras that aren’t being used 95% of the time, but could be switched on in the background to create the world’s largest focus group and monitor of facial emotion.

Linked to our YouTube history would be our facial responses. Every laugh, chuckle or smirk would be captured, evaluated and time-coded to the specific video.

PROs would be able to report back on the exact moment their video triggered the biggest positive audience response as well as the parts which didn’t make the grade. What’s more, combined with socio-demographic data, already available from YouTube, they could see how different types of people respond in different ways and specifically target those people in their next piece of content if they were identified as the ‘fan’ base in need of attention.

With a restrained approach that didn’t dive, Big Brother style, into the private space of every consumer, these cameras could be used to record, process and evaluate the emotive experience of online content on a massive scale.

There are a few companies pioneering this technology already. RealEyes and nViso are two key players, both offering solutions that turn ‘emotion into insight’ and they’re getting big brands on board.

RealEyes, whose client list includes with Sony, eBay and Nokia, promises automated access to the emotional responses of more than 7million people across the world through regular webcams.

One of their studies, based on emotional analysis of over 100 viral videos, shows that where there’s an increase trend in happiness and surprise, the video delivers a higher performance in view duration and engagement and when a story resonates with the audience, it attracts more views that convert into actions 10 times better.

In 2012, nViso’s 3D Facial Imaging technology was used in a ‘spy game’ created to promote TV series Hunted. The campaign went viral and in the first 10 weeks, 1.3million players’ emotional responses were analysed. In this example, it’s not the technology that’s impressive but the scalability of the potential to analyse the emotional state of a massive audience.

If YouTube embeds this technology as part of the available metrics for watching a video on their site, then brands could quickly gain emotional insight to the more than 1billion monthly unique visitors.

So if facial recognition technology can be used to ‘improve’ the consumer experience through personalisation, as Tesco is rolling out, then with a fair few caveats in terms of consumer privacy, why shouldn’t it be applied to understand them better too?

Who knows, the first person to patent ‘FaceLook’ might be the next dotcom billionaire.

Be Bold.

It’s time to come off the fence:


Message us