Human Touch Is Most Vital Part of Tracking Social Media Metrics

Angie Jeffrey
Angie Jeffrey

The growth in the analysis of social media has been driven by the sheer volume of data it generates, allied to the need for greater scrutiny of the Internet. From within that greater scrutiny has emerged an equally compelling need: To get to the truth of how social media influences behavior and determines buying trends, voting intentions, reputation and tone. According to Clarabridge, which specializes in Intelligent Customer Experience Management (CEM), social media generates 2,500,000,000,000,000,000,000 bytes of data each day. Whew.

By any measure, that is a substantial volume of data that requires sifting, weighing and analyzing. Much of it, of course, is of little consequence and even less influence.

But therein lies a key challenge for the PR community: Identifying the real signals among all that swirling white noise and deriving actionable meaning from them.

With the availability—and increasing use—of sophisticated online analytics tools to handle these trillions of bytes of data, you can be forgiven if you believe that automated metrics can actually make sense of it, and be the silver bullet.

After all, media analysis and measurement has moved on from the days when banks of readers monitored the newspapers and cutting clips was done manually. Automation is the new order, rendering human intervention in media measurement as a quaint—and unnecessary—throwback to the 20-century.

But, there is a serious flaw in this argument. While computers excel at handling large volumes of data, they have never been able to make sense of it. Seventy years ago, the forerunners of today’s modern computers finally succeeded in deciphering and cracking wartime codes, such as Enigma. But even that event, which changed the course of World War II, still required a substantial dose of human intuition and pattern recognition.

Ironically, our industry has come full circle. Real-time, human analysis continues to provide the critical edge, but this time in understanding the sentiments and nuances that are interwoven into social media communication. Human insights remain at the heart of the process.

THE HUMAN EYE

At Salience Insight, our strategy is to use computers for what they are really good at: Crunching massive volumes of data very rapidly, and identifying coverage about people and topics. When detailed and accurate analysis is required, however, our investment in recruiting and training teams of skilled analysts pays dividends.

A prime example of this is our collaboration with Ketchum and its client Royal Philips from the Netherlands. Since 2009, we have delivered a service that provides effective, accurate and consistent measures of success for Philips’ diverse and global PR activities.

The measurement program has been designed to drive business processes from measurement-driven planning and management. Based on a unified global research platform, which consolidates all content collection and decentralized in-country coding teams, computers are used to automate selection, with analysis undertaken by the human eye.

Only trained native language analysts can make accurate judgements about the idiosyncrasies in the way we use language from country to country. Only human analysts can provide the necessary level of accuracy to feed into Philips’ communications KPIs. In partnership with Ketchum, we have developed a compound metric that includes an analysis of the presence and alignment of messages to varying degrees of subtlety, as well as the type and sentiment of coverage.

Now, this might imply that many automated systems are not yet fit for purpose. Certainly, past research has highlighted their inaccuracy relative to human analysis, but that does not preclude the use of such systems, nor that they should be discredited. The real issue here is to define more accurately the purpose for which computers are fit.

A HYBRID APPROACH

As it turns out, and in our experience, the most valuable automated data mining is based on a combination of human input and technological expertise. Automated systems are most likely to be successful when the outputs are framed by accurately selected inputs. Typically, these are the keywords used to tag, filter and classify the automated search results. Automated systems that work the best—by offering accurate and efficient data retrieval—are those that measure quantitative metrics and identify who is talking about your product or service.

Using automated outputs to understand the nuances of online and social media discussions and analyze more complex themes is less effective. Why? Automation cannot rely on company and product brand names for a large part of its data selection.

If the chart above had been generated by an automated tool, it would likely show most of this news coverage as neutral in tone. However, adding human analysis yields the fact that most of it is critical in nature, and only a small portion is factual or supportive. Furthermore, it illustrates how one peak is a result of another.
If the chart above had been generated by an automated tool, it would likely show most of this news coverage as neutral in tone. However, adding human analysis yields the fact that most of it is critical in nature, and only a small portion is factual or supportive. Furthermore, it illustrates how one peak is a result of another.

‘NCIS’ FOR PR

We’re undertaking some interesting studies in the field of social media landscaping and discovery, aimed precisely at providing a “semantic” framework, against which clients can develop quantitative analytics. This is particularly valuable in non-English language environments, where automated analytical tools have an even less glittering track record for accuracy.

Elsewhere, The Social Media Conclavewhich is a broad coalition of B2B and B2C companies, industry associations, and PR and social media agenciespublished its first proposed interim standard last June.

The “Sources & Methods Transparency Table” is designed to address the challenges clients face in knowing “what’s inside” social media measurement reports from various agencies, research providers and software vendors.

This is an encouraging development, as a mark of recognition that so-called sentiment analysis in social media needs to be addressed in a measured and scientific way.

Other collaboration extends to defining social media data collection in the area of ‘Reach and Impressions’, the objective being the consistent calculation of subsequent metrics and, potentially, other standards.

These efforts are still a work in progress, however. This is hardly surprising, given that social media has shaped a new global communications phenomenon. But whether automated systems can ever replace the safety net provided by human input, and offer the same (or better) levels of insight, is a subject for another debate.

For the foreseeable future we believe that critical insight will continue to rely on specially trained human analysts to generate the forensic detail that is increasingly demanded of social media measurement.

Human intervention is the critical last link in fitting the missing pieces into the evaluation jigsaw. We used to read newspapers, now we analyze data. PRN

(Giselle Bodie, CEO of Salience Insight, contributed to this article.)

CONTACT:

Angela Jeffrey is managing director U.S. for Salience Insight. She can be reached at [email protected].

This article appeared in the September 23 issue of PR News. Subscribe to PR News today to receive weekly comprehensive coverage of the most fundamental PR topics from visual storytelling to crisis management to media training.