Cutting Through the Communications Data Wilderness: When Big Data Equals Big Answers

shutterstock_591140414

[Editor’s Note: In the first article ( PRN, November, 7, 2017) of this five-part series produced with partner PublicRelay, a media monitoring and analytics firm, we examined some of the challenges of measuring communications data so it can be turned into business insights. In this second article, we compile cases where communicators have moved beyond the fundamentals of measuring data to prove ROI and instead focused on using insights mined from data to change the way they operated their business.]

The Set-Up

Rich Silverman Director, Communications, MUFG
Rich Silverman
Director, Communications, MUFG

The first case centers on MUFG, the largest bank in Japan and the world’s fifth largest. Yet it’s a brand that “needs quite a bit more enhancement, particularly in the U.S., to more effectively compete with global banking rivals,” says Rich Silverman, director, corporate communications. As such, the communications team’s goals have been to educate and inform audiences about MUFG and what it does.

MUFG uses a group of digital tools to inform its outbound strategy (proactive, reactive and opportunistic) and measure its success. “It’s not only about getting out a message, rather we must know if we’re reaching the right people, changing perception and seeing our brand mentioned in relevant online conversations,” Silverman says.

The Hurdle

In past years, MUFG had large delegations attend a major trade show. Team members were on panels, arranged to meet clients and hosted invitation-only events.

Insight from Data

Having its executives at the show proved useful, but working with data showed their presence wasn’t “influencing our broader brand awareness at a time when important audiences were watching,” Silverman says.

“Analyzing the data we realized we were missing several key ingredients: First, our competitors extended their onsite presence by scheduling interviews with important media at the show.” In addition, while many MUFG executives were visible participants in the online conversation during the show, leaders of the social dialogue didn’t see mentions originating from their digital properties. Instead they were mentioned in posts, photos, and graphics that important journalists, analysts and other conference attendees crafted.

MUFG knew of the importance of being seen online, and of being part of the dialogue. “So we went into the show with the idea that it wasn’t enough to just schedule media meetings, rather we decided to examine the data to see which reporters were leading the online conversation on a conference’s hashtag.” This way, MUFG was able to ensure it was not only mentioned in stories, but also was seen as an important part of the show. “The results of this change yielded major dividends for our brand,” Silverman says.

After the show, “We looked at individual-level data to gauge our success. Beyond a major uptick in brand mentions and other critical metrics, we also were able to share information with our stakeholders that can drive their business.” For example, MUFG let its internal clients know whether a prospect or client had seen an article in which they were mentioned.

Lessons Learned

Silverman considers this case “a true learning experience for us. It changed how we go to market with our communications and proved the connection between earned media and online conversation, reinforced the importance of a data-driven communications strategy and showed our work can directly influence our company’s business.”

Case 2

The next case also involves a trade show, CES, the gargantuan annual event for the global technology community. The show hosts more than 6,000 journalists who produce some 60,000 stories with potential impressions reaching in excess of 69 billion on show days alone.

The Issues

Jeff Joseph SVP, Communications and Strategic Relationships, CTA
Jeff Joseph
SVP, Communications and Strategic Relationships, CTA

The scope of coverage presented challenges for the Consumer Technology Association (CTA), the show’s organizer, including how to discern the tone of media coverage and address negative stories; and how to monitor coverage and respond in real time, says Jeff Joseph, SVP, communications & strategic relationships, CTA.

Insight from Data

Partnering with PublicRelay, tonality was assigned to each story, using human review instead of a bot or other technology-based solutions. “We believed the human review would do a better job of recognizing important nuances that a bot might miss,” Joseph says.

After each story had a tonality, CTA was able to compare overall tonality with competitive events. “We used this information to provide a data-based analysis of key media to demonstrate the overall value of our show to exhibitors and to drive corporate goals related to reducing negative coverage.”

A dashboard allowed CTA to monitor print and digital coverage in real time. At the show, communications teams could see the dashboard on a large screen inside their offices. A PublicRelay staffer provided daily briefings at the show.

Lessons

Tracking data, CTA was able to “identify coverage trends and adapt our messaging as necessary.” For example, CTA could see specific product segments were receiving more coverage than others. That trend prompted CTA “to move to provide more accessible information” about those products to media to help amplify the coverage. CTA also used data to provide reports to key exhibitors driving those trends and “adjusted messaging and promotion around other trends we wanted to push,” he says.

At show close, CTA shared data with key exhibitors and keynoters to help them track show value from a media engagement perspective.

These positive experiences led CTA to augment its use of data. For example, “We use tools to help us measure our coverage on policy issues,” Joseph says. It tracks message engagement and compares tone and quality of coverage vs other associations. “We now include the data charts in our reports to internal senior staff and our board.”

Case 3

The next case comes from SAP, the multinational that produces enterprise software to manage business operations and customer relations.

The Issue

jerry nichols
Jerry Nichols Global Head, Marketing Performance Management, SAP

By 2020, SAP aspires to be a top 10 brand. To meet that goal, “It was imperative that we executed our 2016 strategy and meet our targets for Key Performance Indicators (KPI),” says Jerry Nichols, global head of marketing performance management.

A key component of the 2016 strategy was Run Simple, a global, omni-channel advertising campaign. The challenge was to provide “a readout of the market impact of Run Simple.”

To quantify Run Simple and optimize customer experience across paid, owned and earned media, Nichols and his team implemented a measurement approach aligned to the Barcelona 2.0 principles and the customer journey.

Run Simple’s goal was to generate market awareness and demand for SAP’s technologies. A critical component was defining the customer journey in alignment with SAP’s paid, owned and earned media channel strategies and the campaign’s goals. Performance measures, associated sources and comparison types were identified across paid, owned and earned media channels. These included: digital metrics, social media monitoring, brand health, media coverage and demand generation performance.

Voice Recognition Software: The success CTA had with data at CES led it to measure policy issues. Above is a graphic example of share of voice. Source: CTA
Voice Recognition Software: The success CTA had with data at CES led it to measure policy issues. Above is a graphic example of share of voice. Source: CTA

Data Insights

It’s a shame communicators shy away from data due to what they perceive as its complex nature. This case shows how wrong that attitude can be. One of the lessons of this case, Nichols says, is that “a collection of minor improvements [found via data] can culminate in” big results.

“We looked for outliers,” he says, measurements that were overly good or bad. For example, early in Run Simple, data showed a landing page had a poor click-through rate. That outlier, Nichols says, prompted analysis. The fix was simple: Graphics were changed slightly to better alert users to click through; results were significant.

Case 4

The final case, like the first, involves a bank. The bank, whose name we agreed to omit, is a major financial institution; however, this case came from a time when it was seen as a contributor to the country’s financial crisis. Some were arguing that the bank should be shuttered.

The Hurdle

The bank knew it was helping in various ways to overcome the country’s financial crisis, not to deepen it. Its communicators used data to strategically tell positive stories, says David Chamberlin, SVP, CCO, PNC Financial Services Group. Chamberlin was associated with the unnamed bank described in this case study. “We needed to understand in a very granular way what was happening, literally on a daily basis, to our reputation,” he says. “We needed to understand the opportunities we knew about (when positive stories were written) and those we hadn’t seen.”

Use of Data

The bank’s communicators used a measurement tool to pull in countless stories daily, on nearly a real-time basis. Micro campaigns were designed based on what this robust measurement effort yielded.

A key insight the bank gleaned from data: “There was a huge difference between talking to national and local media,” he says. “When we talked to local media our message was picked up 90% of the time. With national media, it was next to zero.”

The bank realized “local media was a much better plan for us and the brand. We would never have known this without the research and the data,” Chamberlin says.

“It wasn’t just a finger in the wind as to which paper or reporter we felt was most positive toward us.” When reporters were positive, he says, “The question became how do you work with him/her so that they have even more materials when they talk about you?” The bank decided not to spend much time courting negative reporters, he says. Instead it concentrated on positive reporters and those in the middle, who had positive and negative attitudes toward the bank.

The bank also measured its spokespeople. The data showed a key executive was one of its most negative spokespeople. The reason: The executive was repeating negative questions in his responses, so he was showing up as negative. [The next article in this series will appear in the December 21 edition.]

CONTACT: david.chamberlin@pnc.com JJoseph@cta.tech jerry.nichols@sap.com rsilverman@mufg.us.jp