|Albert J. Barr|
Using computers to analyze media coverage is useful and amazing. Technology has really come a long way since I got into the media analysis business in 1984. Back then, I designed a system where all of our research analysts used typewriters with paper forms. The fields being researched were typed on 8 1/2 x 11 sheets of paper that were printed with orange ink. The data gleaned from each article was typed, of course, in black. There was enough room on one sheet of paper for three articles.
The challenge was how to convert and export text on a sheet of paper into a computer database. To save time, and automate the process, thousands of sheets of paper were fed into a scanner. The scanner could read the typed data but it could not see the orange ink that had the field names and boxes for typing in the data (media name, favorability, issues, messages, etc.). I can remember clients visiting our offices in Washington, D.C., and being wowed by this creative, state-of-the-art idea while they watched the scanner input information from thousands of researched articles, thus replacing human typists for data entry.
Well, we've advanced light years since then. In the early days, I don't think you could count the number of media analysis companies in the U.S. on one hand. With the advent of the Internet and tremendous advances in computer hardware and software, this has become a large and highly competitive business.
At CARMA International, we believe in technology. Computers can handle huge amounts of information efficiently. There is no way that humans can keep up with that kind of pace.
However, I believe that while computers are fast and relatively accurate, they still can't pick up sarcasm and all kinds of nuances that appear in media coverage. This is why I believe there is still a strong need for some kind of human intervention both in measuring and interpreting what all this coverage means to companies, governments and all organizations who need to know what's been said about them.
I believe, at least for now, there has to be some kind of compromise between using computers to digest millions of bits of information and humans to help analyze and interpret their meaning.
A good way to do this is to use the same approach that survey research firms have been using ever since they started polling. If you are using an automated service to analyze thousands, or even hundreds of thousands, of mentions in both traditional and social media, it still makes sense to get a good statistical sample from this base and have real people measure and interpret the sample. This way you can get dynamic results along with professional advice about what's being said, emerging trends and a much more accurate measure of media sentiment.
George Fueschel, an IBM technician, coined the term "GIGO": garbage in, garbage out. The term is used primarily to call attention to the fact that computers will process the most nonsensical of input data (garbage in) and produce nonsensical output 'garbage out). It was most popular in the early days of computing, but applies even more today, when powerful computers can spew out mountains of erroneous information in a short time.
Quality control is our mantra at CARMA. I would strongly advise anyone using computers for media analysis to include a serious element of human intervention. It's insurance, so organizations don't waste precious funds on information that may prove to be of little to no use.
Albert J. Barr is the chairman and CEO of CARMA International. He has 40 years of experience in public relations, journalism and research. He founded CARMA in 1984 after foreseeing the value of media content analysis as a strategic and tactical measurement tool.