New MarketingProfs Post Up

I wrote a post for the MarketingProfs Blog last week about what’s happening in the world of media research, and have re-posted it here.  See the original at:

http://www.mpdailyfix.com/rethinking-normal-the-newest-in-marketing-research-from-the-arfs-annual-conference/

03.29.10

Rethinking Normal: The Newest in Marketing Research From the ARF’s Annual Conference

I attend a lot of marketing conferences where I hear over-excited pitch people telling me all about The New Thing that will Change Every Paradigm Forever.  So much over-enthusiasm can jade just about anyone, so it was with relief that I joined a much more sober group for their conference.  I spent the last few days at the Advertising Research Federation’s (ARF) re:Think 2010 conference taking place in New York City. I found, however, that even here among the stodgiest of marketing researchers, there’s talk of … a paradigm shift.

In typically understated fashion, the theme of this conference is “The New Normal.”  Apparently, the world went and changed without consulting us.  Bob Barocci, president of the ARF put it thus, “If we accept the new normal, we have to change the way we think, and that’s hard and scary.  We deny its existence, but can we deny its signals?”

Many of these signals of change are very real, but not really paradigm shifts in and of themselves.  Some examples that came up during the conference are newer waves of immigrants are replacing older ones, consumer spending is down, and digital media fragmentation keeps making it harder for marketers to find captive mass audiences.  Most of this is old news.  What I find fascinating is how the entrepreneurs of marketing research are “thinking different” and responding to these shifts.

There tend to be two schools of thought on how to proceed in the new normal.  Big spenders and traditional media stalwarts keep pursuing the elusive single metric that can be used to dispassionately, automatically, and accurately sort out the entire media universe into a range of good to bad investment choices.  Smaller, more cash-strapped marketers tend to take a very different approach, and simply find whatever metric that suits them for the moment.  In a presentation for Decipher on guerrilla marketing measurement strategies, Kristin Luck encouraged those present to get as creative with their measurement as they are with their marketing.  She makes a convincing argument for using mobile to help track the spread and effectiveness of marketing materials by word-of-mouth.  Luck advocates using tactics like survey widgets in mobile platforms, GPS tracking, pass along pings, tracking mobile bar code coupon redemption, and mobile downloads.  While researchers can spend all day arguing about the continued relevance of metrics like GRPs, it’s clearly important not to let that stop anyone from taking advantage of all the new metrics at our disposal.

Back in the days of analog media, researchers made due with sample data generated by research companies.  The complaint with research was more often that the data was too limited to reflect reality.  These days, census level data tracks billions of people in real time, and now the problem is figuring out what to do with all this data.  Most analytics departments are simply overwhelmed with the amount of data coming in.  It was pointed out during a panel that Walmart collects data on 100 million transactions per hour.  Traditional research methods simply aren’t designed to handle this level of data.

Stepping up to this challenge in a rather innovative way, Discover Card uses text mining software to analyze and quantify the content of their customer service calls and compare the frequency of certain kinds of complaints, and even compare this to the incidence of chatter about the brand on blogs and social networks.  By converting speech to text and analyzing this huge source of data, Discover can get a much more accurate idea about whether a specific complaint is being generated by a few lone individuals, or if it truly is a widespread problem deserving of attention.

Two great examples of how qualitative research is adapting to this new norm of overwhelming data came from CBS’s David Poltrack and Global Park’s Dan Coates.  Both have accepted the fact that “pure” research has its place, but a research program that incorporates itself into the data stream of everyday life makes a lot of sense.  Global Park creates online communities for companies that act as CRM tools to keep customers engaged, while simultaneously mining the communities for product insight and feedback on marketing.  Coates calls this approach “large-scale listening with accuracy”.  He is also quick to point out that it isn’t representative of the general population, but the feedback is authentic in large part because “the community goes on even when the research stops; 95% of communication is repetitive management, 5% is real research probing the community.”  He also points out that this approach can make research more enjoyable for the subject by making survey vehicles less boring.

CBS has taken this approach to a new extreme by opening a research facility in Las Vegas that’s as much about entertaining people as it is about research.  Las Vegas is somewhat unique in that it’s possible to meet someone there who traveled from just about any town in the United States.  At their research facility, they invite the public in to watch new material, running 700+ focus groups over 365 days per year. Poltrack calls this approach “immersion research.”

This combination of new streams of data, new methods of data collection, and more powerful processing tools mean that marketing research, like media itself, is becoming a real-time endeavor.  As software takes up the challenge of collecting and processing data, researchers can focus more on finding insights and affecting business decisions.  Who knew research could be so exciting?

When the law is stupid, how do we make it smarter?

Governments tend to be pretty abstract until they interfere directly in one’s life (e.g., war, taxes, tickets, etc.).  Some recent government intervention on behalf of San Francisco’s finest got me thinking about how difficult it is for government regulation to keep up with the ever-increasing speed of technological development.

I first started using Google Maps on my Blackberry Curve about 3 years ago. I graduated to the iPhone 3G about two years ago, and bought a cradle for my car’s dashboard the first time I took a road-trip.  I’ve never been lost since.  As I drive, I watch my virtual blue dot self progress through the 2-D world of Google Maps right along with me as I drive through the 3D world.  95% of the time, this system works well (5% of the time Google thinks I’m in a lake).  It never occurred to me that I might be breaking a law.

One sunny afternoon in November, I programmed my destination into Google Maps, pulled out of my spot, and was off on my way across town.  Shortly before I got to my destination, I found myself waiting at a long line of cars inching through a stop sign, so I pulled the iPhone out of the cradle to get a better look at the last turn on the map.  Upon doing this, a policewoman on motorcycle rode up to my open window and told me to pull over.  Mystified, I asked what I had done.  She told me I was texting while driving.  I showed her the map, she noted on the ticket that I was “texting (map)” and told me to have a nice day.

I stopped her and asked, “would you have pulled me over had I been looking at a paper map?”

“No.”

“Did it seem dangerous that I was looking at my iPhone while stopped in traffic?”

“No.  Look, it’s a screwy law.  I’m just telling everyone to fight it.”

“Huh?”

Today, three months later, I went in front of a judge (your tax dollars at work) to fight this thing.  Promptly at 3pm, a rag-tag bunch of people filed in to traffic court B.  Ten minutes of droning rules later, the first item of business is: me.  They call my name, tell me I’m dismissed, to see the clerk and I’m out.  Wham, bam, no fine, no argument, no case, and no explanation why.  I can only assume I’m not the only one who thinks the law isn’t keeping up with reality.

Now that it’s on my mind, I’m suddenly seeing examples of this trend everywhere.  Flipping through wired.com this afternoon I came across a story about how RealNetworks is giving up on an attempt to keep software that copies DVDs legal. Here are the good bits:
1. “Copying DVDs amounts to “theft,” the MPAA’s general counsel, Daniel Mandil, said Wednesday.”
2. “It’s OK to copy music from CDs, for example, and place it in an iPod. Yet, it’s illegal to do the same with a DVD. When it comes to the DVD, there’s not even a question of fair use allowed under copyright law.”
3. Judge Patel, in her ruling in the RealNetworks case, said “while it may well be fair use for an individual consumer to store a backup copy of a personally owned DVD on that individual’s computer, a federal law has nonetheless made it illegal to manufacture or traffic in a device or tool that permits a consumer to make such copies.”

To be fair, both sides have totally valid points.  The law says that fair use is ok.  I can legally create a back-up copy for myself of media I purchase. However, it is illegal to write software or manufacture a product that is actually able to create a DVD copy, because media companies are afraid digital copies will lead to piracy.  So, to recap, it’s completely legal to have DVD copies, as long as you don’t actually attempt to copy a DVD.  Huh?

Intellectual and physical property needs to be protected from theft and carelessness.  I don’t want teenagers texting while driving any more than I want media companies to go out of business (at least not the ones I work for).  However, the combinations of bits of intellectual property, atoms of media devices, and user actions that create an illegal or careless act are too often confused by law-makers.  It’s legitimately confusing how easily bits, atoms, and actions can be combined and reconfigured in any number of permutations; some legal some not.

Is it possible that the law itself must adapt?  Law itself tends to be so binary–this is right, this is wrong–that it’s unable to handle complexity.  We see so many instances where academic disciplines combine forces to create entirely new cross-disciplinary methods of handling complex problems much better than any isolated school of thought could do on its own.  Perhaps it’s time for a cross-disciplinary team of mathematicians, scientists, and lawyers to get together and come up with a new kind of adaptive law.

Imagine a machine that calculates your fine for speeding using an algorithm that weights dozens of factors simultaneously…  Person A, going 60 mph in a 30 mph zone in the middle of the night with no other traffic = $5 fine.  Person B, going 90 in a 30 during the day with children present = $300 fine.  Person B, going 45 in a 30 while taking an injured friend to the hospital without causing injury to others = $10 rebate.  I could go on…

I suppose this may be wishful thinking, but I honestly believe that at some point the complexity of law will have to catch up with the complexity of reality.  The alternative will simply be a lot of wasted time and and a retardation of progress.  Who’s going to build the flying car of the future if it’s illegal to fly it?  On the other hand, do we really want teenagers texting while flying?  Yikes.