Google’s kind of been dropping the ball lately in my opinion. First the thud of Buzz landing, then the collapse of Wave, then the whole Net Neutrality thing, topped off with not hiring me for a cool job because I got a surprise D in Philosophy class in 1996 (not kidding, it knocked my GPA enough to make Google HR reject me 14 years later). However, I don’t hold a grudge, because every once in a while they do something so simple and smart I remember why Google is great. Behold, the “did you mean to send that without an attachment” pop-up. Awesome.
Whenever given an opening, I tend to launch into a speech about how I think online publishers need to start using a hybrid subscription revenue model with a tiered advertising model. In other words, a visitor to a website who chooses to remain anonymous gets lots and lots of ads, a visitor who offers enough information to allow for accurate targeting and is willing to interact with marketer content gets just a few carefully selected ads, and anyone who wants to opt out can simply pay a fee–maybe a day pass for 24-hour access or a slightly larger amount for a year of ad-free content. A lot of websites are already doing this (Pandora does a great job), but it’s hard for them to do it well because each site has to do it themselves without a lot of outside help. You have to get the credit card out every time you go to a new site and spend five minutes on each form. The other problem for the publisher is what to do with all that empty ad space in their static page templates–the technical challenge is programming in a way that presents content equally attractively with or without the ads.
I happen to be in the middle of re-reading an old favorite of mine for a book club, Robert Reich’s Supercapitalism and it helped spark the idea that led to this post. If you haven’t read it, you should, and just for context, I’ll quote a bit of a blurb from the back: “Supercapitalism highlights a new kind of social conflict–between ourselves as consumers and investors and ourselves as democratic citizens.” Reich reminds me that the best systems internalize countervailing power. Call it a corporate ecosystem or whatever you want, the important takeaway is that any time you have a system where the regulated become the regulators, the whole system gets out out of whack. The Yin swallows up the Yang. The way around this is to incentivize balance over competition for control.
Right now, there seems to be a battle between the proponents of the subscription model and proponents of the advertising model, and when I step back to think about this on the level of the system, it dawned on me that ad servers have no incentive to serve less advertising. They are fundamentally locked into an arms race of exposures because that’s how they make their money. So, wouldn’t it make sense for ad servers to also start managing subscription revenue? In theory, you could even bundle an ad-avoidance subscription across competing sites as long as the ad server is employed by both rival sites and is managing the revenue stream.
This would create a healthier system I think. It would maximize the amount of revenue by improving the user experience. By standardizing the subscription process, it would probably also increase subscription rates, since customers would learn the behavior. When I say learned behavior, I think about how buying your first album on iTunes is a huge pain, but once you have your account set up and your ipod running, purchase is pretty painless. On a similar train of thought, part of my problem with the iPad is that there isn’t much of a tiered pricing structure when buying content. I can get a subscription to Wired Magazine on the iPad for 99 cents/month, but I can’t get a free version with ads in it. Wouldn’t it make a ton of sense to have video and interactive iPad ads that are targeted precisely to me based on all the info Apple has on me? The music I buy and the podcasts I listen to ought to be plenty to inform some algorithm in an ad server what ads I’ll respond to.
Anyway, go read Supercapitalism and shoot me an email if you happen to come across an enterprise-level ad server that also acts as a plug-n-play subscription manager.
While doing research for various projects I keep running into an annoying quirk among B2B marketers. Every once in a while I’ll get an error message when I enter my Gmail address on a form because it’s not a “business” address. This is, however, the email address which I use exclusively for work-related forms, newsletters, and other digital odds and ends I want to keep track of.
In the case I show here, the error message reads “Please specify a valid business email address.” Gmail, and other free email providers, don’t count as valid business addresses. Now, could I take 5 minutes to create a tim at mcateeonmedia.com email address and get my download? Yes. Will I? Probably. That is, however, besides the point. From my perspective, this practice is just chasing away prospects and costing these companies money. Do they really think random people are downloading whitepapers for fun? I feel like it’s pretty safe to assume that if someone is going to the trouble of downloading your whitepaper, they have a legitimate reason for doing so, and it makes no sense to stop them. From a sales perspective, it does make sense to get contact info in exchange for the whitepaper.
I think what this really comes down to is an issue of data, and how much thought has gone into getting the right info about one’s web visitors via static or progressive profiling databases. Lazy analysts and sales people want their customers to make it easy to isolate a prospect as a unique individual with a static profile. Static profiles, however, aren’t reflective of life. They don’t easily accommodate things like job changes, multiple email addresses, multiple phone numbers, or in the case of women, name changes. I’ve had my Gmail account much longer than any one business email, so from a tracking and sales perspective, it’s actually a much better way to keep tabs on me.
The complicated fix is to re-engineer the way databases are built and maintained and improve automated de-duplication. I have, however, seen a much easier fix lately that I think B2B marketers might like. In much the same way consumer sites let you log in with your Facebook account via Facebook connect, all you form makers and database wizards out there, why not try letting people use their LinkedIn profiles as a login key for downloads or registration? That way, you get their current and past job info, the URL to their profile never changes and is unique even across people with the same name, it saves them time, and it auto-updates as they switch jobs. Brilliant, I know. What I really can’t figure out is why LinkedIn isn’t more proactively pushing their LinkedIn widgets and APIs. Anyone know why? Perhaps I’ll make that my web development project for next week just to see how hard it is to implement on this site and report back.
My email inbox, Twitter, and Facebook feeds have been filled with links to some provocative info-graphics produced by the Wall Street Journal recently, which takes a decidedly negative view of online tracking technology. Having been in the online advertising space for a decade and knowing a lot about how these companies work, what data they collect, and the privacy implications of it all, I can say from experience that all this fear-mongering about tracking and privacy protection is largely without merit and designed to scare people. When I read obviously loaded language like “marketers are spying on internet users” with no explanation about what this “spying” actually is, I have to wonder what the motive is for such blatantly biased journalism.
Media economics 101 says that you monetize content in one of two ways–get the reader to pay the publisher directly, or get the reader to pay the publisher indirectly by serving them ads that marketers pay for. It’s no secret that Rupert Murdoch, who bought the Wall Street Journal in 2007, wants to ditch the online advertising model and erect pay walls to monetize the content via subscriptions. Why? Because it’s easier and more lucrative, but only in the absence of free competition.
Now, what is the one thing that all of this tracking technology has in common? It all helps target the right ad to the right person at the right time, and in doing so, making every ad impression more valuable. This is the only way mass media producers can make money without drowning their audience in so many irrelevant ads that the content itself becomes unviewable. Imagine a sales person in a clothing store who walks up to an old, skinny, wealthy white woman and tries to sell her a huge pair of baggy mens jeans. No sales person would do this, because they are able to “spy” on the potential customer and use that information to help them pick a more appropriate product. The best clothing sales people can tell your size just by looking at you. By “spying” on you, they add value. There is nothing nefarious about this.
The websites that need information about a user the most are the ones with the most diverse group of users. WSJ.com, as you can see in the data below, has no such need for this, because they index heavy on a very homogeneous group of rich, over-educated, men that are most likely white or Asian. It’s like being a sales person in an expensive tie shop–you already know exactly who’s walking in the door. They can easily sell their ads at a premium without much additional targeting technology, and many of the ads on their free pages are house ads promoting the paid “Pro” version of the site.
Let’s look at what the previous owners had to say about the sale of the paper to Murdoch: “The Bancrofts worried about protecting the reputation of the Journal, the nation’s second-largest newspaper. They feared Mr. Murdoch would meddle in the paper’s editorial affairs and import the brand of sensationalist journalism found in some of his properties such as the New York Post.”
So what does the Wall Street Journal gain from Fear-Mongering? To answer that question, let me borrow a propaganda tactic often-used by another Murdoch employee, Glenn Beck. I’m not saying that the fears of the Bancroft family are being realized, or that Murdoch is encouraging his editorial staff to undermine the very technology that may well save the democratic tradition of free news supported by advertising, or that the WSJ under Murdoch is cynically pushing propaganda for self-serving purposes…I’m not saying any of that, I’m just playing devil’s advocate, but I’d like it if Mr. Murdoch could prove to me that he’s not.
Most of my career has been spent, in one way or another, explaining to people that the way we all consume media is changing, and that if they listen to me I might be able to help them navigate the change successfully. I got paid a reasonable amount of money to do that, because what I had to say was a relatively scarce commodity in a marketplace filled with old-media types scoffing at the idea of real change. The short-sightedness of those folks provided me with a fantastic way to make a living, for a while. Nothing good lasts forever though, and I’ve been thinking a lot lately about what information will be scarce in the months and years ahead. As a researcher, marketer, and consultant, change is the rocket fuel of my career. To stay relevant, I need to figure out what the next big change will be that nobody saw coming, and then figure out what to do about it before it even occurs to anyone else to worry. I think I can do this, but I’d like to take a moment to reflect. Think of me like the guy on the corner holding the sign “the aliens are coming!” the day after the aliens actually came. I was right, but what to do now?
Watching old media stalwarts blunder their way into new media mistakes over the past few years (i.e., CNN and anything to do with Twitter) was like watching your grandpa get into hip-hop. Watching Ashton
Kutcher try to leapfrog existing media companies with his company, Katalyst Films, I first had a sense that maybe the universe was starting to catch on (though Katalyst seems to have gone dormant since 2009).
Lately, it’s become common to see the ideas I’ve been advocating come to life. One example of this is that I’ve been pushing for years for the idea that online video ad servers should automatically match the length of ads to the length of content viewed in order to create an equitable exchange of content for ad viewing time. Back when my idea was still new, every video, whether a fifteen second clip or a two hour movie had a 30 second ad spot in front of it, or nothing at all. Just recently, I’ve noticed that Hulu’s ad servers have finally gotten smart enough to give less advertising for short clips, and more advertising to long-form content, and even more ads to heavy consumers watching many shows in a row. I have no idea if they read my book on video marketing, and am guessing the smart people at Hulu probably came up with the idea independently. But, the conclusion is the same; many of my radical ideas from the past are becoming the normal, business-as-usual present. Suddenly, we’re all used to the aliens among us.
All of this really struck home this last Saturday night when I went to see Conan O’Brien, Patton Oswalt, and Andy Richter speak (and drink an amazing amount of wine) at the Herbst Theater in San Francisco. They were predictably fun and funny, but the thing that struck me was how much Conan and Patton talked about the internet, and Twitter, and the economics of media. Conan reiterated multiple times that the machine of media is broken, and at one point looked out at the crowd and said “It would not surprise me if three years from now one of you sitting out there is running the media, and it is totally different than it is today.”
Now, perhaps this is selfish or narcissistic, but I have to ask, if Conan O’Brien is now the guy standing up in front of Google employees and talking about how media is changing, how the heck am I supposed to compete with that? I guess I can’t. Conan, as much as I love you, you’re putting me out of a job. Please get back to hosting TV soon. Until then, I’m admitting defeat. I am, instead, now seeking the new scarcity. I’m setting out to uncover whatever it is that is so new that it still has the capacity to blow people’s minds. I’ve got some ideas. I’ve been kicking around a few concepts. But I’m sure not going to write about them here…yet. For all I know, Conan or Ashton are reading my blog (I’m just going to push this narcissism thing to the limit now that I’ve started it). But seriously:
(fast-forward to 15:22 if you’re short on time)
I’m back from my big European vacation and spending a fair amount of time looking for a new job, but when not wondering where the next pay check is coming from, I’ve been working on a novel about hipsters. Having lived in the Lower East Side of Manhattan, Williamsburg in Brooklyn, and the Mission in San Francisco, I consider myself something of an expert on the odd phenomenon that is the American Hipster. I’ve been spending afternoons and evenings writing in trendy coffee shops and PBR-filled dive bars in order to better observe my subjects. I’m sure anyone taking a look at my disheveled appearance as I type away on my shiny mac laptop is likely to consider me a hipster, yet neither I nor any of the tight-jeaned urbanites I write about would ever admit to being one. This is the odd paradox of the hipster.
While eavesdropping on a particularly pretentious mustachioed hipster, I overheard him first blast modern youth for making his band unprofitable by refusing to pay for music and only downloading it free, then go on to vilify hulu.com for daring to insert commercials within the video content he viewed on the site. This hypocrisy got me wondering if such sloppy logic is the norm, and the more I look for it, the more I find it. In case it isn’t obvious, here’s my problem with this guy’s logic. He feels that the media he consumes should come at no cost (he won’t view ads to offset the cost of creation or pay a fee directly to the creators) yet the media he creates should be paid for (people who download his music should give him a cut). He has no qualms about using technology to circumvent advertising in his media, yet is infuriated when others do the same to him.
This self-defeating state of affairs has to end at some point if professional media is going to remain a viable source of income for those who choose to create it. The media industry, in particular the film industry, has made some pretty heavy-handed attempts at making the case that piracy equals theft, but they always seem hollow. I, personally, don’t feel guilty about ripping off some big film studio that’s been gouging me for years with $12-$14 movie tickets, or the music company that charged me $14 for a CD back when I was thirteen years old and they could still get away with it. When Lars Ulrich of Metallica, a multi-millionaire, goes up against Napster, of course I’m rooting for Napster.
However, when I go see a small act that I really like perform at some small venue, I nearly always buy the t-shirt, even if I just wind up giving it away as a gift. The reason I do this is because I know that all the money I spend on that t-shirt is going right into the band’s pockets, not siphoned off by a greedy promoter, ticket-bastard (my pet name for Ticket Master), or a record label that by design cares more about profit than producing good music.
Sipping my over-priced, hipster-approved Cafe Americano, I have an a-ha moment: what if we came up with a way to market “fair-trade” media the same way coffee sellers let us know the independent farmer down in Guatemala is getting his fair share? Some of the big players like iTunes or hulu could easily find ways to be more transparent about how the money spent on media, or time spent with advertising, directly benefits the makers of the media content. It’s easy to steal from a faceless corporation, but it’s hard to steal from a person–especially a person you like. What if I could log into my iTunes account and see a summary of how the money I spent in the last year went to each of the artists I bought from? I think it might work.
Watching the reaction to the iPad release in the buzz-o-sphere over the past few weeks, I consistently notice that reactions tend to fall into one of two camps–writers, critics, and opinion-makers are often underwhelmed by the device (myself included), while the general public fervently loves it.
I’m traveling in Europe right now where the iPad is due to be released May 28th, and while reading on my Amazon Kindle in an airport the other day, an English woman walked up and demanded to see my iPad. When I explained to her that it was merely a Kindle, she looked visibly disappointed–almost disgusted. I actually apologized. This woman, like many throughout the world, is enthralled with the idea of the iPad.
Over the weeks since the U.S. release of the iPad, all we internet writers have picked up on the public’s love-fest with the device and started to reevaluate the iPad’s worth, given that our initially tepid reactions seemed out of sync with the zeitgeist. This gradual about-face among the digital press has gotten me thinking. What follows is my hypothesis, some pure speculation, and probably some grandiose claims. Simply put, I think writers are annoyed that the iPad doesn’t have a keyboard*. Why they are annoyed, however, may be a bit more complex.
Forrester Research has created a social technographics ladder for classifying individuals on the spectrum of social media consumption vs. creation. Peter Burris from Forrester was kind enough to share some of their data. They found that when looking at tech industry insiders or the general public, the majority of each group is active in social media, and nearly everyone that is active in social media is also considered a “Spectator”. The minority of each group falls into the “Creators” group, but among the tech industry subset, they are much more likely than the average person to be a “Creator”. I tell you this simply to lend some gravitas to the common sense conclusion that “normal” people (the general population) are content to simply be consumers of media, while creation of media has become “normal” among those of us who make our money and live our lives online.
Back to the iPad and its detractors, the thing they all have in common is an annoyance with the fact that iPad is essentially just a consumptive device. All the billboards advertising the iPad, even in the former Communist East Germany, show someone in repose, enjoying the sensuous presentation of content from the comfort of their couch, feet up and shoes off. Simply put, this is not a creators device, and what the creators of the world are finally starting to come around to seems obvious with a little distance–most people don’t use the internet the same way they do. Much more interesting is what this means for the internet as a medium.
For years now, we online media folk have been chanting the mantra “content is king,” yet so many times during the last 15 years the general public has turned on their computer, taken a look at the content currently available, gotten bored and flipped on the TV instead. In the U.S., at least, it seems that tide is finally turning in favor of the net. It’s one thing to read an email, check the weather, and watch a short clip of an animal being cute. It is an entirely different thing to turn on your computer to read a book, watch a feature length movie, or settle in for the night in front of hulu on auto-play mode. With so much rich content now available online, the hardware we use is and will continue to adapt to meet our needs. Assuming you’re part of the majority of consumers with no desire to contribute content, why get a desktop computer for your home when a wii or an ipad will suffice? I think this perfectly logical evolution of form and function scares the hell out of online creators, and for very good reason.
If the minority of internet users are contributing as well as consuming, the ability to contribute to the internet may get harder. I think there is a fear that if you take the keyboards off our computers, the renaissance of digital creativity will dry up. In Cory Doctorow’s post on Boing Boing entitled “Why I won’t buy an iPad (and think you shouldn’t, either)” he comes right out and says “If you want to live in the creative universe where anyone with a cool idea can make it and give it to you to run on your hardware, the iPad isn’t for you.” I think he is exactly right, but the numbers show that most people don’t live in the creative universe. Most people just expect that universe to show up at their home, pre-packaged like sliced cheese, and ready to consume while zoning out on the couch. I think it’s futile to argue that we should all be whatever the rugged-individualist version of an internet user is (creating and consuming our home-made content). People are going to consume mass media because it’s easier. How, though, do we make sure that choice doesn’t dry up in a world without keyboards?
Unfortunately, I’m forced to just end this with more questions. Were tomorrow’s iPad buyers ever really using their old keyboards in the first place, or is the iPad just intended to be a plaything of the rich, like a vintage sports car that only comes out on the weekends? Only time will tell.
And finally, I leave you with… Tosh destroys an iPad
* Yeah, I know the iPad has an on-screen keyboard, but if you’ve ever tried to type anything longer than “lol, steve jobs!” on it you’ll quickly understand why I don’t classify it as such.
I admit, I am one of many people who have heard someone say “getting laid off was the best thing that ever happened to me; if that hadn’t happened, I never would have done…” and immediately thought to myself, “bullshit”. Hence, the irony of what I am about to write.
At 8 am last Thursday I took an unexpected call from a strange number, and was informed by an outsourced HR professional (a woman I’ve never actually spoken to before) that my job was being “eliminated for cost-saving reasons”. I’m not sure anything could have prepared me for this surreal moment. The focal point of my life, work that I spent months pouring my time and energy into, was suddenly devalued to $0 a mere two weeks from completion, never to be seen by anyone. I was shocked.
With the clarity of a few days spent thinking, I find I harbor no real resentment, but am legitimately disappointed that this work will never see the light of day only because somebody wanted to save a buck. Even more so, I feel bad for the study participants who gave their time in good faith and are now unlikely to ever get the data they were promised. I hope I am proved wrong.
As you may have guessed from the first sentence, however, I have no desire to dwell on the down-side. What followed that initial adrenaline was the cool feeling of liberation. I feel as though I’ve been training for this moment for the last ten years, and it comes far more naturally than I had anticipated.
The next day, I walked into the same home office, opened up the same computer, logged onto the same email, flipped through the same list of contacts, and everything was exactly as it had been except for one thing–I get to choose what I want to do. The first thing I did that morning was to delete my to-do list, and replace it with a want-to-do list: projects I would like to work on and the people I would like to work on them with.
My work life is so modular, it’s a very simple matter for me to just unplug one project or employer and plug in another. This is both the beauty and the curse of the information economy. When you can no longer count on a single client or employer for stability, you get good at finding stability on your own. Sure, it’s a hassle, but like any form of resistance, it makes you stronger.
I find that my fear is not of a loss in income, but of a loss in trust. I never imagined my employer would make me renege on my promises, which is why I’m making this small offer on this small blog as an attempt to make things right: if you ever took one of my surveys and are still interested in my findings and opinions, send me an email at firstname.lastname@example.org with some background info and whatever marketing challenges you’re struggling with, and I’ll do my best to give you any answers or advice I can, free of charge, for a limited time. Even if you didn’t participate in the research, you’re welcome to jump in on this too. Unfortunately I can’t share any of the great data I collected, but I can certainly allow what I’ve learned to inform my advice.
While a return to stability seems inevitable, I intend to enjoy this moment as long as it lasts. So many times over the last few years I’ve had to politely decline the opportunity to consult or try new things. Always, always, there was too much work to do. But now, finally, I’m open for business. Let’s talk.
I distinctly remember the “oh crap” moment I had back in 2002 the first time someone posted an embarrassing picture of me on Friendster. It used to be that you could safely go on vacation, do stupid things, and be relatively sure that your grandmother wouldn’t be watching a live broadcast of your stupidity via Facebook live update. The idea that one drunken indiscretion could have repercussions lasting a lifetime put a serious damper on everyone’s fun, and I can’t help but notice that younger generations have adopted a much more sober approach to public fun. They seem far more responsible, or at least guarded, than my or my parent’s generation was.
My hypothesis is that there is a distinct shift in attitudes toward internet privacy that is driven largely by age. I think that people who went through school prior to the invention of the social network are frustrated and feel that their privacy ought to be respected, but is not, while those who went through school after social networks caught on simply expect their exploits to be broadcast to the universe. To this younger cohort a world with privacy is like a world without nuclear weapons–sounds nice, but the very idea seems quaint.
What got me thinking about all this is that I recently broke up with my long-time girlfriend, and while I have de-friended quite a few people from my online social networks, for the first time I had to de-girlfriend someone. Throughout history, a marriage ceremony was essentially just a public statement that “this person is now off-limits!” In modern times, Facebook’s relationship status seems to serve pretty much the same purpose. With a simple click, we inform the universe that someone is now off-limits. It turns out that when you un-check that box, Facebook pops up a very odd bit of text, which is “your relationship will be canceled.” In comparison, “divorce” seems less serious. My relationship is…canceled? Yikes. I hit Save, and efficiently, effectively, immediately, the world is informed of my decision. I hate this fact.
Older generations brush off my discomfort and say that if I hate broadcasting this news I shouldn’t have added the relationship to my Facebook page in the first place. Try, however, explaining that to a 20-something girlfriend and her expectant circle of friends. In much the same way that my grandparent’s generation might say that someone isn’t really off the market until they have a ring on their finger, this younger generation seems to think nothing is really real until it’s on Facebook. I think the underlying assumption is that if you keep something private, you must be hiding something. Only those that live their lives in public are perceived to be honest.
Many people my age (mid-30ish) are breaking up with Facebook itself. A growing number of acquaintances have simply cut the cord in favor of getting back to cultivating “real” relationships. They no longer want to spend all their time creating a virtual version of themselves that somehow appears to have far more fun than they do in real life. While I understand the motivation, I’m not sure it’s a response that will serve them well in coming years. It’s probable that today’s college student will be my boss in a decade or two. No one expects to find themselves old, antiquated, and too stubborn to change, but eventually it’s probably going to happen.
This younger generation values a cultivated honesty in their digital personae, and as they come into power, there will come a time when the values of this generation will matter to us all. In 2025, when the next “market correction” hits and my 25-year-old manager scrolls through the pages of my Facebook Resume (in 3D holograph mode, of course), what do I want in there? Is it better to objectively, dispassionately record my real life? Should I spend all my time tweaking my Facebook updates with high-power-phrase-indexing SEO keywords and hide my flaws? Aside from the fact that I should probably date women closer to my own age, what do you think?
I wrote a post for the MarketingProfs Blog last week about what’s happening in the world of media research, and have re-posted it here. See the original at:
Rethinking Normal: The Newest in Marketing Research From the ARF’s Annual Conference
I attend a lot of marketing conferences where I hear over-excited pitch people telling me all about The New Thing that will Change Every Paradigm Forever. So much over-enthusiasm can jade just about anyone, so it was with relief that I joined a much more sober group for their conference. I spent the last few days at the Advertising Research Federation’s (ARF) re:Think 2010 conference taking place in New York City. I found, however, that even here among the stodgiest of marketing researchers, there’s talk of … a paradigm shift.
In typically understated fashion, the theme of this conference is “The New Normal.” Apparently, the world went and changed without consulting us. Bob Barocci, president of the ARF put it thus, “If we accept the new normal, we have to change the way we think, and that’s hard and scary. We deny its existence, but can we deny its signals?”
Many of these signals of change are very real, but not really paradigm shifts in and of themselves. Some examples that came up during the conference are newer waves of immigrants are replacing older ones, consumer spending is down, and digital media fragmentation keeps making it harder for marketers to find captive mass audiences. Most of this is old news. What I find fascinating is how the entrepreneurs of marketing research are “thinking different” and responding to these shifts.
There tend to be two schools of thought on how to proceed in the new normal. Big spenders and traditional media stalwarts keep pursuing the elusive single metric that can be used to dispassionately, automatically, and accurately sort out the entire media universe into a range of good to bad investment choices. Smaller, more cash-strapped marketers tend to take a very different approach, and simply find whatever metric that suits them for the moment. In a presentation for Decipher on guerrilla marketing measurement strategies, Kristin Luck encouraged those present to get as creative with their measurement as they are with their marketing. She makes a convincing argument for using mobile to help track the spread and effectiveness of marketing materials by word-of-mouth. Luck advocates using tactics like survey widgets in mobile platforms, GPS tracking, pass along pings, tracking mobile bar code coupon redemption, and mobile downloads. While researchers can spend all day arguing about the continued relevance of metrics like GRPs, it’s clearly important not to let that stop anyone from taking advantage of all the new metrics at our disposal.
Back in the days of analog media, researchers made due with sample data generated by research companies. The complaint with research was more often that the data was too limited to reflect reality. These days, census level data tracks billions of people in real time, and now the problem is figuring out what to do with all this data. Most analytics departments are simply overwhelmed with the amount of data coming in. It was pointed out during a panel that Walmart collects data on 100 million transactions per hour. Traditional research methods simply aren’t designed to handle this level of data.
Stepping up to this challenge in a rather innovative way, Discover Card uses text mining software to analyze and quantify the content of their customer service calls and compare the frequency of certain kinds of complaints, and even compare this to the incidence of chatter about the brand on blogs and social networks. By converting speech to text and analyzing this huge source of data, Discover can get a much more accurate idea about whether a specific complaint is being generated by a few lone individuals, or if it truly is a widespread problem deserving of attention.
Two great examples of how qualitative research is adapting to this new norm of overwhelming data came from CBS’s David Poltrack and Global Park’s Dan Coates. Both have accepted the fact that “pure” research has its place, but a research program that incorporates itself into the data stream of everyday life makes a lot of sense. Global Park creates online communities for companies that act as CRM tools to keep customers engaged, while simultaneously mining the communities for product insight and feedback on marketing. Coates calls this approach “large-scale listening with accuracy”. He is also quick to point out that it isn’t representative of the general population, but the feedback is authentic in large part because “the community goes on even when the research stops; 95% of communication is repetitive management, 5% is real research probing the community.” He also points out that this approach can make research more enjoyable for the subject by making survey vehicles less boring.
CBS has taken this approach to a new extreme by opening a research facility in Las Vegas that’s as much about entertaining people as it is about research. Las Vegas is somewhat unique in that it’s possible to meet someone there who traveled from just about any town in the United States. At their research facility, they invite the public in to watch new material, running 700+ focus groups over 365 days per year. Poltrack calls this approach “immersion research.”
This combination of new streams of data, new methods of data collection, and more powerful processing tools mean that marketing research, like media itself, is becoming a real-time endeavor. As software takes up the challenge of collecting and processing data, researchers can focus more on finding insights and affecting business decisions. Who knew research could be so exciting?