Archive

Posts Tagged ‘marketing research’

Customer-First Marketing: A conversation with Wharton, MarketingSherpa, and MECLABS Institute

August 18th, 2017

One of my favorite music videos is “No Rain” by ‘90s band Blind Melon. In it, a young girl dressed in a bee costume roams around her town, clearly misunderstood by everybody she encounters.

Until …

One day …

… bee girl encounters an entire field full of people in bee costumes. She had clearly found her tribe.

I’ve seen that same delight when those engaged in customer-first marketing and customer-first science meet. And I certainly felt it myself getting to work with Catharine Hays for a few months on the Beyond Advertising: Creating Value Through all Email and Mobile Touchpoints webinar.

Hays is Executive Director of The Wharton Future of Advertising Program and recently interviewed myself along with Flint McGlaughlin, CEO and Managing Director of MECLABS Institute (the parent research organization of MarketingSherpa) on Marketing Matters, a show she co-hosts on the Business Radio channel on Sirius XM powered by the Wharton School.

If you’re a fellow traveler on the path of customer-first marketing and customer-first science, listen to the recording of the radio show below. Or read the below transcript (I called out key concepts with bolded headlines to allow for easy skimming). I hope you feel that same delight of finding your tribe.

And if you do, feel free to let Flint or myself know through Twitter — @FlintsNotes and @DanielBurstein   — since we won’t be able to hear you shouting in agreement through your headphone or speakers.

Editor’s Note: The audio recording of this interview is no longer hosted on SoundCloud, but you can read the full transcript below.

(originally aired on Sirius XM Channel 111, Business Radio powered by The Wharton School)

We begin with a little background on MECLABS Institute and MarketingSherpa

Catharine Hays: You’re listening to Marketing Matters on Business Radio, powered by the Wharton School.

Welcome back. This is Marketing Matters on Sirius XM’s Business Radio 111. I’m Catharine Hays. I’m the Executive Director of the Wharton Future of Advertising program here. And we’re going to shift gears a little bit to welcome our next guests.

Really, the theme of the show today has been on customer-first marketing, really putting the customer at the front of your marketing and putting the individual, rather than thinking of them as a consumer. So, we spent the last hour really kind of honing in on the Hispanic market and with our last guest, talking about really seeing them from a cultural lens and how open or closed they are to cultural influences, new and old. So, that was pretty interesting.

So, what we’re going to do next is shift gears a little bit, but still have this theme but talk about it more broadly with two wonderful guests. First, we have Flint McGlaughlin. He’s the CEO and Managing Director of MECLABS Institute. Welcome, Flint.

Read more…

Email Personalization: 137% increase in open rate from personal note approach

November 27th, 2012

Here’s a behind-the-scenes look at a recent email marketing send to promote a MarketingSherpa webinar about social media, sponsored by Eloqua. I wanted to share it with you, because while it was quite simple to do, the results were pretty impressive.

Before we get into it, I want to stress that this was not intended to be a valid A/B split test (there is a validity threat that I’ll get to in a moment), so take the results with a grain of salt. However, it is a good example of sending different versions of an email to different segments of a list. For that reason, this is a tactic we do think is worth trying (and perhaps testing) with your own lists.


TEMPLATE VERSION

From: MarketingSherpa [reply@reply.marketingsherpa.com]

Subject Line: [Webinar] 4 steps to drive a measureable social strategy

 

Click to enlarge

Read more…

Social Media Marketing: Social login or traditional website registration?

January 12th, 2012

Janrain, a social Web user management platform provider, recently released its Social Identity study with the research conducted by Blue Research.

The study involved a final sample size of 616, with respondents recruited by email and screened to ensure they either purchased a product online within the past 30 days, or read articles or watched video from major media outlets in the past 30 days.

A key element of the survey was finding out how respondents felt about using a social login — Facebook, LinkedIn, Twitter, etc. — instead of having to register individually at multiple websites.

Some of the results were very interesting:

  • 86% of respondents reported being bothered by the need to create new accounts at websites and said they would actually change their behavior:

–        54% might leave the site and not return

–        26% would go to a different site if possible

–        6% would just simply leave or avoid the site

–        14% would not complete the registration

  • 88% admitted to supplying incorrect information or leaving form fields incomplete (this result should come as no surprise to marketers). This figure is up from 76% in last year’s study
  • 90% admitted to leaving a website if they couldn’t remember their login details rather than taking the time to recover their login information. This figure is up from 45% in 2010

The study also found that even though website visitors are becoming more frustrated with traditional marketing, they are becoming more open to using social identities for website registration.

In fact, 77% responded that social login is “a good solution that should be offered,” with 41% preferring social login over creating a new user account or using a guest account.

 

Click to enlarge

 

Among that 77%:

  • 78% of social login fans have posted a comment or message to their social networks about a product or service they liked or thought others should know more about
  • 83% reported being influenced to consider buying new products or services based on positive social media comments
  • 69% report positive reviews might increase their likelihood to purchase a product or service
  • 82% seek out, or avoid, companies based on social media reviews

 

That’s a lot of pretty numbers, but what do they mean for marketers?

To help put this research into a marketing context, I had the chance to interview Larry Drebes, CEO, Janrain. Here is the result of that interview:

  Read more…

Social Media Marketing: Tactics ranked by effectiveness, difficulty and usage

April 26th, 2011

I’ve been browsing the new MarketingSherpa 2011 Social Marketing Benchmark Report this week and soaking up the rich data. One of the first charts that struck me is a bubble chart on social marketing tactics.Social Marketing Tactics Chart 2011

First, I want to say, I love these bubble charts. They provide a three-dimensional view of the data on a given topic. Our researchers do a great job of packing them full of information without making them confusing.

This chart graphs the effectiveness, difficulty and popularity of each social media marketing tactic. You’ll notice a clear positive correlation between a tactics’ level of difficulty and its level of effectiveness.

Hard work pays off

For those of you who have not brushed up on your statistics lately (as I just brushed up a moment ago) I will note that a positive correlation between two factors means that as one factor increases, the second factor increases. For example, there is a positive correlation between my consumption of ice cream and the temperature outside.

Looking at this chart, it’s clear that the most effective social marketing tactics are also the most difficult, and vice-versa. Blogger relations — the most effective tactic reported — is also the only tactic to break into the 70%-range in terms of marketers reporting it as “very” or “somewhat” difficult.

You’ll also see that the three most-effective tactics — blogging, SEO for social sites, and blogger relations — are known to require significant amounts of time and effort before results are shown.

Every tactic is somewhat effective

Take a look at the scale on this chart’s Y-axis (level of effectiveness). Those listed percentages correspond to the number of marketers who reported a tactic as “very” effective. What they do not include are the marketers who reported a tactic as “somewhat effective.”

Looking at the chart, you might guess that adding social sharing buttons to emails is a waste of time — but don’t be too quick to write this tactic off completely. Only 10% of social marketers reported it as “very effective,” but 55% rated it as “somewhat effective” (found deeper in the report). With a total of 65% of social marketers reporting at least some effectiveness, these buttons might be worth the small investment they require.

Also, since adding social sharing buttons bottoms-out the Y-axis here, every other tactic listed has more than 65% of social marketers reporting at least some effectiveness. Here are some examples:

  • Social sharing buttons on websites: 69% say at least “somewhat” effective
  • Advertising on social sites: 73%
  • Microblogging: 75%

Related resources:

MarketingSherpa 2011 Social Marketing Benchmark Report

Free Webinar: Best Practices for Improving Search and Social Marketing Integration

Marketing Research Chart: Using social media as a list-growth tactic

Inbound Marketing newsletter – Free Case Studies and How To Articles from MarketingSherpa’s reporters

Marketing Research: How asking your customers can mislead you

February 25th, 2011

In a recent blog post for our sister company MarketingExperiments, I shared my experiences at the fifth Design for Conversion Conference (DfC) in New York City. Today, I want to focus on a topic from Dr. Dan Goldstein’s presentation, and its relevance to usability and product testing for marketers — how focus group studies can effectively misrepresent true consumer preferences.

Asking you for your input on our Landing Page Optimization survey for the 2011 Benchmark Report has firmly planted the topic of surveys at the forefront of my thinking.

Calibration is not the whole story

The need to calibrate focus group data is well recognized by marketers and social scientists alike. The things marketers want to know the most – such as “intent to purchase” – is more obviously susceptible to misleading results. It’s easy to imagine that when people are asked what they would do with their money in a hypothetical situation (especially when the product itself is not yet available), naturally their answers are not always going to represent actual behavior when they do face the opportunity to buy.

However, mere calibration (which is a difficult task, requiring past studies on similar customer segments, where you can compare survey responses to real behavior) is not enough to consider. How we ask the question can influence not only the answer, but also the subsequent behavior, about which the respondent is surveyed.

Dr. Goldstein pointed me to an article in Psychology Today by Art Markman, about research into how “asking kids whether they plan to use drugs in the near future might make them more likely to use drugs in the near future.” Markman recommends that parents must pay attention to when such surveys are taken, and make sure that they talk to their children both before and after to ensure that the “question-behavior effect” does not make them more likely to engage in the behaviors highlighted in the surveys. The assumption is that if the respondent is aware of the question-behavior effect, the effect is less likely to work.

Question-Behavior Effect: The bad

If your marketing survey is focused on features that your product or service does not have—whether your competitors do or do not—then asking these negative questions may predispose your respondents against your product, without them even being aware of the suggestion. This is especially worrisome when you survey existing or past customers, or your prospects, about product improvements. Since you will be pointing out to them things that are wrong or missing, you run a good chance of decreasing their lifetime value (or lead quality, as the case may be).

Perhaps the survey taker should spend a little extra time explaining the question-behavior effect to the respondent before the interaction ends, also making sure that they discuss the product’s advantages and successes at the end of the survey. In short, end on a positive.

Question-Behavior Effect: The good

However, there is also a unique opportunity offered by the question-behavior effect: by asking the right questions, you can also elicit the behavior you want. This means being able to turn any touch point—especially an interactive one like a customer service call—into an influence opportunity.

I use the word “influence” intentionally. Dr. Goldstein pointed me to examples on commitment and consistency from Robert Cialdini’s book Influence: Science and Practice, such as a 1968 study conducted on people at the racetrack who became more confident about their horses’ chance of winning after placing their bets. Never mind how these researchers measured confidence—there are plenty of examples in the world of sales that support the same behavioral pattern.

“Once we make a choice or take a stand, we will [tend to] behave consistently with that commitment,” Cialdini writes. We want to feel justified in our decision. Back in college, when I studied International Relations, we called it “you stand where you sit”—the notion that an individual will adopt the politics and opinions of the office to which they are appointed.

So how does this apply to marketing? You need to examine all touch points between your company and your customers (or your audience), and make a deliberate effort to inject influence into these interactions. This doesn’t mean you should manipulate your customers—but it does mean that you shouldn’t miss an opportunity to remind them why you are the right choice. And if you’re taking a survey—remember that your questions can reshape the respondents’ behaviors.

P.S. From personal experience, do you think being asked a question has influenced your subsequent behavior? Please leave a comment below to share!

Related Resources

MarketingSherpa Landing Page Optimization Survey

Focus Groups Vs. Reality: Would you buy a product that doesn’t exist with pretend money you don’t have?

Marketing Research: Cold, hard cash versus focus groups

Marketing Research and Surveys: There are no secrets to online marketing success in this blog post

MarketingSherpa Members Library — Are Surveys Misleading? 7 Questions for Better Market Research

Marketing Research and Surveys: There are no secrets to online marketing success in this blog post

November 23rd, 2010

“Would you like to hear a secret? Do you promise not to tell?” John, Paul, George and Ringo knew how powerful secrets are, as does every Internet marketing “expert” who has ever written a blog post.

Well, I’m sorry, but MarketingSherpa and MarketingExperiments don’t have any secrets to share with you. The only effective strategy I’ve ever seen is hard work and experimentation. Not only do we not have secrets for you, we don’t really even have any answers. But, we can help you ask the right questions.

Question everything

“My mother made me a scientist without ever intending to. Every other Jewish mother in Brooklyn would ask her child after school, ‘So, did you learn anything today?’ But not my mother. ‘Izzy,’ she would say, ‘did you ask a good question today?’ That difference, asking good questions, made me become a scientist.”
– Nobel laureate Isidor Isaac Rabi, discovered nuclear magnetic resonance

And do we ever raise those questions. Like a recent article by Senior Reporter Adam T. Sutton, Are Surveys Misleading? 7 Questions for Better Market Research. When Adam first showed me the article, I knew it would be a little controversial, so I pushed him a little harder than normal in the editing process. Look at the results, and I think you’ll agree that Adam delivered. (If not, I want to hear about it.)

I was a little surprised that the biggest challenge came from within my own company, though. MECLABS Director of Research, Sergio Balegno, questioned the article’s affront to online surveys. Sergio’s a smart guy, so when he says something I listen. And I think he’s right. Well, kinda…

When online surveys are effective

For the kind of surveys Sergio’s team conducts, I believe surveys to be very effective. I use his team’s research all the time in trying to decide what content would be the most helpful for  MarketingExperiments’ and MarketingSherpa’s audiences.

click image to enlarge

Control

The above referenced article, from a recent Chart of the Week email newsletter, questions B2B marketers about the SEO tactics they are currently using. Sergio and his team are not asking about a vaguely potential and highly personal decision somewhere down the road; they are simply asking which SEO tactics B2B marketers use, which were the most effective and which required the greatest level of effort? And here’s where you can learn from Sergio.

I believe surveys can be effective for:

  • Gaining insights into current actions
  • Deciphering opinions on specific subjects that the audience has a high-level knowledge about
  • Getting some new ideas (essentially, crowdsourcing)

When online surveys are not effective

“Would you buy a product that doesn’t exist with pretend money you don’t have?” Yeah, there’s the rub…

Online surveys do not accurately predict actual customer behavior. Or, do they? Frankly, it’s just a shot in the dark. Your goal should be to try to truly gain knowledge about real-world situations that require complex, often counterintuitive decision-making processes that your subject may not even understand. Would a few questions on a Web page really help you gain that knowledge?

Online surveys are not effective when you’re trying to decipher:

  • Potential consumer actions (such as a purchase)
  • Potential B2B marketer purchase decisions very early in a sales cycle (too many variables)
  • Highly sensitive information (if you disagree with this statement, please share your past three sexual experiences in the comments section of this blog)
  • True sentiment on a complex topic that the survey respondent does not have expertise in. For example, 58 percent of Americans favor repeal of the new health care law, according to a recent Rasmussen Reports survey. Meanwhile, in a CBS/New York Times Poll, 41 percent of Americans favor repeal (stop and think about that for a second); and when people were actually told what features would be given up if the law is repealed, that number dropped to 25 percent.

Let’s do a little thought experiment, shall we? Write the answer to this question down on a piece of paper and bury it in your backyard… “How likely are you to buy each of the following in the next 12 months: regular mayonnaise, light mayonnaise, mayonnaise with olive oil, canola mayonnaise, low-fat mayonnaise?”

Now go leave yourself a reminder on Outlook for November 23, 2011 that says, “Dig up mayonnaise survey.” So, how accurate were you Carnac the Magnificent?

Only you can discover the marketing tactics that work best for your company

OK, I was a little too fresh up there, sorry about that. But I’m trying to help you understand this simple point (to annotate MasterCard)…there are some things in marketing that can’t be observed, for everything else try an online survey.

If you can’t observe the information you seek to obtain and there is a strong likelihood that your subjects know the answer, then a survey could be very helpful. In the example chart above, you likely could not observe the SEO tactics of 935 marketers and see into their brains to determine the effectiveness and effort required. Those respondents also likely know what SEO tactics they used, how well they worked and how much effort they required.

However, when you’re looking at potential customer actions, don’t try to ask prospective customers to predict what they might do under fictional, hypothetical circumstances. From the number of times I’ve asked my wife why she bought those shoes, believe me when I say she likely doesn’t know the answer herself.

Instead, simply observe their actual actions. And you can do that with real-world, real-time online testing.

After all, that is the real goal of all the information we provide. Again, we don’t write about secrets to Internet marketing success on MarketingExperiments and MarketingSherpa, and very rarely even give you any answers.

But we do help you ask the right questions and then do the experimentation (and hard work) necessary to determine what works best for your organization.

Related resources

2011 Email Marketing Benchmark Report

2011 B2B Marketing Benchmark Report

Ask the Scientist: Price testing methods and practices

Anti-Crowdsourcing: On (not) getting marketing ideas from your customers