Questions get a bad rap. Everyone is so obsessed with answers. We fawn over people who are smart, who have all the answers.
You know who has all the answers? Robots have all the answers. Like Skynet. In the “Terminator” movies, the robots’ answer was to get rid of humans.
Answers have had their day.
We seem satisfied with almost any answer, often because we’re so relieved that someone was brave enough to offer one. This is what we have called leadership up to now. Making a decision is more important than making a good decision. Experts don’t need to research questions. They’re experts.
We are so obsessed with answers that we forget to examine the questions we ask.
“Can I have a fish?” vs. “Can you teach me to fish?”
The only questions we prize seem to have no answer, like, “WTH?”
When data was cheap, it was OK for our executives to simply deliver an answer, even though this degraded it to opinion. If you’re paid enough money, your opinion qualifies as an answer. Data has gotten cheap. It’s easy to collect data to help us make decisions.
This leaves executives in a very awkward position. They were the ones with the answers. But in a world as complex as marketing and advertising, there are no answers. There are only pointers and clues — guideposts through the unpredictable workings of the human mind.
When we ignore our ability to collect data, we are ignoring these guideposts.
It’s time to fall in love with questions.
Questions come with their own coat hanger. Or is it half of a light bulb?
When you’re optimizing a campaign or website, you’ll find that you can achieve the most successful outcome through generating questions.
Collecting questions
There are two primary ways of generating questions: exploratory and exploitative. In the course of optimizing a campaign or website, you will move between these two states frequently.
But move carefully. Each has its place.
I wonder…
When you start by saying, “I wonder,” you elicit exploratory questions. These questions that flow from “I wonder” are often qualitative in nature, rich in subjectivity.
Isn’t that exciting?
I wonder if the copy or the offer could be improved.
At some point, we will end our exploration and begin the next phase of questioning.
I think…
“I think” questions are exploitative. These questions usually come after researching explorative questions.
“I think” questions lend themselves to data collection, as they can be disproved. This makes them more quantitative in nature.
“I think the world is flat.”
Avoid starting with ‘I think…’
As a marketer, you may find yourself jumping past your exploratory “I wonder” stage and right to “I think.” It is the plight of the oppressed and overworked marketer that the organization starts with “I think” questions.
Buy yourself some time. Do 15 minutes of meditation or a half-hour of yoga, and then ponder your “I wonder” questions before jumping to “I think.”
For example, the conversion optimization industry is maturing, and at one point, we thought it might be time for a more corporate (i.e., boring) approach for our website.
Our “quirky” website performed well for several years.
We started with an “I wonder” question, asking:
“I wonder if our quirky website should be more corporate?”
This exploratory question could easily have morphed into an exploitative, “I think we need to make the site look more corporate.” Many a redesign has been launched on so little thought.
This led to several exploratory questions:
“Do prospects prefer our competitors’ approach?”
“Could we ask our visitors if the site conveys professionalism and authority?”
“Does having a spokesperson (me) make us seem too small?”
Each of these questions can be researched quickly and cheaply.
We decided to collect some data on this. We ran a preference test of our home page and the home pages of our largest competitors using UsabilityHub. We asked 50 visitors, “Which of these companies would you feel most comfortable hiring to help improve the sales performance of your website, and why would you choose it?” This is qualitative research primarily, but it’s inexpensive and can be collected quickly.
We found out that our site was preferred by only 2 percent of the participants as compared to our four biggest competitors. Ouch.
We then created a more corporate-looking mockup of what our site might look like. We did the test again.
A more corporate look was preferred by user test participants.
This time, our site was preferred 16 percent of the time. That’s progress.
When we looked at the participants’ comments, we found that many thought the company logos found on competing sites lent them some credibility. This created an exploitative statement:
“I think that we should add colorful client logos to our home page.”
This led to the question:
“Will client logos on our home page make people more willing to spend money with us?”
We knew how to collect some data on this. So we did a mockup of our corporate-looking home page with logos right at the top and ran another preference test.
Even though users said they liked customer logos, they didn’t prefer this treatment of our home page.
This version was preferred by 10 percent of the new participants, a statistical tie with the previous 16 percent because of the small sample sizes (n=50). No magic bullet here.
We then thought, “I think we should try a very different headline.”
An “over-the-top” headline didn’t change things statistically.
This didn’t significantly move the needle in a new preference test. We’ll try a few more headlines. And eventually, we’ll shift to a more statistically significant data collection method: A/B testing.
Stay with the questions
What this comes down to is that when we stay with our questions, we get better at identifying sources of data to support or discard those questions. And rather than look for answers, we begin to embrace the additional questions that inevitably come up.
“I wonder” how many readers will find new freedom in holding onto their questions?
コメント