Don’t Get Fooled By Bad Statistics

As a conversion optimization consultant, I look at studies and statistics all the time. Why? Because I’m committed to doing great work that is founded upon facts. My goal is to know the truth about conversion best practices, act upon the truth, and tell other people about it.

But, let’s face it, not everything you read is true. Your mom told you that. In the age of mass information, we’ve got to be more discerning than ever about what’s legit and what’s not. As optimistic as I am about the awesomeness of humankind, I suspect that some people twist truth in order to advance their interests.

I’ve worked hard to get at the facts, and to find out which facts are reliable. I’m going to share that information with you, because the better we can discover the truth about conversions, the better we can act upon it to improve our conversions.

As you read, research and learn about the conversion optimization industry, here is how you can know whether studies, metrics, data, and statistics are reliable. There are seven reliable signs that I’ll share with you.

1.  The statistic is backed up by a link to a study or research.

If you read some data, look for verification. Does the author prove what she’s saying? If so, you’ll probably see a link. Click on the link. Does it go to the actual study, or does it go to someone else who’s parroting the same information? If the link goes to a data source, good. If the link goes to someone else’s article making the same claim, be skeptical.

This is the most fundamental aspect of research. If you have a link, and the link goes to something legit, you’re on the right track.

2.  The data is backed up by the author’s own research.

When an organization or author does his or her own research and shares it, my trust level improves a little bit.

For example, CopyBlogger suggests that to improve the clickability of my call to action, I should do the following:

3deffect

Image source.

I want to know why they make such claims.

Well, here’s my answer, in the same article:

createdebate

Okay, that’s good. My trust level is medium now, because they told me that they did a study, and found a nice percentage increase.

I do have to keep in mind that this is just one study testing one change in one button. That’s not a whole lot of evidence to make me rush off to create red 3D buttons that say “argue now.”

Studies, even studies that an author or group conducts themselves, are good. Bonus points if they actually explain their research process, show all their data, and give us a possible hypothesis.

3.  The link goes to an accessible source.

Sometimes, the link that verifies a claim will go to a restricted source. In my research on psychology, I occasionally come upon studies that cost money.

For example, I came across a great article in the Journal of the Academy of Marketing Science. I’d love to cite this article, but I can’t dig into because I can’t afford to pay $40 for it.

journal

So, if I come across this link while I’m reading someone’s article, I might be a little bit skeptical at first. However, if I can determine the veracity of the claim based on the abstract, then my confidence level goes up. Here’s what I might read in someone’s article:

Some researchers think that people simply prefer to buy products that are made in their own country.”

I’m curious about this claim, so I click on the link:  http://link.springer.com/article/10.1177/0092070303257644

I go to the abstract, and read:

“This study uses a multidimensional unfolding approach to examine the preference patterns of U.K. consumers for domestic products and those originating from specific foreign countries for eight product categories. Results indicate that the observed variability in preferences is linked to consumer ethnocentrism.”

That is a fair analysis, even though it’s derived from a scholarly abstract. I can probably trust that claim.

4.  The writer uses disclaimers.

The more knowledge a person acquires, the more they feel the need to provide disclaimers.

If you are reading an article that makes a lot of bold, brash, big claims, you have a right to be skeptical. On the other hand, if the author takes care to offer disclaimers, it is more likely that the author can be trusted.

Here are some examples of disclaimers:

  • “Though it can’t be proven …”
  • “According to some sources …”
  • “In his opinion …”
  • “The team at DataCenter thinks that …”
  • “Some market analysts have hypothesized …”
  • “There hasn’t been enough research to be conclusive, but there is some indication that …”
  • “Even though not everyone agrees, it seems likely …”
  • “Based on the little research that I’ve done, it looks like …”
  • “His argument may not be airtight, but he does make a fairly compelling case for …”
  • “I came across around four SEOs who said the same thing, so I’d suggest …”
  • It’s impossible to make hard and fast conclusions from this data, but we can venture a guess …”
  • “This is what I’ve found to be true, but you should test it yourself.”
  • “There are statistics all over the board, but most of the research puts the number somewhere around …”

 

These authors aren’t trying to be flimsy. They’re just trying to be accurate. They’re avoiding brash claims, because they know that they data that they have isn’t 100% verifiable. A bit of honest relativism comes into play.

This type of bet hedging assures me that the author has done their due diligence. They’re not trying to cram falsehoods down my throat. They’re trying to show that it might be true, but assuring me that I’m smart enough to come to my own conclusions. This approach also assures me that the author himself has enough sense to do research and calmly offer some data.

5.  The studies and statistics are published by a reputable source.

One of the easiest ways to determine whether a number or claim is reliable is if it’s been published in a reputable source. If you’re a physician, and you want to research recurrent thrombosis in stent patients, you can read this article and be pretty sure that the information is good:

jama

The source is JAMA – the Journal of the American Medical Association. This is a reputable periodical with peer-reviewed studies, as indicated by the list of names below the title.

I have a list of nearly all the professionals in my industry who have a web presence. I know who can be trusted and who can’t. I have deep and wide knowledge about my niche. Therefore, I’m able to determine which sites have reputable information.

In each industry and niche, there are those who are authoritative. Your job is to find out which ones. This takes time and experience, but it can be done.

6.  The authors know their stuff.

Another way to determine whether information and data is worthy of respect is to find out a little bit about the authors. What personal basis do they have for making these claims? Most blogs have a bio section. Most professionals have a LinkedIn profile. Do some research on the author.

I read an article recently that listed these guys as authors:

george

If you read a bio, like this, your skepticism should be higher:

“Sunshine M” lives in California and loves to serf. She loves iPhones, kittens, and chamomile tea, and write blogs posts  in her free time.

seemslegit

Anyone can fluff some sweet-sounding ego-building verbiage. But some authors (first example) have university professorships and published works. I can even email them if I want to. This allows me to trust them to some degree.

I’m not so sure about Sunshine.

7.  The data is corroborated by other sources.

When I was doing some research about price points in conversions, I came across an article in The Journal of Economic Perspectives. That’s a reputable source, so that was good enough for me. I also noticed that the article was referenced 21 times.

bias

When a study is cited by other high-level publications and research, that lends validity to its claims.

In the case of the above article, I could track down exactly which journals and studies cited the research. It was impressive.

author

Not every source you find is going to have citation lists, but if it does, you’re in good shape. Also, if you find a data point that is cited by other reputable professionals, there’s a strong chance it can be trusted.

The further you get from the source, however, the less reputable the information becomes.

Here’s a fictional example:

  1. BlogCentral claims, “Never use the color green for your call to action! 90% of consumers hate the color green, says UCLA research.” You read this and think, “Hmm. UCLA, huh? That must be awesome.” But caveat lector! Trace the link in BlogCentral, and what do you find? A link to ProClickGroup.
  2. ProClickGroup claims, “According to some research from ColorPsych, citing UCLA research, as many as 9 out of 10 people would choose a color other than green, if asked to pick from a collection of 10 colors.” So you click on the ColorPsych link.
  3. Here’s what ColorPsych writes: “UCLA researchers presented lab rats with 10 colored feeding tubes. As many as 90% of the rats chose magenta, navy blue, black, purple, teal, white, marigold yellow, and dark gray. Only 10% of the rats chose lime green and forest green.
  4. So, you click on the UCLA research link. What do you find? An abstract to a locked article. The article was published in 1981. It costs $956 to unlock it. The title is “The Effect of Coloration and Dietary Impact on Lab Rats Using FD&C additives Blue #1, Blue #2, Green #3, Red #3, Red #40, Yellow #5, and Yellow #6. The abstract reads, in part, “Preliminary analysis is inconclusive, but suggests that the dietary impact of color additives is negligible, but possibly corollary effects have as their causality the possible preference of some colors.”

When you track the data back to its source, you find out that the BlogCentral claim is useless.

A Case Study

Let me provide a quick case study of how I go about things as I research:

When I was preparing this article, I wanted to find out just how many words we see in a day. So, I Googled, “how many words a day do people read?”

google

 

I clicked on the first link.

utne

There are a few things that stood out to me as I looked at the article:

  • I didn’t recognize the publication. That was a warning sign.
  • The data was old — 2009. However, it was ranking high in the SERPs, so it had some algorithm-based authority. That’s good.
  • The tagline of Utne, is “cure ignorance,” which means that they’re not just trying to make people laugh. They’re trying to inform people. That’s good.
  • There was a typo in the first line! When I went to look at that typo in the cited article, it wasn’t there. That means that the author was both careless in his typing, and his recording of the source information. That was a warning sign.

theguardian

  • The author cited New York Times and The Guardian. That was good.

 

Now, this wasn’t really an article. It was just some copied information from the Guardian. So, naturally, I went over to the Guardian’s site to see what the source article looked liked. Thankfully, the links worked (good sign).

pdadigital

The information in The Guardian was actually from some subsidiary content source, but it was still a Guardian product. So far, so good. Seeing a photo of the author helped to inspire trust.

pda2

I did notice, however, that the article had very few shares. None, actually. This wasn’t exactly confidence-inspiring.

linkeconomy

Definitely forgivable, though.

The content itself is numbers-driven. That’s good.

bilton

 

Numbers are comforting, but they need to be analyzed. When I take some time to read this article, I see that the author is trying to prove something — there’s an overload of information in the new media economy.

What data does she lay out to prove this claim? The number of links on major newspapers’ homepages.

Okay, but think about that. Do you click on every single link on a newspaper’s homepage? All those ads? All those videos? All those terms of service, user agreements, author profiles, social plugins, comment threads, “arts” headlines, S&P 500 quotes, Site feedback link?

Absolutely not.

Just because a newspaper has links on their homepage, this doesn’t prove that we have information overload. I’m not arguing against information overload.

I’m just questioning the way in which Bilton, the researcher cited, comes to such an assumed conclusion. His rhetorical statement is, “And we wonder why people have information overload of content.” This statement assumes, of course, that people have ‘“information overload of content.”

But, for all those numbers, he hasn’t proven it to me.

Bilton goes on to state that if you visit 200 web pages in one day, “you’ll see on average 490,000 words.” That’s a fair statement. But it still doesn’t prove anything. I visit way more than 200 web pages in one day. But do I read every single word on every single one? Absolutely not.

Just because I see words doesn’t mean that I read those words. I see lots of blades of grass outside my window right now, but I’m not examining each one.

So, let me circle back around to my original query: “How many words a day do people read?” In trying to trace down an answer, I came across a fairly reputable site, read some fairly nice-sounding words, but came to the conclusion that it wasn’t worth repeating in this article. Why?

Basically, it didn’t really answer my question. I don’t care if people see 490,000 words in a day. I care about how many words people read.

Maybe my query was messed up.

That’s the process that I go through whenever I trace down information. And it doesn’t end there.

I go on leapfrogging pilgrimages that carry me from link to link to link across the web, hoping to find the purported study or source of information. To figure out the answer to my question, “how many words a day do people read?” I might click on 40-50 links.

Conclusion

This article has been a bit different from my ordinary fare. I think it’s an important topic. Let me give you just a couple of takeaways:

  1. Be skeptical of everything. One of my mottos in life is “caveat lector” — reader beware. It’s Latin, so it sounds pretty smart. Basically, you need to be aware of everything you come across. Research stuff yourself. Have a healthy skepticism.
  2. Be a tester. I’m passionate about testing everything. I have the tools and ability to know how to test things. That means, I can verify or disprove data and claims myself, using my own research. I recommend that every web marketer know how to test things. Testing your own site is the way to get the best information.

 

That’s the price of research. It’s not easy, and it’s not always fun. But when you do seize upon a bit of information that’s backed by solid thinking, reputable research, and great authority, it feels really good.