Every marketer knows the value of making data-driven decisions. Decisions backed by data are more effective, but that only works when the data is accurately collected and applied in the right context. It’s hard to interpret data correctly when you didn’t compile it yourself.
Two types of data
There are two general types of data published: market projections and data that comes from self-reported answers. It’s smart to listen to feedback from surveys, but you’ve got to be cautious about believing market projections.
Numbers are powerful and persuasive when presented. On the internet, data can be manipulated, altered, and slanted to fit almost any agenda. Inaccurate data isn’t always intentional. Even when you collect your data directly, it’s possible for misinterpretations to occur.
It’s wise to avoid relying solely on data for making vital business decisions. Data should be considered but set aside when it doesn’t make sense. Here’s why:
You can’t trust every statistic on the internet
If you gather data by searching Google for general statistics on your market, the data you collect might not be accurate, even when it’s from a leading research firm.
According to the internet, 73.6% of statistics are made up. That’s a statistic entrepreneur Mark Suster made-up to demonstrate a point he learned about made-up statistics. While sizing up the mobile handset markets in the U.S., Europe, and Asia, Suster was looking at reports from Gartner, Yankee Group, IDC, Goldman Sachs, and Morgan Stanley. All the reports showed vastly different data, so he decided to call the people who created them.
On his first call, he spoke with an analyst who wrote one of the reports. Suster asked how he got his projections, and the analyst said, “well, my report was due, and I didn’t have much time. My boss told me to look at the growth rate average over the past 3 years and increase it by 2% because mobile penetration is increasing.”
Suster grilled another source analyst whose projections were 3% higher than the other. The analyst admitted to interviewing manufacturers and corporate buyers to get a good storyline. Then the analyst arbitrarily estimated the future numbers. Suster asked, “and you had to show higher growth because nobody buys reports that just show that next year the same thing is going to happen that happened last year?” The analyst replied, “um, basically.”
The analysts Suster spoke with worked for some of the top research companies businesses rely on for market projections. The guesswork involved in their reporting process highlights the importance of performing market research first hand before making critical decisions.
Self-reported statistics are generally trustworthy
The point of collecting data is to make use of it. If it’s not accurate, it’s not useful. You can’t do much with projections derived from guesswork, but you can with self-reported data.
For instance, Millennial renters have the highest engagement on social media, says Green Residential, citing a study that revealed 89% of Millennials aged 18-29 are on social media. The data from this study isn’t an interpretation; it’s a collection of facts suggesting social media is where marketers should engage with Millennials. That’s an accurate and useful self-reported statistic. With these statistics, you might come to some interesting conclusion, or maybe even consider investing in top media buying companies, if the numbers suggest that is a good idea.
Statistics projecting a market’s growth are difficult to trust because nobody can accurately predict the future. In a sense, projections are like prophecies because when people believe them, predictions drive the market in that general direction. That’s how the stock market works, too.
Statistics that come from self-reported answers are more likely to be accurate because the data isn’t an interpretation. The numbers are pure.
Numbers don’t lie but interpretations might
We’ve all heard that “numbers don’t lie.” That may be true in isolation, but if there’s any possibility for interpretation, the numbers you’re working with could be misleading.
Before you drive decisions based on data collected by others, consider all possible options. If the data was interpreted in any way before being published, consider it a possibility rather than an absolute. If you can’t verify the data before taking action, it could be a false statistic.