April 13, 2017
The Truth About Email Panel Data
We’re often asked about our stance on the use of email panel data. As senders, we want additional data points, visibility, and signals. In a perfect world, email panel data would enable marketers to compare their engagement to that of their competitors within a contextually relevant group of subscribers. Collected ethically and without bias, this data could help accurately assess what actions lead to increased engagement, or conversely, an unsubscribe or spam placement.
Unfortunately, this isn’t a perfect world. In light of what we believe to be significant misinformation and half-truths floating around about currently available panel data offerings, we think it’s important to share why we believe the effectiveness and accuracy of these offerings fall critically short.
How today’s email panel data is fundamentally flawed
While email panel data is attractive in theory, we couldn’t recommend such a tool with a clear conscience given its current state: the geographical bias, its composition of non-primary accounts, the method in which the data is collected (from third-party applications designed to alter behavior by shuffling emails around), and the improbability of statistical significance. It is our opinion that given these circumstances, a data set whose reliability is so questionable that it should not currently influence business decisions.
One panel data provider, eDataSource, purchased a company called Boxbe to power its panel offering. Based on 30-day Alexa.com data, we believe that Boxbe may suffer significant geographical bias with over 38% of its traffic originating in India, followed by just 11% in the United States. On Google Trends, the regional bias is even more pronounced with a majority of search traffic originating in the Philippines followed by India (see chart below). We found it peculiar that consumers in these two countries expressed such heavy interest in Boxbe over the years, so we did a second search of the last 12 months. To our surprise, the search interest has shifted exclusively to India over the past year.
If Boxbe’s overall user base is even remotely consistent with the makeup of the traffic reported by Alexa or Google, the odds of this data having much overlap to your mailing list is likely low. That is unless of course, your mailing list is overwhelmingly based in certain regions of the eastern hemisphere.
The most glaring issue with Boxbe, at least according to its customers, is that it misclassifies emails and moves them out of the inbox. As one customer said, “… Boxbe decided to start scanning my messages again, which bounced them into my SPAM folder. Upon finding these lost messages, I returned to Boxbe and verified that all features were disabled, however, messages continued to get snagged by Boxbe.”
We almost couldn’t believe that a tool would report on open or deletion rates when the panel is so clearly biased, but its website boasts that Boxbe “empowers you to choose which businesses and contacts can reach your inbox. Anybody who isn’t on your guest list is automatically asked to verify that they are an actual person. This means less spam for you.” Still not convinced, we decided to create our own Boxbe account. Sure enough, it’s still moving emails out of the main inbox folder and into a separate folder and influencing behavior:
See the other folders labeled “OIB?” Those would be Organizer by OtherInbox, which is owned and operated by Return Path, and used as one of the data sources to inform their panel. In an article titled Consumer Network Panel Data: Just the Facts, Return Path revealed their panel composition as 76% dead or secondary email accounts. Yes, you read that correctly: of all the panelists, only 24% are primary accounts defined as “actively engaged.” And yes, the behavior of Organizer users is also being manipulated by the application.
The article also states that the 76% of secondary accounts are “true-to-life and representative of your subscriber list.” Believing that most consumers have separate accounts just for their junk mail seems like a broad assumption given the remarkable improvement of spam filters in recent years.
According to a Radicati Group study from January 2017, there will be more than 3.7 billion email users globally by year’s end. Based on reported panel data sizes of “over two million” by both eDataSource and Return Path, each panel covers approximately 0.11% of all email accounts. Of the extremely small sample, how much of the panel actually overlaps with your mailing list? Unless you’re one of the largest senders in the world, chances are it’s not much.
Suffice it to say, no serious marketer should make critical decisions based on such a small sample of secondary accounts, dead accounts, moved or reorganized messages, or geographical bias. That’s not rainbows and butterflies; that’s common sense.
How today’s panel-based engagement data is fundamentally skewed
If you’ve attempted to measure engagement or inbox placement based on one of today’s commercial panel offerings, it’s critically important to understand where a good portion of this data actually comes from: inbox organizing and productivity applications proven to fundamentally alter the email behavior of recipients. Let’s take a look at a few examples of these applications:
- Organizer by OtherInbox (owned by Return Path)
From getorganizer.com: “Organizer effortlessly and securely manages your email. Receipts, newsletters, and more are instantly organized into folders, saving your email inbox for what is most important to you.”
- Unsubscriber by OtherInbox (owned by Return Path)
From getunsubscriber.com: “After you sign up, an Unsubscribe folder is created in your inbox. Drag in unwanted email and Unsubscriber will block email from those senders from entering your inbox.
- Boxbe (owned by eDataSource)
From boxbe.com: “Boxbe will prioritize all of the mail and messages from only the people you trust.” They go on, “Boxbe empowers you to choose which businesses and contacts can reach your inbox. Anybody who isn’t on your guest list is automatically asked to verify that they are an actual person. This means less spam for you.”
- Organizer by OtherInbox (owned by Return Path)
It is our opinion that applications designed to reshape behavior have no business measuring behavior. Think about that for a moment. In simply reading the text from these websites, it seems that they are moving, reclassifying, folder-ing, and in some cases, outright blocking messages from being viewed. We’ll ask again: in what scenario would that possibly be an acceptable measurement of engagement, inbox placement, or statistical significance?
Next time you’re told that your read rates are better or worse according to a panel, remember: of course they are! Here’s a simple equation to keep handy:
Engagement by a Gmail recipient on Gmail
Engagement by a Gmail recipient on a third-party email organization application designed to fundamentally change behavior
Seeds don’t open, click, complain, etc., and that’s a good thing
In our experience, when comparing panel and seed data, the question most people are trying to answer is “what will happen next time I send to a new recipient at a particular mailbox provider?” Email panel data fails to answer this question accurately, especially at providers who employ user-level filtering. And unless you’re one of the largest senders out there, chances are the sample size is not large enough to be statistically meaningful to you.
To deter the focus away from these facts, some panel providers have attempted to paint seed testing as statistically inferior because it doesn’t factor in engagement filtering. This argument is comically short-sighted for a few reasons:
- Since seed accounts have no previous engagement activity, seed results are a more accurate data point in mimicking what is likely to happen when you email a new contact for the first time.
- Not all providers are solely focused on engagement. In the case of global filtering, seed data is an important signal in determining spam filter disposition or identifying a problem, such as blocked messages, which is a blind spot in panel data.
Is seeding alone a complete solution? No, and nobody here is arguing that. We believe it is most powerful when paired with analytics and engagement data (real engagement data from your customers, not secondary or dormant panel accounts representing a small subset of your list) to achieve a comprehensive view of performance.
Speaking of opens, clicks, and unsubscribes, you know who does do those things? Your customers do, which you could learn from your analytics tools, especially if the tools utilize data from your email service providers.
In our opinion, currently available panel offerings are jacks of all trades, masters of none. In other words, they don’t excel at anything they purport to do. They cannot provide a consistent baseline for new subscriber deliverability, nor can they provide nearly the holistic view that a sender’s own analytics can when it comes to engagement.
Sound the alarm and chase the ambulance!
Have you ever received a call or a fancy PowerPoint presentation from an overly ambitious salesperson claiming your inbox rate is low based on email panel data? We know many who have. In fact, just last week a Fortune 100 company was told they had over 40% spam placement at Gmail when they know their inbox rate to be near perfect.
We like to call this “ambulance chasing.” Where’s the context? How large was the sample size? Five recipients? 1,000 recipients? Is this even a real problem, or more of a reflection of the few users included in the panel?
In our experience thus far, the more digging you do and questions you ask, it quickly becomes apparent that today’s email panel data is inaccurate at best. At worst, it can lead marketers down a rabbit hole of trying to solve a problem that doesn’t really exist. We see many senders falling prey to this type of ambulance chasing, but it’s often easily resolved by walking through their analytics data and looking for meaningful changes in engagement. We’ve seen quite a few of these presentations, and we’ve yet to see one that is remotely accurate.
So what now?
Again, we believe that email panel data is attractive in theory, but not in today’s practice. We fully support data and tools that increase transparency and improve the email sender-receiver relationship, when these tools are unbiased and can be consistently relied upon. We simply don’t believe the current generation of offerings are accurate, nor effective, tools for marketers. Given what we now know about these data sets, we challenge the industry put a stop to the spread of inaccurate information. And for the consumers of this data, we recommend taking a more objective approach to how you use and interpret this data given the way it’s collected and uncertainty of its accuracy.
You may also like...
Welcome to How The Top 500 Internet Retailers Collect Email Sign-ups (2016), an analysis of how retailers promote their programs, leverage mobile optimization, use social sign-ups, capture personal data, and more. In addition, we have shared some year-over-year trend insights compared to How The Top 500 Internet Retailers Collect Email Sign-ups (2015). Let’s dig in. […]
We reviewed the top 500 internet retailers to analyze their email collection practices and sending habits. Check out some of the trends we discovered while analyzing over 1,000 websites owned by the internet’s top retailers.
It’s important to measure and compare your delivered rate to your inbox rate. What’s the difference? Let’s say your email service is reporting 90% deliverability with a 10% bounce rate. Then you run your campaign through your deliverability service and it reports the same 10% bounce/missing rate, but 72% inbox placement and 18% spam placement. Both look […]