Simplifying complexity: Bringing research in-house
Originally published on Research Live
While the fundamental principles of market research rarely change, it’s hard to deny that new tools and technologies have turned the industry on its head. What is now standard practice would have seemed like science fiction only 20 or 30 years ago.
Research before the data age
Back in 1994, I started my career in the data processing team of a London market research agency. At that time, I felt technologically advanced using Unix workstations to convert pen and paper questionnaires into numbers.
The number of data sources was relatively small, as were the number of touchpoints and the means of reporting. Survey responses would have to be converted into reams of tabulations and then sent to a single printer big enough to merit its own air-conditioned room. Once the tables of analysis were ready to be sent to the client (after days of cross-checking), it was time to schedule printing on the laser printer — but everything had to be certain, as running the printer cost a small fortune.
The client-facing teams were just starting to adapt to the data age, and if lucky, each team would share a PC between them running Windows 3.1. Email started being commonly used and the possibilities seemed endless.
The need to keep today's research systems simple and action-based
In the quarter of a century since, the change has, of course, been dramatic. The number of data sources, touchpoints, processing systems, aggregation and reporting tools would be unrecognizable back in 1994. Job titles that didn’t exist then, such as ‘data scientist’, are some of the most valuable graduates that a company can recruit. In this new environment, it can feel like we all need qualifications in machine learning, AI, and natural language processing just to do our jobs.
The result is that, given the importance of businesses embracing data-driven cultures, this level of sophistication is counterproductive. The more processing, people and systems involved, the more distant the needs of the end-user become from the original source of the insights. It’s easy to find yourself drowning in data while trying to understand what it means and what actions to take.
While the world we are trying to understand and influence is more complicated, we need to ensure our approach is as simple as possible
Our systems need to be fast and responsive, providing a smooth action-based user experience. Chief execs need simple dashboards and metrics that can help drive actions and identify priorities. Insights professionals need straightforward and navigable tools that do the number crunching for them.
At the same time, the simplicity of these systems should not come at the expense of data privacy. And yet, with more technologies, systems and potential data silos, it grows ever harder to ensure data is being used both effectively and ethically.
What’s remarkable is that, despite the wholesale change in the data and insights ecosystem, the core practices have remained largely unchanged. Most companies outsource all of their data collection and insight generation to third party vendors, not just to traditional research agencies — although these are common beneficiaries. This is a lost opportunity.
Companies need to understand the world in real-time. An online retailer or in-store manager cannot wait a few days or weeks during a busy period to know that one of their stores is underperforming. Consumers have unprecedented choice, allowing them to go elsewhere, compare prices, or check a competitor’s delivery terms, all at the click of a button.
The future of research: outsourced tools, in-house brains
The fastest growing and most successful companies in the last decade, and those that will win in the next, are those that embrace data in all of their decision making. They take full ownership of their data in all aspects, from generation to aggregation, dashboarding, reporting and action.
The traditional approach is completely contrary to this, with sub-contracted ownership of data and slower dataflow by design. Businesses need to reduce the distance between consumers and business decisions. They need to integrate data across their entire business — ultimately, they need to change their culture and own their data.
So, what does all this mean for your business? Well, my message is simple: Outsource the tools, not the brains, the process or recommended actions. Your company no longer needs a high-end (and high-priced) researcher to write a coherent questionnaire. With today’s technology, insights that deliver real business value can be driven from within your organization.
25 years on from my introduction to the data and insights ecosystem, I can now see that those of us in the research industry are finally changing our working model — and our clients will be the new beneficiaries. It’s all about simplifying complexity.
Download our guide to building agile in-house research teams
August 4, 2022
Improve data quality by using a commitment request instead of attention checks
July 8, 2022
Despite looming recession, C-suite leaders are more confident about their own companies
May 12, 2022