Hendon, near the end of the Northern Line, was our destination on Thursday afternoon as a colleague and I visited Middlesex University to give a lecture to second year undergraduates. The subject: Survey research in the ‘real world’.

It proved a fascinating insight into the disconnect between the theory behind the work that we do, and the way that we – in practice – conduct research studies. The students had diligently been taught the importance of random and representative samples, ideal sample sizes, randomised control trials etc. but it’s rare when one has to chance to apply the perfect theory to a research study design. I suppose you need to know the rules before you can determine what you can get away with breaking.

In anticipation that the lecture may go down this path we had talked to a few colleagues about what practicalities you have to consider in the ‘real world’ that may obstruct your path to the ideal piece of research. And we came up with the below, our ‘patchwork of real world practicalities’; patchwork not just because it made for a pretty slide, but because all these practicalities are ultimately interconnected and interdependent.

We ordered them in a hierarchical fashion: as shown on the top row of the chart below budget, timescale, access to sample and capacity to undertake the work are – to my mind – the key factors affecting whether or not particular approach is possible. If one of these doesn’t quite work, your project will fall down, pretty quickly.

Patchwork of real world practicalities

Our next row covers population information, data quality, ethics and the client. I could go into detail on all of these, but appreciate readers have better things to do (second year Middlesex undergraduates weren’t afforded the same luxury). Without appropriate care, all can serve to derail or undermine a project (naming no client names…), and some of their effects are less visible than others. Pausing on data quality, it’s scary how much we take for granted statistics that are bandied around in by politicians or in the media, when we know little about how those figures were arrived at. People interpret questions in various ways, apply various biases, and often suffer from recall issues, but the reporting of any figure will in its simplicity obscure such concerns.

And our final row considers other practicalities affecting delivery. There are often externalities out of anyone’s control: a number of our public sector projects are currently experiencing a period of purdah, meaning we cannot conduct fieldwork until the General Election is over.

The remaining practicalities all relate to people:

  • we are reliant on the goodwill of people to respond to our surveys, and therefore ensure healthy response rates that boost our sample robustness;
  • we are affected by the various stakeholders applying pressure to our direct client that typically result in a study trying to answer multiple, and often competing objectives;
  • we are reliant on having a team in place that is capable of delivering the project, and just as critically one that we enjoy being a part of.

So there we have it, we hoped that through the lecture students would gain an insight into how the reality of conducting research can depart quite dramatically from what the textbooks say. And we also hope that we didn’t completely turn them off from a career in research!

Do you agree with our patchwork of practicalities?

Written by Sam Whittaker, Senior Research Executive and Andrew Skone James, Director in our Higher Education team.