Phil Hearn: Blogger, Writer & Founder of MRDC Software Ltd.

Efficient vs inefficient market research data processing

I am sometimes shocked to see how inefficiently some market research data processing tasks are carried out. Every business has some inefficiencies where the wrong people are being asked to perform a task, the wrong tools are being used, or the wrong approach has been adopted. The ‘price’ of these inefficiencies can be anything from a few pounds or a few hours right through to a task taking ten times – or, perhaps, even more – longer than it could do.

This guide is broken into two easy-to-follow parts. Firstly, things that can go wrong and, secondly, how to avoid these problems. So, let’s get started.

Things that can go wrong

Types of projects that often go wrong
  • Big (and, in fact, small) tracking studies
  • Big sets of data
  • Large volumes of tables/charts/reports
  • Complex tasks
  • Repetitive requirements/outputs
  • Merging of data sets
  • Repetitive data (sometimes called data loops) such as diaries, trip-based data records, eating occasions
  • Unusual data file formats (either as input or required as outputs)
  • Non-standard data delivery requirements
My lightbulb moment

Many years ago, when I was learning to program, I spent about three days programming a data management and analysis task in Visual Basic. I was delighted when my program worked perfectly and produced correct results – I awaited a pat on the back. Sadly, someone looked at my program code and showed me in about 20 minutes how I could have written the program more efficiently. He had taken 20 minutes to do what I had achieved in three days. I suddenly realised that inefficiency was not about making a difference of 5% or 10%; it could be massive.

DP inefficiencies can inflate costs exponentially

The fact of the matter is that poorly handled tasks in data processing can increase staff times – and, of course, costs hugely. Badly handled data processing or programming tasks are more prone to enormous penalties for approaching wrongly than many business tasks. This maxim certainly applies to my industry, market research, where many variables may need to be analysed in depth.

Wrong people

One of the reasons that data processing tasks go over budget is that the wrong people are being asked to carry out a task. This may be due to a lack of training in handling something more complex than a person or team has dealt with before, or it could be that a task needs an expert. I have seen staff struggle over a complex task (and often fail) for hours when an expert can solve the problem in minutes. I often find people working in data processing have a dogged determination to work through difficulties. Admirable though this is, it can mean an inefficient process arises, which will have penalties later. Or, more simply, someone spends many times more on a task than they need to.

Wrong software

The issue of using the wrong software to carry out a data processing task is equally important. Many software packages have a way of getting around a problem, but if this means jumping through too many hoops, it will not only mean that the task is carried out inefficiently, but it may also mean that it is prone to error or will be difficult to manage or change in the future. I could name more than one company that, in my opinion, has persisted with an unsuitable product rather than spend less than £1000 on a suitable one, thus saving thousands in staff costs.

Wrong approach

There are two types of data processing tasks where I see wrong approaches taken. The first, unsurprisingly, is where there is a complex requirement outside a company or team’s experience. However, equally, I have seen relatively simple but large, often repetitive, tasks handled ineffectively. Something big probably needs the right tools to make that task easy.

Digging holes

Perhaps the worst problem is ‘digging holes’. I have witnessed companies embark on major projects that need specialist knowledge to handle efficiently. The team tends to soldier on and succeeds in getting correct results out for a while, but as the project continues, it grows and grows to a point where it becomes almost impossible to manage. To make this scenario worse, undoing all the work that has gone into surviving for a few months becomes a nightmare to unravel. Starting again may be the only sensible decision, but that’s not easy to tell a client, expecting correct results – and now!

Prototyping is often seen as a wasted effort

One thing that is often underestimated is the benefit of prototyping. In the engineering world, it is standard practice to build prototypes, but in market research data processing, I have rarely seen prototypes built to test processes for major projects. Why not? Is market research so time-dependent, or is it better to take the risk of a process failing?

Tracking studies need planning

Perhaps the most common type of mishandled project is a research tracking study. Clients often promise that the questionnaire and analysis required for a tracking study will not change – or, if it does, it will not change much. The reality is that for most projects, the questionnaire and the analysis will change more frequently than expected. This can cause problems. If different software systems are being used to collect and analyse the data, a major data management problem may also arise.

Not many software packages handle tracking studies efficiently

 

Few software packages are geared to handling changing questionnaires, changing data maps and different analyses over the duration of a typical tracking study. Add to this the fact that there may be data calculations, weighting, changing variables and more, all dependent on the other changes cited. Most software packages are not able to handle these problems. Or, if they can, it is by clumsy recoding processes, which are time-consuming to carry out and highly prone to error.

A brief case study

An example of where you need the right tools

I was asked to help a client with a tracking study in the mobile phone market where the questionnaire changed monthly. Additionally, the brands, sub-brands and tariffs for each sub-brand changed most months. The project also required external data to be merged. The project worked on a monthly cycle, such that the key person working on the project had to work on the next month’s data processing as soon as he had finished the previous month.

A tracking study can go off the rails

As the project grew, it became clear that the 30 days of work per month was creeping up to 40 days, and delivery was becoming unachievable. In this case, we stored all the changing elements of the project in Excel templates so that the whole project could be automated – including everything from code lists, tables, brands, sub-brands and tariffs. Our MRDCL data processing software could read these spreadsheets and process the survey using easy-to-understand Excel templates. After the overhaul, the task became more of a clerical task than a data processing task. More to the point, several people could easily understand the process and help.

Finding solutions

Finding the right people to help

What is the learning from this example? Where things become complex, repetitive or cumbersome, it is time to think about what input is not functioning well. Is it the staff? Is it the expertise? Is it the software? Is it the hardware, even? Or is it several of these things? In the example above, one month’s restructuring of the project by an expert meant that the project could be handled by two or three days of work per month rather than 30 days’ work. And primarily by less skilled staff.

What projects are most likely to cause data processing headaches?

I’ve already mentioned tracking studies as potentially problematic, but it is hard to make a general statement about all research projects. The best I can offer is this: if you have a project outside your comfort zone or different from anything you have handled (easily) before, you should put more effort into checking your planned route. It is easy to get tricked when projects are conceptually relatively easy but have many steps or minor complications in data processing terms. These can soon stack up into a bigger problem.

Why may MRDC Software be able to help?

First of all, let me say there are a lot of good software suppliers in the market research industry. However, the strength of most software suppliers is that their software does the more straightforward things well. Additionally, some specific features may do something else well – for example, producing quick charts or summary tables. What’s different about our MRDCL software is that one of its primary purposes is to handle complex tasks efficiently. Those examples in bullet points at the top of this blog article are MRDCL’s raison d’être. It means that we have the knowledge and experience to handle problem projects well.

Feel that you have a problem project or projects?

If you feel that you have a problem project (or projects), ask us for help. If the problem is outside our area of expertise, we will, of course, tell you, but we are experts in most aspects of data processing and can help you get more from your budget and increase your profit margins. I always liked what a client said to me some years ago. As far as I recall, his words were, “I always knew I had a problem with , but I didn’t know how to deal with it. I also knew there must be a better way”. Since then, when I get out of my comfort zone and think, “There must be a better way”, but I don’t know what direction to go, I seek help. Maybe we can help you.