An awful lot has been written about big data and how it might impact on marketers. Big data is the casual term applied to the sort of data now available as a byproduct of the increasingly connected world around us. It is characterised by three attributes: velocity, variety and volume. But, more than that, it is a subjective term.
It is data so much larger than traditional stock-in-trade that it simply isn’t manageable within the current environment. Just as transactional level data revolutionised data for purchases, the movement towards interaction and event data has created a data management challenge, where the largest table in a marketing database can be several times larger than the customer table.
It’s easy, on a technical level, to believe that this is a step change that could alter your business and maybe it can. But, it can also sink your business into a mire of data overload if you don’t have a manageable solution.
Whether it’s Facebook’s targeting methodology or some as yet unseen mash-up produced from a big data combination, the belief is that data is now truly driving change across corporations. It is one of the biggest shifts in industry thought of late.
Consumer insight has always been crucial for driving creative ideas and has been at the heart of planning since Stanley Pollitt invented the concept; it remains at the core of all good marketing. Yet, the step change now becoming possible, through the integration of the strands of data that criss-cross our everyday lives, means that we need a far more complex toolbox to drive smarter implementation. This is no easy problem to manage.
In 2012, insight gathered from mechanised data sources is able to deliver significant advances in segmentation and targeting. This is the result of those vast volumes of data being accumulated from multiple touchpoints, including traditional channels, online and mobile apps, retail loyalty schemes and external datasets.
It’s only advanced thinking in data manipulation and analysis that allows us to efficiently harness this data and provide fully relevant communications at the right time and right place.
In addition to the significant technical and analytical challenges involved in managing data and analysis resources there is the not insignificant issue of cost benefit analysis. It’s not enough to have the capability; you need to check it’s actually worthwhile.
There has been a tendency in recent times to pile investment into areas of activity based on a belief. While there’s nothing wrong with following a hunch – inspiration has many origins – there is a point where objective analysis is needed.
For instance, how many brands genuinely understand the value of a “like” on Facebook or the scalability of that model? If the management of large-scale data is to become a key part of understanding your customer base, it needs to start off on the right path.
However, just as the financial justification for this sort of data project needs to be carefully investigated, you also need to take heed of the less tangible aspects. Crucially, marketers need to consider how customers feel about their data being used in this manner – will they notice or will it be used to subtly target them with relevant messaging?
This is the real crux: consumers are less likely to reject relevant communications. They have and will continue to reject poor targeting, crass personalisation, intrusive ads and perceived misuses of permission.
Permission may have a legal aspect in what was said when certain pieces of data were captured but, in reality, it’s entirely subjective. If the consumer is surprised by how exactly the brand uses their personal data, it doesn’t matter what boxes are ticked. It may keep you out of trouble with the Information Commissioner but it won’t keep you in business.
The future of managing big data, whatever your context, relies upon tools, skills, objectives and customer value. Ultimately, it is no different from normal marketing data, just bigger.
Charles Ping is strategy director of data intelligence at Communisis