Let’s not measure ourselves into obscurity
Posted
30.10.2017
Author
Simon Bird

An interesting headline
Over the last few years in advertising, media and marketing it has felt that there has been somewhat of a civil war going on.
That sounds a bit dramatic but it has been a pretty constant battle between art and science, creative and big data, digital and traditional, mass and targeted, salience and optimised and emotion and information.
Generalising a little, there’s basically a ‘pro digital side’ believing mass advertising is inefficient and full of wastage and that marketing today is all about right message/right place/right time or DM at scale. And an ‘anti-digital side’ believing digital advertising is guilty of spurious numbers and spurious models of marketing and is the online version of the tactical retail stuff from your letterbox that you put straight in the bin.
There hasn’t been much in the middle. Not least of which because it doesn’t tend to make good headlines.
Both sides of the argument have some valid points. However, one side is winning this war. One side has the lion’s share of the press articles, the lion’s share of new media spend and all of the big technology companies on its side. It has all the modern language too: real-time optimisation, dynamic trading, always on, programmatic trading etc.
Many people will say this is a good thing and is the industry modernising. ‘Advertising 2.0’ or such like. Of course, some level of change is good and completely necessary to help us exploit the new digital ecosystems but maybe big data sounds a lot more precise than perhaps it should and, potentially, for the good of our industry this marketing civil war is one that we cannot afford for one side to win.
A Harvard Business Review article from late last year helps highlight the issues that will likely occur if we continue to keep biasing the ‘real-time measurement and optimisation’ side over the other.
The article starts by pointing out the staggering fact that public companies, which are typically those with all the advantages of scale, are disappearing faster than ever. The current rate is six times faster than only 40 years ago and listed companies now have a one in three chance of being delisted in the ensuing five years.
It then goes on to talk about how some of this precariousness is explained by three fundamental changes in the business environment. These being the speed of technological change, a much more interconnected world and the business environment being more diverse than ever. All of which is pretty understandable however the next part of the article goes on to explain how companies are inadvertently magnifying the effects of these new forces. That is, companies are actually complicit in creating an environment that makes them more likely to fail not less.

An interesting sub headline
The core point is that as the business environment gets more and more complex it also gets less predictable; yet, the corporate structures and processes, designed for more stable and predictable times, are preventing companies from adapting to the complexity of their environment. As the system gets more complex the negative effects of this get larger.
Towards the end of the article, they suggest some solutions to the issue such as firms being realistic about what they can really predict and control, being more expecting of unpredictable events, looking more at events outside the firm or category and not to be so reliant on statistical models.
An article in the Wall Street Journal, also last year, mirrors the general sentiment of the HBR piece.
They’re both largely based on the work of a theoretical physicist called Geoffrey West. It sounds a little odd for a scientist to be writing so insightfully about corporate survival but it turns out businesses are ‘complex adaptive systems’ of which the sciences have many – cities, bacteria, evolution and economies to name just a few.

Another statement designed to get people thinking
Unlike companies which are getting less stable, most of the above examples become more stable as they adapt. When West was asked why cities are so stable and companies so fragile (despite both being full of people whose behaviour is typically tricky to predict) he answered simply: “It’s easy, cities tolerate crazy people and companies don’t.” In essence, the central point of both the HBR and WSJ articles – an inability to understand how best to operate with unpredictability (crazy people) is one of the key reasons companies are becoming increasingly less stable.
The above is obviously an extremely abridged version of what is a long and complicated field of study; however, these points provide clear warnings for our own industry and perhaps how we should and shouldn’t behave moving forward.
Marketing (and advertising/media) exists in the aforementioned complex adaptive business ecosystem. One could argue ours is more complex further still due to the even stronger influence of new technology in our industry and the constant stream of new media channels that our target customers are using and adapting to.
Yet much of our industry is currently obsessed with looking for more and more precision from models. Are we really sure our models are as good as we think? Perhaps we are also guilty of thinking we can measure more accurately than we should.
We are not the only field that has grappled with the philosophy of how much to measure and how precisely. Physics went through this around 100 years ago when they moved on from the precision of Newtonian physics into the far less precise field of quantum physics.

A summation of sorts
Indeed, its most famous principle is aptly named the Heisenberg uncertainty principle. It boldly states that we can know where an atom is or how fast it’s going but not both. Quantum physics’ lack of precision has by no means made the field any less important or scientific. Many would say quite the opposite is true.
Economics has been battling with this area also. Their models have become more and more sophisticated over the last 50 years to the point where half the syllabus of most economic degrees is statistics and only ten to 15 percent is conceptual thinking and real-world examples. Despite the models using extremely advanced mathematics they have regularly failed to predict economic crisis – indeed their models were complicit in exacerbating the GFC due to so much misplaced confidence in what they could measure and predict. They would do well to remember John Maynard Keynes’ description of economics as being “the science of thinking in terms of models joined to the art of choosing models that are relevant to the real world”.
An interesting headline
Over the last few years in advertising, media and marketing it has felt that there has been somewhat of a civil war going on.
That sounds a bit dramatic but it has been a pretty constant battle between art and science, creative and big data, digital and traditional, mass and targeted, salience and optimised and emotion and information.
Generalising a little, there’s basically a ‘pro digital side’ believing mass advertising is inefficient and full of wastage and that marketing today is all about right message/right place/right time or DM at scale. And an ‘anti-digital side’ believing digital advertising is guilty of spurious numbers and spurious models of marketing and is the online version of the tactical retail stuff from your letterbox that you put straight in the bin.
There hasn’t been much in the middle. Not least of which because it doesn’t tend to make good headlines.
Both sides of the argument have some valid points. However, one side is winning this war. One side has the lion’s share of the press articles, the lion’s share of new media spend and all of the big technology companies on its side. It has all the modern language too: real-time optimisation, dynamic trading, always on, programmatic trading etc.
Many people will say this is a good thing and is the industry modernising. ‘Advertising 2.0’ or such like. Of course, some level of change is good and completely necessary to help us exploit the new digital ecosystems but maybe big data sounds a lot more precise than perhaps it should and, potentially, for the good of our industry this marketing civil war is one that we cannot afford for one side to win.
A Harvard Business Review article from late last year helps highlight the issues that will likely occur if we continue to keep biasing the ‘real-time measurement and optimisation’ side over the other.
The article starts by pointing out the staggering fact that public companies, which are typically those with all the advantages of scale, are disappearing faster than ever. The current rate is six times faster than only 40 years ago and listed companies now have a one in three chance of being delisted in the ensuing five years.
It then goes on to talk about how some of this precariousness is explained by three fundamental changes in the business environment. These being the speed of technological change, a much more interconnected world and the business environment being more diverse than ever. All of which is pretty understandable however the next part of the article goes on to explain how companies are inadvertently magnifying the effects of these new forces. That is, companies are actually complicit in creating an environment that makes them more likely to fail not less.

An interesting sub headline
The core point is that as the business environment gets more and more complex it also gets less predictable; yet, the corporate structures and processes, designed for more stable and predictable times, are preventing companies from adapting to the complexity of their environment. As the system gets more complex the negative effects of this get larger.
Towards the end of the article, they suggest some solutions to the issue such as firms being realistic about what they can really predict and control, being more expecting of unpredictable events, looking more at events outside the firm or category and not to be so reliant on statistical models.
An article in the Wall Street Journal, also last year, mirrors the general sentiment of the HBR piece.
They’re both largely based on the work of a theoretical physicist called Geoffrey West. It sounds a little odd for a scientist to be writing so insightfully about corporate survival but it turns out businesses are ‘complex adaptive systems’ of which the sciences have many – cities, bacteria, evolution and economies to name just a few.

Another statement designed to get people thinking
Unlike companies which are getting less stable, most of the above examples become more stable as they adapt. When West was asked why cities are so stable and companies so fragile (despite both being full of people whose behaviour is typically tricky to predict) he answered simply: “It’s easy, cities tolerate crazy people and companies don’t.” In essence, the central point of both the HBR and WSJ articles – an inability to understand how best to operate with unpredictability (crazy people) is one of the key reasons companies are becoming increasingly less stable.
The above is obviously an extremely abridged version of what is a long and complicated field of study; however, these points provide clear warnings for our own industry and perhaps how we should and shouldn’t behave moving forward.
Marketing (and advertising/media) exists in the aforementioned complex adaptive business ecosystem. One could argue ours is more complex further still due to the even stronger influence of new technology in our industry and the constant stream of new media channels that our target customers are using and adapting to.
Yet much of our industry is currently obsessed with looking for more and more precision from models. Are we really sure our models are as good as we think? Perhaps we are also guilty of thinking we can measure more accurately than we should.
We are not the only field that has grappled with the philosophy of how much to measure and how precisely. Physics went through this around 100 years ago when they moved on from the precision of Newtonian physics into the far less precise field of quantum physics.

A summation of sorts
Indeed, its most famous principle is aptly named the Heisenberg uncertainty principle. It boldly states that we can know where an atom is or how fast it’s going but not both. Quantum physics’ lack of precision has by no means made the field any less important or scientific. Many would say quite the opposite is true.
Economics has been battling with this area also. Their models have become more and more sophisticated over the last 50 years to the point where half the syllabus of most economic degrees is statistics and only ten to 15 percent is conceptual thinking and real-world examples. Despite the models using extremely advanced mathematics they have regularly failed to predict economic crisis – indeed their models were complicit in exacerbating the GFC due to so much misplaced confidence in what they could measure and predict. They would do well to remember John Maynard Keynes’ description of economics as being “the science of thinking in terms of models joined to the art of choosing models that are relevant to the real world”.