Archive
On past and future innovation – Disruption, inequality and robots
How to define innovation, how has it been studied in the recent past, and what does future innovation hold for the human race?
Sometimes the word innovation gets misused. Like when people use the word “technology” to mean recent gadgets and gizmos, instead of acknolwedging that the term encompasses the first wheel. “Innovation” is another tricky one. Our understanding of recent thoughts on innovation – as well as its contemporary partner, “disruption” – were thrown into question in June when Jill Lepore penned an article in The New Yorker that put our ideas about innovation and specifically on Clayton Christensen’s ideas about innovation in a new light. Christensen, heir apparent to fellow Harvard Business School bod Michael Porter (author of the simple, elegant and classic The Five Competitive Forces that Shape Strategy) wrote The Innovator’s Dilemma in 1997. His work on disruptive innovation, claiming that successful businesses focused too much on what they were doing well, missing what, in Lepore’s words, “an entirely untapped customer wanted”, created a cottage industry of conferences, companies and counsels committed to dealing with disruption, (not least this blog, which lists disruption as one its topics of interest). Lepore’s article describes how, as Western society’s retelling of the past became less dominated by religion and more by science and historicism, the future became less about the fall of Man and more about the idea of progress. This thought took hold particularly during The Enlightenment. In the wake of two World Wars though, our endless advance toward greater things seemed less obvious;
“Replacing ‘progress’ with ‘innovation’ skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices our getting newer and newer”
The article goes on to look at Christensen’s handpicked case studies that he used in his book. When Christensen describes one of his areas of focus, the disk-drive industry, as being unlike any other in the history of business, Lepore rightly points out the sui generis nature of it “makes it a very odd choice for an investigation designed to create a model for understanding other industries”. She goes on for much of the article to utterly debunk several of the author’s case studies, showcasing inaccuracies and even criminal behaviour on the part of those businesses he heralded as disruptive innovators. She also deftly points out, much in the line of thinking in Taleb’s Black Swan, that failures are often forgotten about, and those that succeed are grouped and promoted as formulae for success. Such is the case with Christensen’s apparently cherry-picked case studies. Writing about one company, Pathfinder, that tried to branch out into online journalism, seemingly too soon, Lepore comments,
“Had [it] been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, ‘it’s like it never existed’… Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.”
Such were the ramifications of the piece, that when questioned on it recently in Harvard Business Review, Christensen confessed “the choice of the word ‘disruption’ was a mistake I made twenty years ago“. The warning to businesses is that just because something is seen as ‘disruptive’ does not guarantee success, or fundamentally that it belongs to any long-term strategy. Developing expertise in a disparate area takes time, and investment, in terms of people, infrastructure and cash. And for some, the very act of resisting disruption is what has made them thrive. Another recent piece in HBR makes the point that most successful strategies involve not just a single act of deus ex machina thinking-outside-the-boxness, but rather sustained disruption. Though Kodak, Sony and others may have rued the days, months and years they neglected to innovate beyond their core area, the graveyard of dead businesses is also surely littered with companies who innovated too soon, the wrong way or in too costly a process that left them open to things other than what Schumpeter termed creative destruction.
Outside of cultural and philosophical analysis of the nature and definition of innovation, some may consider of more pressing concern the news that we are soon to be looked after by, and subsequently outmaneuvered in every way by, machines. The largest and most forward-thinking (and therefore not necessarily likely) of these concerns was recently put forward by Nick Bostrom in his new book Superintelligence: Paths, Dangers, Strategies. According to a review in The Economist, the book posits that once you assume that there is nothing inherently magic about the human brain, it is evidence that an intelligent machine can be built. Bostrom worries though that “Once intelligence is sufficiently well understood for a clever machine to be built, that machine may prove able to design a better version of itself” and so on, ad infinitum. “The thought processes of such a machine, he argues, would be as alien to humans as human thought processes are to cockroaches. It is far from obvious that such a machine would have humanity’s best interests at heart—or, indeed, that it would care about humans at all”.
Beyond the admittedly far-off prognostications of the removal of the human race at the hands of the very things it created, machines and digital technology in general pose great risks in the near-term, too. For a succinct and alarming introduction to this, watch the enlightening video at the beginning of this post. Since the McKinsey Global Instititute published a paper in May soberly titled Disruptive technologies: Advances that will transform life, business, and the global economy, much editorial ink and celluloid (were either medium to still be in much use) has been spilled and spooled detailing how machines will slowly replace humans in the workplace. This transformation – itself a prime example of creative destruction – is already underway in the blue-collar world, where machines have replaced workers in automotive factories. The Wall Street Journal reports Chinese electronics makers are facing pressure to automate as labor costs rise, but are challenged by the low margins, precise work and short product life of the phones and other gadgets that the country produces. Travel agents and bank clerks have also been rendered null, thanks to that omnipresent machine, the Internet. Writes The Economist, “[T]eachers, researchers and writers are next. The question is whether the creation will be worth the destruction”. The McKinsey report, according to The Economist, “worries that modern technologies will widen inequality, increase social exclusion and provoke a backlash. It also speculates that public-sector institutions will be too clumsy to prepare people for this brave new world”.
Such thinking gels with an essay in the July/August edition of Foreign Affairs, by Erik Brynjolfsson, Andrew McAfee and Michael Spence, titled New World Order. The authors rightly posit that in a free market the biggest premiums are reserved for the products with the most scarcity. When even niche, specialist employment though, such as in the arts (see video at start of article), can be replicated and performed to economies of scale by machines, then labourers and the owners of capital are at great risk. The essay makes good points on how while a simple economic model suggests that technology’s impact increases overall productivity for everyone, the truth is that the impact is more uneven. The authors astutely point out,
“Today, it is possible to take many important goods, services, and processes and codify them. Once codified, they can be digitized [sic], and once digitized, they can be replicated. Digital copies can be made at virtually zero cost and transmitted anywhere in the world almost instantaneously.”
Though this sounds utopian and democratic, what is actually does, the essay argues, is propel certain products to super-stardom. Network effects create this winner-take-all market. Similarly it creates disproportionately successful individuals. Although there are many factors at play here, the authors readily concede, they also maintain the importance of another, important and distressing theory;
“[A] portion of the growth is linked to the greater use of information technology… When income is distributed according to a power law, most people will be below the average… Globalization and technological change may increase the wealth and economic efficiency of nations and the world at large, but they will not work to everybody’s advantage, at least in the short to medium term. Ordinary workers, in particular, will continue to bear the brunt of the changes, benefiting as consumers but not necessarily as producers. This means that without further intervention, economic inequality is likely to continue to increase, posing a variety of problems. Unequal incomes can lead to unequal opportunities, depriving nations of access to talent and undermining the social contract. Political power, meanwhile, often follows economic power, in this case undermining democracy.”
There are those who say such fears of a rise in inequality and the whole destruction through automation of whole swathes of the job sector are unfounded, that many occupations require a certain intuition that cannot be replicated. Time will tell whether this intuition, like an audio recording, health assessment or the ability to drive a car, will be similarly codified and disrupted (yes, we’ll continue using the word disrupt, for now).
The Big Data Fallacy
The latest issue of Foreign Affairs features the cover article “The Rise of Big Data” by Kenneth Cukier and Viktor Mayer-Schoenburger, which mostly details some of the incredible ways companies like UPS, Google and Apple have come to rely on vast arrays of numbers in order to run their businesses better. But data has always provided a problem in that it gives a substantive assurance of certainty that has a propensity to foster overconfidence in those relying on it. The article attempts to address this:
“[K]nowing the causes behind things is desirable. The problem is that causes are often extremely hard to figure out… Behavioural economics has shown that humans are conditioned to see causes even where none exist. So we need to be particularly on guard to prevent our cognitive biases from deluding us; sometimes, we just have to let the data speak.”
The sentiment here is admirable, and the context perceptive. But the final part of the quotation (my emphasis) assumes wrongly that data can speak objectively, that there is a fundamental ‘truth’ in a number. All too often though the wrong things are measured, or not all variables are measured. What data does not record, or worse, cannot record, can often be overlooked. While ostensibly data is there to provide assistance with building models and predicting future trends and movements, it sometimes leads to a very narrow view of one particular future, and fails to account for possibilities, that, though while unlikely, could potentially be devastating. This is what Nicholas Taleb writes about in his by turns unreadable but seminal work, Black Swan. The fictional, paranoid loner Fox Mulder of the hit series The X-Files had it right fifteen years ago when he lamented “in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorised or easily referenced”. The financial system before 2008 was a victim of such narrow thinking.
Hendrik Hertzberg, in his Talk of the Town column “Preventive Measures” in this week’s The New Yorker, made the adroit analogy with the 2002 film Minority Report in our quest to categorise and predict acts of crime. Hertzberg points out that in reality this “turns out to be a good deal more difficult than investigating such an act once it occurs”. Indeed, such prediction methods are being implemented, just with somewhat less efficacy than in the Tom Cruise movie. The stop-and-frisk procedure currently employed by the New York Police Department points to a sustained effort to engage in preventative measures to reduce crime, effectively what Cruise and his myrmidons were doing, albeit without the help of psychic imagery as in the film. While the psychic “Pre-Cogs” turned out to occasionally disagree, the success rate with stop-and-frisk is even less attractive. “In the final months of 2012”, writes the New York Times, only 4% of stops resulted in an arrest. But what is this low figure telling us…?
Hertzberg also alludes to the dilemma of mountains of data, produced without concern for oversight or management; producing more just because it’s possible to produce it, rather than thinking about the implications:
“This fall, the National Security Agency, the largest and most opaque component of the counter-terrorism behemoth, will [open] a billion-dollar facility [analysing] intercepted telecommunications… each of the Utah Data Center’s two hundred (at most) professionals will be responsible for reviewing five hundred billion terabytes of information each year, the equivalent of twenty-three million years’ worth of Blu-ray DVDs… that’s a lot of overtime.”
The other problem this data poses – and increasingly this goes for many industries that are jumping on the Big Data bandwagon – is that intelligence departments and businesses alike are now technically able to put quantifiable targets and figures to what they want to achieve, without considering whether such targets are actually applicable. Police claim the low stop-to-arrest ratio implies that they are preventing crimes by stopping someone before they act. There is nothing to argue otherwise. The New York Times article alludes to the debate over what ratio or percentage the Supreme Court would be comfortable with under the tenet of “reasonable suspicion”. This leads down a dangerous path where we treat data as an answer to a question, rather than as supporting evidence to an answer.
The Leadership Legend – CEOs, David Petraeus & Marie Antoinette
Zeitgeist has found himself leading projects several times over the past year. The prospect can sometimes be a challenging one, and the received wisdom is that looking to the past can help shed light on the future. Looking at both recent and ancient history, however, says one thing more than anything else; leaders are a victim of circumstances. Any strategy must adapt to context.
As a 20-something Londoner with money to burn, Zeitgeist naturally found himself on Saturday night sitting at home, reading The New Yorker. The fascinating review by Dexter Filkins of recent biographies on David Petraeus, former CIA director and responsible for the execution of the ‘surge’ in Iraq and Afghanistan, painted an interesting portrait of what leadership is about. He recognised that the system in place in the early days of Iraq of rounding up countless civilians in order to ferret out insurgents was not an efficient one, nor was it especially effective. Rather, as Filkins points out, “I witnessed several such roundups, and could only conclude that whichever of these men did not support the uprising when the raids began would almost certainly support it by the time the raids were over”. Leadership, then, in this case, came in the ability to spot a deficiency, and then building on it by offering a better solution. Petraeus, who liked to say that “money is ammunition”, focused on the civilians they wanted to protect, rather than the enemy they wanted to kill. This was a drastically radical notion at the time in the military. True leadership narratives are often riddled with anecdotes of absolute maverick behaviour of this kind. The fallacy is that, and this is one of Taleb’s main points in his book on uncertainty, Black Swan, the stories of those whose maverick ideas did not work out rarely make for interesting books or films. Few songs will be written about those guys.
Just as Petraeus was able to leverage the time in which he happened to be serving in order to spot something that he could perceive to be at fault and have the opportunity to amend, there is then an element of luck involved too. “I have plenty of clever generals”, Napoleon once said, “Just give me a lucky one”. Petraeus’ luck began with being around at the right time in order to see how things could be different. It continued when he managed to shepherd his idea for the ‘surge’ to fruition. While at the time the idea of deploying an extra 25,000 soldiers to Iraq was greeted with some mixed reactions to the say the least, it can certainly be said to have paid off in large part. It was another example of a maverick move that panned out well. However, as Filkins points out, the timing of it all was what made it such a success. The Awakening, a phrase given to Sunni-orchestrated truces with US troops that began before the surge, was instrumental. Filkins writes, “Could the surge have worked without the Awakening? Almost certainly not”. The Awakening most assuredly featured tactically in the execution of the surge, but you can be sure it was never part of the strategy. Perhaps it was the failure to notice this, and the attractiveness of the holistic narrative – another fallacy that Taleb notes in his book – that led to a surge being attempted, with far less success, in Afghanistan. What works in one place at one time, might not work again.
Zeitgeist is also currently wading through the Marie Antoinette biography by Antonia Fraser. It is quite extraordinary to note how many times the autocratic aristocracy are a victim of circumstances, rather than being able to dictate their own fate through their own policies and leadership. In the long-term, though greeted with warmth at the start of her reign, Marie Antoinette was always treated with a modicum of suspicion by the people of France, hailing from Austria, a country of lukewarm political relations and which culturally left many an ordinary Frenchman cold. It was long-gestating prejudices such as these that helped blacken the Queen’s name. The phrase ‘Let them eat cake’ had been ascribed to various monarchs going back over a century before Marie Antoinette ascended to the throne. In the medium-term, the support France provided in the American war of independence was pivotal. The Treasury spent an enormous amount of money funding the war, which was seen as a proxy battle with England. This action alone nearly bankrupted the country. But, away from finances, there was the ideological lens to consider as well. Landed gentry like Lafayette, who left nobly at the King’s command to support the war, returned not only as lauded heroes, but as heroes who had been fighting with a group of people who yearned to be free of a suppressive, royalist regime. Such thinking proved infectious, and was not forgotten when men like Lafayette returned home. Finally, in the short-term, an absolutely ruinous stroke of weather stunted harvests, creating mass famine across the country in the lead-up to the revolution. All such things were manageable to an extent by the royalty, but truthfully the origins of such influences were out of their hands.
CEOs today are seen as less wizard-like than they were five or ten years ago, when moguls, particularly in the media industry, bestrode the globe, acquiring companies at their whim, creating ‘synergy’ where none really existed in the first place (think AOL Time Warner). The paradigm shift of course has been in the global recession that few – including many a lauded business leader – foresaw. Confidence in such people has been shaken. What these histories tells us about the ways to handle leadership then can be summarised in the following ways: 1. Know your environment. Externalities and trends are likely to influence your business, and not always in obvious ways. 2. Be mindful of context. What works somewhere might not work in the same way again elsewhere. 3. Appearance, rightly or wrongly, counts for a great deal. 4. When you choose to do something can sometimes be as important as the thing you are trying to do itself. 5. A small amount of luck can go a long way.
On the danger of trends
Around this time of year, many companies, journalists and soothsayers become prone to taking educated guesses at trends in the coming year(s). Zeitgeist itself is guilty of such crystal ball gazing, writing on retail trends for Design Week at the beginning of this year.
Valuable as some of these insights are, we must never forget about externalities that inevitably push certain trends off-course, (look what the recession did for the popularity and importance of organic food). Though it is written in an annoying manner, Black Swan and its ideas are not to be ignored. Few would have suspected that this year would have seen the demise of Gaddafi, Bin Laden and Kim Jong-Il, but so it happened. In the 1932 film Shanghai Express, the American traveller Sam asks a question, at once revealing a prideful lack of foresight, and an ephemeral resolution. Traits of a nation, perhaps. But it demonstrates the dangers of making judgements on the future based on current trends, and presuming the status quo will remain just that.
“What future is there being a Chinaman? You’re born, you eat your way through a handful of rice, and you die. What a country. Let’s have a drink!”
Studying the Studio Brand
Apple’s powerful brand helps it sell its various computer devices to customers. Nike’s brand is similarly beneficial as far as sports shoes and apparel are concerned. Both brands have a certain ideology; cues that tip off the buyer as to what to expect from the product, or what kind of ethos they are buying into. Not so for a 20th Century Fox. With the 3D cash cow already struggling, what can film studios do to leverage affinity for their product?
In the early days of the filmmaking, the branding of a studio was more prevalent. The audience in turn, might have had more affinity for one studio over another, which may be what the choice comes down to when presented with a similar product – two asteroid films, for example, just like Nike must compete with other sneaker manufacturers. 20th Century Fox has a diverse stable of films, from Star Wars, Titanic and Avatar to Black Swan and The Wrestler. The latter two are produced by a subsidiary studio, focussed on smaller, art films, called Fox Searchlight. With such a narrow remit, it would be interesting to see the studio build its brand with consumers more, encouraging affinity. Could other subsidiary studios be created, in name only, to signal different genres and expectations to movie-going customers? They could take a leaf from Marvel’s marketing handbook, which, laden with the singularly uninspiring film Captain America, used the equity of the studio’s previous films to raise the hypothetical level of assumed quality: “From the Studio that brought you…”. It’s a great idea that could be used far more frequently by a greater number of studios.
On demographics, devices and ‘Downton Abbey’
“Keynesian paradigm shift” was a term Zeitgeist was introduced to back in those glorious days of university. We’re often on the lookout for that next shift. 2003 was the first time when Zeitgeist began to take blogs seriously, as your average Iraqi citizen started writing journals online that gave more of an insight into the invasion than any “embedded” Fox News reporter. Incidentally, anyone looking to know more about the way news was covered by those reporters under the care of the US military at the time should check out the fascinating documentary “Control Room”.
There’s has been much discussion of more paradigm shifts over the last couple of years as PVR / DVR devices like TiVo and Sky+ have set various network TV honchos and advertising execs fretting about the lessened impact of advertising caused by delayed viewing. Advertisements on television are scheduled at a particular time to appeal to a very particular audience, and may be very ephemeral in nature (eg for an upcoming event or film). Having viewers watch the commercial at a later time might be bad, as it could be – in the advertiser’s eyes – too late. But having the viewer fast-forward through the commercial break altogether is disastrous. Simply put, companies won’t pay to have an ad on TV if no one is going to watch it. This of course is especially relevant to shows with covetable demographics, i.e. Watched by the financially comfortable, as ironically they are more likely to have purchased a device that makes those advertisements fast-forwardable.
However, recent news should cheer those whose job it is to worry about such matters. In the first place, as the world economy stutters into recovery, advertisers are funnelling money back into mainstream media, particularly television, as we reported on last October. Moreover, as Variety recently reported, the feeling of watching a show as it is broadcast “live” is a special one. This has long held true for sporting events and the Oscars, but increasingly it applies to popular sitcoms and dramas, too. Shows like ITV’s recent Downton Abbey revealed that people made a point of watching the broadcast live so that they could engage more in the online conversations that were taking place on social networks like Facebook. UK TV ratings are now at their highest since records began.
This brings up two points, one of cultural philosophy, the other of political science policy. In the mid-1930s, Walter Benjamin wrote a seminal piece of work known as “The work of art in the age of mechanical reproduction”. The crux of this paper rested on the idea that there is something infinitely intangible and special about seeing the genuine artefact; beholding the original Mona Lisa in the Denon wing of the Louvre is a more special experience than looking at it on a postcard. There is an “aura” to it. If we extrapolate this to the world of film and television, that aura is fed now by social media chatter amongst friends.
From a policy point of view, an argument that Zeitgeist has mentioned before bears noting, that of technological determinism vs social constructivism. It posits the argument over whether what a technology is intended for necessarily dictates how it is used, and influences user behaviour. With a heightened demand for the live experience, evidently this is not the case with PVRs. Recent studies show that people are fast-forwarding through commercials less and less, and, as mentioned, gravitating toward enjoying the live experience more and more. A savvy person might ask how we can mesh these two worlds together. Zeitgeist wouldn’t be surprised to see in the near future programme recommendations appearing on your PVR from friends you are connected to over social networks.
Downton Abbey is worth noting again. In the past, when we have thought of wildly successful shows and films, thoughts of the latest teen sensation might have come to mind. And while Twilight and Justin Bieber do occupy a significant part of the current Zeitgeist, shows like Downton Abbey illustrate that there is another audience – a rapidly growing one – that is only just beginning to appear on the radar of media executives. As The Economist recently pointed out, the baby boomer generation has a relatively high spending power, and buys a relatively high quantity of media like CDs. And while, according to Variety, movie studios plan to release some 27 prequels or sequels this year, there are also signs for hope too. The King’s Speech came very close to not getting made after debacles with funding, and Black Swan had a similarly bumpy road to production; Variety says it “kept losing its funding until the day before principal photography”. These were two of the greatest (and most mature) films of last year according to Zeitgeist, with the former winning both Best Picture and Best Director. Black Swan has grossed more than $100m in the US. The only other film from Fox studios to do the same was the latest Narnia incarnation, which must have cost north of $200m to make, once marketing is included. Films about royalty and ballet are ones that will appeal to the superannuated audience, and not coincidentally perhaps the ones with the highest profit margins. The much-coveted 18-49 demographic is an anachronism, let’s think bigger (and older).
Neuromarketing Explained
“The trouble with market research is that people don’t think how they feel, they don’t say what they think, and they don’t do what they say”
– David Ogilvy
On Wednesday night, part of the Zeitgeist entity found itself at a Holiday Inn. No, it was not part of a dare. The Account Planners Group [APG] had chosen this venue in central London to host a conference on neuroscience, with specific reference to its application in marketing. Neuromarketing involves using tools, tasks or tests from the realm of cognitive psychology and neuroscience to measure non-conscious reactions in the brain to marketing stimuli. The use of the above image was made even more appropriate given that the organiser of these events goes by the name Steve Martin (I kid you not). AdAge recently featured a pretty good article on the subject.
Our host for the evening was Gemma Calvert. As Warwick University notes, “In 1997, Professor Calvert established the world’s first neuromarketing consultancy, Neurosense Limited, which has undertaken numerous fMRI studies for clients in the advertising, marketing and pharmaceutical industries. The company’s clients include Unilever, Viacom Brand Solutionts, GMTV, Omnicom, Quest International and McDonald’s Europe. This expertise has formed the basis for the establishment of a dedicated academic group at WDL which aims to help marketers and manufacturers understand how the brain responds to products/fragrances, brand extensions, packaging design and marketing messages.”
Ms Calvert began by talking of Descartes, one of the leading figures of the Enlightenment, who espoused philosophies on the inherent superiority of human beings to primates, because we had the rational mind. But then, more recently, in the 90s, some dude came along called Damasio. Damasio claimed that we were at heart (or rather, in brain) emotional beings ruled by emotive impulses. This theory, it turns out, is closer to the truth. While our brains have expanded as we have evolved, our limbic brain sits comfortably over our reptilian one, and our neocortex rests on this. The cortex makes our rational decision for us, while the more base parts of our brain do the instinctive “fight or flight”, “must have sex now” stuff. Unfortunately for those supporting the rational part of our brain however, the cortex makes no decisions without consulting the limbic part, subconsciously. Our brain is unable to tune into all the information it needs to, so sometimes we block out things that we see as extraneous. This is dangerous as it can lead to unexpected dangers down the road (see global recession and the premise of Black Swan). It’s well-illustrated by the following video:
Remarkably, we even rationalise post-hoc, telling ourselves something we know is not true but forcing our rational mind to accept it. There is an excellent article here on the subject of confabulation (link updated 2014). Zeitgeist watched this video last night at the conference and did not believe that there had been a gorilla in the video the first time it was shown. Watching the YouTube video this morning, he now believes this is a different video that does contain bears in both clips. It is very unlikely that he is correct. Ms Calvert also highlighted the fallibility of focus groups, as evinced by the great Mr. Ogilvy at the beginning of this article. One of her more whimsical comments came after her statement that 97% of new products fail in Japan within the first 12 months (there are specific reasons for problems in this region). This despite months of testing, focus groups and general consumer research. Ms Calvert’s opinion was that you were just as – statistically speaking, better – off flipping a coin, as at least with that you had a 50-50 chance. Neuromarketing on the other hand can give you an insight into how consumers actually feel, rather than merely what they are telling you. The application for this study is done through eyetracking, fMRI scans and EEG. MRI involves the rather unnatural state of lying down surrounded by a gigantic magnet. Wearing fibre optic glasses, the subject can be shown pictures, movies, or even be given a joystick to engage on a virtual shopping trip. It can be used to study how a 30-second spot holistically effects the brain. EEG on the other hand can be used to examine how someone feels about something on a second-by-second basis, with a positive or negative timeline.
Ms Calvert also spoke briefly on behavioural economics. Zeitgeist has commented previously on behavioural economics – which, contrary to classical economics, argues that we are not all inherently rational beings making purely rational decisions – which is a methodology that, according to Ms Calvert, aims to effect large-scale population change. Thinkbox has the pleasure of hosting none other than Ogilvy’s very own Rory Sutherland on the subject on its website, video of which can be found here. These methodologies can help validate and measure effectiveness. It can help divine brand empathy, loyalty, liking and recognition. The findings were most interesting for subjects where the consumer was actually lying to themselves. When Dove tested to see whether they should enter the house cleaning market, those tested with neuromarketing revealed they were very turned off by such a notion, with their brains showing high signs of disinterest and even disgust. In focus groups though they told researchers they would be quite happy to consider buying such a product. Brain imaging better predicts intended purchases than what consumers actually tell researchers. How to reconcile these contrasts? Well perhaps the fact that fully 85% of consumer behaviour is driven by non-conscious awareness is part of it; we are not even aware of most of the decisions we make. Now neuroscientists are. Sounds like a movie I saw this summer…