Archive
On past and future innovation – Disruption, inequality and robots
How to define innovation, how has it been studied in the recent past, and what does future innovation hold for the human race?
Sometimes the word innovation gets misused. Like when people use the word “technology” to mean recent gadgets and gizmos, instead of acknolwedging that the term encompasses the first wheel. “Innovation” is another tricky one. Our understanding of recent thoughts on innovation – as well as its contemporary partner, “disruption” – were thrown into question in June when Jill Lepore penned an article in The New Yorker that put our ideas about innovation and specifically on Clayton Christensen’s ideas about innovation in a new light. Christensen, heir apparent to fellow Harvard Business School bod Michael Porter (author of the simple, elegant and classic The Five Competitive Forces that Shape Strategy) wrote The Innovator’s Dilemma in 1997. His work on disruptive innovation, claiming that successful businesses focused too much on what they were doing well, missing what, in Lepore’s words, “an entirely untapped customer wanted”, created a cottage industry of conferences, companies and counsels committed to dealing with disruption, (not least this blog, which lists disruption as one its topics of interest). Lepore’s article describes how, as Western society’s retelling of the past became less dominated by religion and more by science and historicism, the future became less about the fall of Man and more about the idea of progress. This thought took hold particularly during The Enlightenment. In the wake of two World Wars though, our endless advance toward greater things seemed less obvious;
“Replacing ‘progress’ with ‘innovation’ skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices our getting newer and newer”
The article goes on to look at Christensen’s handpicked case studies that he used in his book. When Christensen describes one of his areas of focus, the disk-drive industry, as being unlike any other in the history of business, Lepore rightly points out the sui generis nature of it “makes it a very odd choice for an investigation designed to create a model for understanding other industries”. She goes on for much of the article to utterly debunk several of the author’s case studies, showcasing inaccuracies and even criminal behaviour on the part of those businesses he heralded as disruptive innovators. She also deftly points out, much in the line of thinking in Taleb’s Black Swan, that failures are often forgotten about, and those that succeed are grouped and promoted as formulae for success. Such is the case with Christensen’s apparently cherry-picked case studies. Writing about one company, Pathfinder, that tried to branch out into online journalism, seemingly too soon, Lepore comments,
“Had [it] been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, ‘it’s like it never existed’… Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.”
Such were the ramifications of the piece, that when questioned on it recently in Harvard Business Review, Christensen confessed “the choice of the word ‘disruption’ was a mistake I made twenty years ago“. The warning to businesses is that just because something is seen as ‘disruptive’ does not guarantee success, or fundamentally that it belongs to any long-term strategy. Developing expertise in a disparate area takes time, and investment, in terms of people, infrastructure and cash. And for some, the very act of resisting disruption is what has made them thrive. Another recent piece in HBR makes the point that most successful strategies involve not just a single act of deus ex machina thinking-outside-the-boxness, but rather sustained disruption. Though Kodak, Sony and others may have rued the days, months and years they neglected to innovate beyond their core area, the graveyard of dead businesses is also surely littered with companies who innovated too soon, the wrong way or in too costly a process that left them open to things other than what Schumpeter termed creative destruction.
Outside of cultural and philosophical analysis of the nature and definition of innovation, some may consider of more pressing concern the news that we are soon to be looked after by, and subsequently outmaneuvered in every way by, machines. The largest and most forward-thinking (and therefore not necessarily likely) of these concerns was recently put forward by Nick Bostrom in his new book Superintelligence: Paths, Dangers, Strategies. According to a review in The Economist, the book posits that once you assume that there is nothing inherently magic about the human brain, it is evidence that an intelligent machine can be built. Bostrom worries though that “Once intelligence is sufficiently well understood for a clever machine to be built, that machine may prove able to design a better version of itself” and so on, ad infinitum. “The thought processes of such a machine, he argues, would be as alien to humans as human thought processes are to cockroaches. It is far from obvious that such a machine would have humanity’s best interests at heart—or, indeed, that it would care about humans at all”.
Beyond the admittedly far-off prognostications of the removal of the human race at the hands of the very things it created, machines and digital technology in general pose great risks in the near-term, too. For a succinct and alarming introduction to this, watch the enlightening video at the beginning of this post. Since the McKinsey Global Instititute published a paper in May soberly titled Disruptive technologies: Advances that will transform life, business, and the global economy, much editorial ink and celluloid (were either medium to still be in much use) has been spilled and spooled detailing how machines will slowly replace humans in the workplace. This transformation – itself a prime example of creative destruction – is already underway in the blue-collar world, where machines have replaced workers in automotive factories. The Wall Street Journal reports Chinese electronics makers are facing pressure to automate as labor costs rise, but are challenged by the low margins, precise work and short product life of the phones and other gadgets that the country produces. Travel agents and bank clerks have also been rendered null, thanks to that omnipresent machine, the Internet. Writes The Economist, “[T]eachers, researchers and writers are next. The question is whether the creation will be worth the destruction”. The McKinsey report, according to The Economist, “worries that modern technologies will widen inequality, increase social exclusion and provoke a backlash. It also speculates that public-sector institutions will be too clumsy to prepare people for this brave new world”.
Such thinking gels with an essay in the July/August edition of Foreign Affairs, by Erik Brynjolfsson, Andrew McAfee and Michael Spence, titled New World Order. The authors rightly posit that in a free market the biggest premiums are reserved for the products with the most scarcity. When even niche, specialist employment though, such as in the arts (see video at start of article), can be replicated and performed to economies of scale by machines, then labourers and the owners of capital are at great risk. The essay makes good points on how while a simple economic model suggests that technology’s impact increases overall productivity for everyone, the truth is that the impact is more uneven. The authors astutely point out,
“Today, it is possible to take many important goods, services, and processes and codify them. Once codified, they can be digitized [sic], and once digitized, they can be replicated. Digital copies can be made at virtually zero cost and transmitted anywhere in the world almost instantaneously.”
Though this sounds utopian and democratic, what is actually does, the essay argues, is propel certain products to super-stardom. Network effects create this winner-take-all market. Similarly it creates disproportionately successful individuals. Although there are many factors at play here, the authors readily concede, they also maintain the importance of another, important and distressing theory;
“[A] portion of the growth is linked to the greater use of information technology… When income is distributed according to a power law, most people will be below the average… Globalization and technological change may increase the wealth and economic efficiency of nations and the world at large, but they will not work to everybody’s advantage, at least in the short to medium term. Ordinary workers, in particular, will continue to bear the brunt of the changes, benefiting as consumers but not necessarily as producers. This means that without further intervention, economic inequality is likely to continue to increase, posing a variety of problems. Unequal incomes can lead to unequal opportunities, depriving nations of access to talent and undermining the social contract. Political power, meanwhile, often follows economic power, in this case undermining democracy.”
There are those who say such fears of a rise in inequality and the whole destruction through automation of whole swathes of the job sector are unfounded, that many occupations require a certain intuition that cannot be replicated. Time will tell whether this intuition, like an audio recording, health assessment or the ability to drive a car, will be similarly codified and disrupted (yes, we’ll continue using the word disrupt, for now).
The Big Data Fallacy
The latest issue of Foreign Affairs features the cover article “The Rise of Big Data” by Kenneth Cukier and Viktor Mayer-Schoenburger, which mostly details some of the incredible ways companies like UPS, Google and Apple have come to rely on vast arrays of numbers in order to run their businesses better. But data has always provided a problem in that it gives a substantive assurance of certainty that has a propensity to foster overconfidence in those relying on it. The article attempts to address this:
“[K]nowing the causes behind things is desirable. The problem is that causes are often extremely hard to figure out… Behavioural economics has shown that humans are conditioned to see causes even where none exist. So we need to be particularly on guard to prevent our cognitive biases from deluding us; sometimes, we just have to let the data speak.”
The sentiment here is admirable, and the context perceptive. But the final part of the quotation (my emphasis) assumes wrongly that data can speak objectively, that there is a fundamental ‘truth’ in a number. All too often though the wrong things are measured, or not all variables are measured. What data does not record, or worse, cannot record, can often be overlooked. While ostensibly data is there to provide assistance with building models and predicting future trends and movements, it sometimes leads to a very narrow view of one particular future, and fails to account for possibilities, that, though while unlikely, could potentially be devastating. This is what Nicholas Taleb writes about in his by turns unreadable but seminal work, Black Swan. The fictional, paranoid loner Fox Mulder of the hit series The X-Files had it right fifteen years ago when he lamented “in a universe of infinite possibilities, we may find ourselves at the mercy of anyone or anything that cannot be programmed, categorised or easily referenced”. The financial system before 2008 was a victim of such narrow thinking.
Hendrik Hertzberg, in his Talk of the Town column “Preventive Measures” in this week’s The New Yorker, made the adroit analogy with the 2002 film Minority Report in our quest to categorise and predict acts of crime. Hertzberg points out that in reality this “turns out to be a good deal more difficult than investigating such an act once it occurs”. Indeed, such prediction methods are being implemented, just with somewhat less efficacy than in the Tom Cruise movie. The stop-and-frisk procedure currently employed by the New York Police Department points to a sustained effort to engage in preventative measures to reduce crime, effectively what Cruise and his myrmidons were doing, albeit without the help of psychic imagery as in the film. While the psychic “Pre-Cogs” turned out to occasionally disagree, the success rate with stop-and-frisk is even less attractive. “In the final months of 2012”, writes the New York Times, only 4% of stops resulted in an arrest. But what is this low figure telling us…?
Hertzberg also alludes to the dilemma of mountains of data, produced without concern for oversight or management; producing more just because it’s possible to produce it, rather than thinking about the implications:
“This fall, the National Security Agency, the largest and most opaque component of the counter-terrorism behemoth, will [open] a billion-dollar facility [analysing] intercepted telecommunications… each of the Utah Data Center’s two hundred (at most) professionals will be responsible for reviewing five hundred billion terabytes of information each year, the equivalent of twenty-three million years’ worth of Blu-ray DVDs… that’s a lot of overtime.”
The other problem this data poses – and increasingly this goes for many industries that are jumping on the Big Data bandwagon – is that intelligence departments and businesses alike are now technically able to put quantifiable targets and figures to what they want to achieve, without considering whether such targets are actually applicable. Police claim the low stop-to-arrest ratio implies that they are preventing crimes by stopping someone before they act. There is nothing to argue otherwise. The New York Times article alludes to the debate over what ratio or percentage the Supreme Court would be comfortable with under the tenet of “reasonable suspicion”. This leads down a dangerous path where we treat data as an answer to a question, rather than as supporting evidence to an answer.
The Leadership Legend – CEOs, David Petraeus & Marie Antoinette
Zeitgeist has found himself leading projects several times over the past year. The prospect can sometimes be a challenging one, and the received wisdom is that looking to the past can help shed light on the future. Looking at both recent and ancient history, however, says one thing more than anything else; leaders are a victim of circumstances. Any strategy must adapt to context.
As a 20-something Londoner with money to burn, Zeitgeist naturally found himself on Saturday night sitting at home, reading The New Yorker. The fascinating review by Dexter Filkins of recent biographies on David Petraeus, former CIA director and responsible for the execution of the ‘surge’ in Iraq and Afghanistan, painted an interesting portrait of what leadership is about. He recognised that the system in place in the early days of Iraq of rounding up countless civilians in order to ferret out insurgents was not an efficient one, nor was it especially effective. Rather, as Filkins points out, “I witnessed several such roundups, and could only conclude that whichever of these men did not support the uprising when the raids began would almost certainly support it by the time the raids were over”. Leadership, then, in this case, came in the ability to spot a deficiency, and then building on it by offering a better solution. Petraeus, who liked to say that “money is ammunition”, focused on the civilians they wanted to protect, rather than the enemy they wanted to kill. This was a drastically radical notion at the time in the military. True leadership narratives are often riddled with anecdotes of absolute maverick behaviour of this kind. The fallacy is that, and this is one of Taleb’s main points in his book on uncertainty, Black Swan, the stories of those whose maverick ideas did not work out rarely make for interesting books or films. Few songs will be written about those guys.
Just as Petraeus was able to leverage the time in which he happened to be serving in order to spot something that he could perceive to be at fault and have the opportunity to amend, there is then an element of luck involved too. “I have plenty of clever generals”, Napoleon once said, “Just give me a lucky one”. Petraeus’ luck began with being around at the right time in order to see how things could be different. It continued when he managed to shepherd his idea for the ‘surge’ to fruition. While at the time the idea of deploying an extra 25,000 soldiers to Iraq was greeted with some mixed reactions to the say the least, it can certainly be said to have paid off in large part. It was another example of a maverick move that panned out well. However, as Filkins points out, the timing of it all was what made it such a success. The Awakening, a phrase given to Sunni-orchestrated truces with US troops that began before the surge, was instrumental. Filkins writes, “Could the surge have worked without the Awakening? Almost certainly not”. The Awakening most assuredly featured tactically in the execution of the surge, but you can be sure it was never part of the strategy. Perhaps it was the failure to notice this, and the attractiveness of the holistic narrative – another fallacy that Taleb notes in his book – that led to a surge being attempted, with far less success, in Afghanistan. What works in one place at one time, might not work again.
Zeitgeist is also currently wading through the Marie Antoinette biography by Antonia Fraser. It is quite extraordinary to note how many times the autocratic aristocracy are a victim of circumstances, rather than being able to dictate their own fate through their own policies and leadership. In the long-term, though greeted with warmth at the start of her reign, Marie Antoinette was always treated with a modicum of suspicion by the people of France, hailing from Austria, a country of lukewarm political relations and which culturally left many an ordinary Frenchman cold. It was long-gestating prejudices such as these that helped blacken the Queen’s name. The phrase ‘Let them eat cake’ had been ascribed to various monarchs going back over a century before Marie Antoinette ascended to the throne. In the medium-term, the support France provided in the American war of independence was pivotal. The Treasury spent an enormous amount of money funding the war, which was seen as a proxy battle with England. This action alone nearly bankrupted the country. But, away from finances, there was the ideological lens to consider as well. Landed gentry like Lafayette, who left nobly at the King’s command to support the war, returned not only as lauded heroes, but as heroes who had been fighting with a group of people who yearned to be free of a suppressive, royalist regime. Such thinking proved infectious, and was not forgotten when men like Lafayette returned home. Finally, in the short-term, an absolutely ruinous stroke of weather stunted harvests, creating mass famine across the country in the lead-up to the revolution. All such things were manageable to an extent by the royalty, but truthfully the origins of such influences were out of their hands.
CEOs today are seen as less wizard-like than they were five or ten years ago, when moguls, particularly in the media industry, bestrode the globe, acquiring companies at their whim, creating ‘synergy’ where none really existed in the first place (think AOL Time Warner). The paradigm shift of course has been in the global recession that few – including many a lauded business leader – foresaw. Confidence in such people has been shaken. What these histories tells us about the ways to handle leadership then can be summarised in the following ways: 1. Know your environment. Externalities and trends are likely to influence your business, and not always in obvious ways. 2. Be mindful of context. What works somewhere might not work in the same way again elsewhere. 3. Appearance, rightly or wrongly, counts for a great deal. 4. When you choose to do something can sometimes be as important as the thing you are trying to do itself. 5. A small amount of luck can go a long way.
On the danger of trends
Around this time of year, many companies, journalists and soothsayers become prone to taking educated guesses at trends in the coming year(s). Zeitgeist itself is guilty of such crystal ball gazing, writing on retail trends for Design Week at the beginning of this year.
Valuable as some of these insights are, we must never forget about externalities that inevitably push certain trends off-course, (look what the recession did for the popularity and importance of organic food). Though it is written in an annoying manner, Black Swan and its ideas are not to be ignored. Few would have suspected that this year would have seen the demise of Gaddafi, Bin Laden and Kim Jong-Il, but so it happened. In the 1932 film Shanghai Express, the American traveller Sam asks a question, at once revealing a prideful lack of foresight, and an ephemeral resolution. Traits of a nation, perhaps. But it demonstrates the dangers of making judgements on the future based on current trends, and presuming the status quo will remain just that.
“What future is there being a Chinaman? You’re born, you eat your way through a handful of rice, and you die. What a country. Let’s have a drink!”