Archive

Posts Tagged ‘Clayton Christensen’

Mischief, managed – digital disruptors in need of legacy structures

Magic-Beans“Move fast and break things”. That is the motto of Facebook, and unofficially many of its contemporaries. While much of the most visible impact of new digital organisations has been on how they respond to, engage with and influence user behaviour, just as significant has been the extent to which these organisations have eschewed traditional business models, ways of working and other internal practices. This includes traditional measures of success (hence the above cartoon from The New Yorker), but also of transparency and leadership. Such issues will be the focus of this piece, to compare the old with the new, and where opportunities and challenges can be found.

What makes digital-first organisations different

It’s important to acknowledge the utterly transformative way that digital-first companies do business and create revenue, and how different this is from the way companies operated for the past century. Much of this change can be summed up in the phrase “disruptive innovation”, coined by the great Clayton Christensen way back in 1995. I got to hear from and speak to Clay at a Harvard Business Review event at the end of last year; a clear-thinking, inspiring man. There are few things today that organisations would still find use in from the mid-90s, and yet this theory, paradoxically, holds. The market would certainly seem to bear this concept out. Writing for the Financial Times in April, John Authers noted,

Tech stocks… are leading the market. All the Fang stocks — Facebook, Amazon, Netflix and Google — hit new records this week. Add Apple and Microsoft, and just six tech companies account for 29 per cent of the rise in the S&P 500 since Mr Trump was inaugurated.

The FANG cohort are entirely data-driven organisations that rely on user information (specifically user-volunteered information) to make their money. The more accurately they can design experiences, services and content around their users, the more likely they are to retain them. The greater the retention, the greater the power of network effects and lock-in. (Importantly, their revenue also make any new entrants easily acquirable prey, inhibiting competition). These are Marketing 101 ambitions, but they are being deployed at a level of sophistication the likes of which have never been seen before. Because of this, they are different businesses to those operating in legacy areas. These incumbents are encumbered by many things, including heavily codified regulation. Regulatory bodies have not yet woken up to the way these new companies do business; but it is only a matter of time. Until then though, the common consensus has been that, working in a different way, and without the threat of regulation, means traditional business structures can easily be discarded for the sake of efficiency; dismissed entirely as an analogue throwback.

The dangers of difference

One of the conceits of digital-first organisations is that they tend to be set up in order to democratise the sharing of services or data; disruption through liberalising of a product so that everyone can enjoy something previously limited via enforced scarcity (e.g. cheap travel, cheap accommodation). At the same time, they usually have a highly personality-driven structure, where the original founder is treated with almost Messianic reverence. This despite high-profile revelations of the Emperor having no clothes, such as with Twitter’s Jack Dorsey as well as Google, then Yahoo’s, now who-knows-where Marissa Mayer. She left Yahoo with a $23m severance package as reward for doing absolutely zero to save the organisation. Worse, she may have obstructed justice by waiting years to disclose details of cyberattacks. This was particularly galling for Yahoo’s suitor, Verizon as information came to light in the middle of its proposed purchase of the company (it resulted in a $350m cut to the acquisition price tag). The SEC is investigating. The silence on this matter is staggering, and points to a cultural lack of transparency that is not uncommon in the Valley. A recent Lex column effectively summarised this leader worship as a “most hallowed and dangerous absurdity”.

Uber’s embodiment of the founder-driven fallacy

Ben Horowitz, co-founder of the venture capital group Andreessen Horowitz, once argued that good founders have “a burning, irrepressible desire to build something great” and are more likely than career CEOs to combine moral authority with “total commitment to the long term”. It works in some cases, including at Google and Facebook, but has failed dismally at Uber.

– Financial Times, June 2017

This culture that focuses on the founder has led to a little whitewashing (few would be able to name all of Facebook’s founders, beyond the Zuck) and a lot of eggs in one basket. Snap’s recent IPO is a great example of the overriding faith and trust placed in founders, given that indicated – as the FT calls it – a “21st century governance vacuum“. Governance appears to have been lacking at Uber, as well. The company endured months of salacious rumours and accusations, including candid film of the founder, Travis Kalanick, berating an employee. This all rumbled on without any implications for quite some time. Travis was Travis, and lip service was paid while the search for some profit – Uber is worth more than 80% of the companies on the Fortune 500, yet in the first half of last year alone made more than $1bn in losses – continued.

Uber’s cultural problems eventually reached such levels (from myriad allegations of sexual harassment, to a lawsuit over self-driving technology versus Google, to revelations about ‘Greyball’, software it used to mislead regulators), that Kalanick was initially forced to take a leave of absence. But as mentioned earlier, these organisations are personality-driven; the rot was not confined to one person. This became apparent when David Bonderman had to resign from Uber’s board having made a ludicrously sexist comment directed at none other than his colleague Arianna Huffington, that illustrated the company’s startlingly old-school, recidivist outlook. This at a meeting where the company’s culture was being reviewed and the message to be delivered was of turning a corner.

A report issued by the company on a turnaround recommended reducing Kalanick’s responsibilities and hiring a COO. The company has been without one since March. It is also without a CMO, CFO, head of engineering, general counsel and now, CEO. Many issues raise themselves as a start-up grows from being a small organisation to a large one. So it is with Uber – one engineer described it as “an organisation in complete, unrelenting chaos” – as it will be with other firms to come. There is only a belated recognition that structures had to be put in place, the same types of structures that the organisations they were disrupting have in place. The FT writes,

“Lack of oversight and poor governance was a key theme running through the findings of the report… Their 47 recommendations reveal gaping holes in Uber’s governance structures and human resources practices.”

These types of institutional practices are difficult to enforce in the Valley. That is precisely because their connotations are of the monolithic corporate mega-firms that employees and founders of these companies are often consciously fighting against. Much of their raison d’être springs from an idealistic desire to change the world, and methodologically to do so by running roughshod over traditional work practices. This has its significant benefits (if only in terms of revenue), but from an employee experience it is looking like an increasingly questionable approach. Hadi Partovi, an Uber investor and tech entrepreneur told the FT, “This is a company where there has been no line that you wouldn’t cross if it got in the way of success”. Much of this planned oversight would have been anathema to Kalanick, which ultimately is why the decision for him to leave was unavoidable. Uber now plans to refresh its values, install an independent board chairman, conduct senior management performance reviews and adopt a zero-tolerance policy toward harassment.

Legacy lessons from an incumbent conglomerate

Many of the recommendations in the report issued to Uber would be recognised by anyone working in a more traditional work setting (as a former management consultant, they certainly ring a bell to me). While the philosophical objection to such things has already been noted, the notion of a framework to police behaviour, it must also be recognised, is a concept that will be alien to most anyone working in the Valley. Vivek Wadhwa, a fellow at the Rock Center of Corporate Governance, clarified, “The spoiled brats of Silicon Valley don’t know the basics. It is a revelation for Silicon Valley: ‘duh, you have to have HR people, you can’t sleep with each other… you have to be respectful’.”

Meanwhile, another CEO stepped down recently in more forgiving circumstances, recently but which still prompted unfavourable comparisons; Jeff Immelt of General Electric. As detailed in a stimulating piece last month in The New York Times, Immelt has had a difficult time of it. Firstly, he succeeded in his role a man who was generally thought to be a visionary CEO; Jack Welch. Fortune magazine in 1999 described him as the best manager of the 20th century. So no pressure for Immelt there, then. Secondly, Immelt became Chairman and CEO four days before the 9/11 attacks, and also had the 2008 financial crisis in his tenure. Lastly, since taking over, the nature of companies, as this article has attempted to make clear, has changed radically. Powerful conglomerates no longer rule the waves.

Immelt has, perhaps belatedly, been committed to downsizing the sprawling offering of GE in order to make it more specialised. Moreover, the humility of Immelt is a million miles from the audacity, bragaddacio and egotism of Kalanick, acknowledging, “This is not a game of perfection, it’s a game of progress.”

So while the FANGs of the world are undoubtedly changing the landscape of business [not to mention human interaction and behaviours], they also need to recognise that not all legacy structures and processes are to be consigned to the dustbin of management history, simply because they work in a legacy industry sector. Indeed, more responsibility diverted from the founder, greater accountability and transparency, and a more structured employee experience might lead to greater returns, higher employee retention rates and perhaps even mitigate regulatory scrutiny down the line. The opportunity is there for those sensible enough to grasp it.

On past and future innovation – Disruption, inequality and robots

How to define innovation, how has it been studied in the recent past, and what does future innovation hold for the human race?

Sometimes the word innovation gets misused. Like when people use the word “technology” to mean recent gadgets and gizmos, instead of acknolwedging that the term encompasses the first wheel. “Innovation” is another tricky one. Our understanding of recent thoughts on innovation – as well as its contemporary partner, “disruption” – were thrown into question in June when Jill Lepore penned an article in The New Yorker that put our ideas about innovation and specifically on Clayton Christensen’s ideas about innovation in a new light. Christensen, heir apparent to fellow Harvard Business School bod Michael Porter (author of the simple, elegant and classic The Five Competitive Forces that Shape Strategy) wrote The Innovator’s Dilemma in 1997. His work on disruptive innovation, claiming that successful businesses focused too much on what they were doing well, missing what, in Lepore’s words, “an entirely untapped customer wanted”, created a cottage industry of conferences, companies and counsels committed to dealing with disruption, (not least this blog, which lists disruption as one its topics of interest). Lepore’s article describes how, as Western society’s retelling of the past became less dominated by religion and more by science and historicism, the future became less about the fall of Man and more about the idea of progress. This thought took hold particularly during The Enlightenment. In the wake of two World Wars though, our endless advance toward greater things seemed less obvious;

“Replacing ‘progress’ with ‘innovation’ skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices our getting newer and newer”

The article goes on to look at Christensen’s handpicked case studies that he used in his book. When Christensen describes one of his areas of focus, the disk-drive industry, as being unlike any other in the history of business, Lepore rightly points out the sui generis nature of it “makes it a very odd choice for an investigation designed to create a model for understanding other industries”. She goes on for much of the article to utterly debunk several of the author’s case studies, showcasing inaccuracies and even criminal behaviour on the part of those businesses he heralded as disruptive innovators. She also deftly points out, much in the line of thinking in Taleb’s Black Swan, that failures are often forgotten about, and those that succeed are grouped and promoted as formulae for success. Such is the case with Christensen’s apparently cherry-picked case studies. Writing about one company, Pathfinder, that tried to branch out into online journalism, seemingly too soon, Lepore comments,

“Had [it] been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, ‘it’s like it never existed’… Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.”

Such were the ramifications of the piece, that when questioned on it recently in Harvard Business Review, Christensen confessed “the choice of the word ‘disruption’ was a mistake I made twenty years ago“. The warning to businesses is that just because something is seen as ‘disruptive’ does not guarantee success, or fundamentally that it belongs to any long-term strategy. Developing expertise in a disparate area takes time, and investment, in terms of people, infrastructure and cash. And for some, the very act of resisting disruption is what has made them thrive. Another recent piece in HBR makes the point that most successful strategies involve not just a single act of deus ex machina thinking-outside-the-boxness, but rather sustained disruption. Though Kodak, Sony and others may have rued the days, months and years they neglected to innovate beyond their core area, the graveyard of dead businesses is also surely littered with companies who innovated too soon, the wrong way or in too costly a process that left them open to things other than what Schumpeter termed creative destruction.

terminator-5-sarah-connor-actress

Your new boss

Outside of cultural and philosophical analysis of the nature and definition of innovation, some may consider of more pressing concern the news that we are soon to be looked after by, and subsequently outmaneuvered in every way by, machines. The largest and most forward-thinking (and therefore not necessarily likely) of these concerns was recently put forward by Nick Bostrom in his new book Superintelligence: Paths, Dangers, Strategies. According to a review in The Economist, the book posits that once you assume that there is nothing inherently magic about the human brain, it is evidence that an intelligent machine can be built. Bostrom worries though that “Once intelligence is sufficiently well understood for a clever machine to be built, that machine may prove able to design a better version of itself” and so on, ad infinitum. “The thought processes of such a machine, he argues, would be as alien to humans as human thought processes are to cockroaches. It is far from obvious that such a machine would have humanity’s best interests at heart—or, indeed, that it would care about humans at all”.

Beyond the admittedly far-off prognostications of the removal of the human race at the hands of the very things it created, machines and digital technology in general pose great risks in the near-term, too. For a succinct and alarming introduction to this, watch the enlightening video at the beginning of this post. Since the McKinsey Global Instititute published a paper in May soberly titled Disruptive technologies: Advances that will transform life, business, and the global economy, much editorial ink and celluloid (were either medium to still be in much use) has been spilled and spooled detailing how machines will slowly replace humans in the workplace. This transformation – itself a prime example of creative destruction – is already underway in the blue-collar world, where machines have replaced workers in automotive factories. The Wall Street Journal reports Chinese electronics makers are facing pressure to automate as labor costs rise, but are challenged by the low margins, precise work and short product life of the phones and other gadgets that the country produces. Travel agents and bank clerks have also been rendered null, thanks to that omnipresent machine, the Internet. Writes The Economist, “[T]eachers, researchers and writers are next. The question is whether the creation will be worth the destruction”. The McKinsey report, according to The Economist, “worries that modern technologies will widen inequality, increase social exclusion and provoke a backlash. It also speculates that public-sector institutions will be too clumsy to prepare people for this brave new world”.

Such thinking gels with an essay in the July/August edition of Foreign Affairs, by Erik Brynjolfsson, Andrew McAfee and Michael Spence, titled New World Order. The authors rightly posit that in a free market the biggest premiums are reserved for the products with the most scarcity. When even niche, specialist employment though, such as in the arts (see video at start of article), can be replicated and performed to economies of scale by machines, then labourers and the owners of capital are at great risk. The essay makes good points on how while a simple economic model suggests that technology’s impact increases overall productivity for everyone, the truth is that the impact is more uneven. The authors astutely point out,

“Today, it is possible to take many important goods, services, and processes and codify them. Once codified, they can be digitized [sic], and once digitized, they can be replicated. Digital copies can be made at virtually zero cost and transmitted anywhere in the world almost instantaneously.”

Though this sounds utopian and democratic, what is actually does, the essay argues, is propel certain products to super-stardom. Network effects create this winner-take-all market. Similarly it creates disproportionately successful individuals. Although there are many factors at play here, the authors readily concede, they also maintain the importance of another, important and distressing theory;

“[A] portion of the growth is linked to the greater use of information technology… When income is distributed according to a power law, most people will be below the average… Globalization and technological change may increase the wealth and economic efficiency of nations and the world at large, but they will not work to everybody’s advantage, at least in the short to medium term. Ordinary workers, in particular, will continue to bear the brunt of the changes, benefiting as consumers but not necessarily as producers. This means that without further intervention, economic inequality is likely to continue to increase, posing a variety of problems. Unequal incomes can lead to unequal opportunities, depriving nations of access to talent and undermining the social contract. Political power, meanwhile, often follows economic power, in this case undermining democracy.”

There are those who say such fears of a rise in inequality and the whole destruction through automation of whole swathes of the job sector are unfounded, that many occupations require a certain intuition that cannot be replicated. Time will tell whether this intuition, like an audio recording, health assessment or the ability to drive a car, will be similarly codified and disrupted (yes, we’ll continue using the word disrupt, for now).