How to define innovation, how has it been studied in the recent past, and what does future innovation hold for the human race?
Sometimes the word innovation gets misused. Like when people use the word “technology” to mean recent gadgets and gizmos, instead of acknolwedging that the term encompasses the first wheel. “Innovation” is another tricky one. Our understanding of recent thoughts on innovation – as well as its contemporary partner, “disruption” – were thrown into question in June when Jill Lepore penned an article in The New Yorker that put our ideas about innovation and specifically on Clayton Christensen’s ideas about innovation in a new light. Christensen, heir apparent to fellow Harvard Business School bod Michael Porter (author of the simple, elegant and classic The Five Competitive Forces that Shape Strategy) wrote The Innovator’s Dilemma in 1997. His work on disruptive innovation, claiming that successful businesses focused too much on what they were doing well, missing what, in Lepore’s words, “an entirely untapped customer wanted”, created a cottage industry of conferences, companies and counsels committed to dealing with disruption, (not least this blog, which lists disruption as one its topics of interest). Lepore’s article describes how, as Western society’s retelling of the past became less dominated by religion and more by science and historicism, the future became less about the fall of Man and more about the idea of progress. This thought took hold particularly during The Enlightenment. In the wake of two World Wars though, our endless advance toward greater things seemed less obvious;
“Replacing ‘progress’ with ‘innovation’ skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices our getting newer and newer”
The article goes on to look at Christensen’s handpicked case studies that he used in his book. When Christensen describes one of his areas of focus, the disk-drive industry, as being unlike any other in the history of business, Lepore rightly points out the sui generis nature of it “makes it a very odd choice for an investigation designed to create a model for understanding other industries”. She goes on for much of the article to utterly debunk several of the author’s case studies, showcasing inaccuracies and even criminal behaviour on the part of those businesses he heralded as disruptive innovators. She also deftly points out, much in the line of thinking in Taleb’s Black Swan, that failures are often forgotten about, and those that succeed are grouped and promoted as formulae for success. Such is the case with Christensen’s apparently cherry-picked case studies. Writing about one company, Pathfinder, that tried to branch out into online journalism, seemingly too soon, Lepore comments,
“Had [it] been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, ‘it’s like it never existed’… Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.”
Such were the ramifications of the piece, that when questioned on it recently in Harvard Business Review, Christensen confessed “the choice of the word ‘disruption’ was a mistake I made twenty years ago“. The warning to businesses is that just because something is seen as ‘disruptive’ does not guarantee success, or fundamentally that it belongs to any long-term strategy. Developing expertise in a disparate area takes time, and investment, in terms of people, infrastructure and cash. And for some, the very act of resisting disruption is what has made them thrive. Another recent piece in HBR makes the point that most successful strategies involve not just a single act of deus ex machina thinking-outside-the-boxness, but rather sustained disruption. Though Kodak, Sony and others may have rued the days, months and years they neglected to innovate beyond their core area, the graveyard of dead businesses is also surely littered with companies who innovated too soon, the wrong way or in too costly a process that left them open to things other than what Schumpeter termed creative destruction.
Outside of cultural and philosophical analysis of the nature and definition of innovation, some may consider of more pressing concern the news that we are soon to be looked after by, and subsequently outmaneuvered in every way by, machines. The largest and most forward-thinking (and therefore not necessarily likely) of these concerns was recently put forward by Nick Bostrom in his new book Superintelligence: Paths, Dangers, Strategies. According to a review in The Economist, the book posits that once you assume that there is nothing inherently magic about the human brain, it is evidence that an intelligent machine can be built. Bostrom worries though that “Once intelligence is sufficiently well understood for a clever machine to be built, that machine may prove able to design a better version of itself” and so on, ad infinitum. “The thought processes of such a machine, he argues, would be as alien to humans as human thought processes are to cockroaches. It is far from obvious that such a machine would have humanity’s best interests at heart—or, indeed, that it would care about humans at all”.
Beyond the admittedly far-off prognostications of the removal of the human race at the hands of the very things it created, machines and digital technology in general pose great risks in the near-term, too. For a succinct and alarming introduction to this, watch the enlightening video at the beginning of this post. Since the McKinsey Global Instititute published a paper in May soberly titled Disruptive technologies: Advances that will transform life, business, and the global economy, much editorial ink and celluloid (were either medium to still be in much use) has been spilled and spooled detailing how machines will slowly replace humans in the workplace. This transformation – itself a prime example of creative destruction – is already underway in the blue-collar world, where machines have replaced workers in automotive factories. The Wall Street Journal reports Chinese electronics makers are facing pressure to automate as labor costs rise, but are challenged by the low margins, precise work and short product life of the phones and other gadgets that the country produces. Travel agents and bank clerks have also been rendered null, thanks to that omnipresent machine, the Internet. Writes The Economist, “[T]eachers, researchers and writers are next. The question is whether the creation will be worth the destruction”. The McKinsey report, according to The Economist, “worries that modern technologies will widen inequality, increase social exclusion and provoke a backlash. It also speculates that public-sector institutions will be too clumsy to prepare people for this brave new world”.
Such thinking gels with an essay in the July/August edition of Foreign Affairs, by Erik Brynjolfsson, Andrew McAfee and Michael Spence, titled New World Order. The authors rightly posit that in a free market the biggest premiums are reserved for the products with the most scarcity. When even niche, specialist employment though, such as in the arts (see video at start of article), can be replicated and performed to economies of scale by machines, then labourers and the owners of capital are at great risk. The essay makes good points on how while a simple economic model suggests that technology’s impact increases overall productivity for everyone, the truth is that the impact is more uneven. The authors astutely point out,
“Today, it is possible to take many important goods, services, and processes and codify them. Once codified, they can be digitized [sic], and once digitized, they can be replicated. Digital copies can be made at virtually zero cost and transmitted anywhere in the world almost instantaneously.”
Though this sounds utopian and democratic, what is actually does, the essay argues, is propel certain products to super-stardom. Network effects create this winner-take-all market. Similarly it creates disproportionately successful individuals. Although there are many factors at play here, the authors readily concede, they also maintain the importance of another, important and distressing theory;
“[A] portion of the growth is linked to the greater use of information technology… When income is distributed according to a power law, most people will be below the average… Globalization and technological change may increase the wealth and economic efficiency of nations and the world at large, but they will not work to everybody’s advantage, at least in the short to medium term. Ordinary workers, in particular, will continue to bear the brunt of the changes, benefiting as consumers but not necessarily as producers. This means that without further intervention, economic inequality is likely to continue to increase, posing a variety of problems. Unequal incomes can lead to unequal opportunities, depriving nations of access to talent and undermining the social contract. Political power, meanwhile, often follows economic power, in this case undermining democracy.”
There are those who say such fears of a rise in inequality and the whole destruction through automation of whole swathes of the job sector are unfounded, that many occupations require a certain intuition that cannot be replicated. Time will tell whether this intuition, like an audio recording, health assessment or the ability to drive a car, will be similarly codified and disrupted (yes, we’ll continue using the word disrupt, for now).
Are incumbent companies starting to see the light when it comes to embracing digital? Evidence is slowly starting to point in that direction.
Artists are known for embracing change and innovation, but the art market itself has been slow to adapt to changing consumer behaviour. Now mega e-tailer Amazon is selling art on its site, and venerable auction house Christie’s is pushing headlong into online-only sales, as Mashable recently reported. And while fashion designers know how to use digital to push the envelope, the fashion industry as a business has been notorious for their skittishness at investing in efficient, immersive digital experiences for their customers, so worried are they about detracting from the brand. So it was reassuring to see during Paris Fashion Week recently that French marque Chloé had gotten the message. As Zeitgeist’s dear friend and fashion aficionado Rachel Arthur details on her blog, the brand launched a dedicated microsite for their runway show. Brands like Burberry and Louis Vuitton have been doing this for at least three years, so in of itself it’s nothing new. What made the experience different were two things. Firstly, the site created a journey that started before the show, and continued after it, rather than merely offering a stream of live video and little else. More importantly, it tried to make the experience one that reflected the influence of those watching. As Rachel points out,
“As the event unfolded, so too did different albums under a moodboard header, including one for the collection looks, one for accessories, another for the guests, and one from backstage. Users could click on individual images and share them via Twitter, Facebook, Pinterest or Weibo, or heart them to add them to their own personal moodboard page.
‘[We] are excited to see how you direct your own Chloé show,’ read the invite.”
The recognition of platforms like Weibo should be seen as another coup for Chloé. Too often, companies send out communications to global audiences with perfunctory links to Facebook and Twitter. Not only is there no call to action for these links (why is it that the user should go there?), but there is no recognition that one of the world’s most populous and prosperous markets are more into their Renren and Weibo.
Elsewhere, despite what seems like some niggling problems, Zeitgeist was excited and intrigued to read about Disney‘s latest foray into embracing how consumers use digital devices, this time creating a second-screen experience in movie theaters. Second Screen Live, as Disney have branded it, doesn’t immediately sound particularly logical, as GigaOm point out,
“Of all the places I’d thought would be forbidden to the second screen experience, movie theaters were near the top of my list. After all, you’re paying a premium ticket price for the opportunity to sit in a dark theater and immerse yourself in a narrative — second screen devices operate in direct opposition to that.”
And yet the Little Mermaid experience that the writer goes on to describe cannot be faulted for its attempt at innovation, at reaching beyond current thinking (not to mention revenue streams), in order to forge a new relationship between the viewer and the product. Kudos.
Lastly, Zeitgeist wanted to mention the US television network Fox as a classic example of a company that has slowly come to realise the power of working with digital, rather than against it. In years passed, companies like Fox were indisputably heavily involved in digital, but only from a punitive standpoint. Fox and others were ruthless in their distribution of takedown notices to sites hosting content they deemed to infringe on their product. Fan sites that exploded in support and admiration for shows like The X-Files were summarily threatened with legal action and closed. There was little thought given to the positive sentiment sites were creating around the product, and little thought given to the destruction of brand equity that such takedown notices brought about. Not to mention the dessication of communities that had come together from different parts of the world, their single shared attribute being that they were evangelists of what you were selling. Clips of shows, such as The Simpsons, appearing on YouTube would be treated with similar disdain. So it shows how far we’ve come in a few years that this morning when Zeitgeist went onto YouTube he was greeted on the homepage with a sponsored link from Fox pointing him to the opening scenes of the latest Simpsons episode, before it aired. Definitely a move in the right direction.
This past week, Zeitgeist had the pleasure of enjoying a new adaptation of Shakespeare’s “Much Ado about Nothing”. This adaptation was not performed at the theatre but at the cinema. It was not directed by Kenneth Branagh or any other luminary of the legitimate stage, but rather by the quiet, modest, nerdy Joss Whedon, who until a few years ago was best known to millions as the brains behind the cult TV series phenomenon “Buffy the Vampire Slayer” (full disclosure: Zeitgeist worked on the show in his days of youth). Whedon was picked to direct a film released last year that can, without much difficulty, be seen as the apotheosis of the Hollywood film industry; “The Avengers”. A mise-en-abyme of a concept, involving disparate characters, some of whom already have their own fully-fledged franchises, coming together to form another vehicle for future iterations. “The Avengers” became the third-highest grossing film of all time, and it is a thoroughly enjoyable romp. Moreover, to go from directing on such a broad canvas to shooting a film mostly with friends in one’s own home – as with “Much Ado…” – displays an impressive range of creative ingenuity.
Sadly for shareholders and studio executives’ career aspirations, not every film is as sure-fire a hit as “The Avengers”, try though as they might (and do) to replicate the same mercurial ingredients that lead to success. Marvel, which originally conceived of the myriad characters surrounding The Avengers mythology, was bought in 2009 by Disney for $4bn. Disney for all intents and purposes have a steady strategic head on their shareholders. They parted ways with the quixotic Weinstein brothers while welcoming Pixar back into the fold. They were one of the first to concede the inevitability of closed platforms release windows – something Zeitgeist has written about in the past – they are debuting a game-changing platform, Infinity, which might revolutionise the way children interact with the plethora of memorable characters the studio have dreamt up over the years. However, such sound business strategy could not save them from the uber-flop that was 2012’s “John Carter”, which lost the studio $200m. This summer, the rationale for their biggest release has been built on what appears to be sound logic; taking the on- and off-screen talent behind their massively successful “Pirates of the Caribbean” franchise, and bringing them together again for another reboot in the form of “The Lone Ranger”. The New York Times said the film “descends into nerve-racking incoherence”; it has severely underperformed at the box office, after a budget of $250m. Sony’s “After Earth” similarly underperformed, suddenly throwing Will Smith’s bullet-proof reputation for producing hits into jeopardy.
These summer films – “tentpoles” to use the terminology bandied about in Los Angeles – are where the money is made (or not) for studios. As an industry over the past ten years, Zeitgeist has watched as these tentpoles have become more concentrated, more risk-averse and therefore less original, more expensive and more likely either to produce either stratospheric results or spectacular failures. Paramount is an interesting example of a studio that has made itself leaner recently, releasing far fewer films, and relying on franchises to keep the ship afloat. Edtorial Director of Variety Peter Bart seems to think there’s a point when avoiding risk leads to courting entropy. It’s an evolution that has escaped few, yet is was still notable when, last month, famed directors Steven Spielberg and George Lucas spoke out publicly against the way the industry seemed to be headed. Indeed, the atmosphere at studios in Hollywood seems to mimic that of a pre-2008 financial sector; leveraging ever more collateral against assets with significant – and unsustainable – levels of risk. The financial sector uses arcane algorithms and has a large number of Wharton grads whose aim should be to preserve stability and profit. Yet even with all this analysis, they failed to see the gigantic readjustment that was imminent. In the film industry, Relativity Media’s reputation for rigorous predictive models on what will make a film successful is rare enough to have earned it a feature in Vanity Fair. So what hope is there the film industry will change its tune before it is too late? Spielberg pontificates,
“There’s eventually going to be a big meltdown. There’s going to be an implosion where three or four or maybe even a half-dozen of these mega-budgeted movies go crashing into the ground and that’s going to change the paradigm again.”
Instead of correcting course as failures at the box office failed to abate, studios have dug in harder. Said Lucas,
“They’re going for gold, but that isn’t going to work forever. And as a result they’re getting narrower and narrower in their focus. People are going to get tired of it. They’re not going to know how to do anything else.”
Such artistic ennui in audiences is admittedly sclerotic in its visibility at the moment. “Man of Steel”, another attempt at rebooting a franchise – coming only seven years after the last attempt – is performing admirably, with a position still firmly in the top ten at the US box office after four weeks of release, with over $275m taken domestically. It’s interesting to note that audiences have been happy to embrace the new version so quickly after the last franchise launch failed; though actor James Franco finds it contentious, the same has been true with the “Spider-Man” franchise relaunch.
Part of the problem in the industry, some say, is to do with those at the top running the various film studios. In “Curse of the Mogul”, written by lecturers at Columbia University, the authors contend that since 2005 the industry as a whole has underperformed versus the S&P stock index, yet such stocks are still eminently attractive to investors. The reason, the authors say, is that those running the businesses frame the notion of success differently. They argue that it takes a very special type of person (i.e. them) to be able to manage not only different media and the different audiences they reach and the different trends that come out of that, but more importantly (in their eyes) to be able to manage the talent. They asked to be judged on Academy Awards rather than bottom lines. The most striking thing in the book – which Zeitgeist is still reading – is the continual pursuit by said mogul of strategic synergies. This M&A activity excites shareholders but has historically led to minimal returns (think Vivendi or AOL Time Warner), often because what was presented as operational or content-based synergy is actually nothing of the sort. It’s a point Richard Rumelt makes in his excellent book, “Good Strategy / Bad Strategy”. Some companies are beginning to get the idea. Viacom seemed an outlier in 2006 when it divested CBS. Lately, News Corporation has followed a similar tack, albeit under duress after suffering from scandalous revelations about hacking in its news division. A recent article in The Economist states,
“Most shareholders now see that television networks, newspapers, film studios, music labels and other sundry assets add little value by sharing a parent. Their proximity can even hinder performance by distracting management… they have become more assertive and less likely to believe the moguls’ flannel about ‘synergies’.”
So in some ways it was of little surprise that Sony came under the microscope recently as well, part of this larger trend of scrutiny. The company has experienced dark times of late, with shares having plunged 85% over the past 13 years. The departure of Howard Stringer in 2012 coincided with an annual loss of some $6.4bn. Now headed up by Kazuo Hirai, the company has undoubtedly become more focused, with much more being made of their mobile division. Losses have been stemmed, but the company is still floundering, with an annual loss reported in May of $4.6bn. It was only a couple of weeks later that hedge-fun billionaire Dan Loeb – instrumental in getting Marissa Meyer to lead Yahoo – upped his ownership stake in Sony, calling on it to divest its entertainment division in a letter to CEO Hirai. Part of the issue with Sony is a cultural one, where Japan’s ways of working differ strongly from the West’s. This is covered in some detail in a profile with Stringer featured in The New Yorker. In a speech he gave last year, Stringer said, “Japan is a harmonious society which cherishes its social values, including full employment. That leads to conflicts in a world where shareholder value calls for ever greater efficiency”. But Sony’s film division – which includes the James Bond franchise – is performing well; in the year to March 2013 Sony’s film and music businesses produced $905m of operating income, compared with combined losses of $1.9 billion in mobile phones, according to The Economist. It ended 2012 first place among the other film studios in market share. Sony is the last studio to consistently deliver hits across genres, reports The New York Times in an excellent article. The article quotes an anonymous Sony exeuctive, “We may not look like the rest of Hollywood, but that doesn’t mean this isn’t a painstakingly thought-through strategy and a profitable one”. Sadly the strategy behind films like ‘After Earth’ begin to look flimsy when one glances at the box office results. While Hirai and the Sony board concede that have met to discuss the possibility of honouring Mr. Loeb’s suggestion – offering 15-20% of it as an IPO rather than selling it off in full – Mr. Hirai also commented in an interview with CNBC, “We definitely want to make sure we can continue a successful business in the entertainment space. That is for me, first and foremost, the top priority”. In mid-June Loeb sent a second letter, advocating the IPO proposal and saying “Our research has confirmed media reports depicting Entertainment as lacking the discipline an accountability that exist at many of its competitors”. The question is whether selling off its entertainment assets would remove any synergies with other divisions, thus making the divisions left over less profitable, or whether such synergies even existed in the first place. For Loeb, the “most valuable untapped synergies” are still in the studio and music divisions yet after decades as one company they still remain untapped. That point won’t make for pleasant reading at Sony HQ.
Another problem is the changing nature of media consumption habits. Not only are we watching films in different ways over different platforms, we are also doing much else besides, from playing video games, which have successfully transitioned beyond the nerdy clique of yesteryear, to general mobile use and second screening. This transition – and with it a realisation that competition is not likely to come from across regional boarders but from startup platforms – is largely being ignored by the French as they insist on trade talks with the US that centre on the preservation of l’exception culturelle. Such trends are evident in business dealings. The Financial Times this weekend detailed Google’s significant foray into developing content, setting up YouTube Space LA. The project gives free soundstage space to artists who are likely to guarantee eyeballs on YouTube, and lead to advertising revenue for the platform. From the stellar success of the first season of “House of Cards”, to DreamWorks Animation’s original content partnership announced last month, Netflix has become the bête noire for traditional content producers as it shakes up traditional models. We have written before about the IHS Screen Digest data from earlier this year, showing worrying trends for the industry; as predicted, audiences are beginning to favour access over ownership, preferring to rent rather than own, which means less profit for the studio. As much due to a decline in revenue from other platforms as growth in of itself, cinemas are expected to be the major area of profit going forward to 2016 (see above chart). We’ve written before about the power cinema still has. Spielberg and Lucas pick up on this;
“You’re going to end up with fewer theaters, bigger theaters with a lot of nice things. Going to the movies will cost 50 bucks or 100 or 150 bucks, like what Broadway costs today, or a football game. It’ll be an expensive thing… [Films] will sit in the theaters for a year, like a Broadway show does. That will be called the ‘movie’ business.”
In a conversation over Twitter, (excerpts of which are featured above), Cameron Saunders, MD of 20th Century Fox UK told Zeitgeist that “major changes were afoot”. Such potential disruption is by no means unique to the film industry, and should come as a surprise to one. Zeitgeist recently went to see Columbia faculty member Rita McGrath speak at a Harvard Business Review event. In her latest book, “The End of Competitive Advantage”, McGrath discounts the old management consultant attempts at providing sustainable competitive advantages to business. Her assertion is that any advantage is transient, that incumbency and success often lead to entropy, unless there is constant innovation to build on that success. Such a verdict of entropy could well be applied to the film industry. The model has worked well for decades, despite predictions of doom at the advent of television, the VCR, the DVD, et cetera ad nauseum. But fundamental behavioural shifts are now at play, and the way we devise strategies for what content people want to see and how they wish to see it need to be readdressed, quickly. Otherwise all this deliberation will eventually become much ado about nothing.
UPDATE (15/4/13): Of course, context is everything. The New York Times published an interesting article today saying investing in Hollywood is less risky than investing in Silicon Valley, though the returns in the latter are likely to be greater. Neither are seen as reliable.
This issue isn’t going away. We write again about it, here.
“[T]he big screen. That is its natural habitat—the only place, you might say, where its proud and leonine presence has any meaning. Anything more cramped is a cage, as Jon Stewart showed during this year’s Oscar ceremony. At one point, we found him gazing at his iPhone. “I’m watching ‘Lawrence of Arabia.’ It’s just awesome,” he said, adding, “To really appreciate it, you have to see it in the wide screen.” And he turned the phone on its side. Deserts of vast eternity, reduced to three inches by two.”
- Anthony Lane, The New Yorker
Film can sometimes be a mercurial medium. Especially nowadays. It encompasses multiple genres, and, like food, is meant for different occasions, for different needs. Of course, sometimes we go to bad restaurants, or order in, and the experience is terrible. Uber-flop John Carter cost Disney a cool $200m, and wasted many a precious dollar and hour for those that went to see it (admittedly few). But sometimes it’s like a great burger and fries – Die Hard springs to mind – and sometimes it’s a sumptuous 6-course meal cooked by a Michelin-starrred chef – Lawrence of Arabia, or All the King’s Men. Film can stimulate us, it can teach us, and it can be a breezy bit of consumption to pass the time, like a coffee at Starbucks. Moreover, as with food, it can be consumed in different places and circumstances. There are times when the right way to watch a certain film is on your iPad in a cramped airline seat. Pure escapism. But cinema has a crucial place too.
It was interesting today, when Zeitgeist went to see a movie, that it was preceded by an announcement showing an empty cinema, covered in cobwebs and dust, bemoaning the death of the medium at the hands of pirates. Its aim was to take the audience on a guilt trip: ‘Why are you illegally downloading films?’ ‘Why aren’t you coming to see more films at the cinema?’ it pleaded. There are a couple of things strategically wrong with this approach. Firstly, what is the principle problem here? Alright, people are not going to the cinema as often as we would like. Zeitgeist remembers in a brief stint working for Fox several years ago that people went to the cinema 1.8 times a year in the UK. The Economist reports that the share of Americans who attend cinema at least once a month has declined from 30% in 2000 to 10% in 2011. The assumption is that people are instead pirating films at home, thereby depriving studios of money (ignoring research that suggests those that pirate are often avid cinema-goers, and optimistically equating every film downloaded to ticket revenue lost). Well, one quick way to address this is to make films legally available – at a sizeable premium – on multiple platforms day and date. We’ve argued this before, and entertainment trade Variety has used our argument for a lead editorial. It should be recognised, that, although the most prominent face of the film industry, cinema is not what makes the studio money; for years the bulk of profits have been made in home entertainment consumption. Furthermore, there are two fallacies here. One is that cinemas make most of their profit from the snacks people buy at the cinema, not the films themselves. If you want to increase margins, there should be a much more prominent focus on food options, and that means offering a wider, more tempting range of food to be eaten, which is then promoted more effectively. The way such snacks are currently promoted – “Let’s all go the lobby” – has not altered for a half century. Lastly and most egregiously, the communication is completely misdirected, talking to the very audience who is already doing what the ad asks them to do. The ad is shown nowhere but the cinema, therefore only people who go to the cinema will be subject to this guilt trip. To avoid feeling guilty, one can avoid the ad by avoiding the cinema. The logic is completely twisted. Negative communications have been shown to be much less effective in influencing behaviour than positive affirmation. So let’s think about a way to promote cinema that goes beyond a highlight reel of what movies are on in a particular season. More robust revenue streams will have to be found soon. Less people are turning out to the cinema, and in foreign markets, which are doing relatively well, a far smaller chunk of box-office receipts go to the studios.
What also played during the reel before the film started was a short film by Disney Animation that has been nominated for an Academy Award, called Paperman (see trailer above). Zeitgeist had watched the short some days ago on his iPhone after coming across it on Twitter, and enjoyed it thoroughly. It was exciting and convenient to be able to consume something so quickly after hearing about it. Moreover, it was instantly shareable with the 400-odd people who follow our tweets when we retweeted the link. Seeing it in the cinema today though really reinforced the power of the big screen; the detail you couldn’t see on the iPhone, the great sound, and the shared laughter and enjoyment from those around you. “Grandeur is a far from simple blessing”, writes Anthony Lane in the same article quoted at the beginning of this post, in The New Yorker back in 2008. The pleasure of watching something in the cinema is ultimately an irrational benefit, which can be hard to quantify, but even harder to ignore.
UPDATE (06.12.13): The Economist featured a good article on how cinemas are seeking new revenue streams around the world, here.
“If you have built castles in the air, your work need not be lost; that is where they should be. Now put foundations under them.”
- Henry David Thoreau
Though the brouhaha over the series House of Cards has been building steadily since its announcement almost two years ago, through rumours of budget battles between director and studio, it was upon the release of the series this week that the media meta-echo chamber really went into overdrive. The first season, with a budget far north of $100m, debuted to ebullient praise from critics. But what does it signify for the trail-blazing company’s future?
Aside from the mostly positive reviews, the series piqued the media industry’s interest for other reasons too. It is the first to be created and screened exclusively by Netflix, a company previously known for striking deals with studios to distribute and stream their content. Not satisfied solely with such (sometimes pricey) deals, the company also saw an opportunity for greater brand visibility and a separate revenue stream – assuming it eventually licenses the show regular TV networks – in fully-fledged independent production. What is also interesting is that the entire first season was made available for instant viewing, all 12 hours. By doing this the company recognised and capitalised on a trend that has been accelerating for almost a decade; people like to watch multiple episodes at once. This has never not been the case, but the weekly episodic installments of shows on network television have allowed the audience little say in the matter, and thus no room for such a habit to develop. This changed dramatically with the arrival of the DVD, specifically with affordable boxsets, as those that had missed the zeitgeists of West Wing, The Sopranos and 24 were able to quickly catch up with their obsessed brethren. Critics have often noted how the viewing of multiple episodes at once – which is how such reviews are often conducted as they usually receive a disc with several shows to consider – particularly for shows like Lost, improves the structure and narrative flow. With the arrival of boxsets, such opportunities were available to all. Indeed, marketers leveraged this enthusiasm for consecutive viewing, creating events around it. Netflix saw this with absolute clarity and allowed viewers to watch as much or little as they desired. Many, it seemed, chose to devour the whole first season in one weekend, which entertainment trade Variety covered with humourous repercussions to the viewer’s psyche, across now fewer than six stages of grief. Zeitgeist has written before about the increasing popularity of streaming, and the complementary preference that audiences have for the type of films (action, romcom, broad comedy) they like to watch when choosing such a distribution method. It is interesting to consider then just how much the viewing experience differs between a 12-hour marathon over two days, and a one-hour slice over a period of three months. As the article in Variety half-jokingly posits, “Is tantric TV viewing a thing? If it’s not, should it be?”.
Of course, Netflix aren’t alone in seeing an opportunity to delve into developing complementary products and assets. Microsoft are using the functionality of Kinect to pair with their own content development, letting children “join in” with Sesame Street, for example, and are in the process of setting up a dedicated studio for production, in Los Angeles. Amazon, which owns the streaming service LoveFilm, is also getting into the game, recently setting up Amazon Studios for original content production. At the end of last year, The Hollywood Reporter announced Amazon would be greenlighting twenty pilots, all of which were “either submitted through the studio’s website or optioned for development”. YouTube recently launched twenty professional channels on its UK website, Hulu is following suit… It really is quite startling to see such fundamental disruption and turmoil in environments where incumbent stalwarts (such as 20th Century Fox in film and Walmart in retail,) have long been accustomed to calling the shots. Could the model become completely inverted, such that the Fox network and HBO become the “dumb pipes” of the TV world, showcasing the best in internet-produced television? Maybe so, and this is not necessarily a bad thing. The Economist this week argue that one of the most important factors in Liberty Global’s recent purchase of Virgin Media was the avoidance of paying corporate tax for “years” to come. If content is still king though, a problem remains for those incumbents. The New Yorker astutely points out,
“An Internet firm like Netflix producing first-rate content takes us across a psychological line. If Netflix succeeds as a producer, other companies will follow and start taking market share… When that happens, the baton passes, and empire falls—and we will see the first fundamental change in the home-entertainment paradigm in decades.”
Netflix must tread carefully. Crucially, what seems like competitive differentiation and all-quadrant coverage now can quickly shift. Amazon’s ventures into content production will be backed up with a sizeable and perpetual stream of revenue that it derives from its e-commerce platform, which isn’t going away anytime soon. The BBC are publicly welcoming new entrants, and is devising its own tactics, such as making episodes available on iPlayer before they screen, if at all, on television. Interesting but hardly earth-shattering, and likely to make little difference to viewer preference. Netflix will have to do better than that if it wants long-term dominance of this market. It will have to be increasingly careful with its partners, too. Recent, though long-running, rumblings of discord with partners like Time Warner Cable, though seemingly innocuous, tend to be indicative of a larger battle ensuing between corporate titans. Moreover, though the act of providing a deluge of content seems new and sexy now, what about when everyone starts doing it? Chief content officer for Netflix Ted Sarantos told The Economist last week, “Right now our major differentiation is that consumers can watch what they want, when they want it, but that will be the norm with television over time. We’re getting a head start”. Fine, but about when that is the norm, what is the strategy for differentiation then? Netflix have made some lofty, daring, innovative moves here, exploiting consumer trends and noticing a gap in the competitive environment. But they will need firm foundations to support this move into an adjacent business area, of which they know relatively little, in the years to come. As President Bartlet of West Wing was often heard to say, “What’s next?”.