We all seem to have less time to ourselves these days. But there seems to be more to watch – on more platforms – than ever before. What trends have led to this, and what’s the result? Much editorial ink has been spilled over the years about how our lives seem to be getting busier, with less free time to ourselves. This is somewhat of a painful irony given that many of our more intellectual ancestors thought our evolution as a species would quickly lead to a civilisation mostly consumed by thoughts of how to fill the days of leisure. In last week’s New Yorker, Harvard professor Thales Teixeira noted there are three major “fungible” resources we have as people – money, time and attention. The third, according to Teixeira, is the “least explored”. Interestingly, Teixeira calculated the inherent price of attention and how it fluctuates, by correlating it with rising ad rates for the Super Bowl. Last year, the price of attention jumped more than 20%. The article elaborates,
“The jump had obvious implications: attention—at least, the kind worth selling—is becoming increasingly scarce, as people spend their free time distracted by a growing array of devices. And, just as the increasing scarcity of oil has led to more exotic methods of recovery, the scarcity of attention, combined with a growing economy built around its exchange, has prompted R. & D. in the [retaining of attention].”
It’s such thinking that has persuaded executives to invest in increasingly multi-platform, creative advertising during the Super Bowl, and to media production companies taking their wares to the likes of YouTube and Netflix. But it’s all circular , as demonstrated last week when Amazon announced it would be producing films for cinema release. The plurality of such content over different channels carries important connotations for pricing strategies. At its most fundamental, what is a product worth when it is intangible and potentially only available in digital form? It chimes with an article written earlier this month in The Economist on the customer benefits of e-commerce. Though most knee-jerk reactions would assume price is the biggest benefit to customers, recent research illustrates this is not always the case. Researchers at MIT showed on average people paid an extra 50% for books online versus in-store. This isn’t because that latest David Baldacci is sold for more on Amazon, but rather because of the long tail. Which means more products are able to find the right owner, for a price, whereas in store comparatively they go unsold. More channels have meant more availability for content, which should benefit consumers in that more content destined to be a hit now finds a home, where once it might have been lost if turned down by the major TV or radio network stations. The Economist elaborates,
“Seasoned publishers have only a vague idea what book, film or song will be a hit. A major record label can sign only a fraction of the artists available, knowing full well it will unwittingly reject a future superstar. Thanks to cheap digital recording technology, file sharing, YouTube, streaming music and social media, however, barriers to entry have been dismantled. Artists can now record and distribute a song without signing to a major label. Independent labels have proliferated, and they are taking on the artists passed over by major labels. Hit songs are still a lottery, but the public gets three times as many lottery tickets.”
So while we may have less time to consume it, more content over more channels will allow for greater chances for breakout hits, particularly with avid niche audiences. Amazon Prime video content was until recently confined to a niche audience, and the show Transparent dealt with niche subject matter. But the show has broken out into the zeitgeist and won two awards at the recent Golden Globes ceremony. (Full disclosure, we know a producer on the show and were lucky enough to visit the set on the Paramount lot in Los Angeles last summer). It is likely such a great show – recently made available free for 24 hours as a way to upsell customers to Prime – would not have found a home on traditional TV networks, and thus in people’s homes, were it not for this plurality.
Our most popular article this year by far was a piece we wrote on trends in the media and entertainment industry for the coming twelve months. That nothing has been written since January that has proved as popular as that is a little disappointing, but it is a good indication of what users come to this blog for.
It’s been an interesting past month or so in the Technology, Media and Telecoms sector. We’re going to attempt to recap some of the more consequential things here, as well as the impact they may have into next year.
Star Wars – And the blockbuster dilemma
Friday saw the release of the first trailer for Star Wars Episode VII, due for release December 2015. CNBC covered the release at the coda of European Closing Bell, around the point of a segment a story might be done about a cat caught up a tree (“On a lighter note…”). They discussed the trailer and the franchise on a frivolous note at first, mostly joking about the length of time since the original film’s release. One of the anchors then went on to claim that Disney’s purchase of “Lucasfilms” [sic] and the release of this trilogy of films, given the muted reaction to Episodes I-III, constituted a huge bet on Disney’s part. This showed a profound lack of understanding. Collectively, Episodes I-III, disappointing artistically as they may have been, made a cool $1.2bn. And this is just at the box office. Homevideo revenues would probably have been the same again, almost certainly more. Most importantly (whether we like it or not), are revenue streams like toy sales, theme park rides and the like (see below graphic, from StatisticBrain). So we are talking about a product that, despite many not being impressed with, managed to generate several billion dollars for Fox, Lucasfilm, et al. With a more reliable pair of hands at the helm in the form of J.J. Abrams, to say Episodes VII-IX are a huge bet is questionable thinking at best.
It can be easy for pundits to forget those ancillary streams, but in contemporary Hollywood it is such areas that are key, and fundamentally influence what films get made. Kenneth Turan, writing in mid-September for the LA Times, echoed such thinking. As with our Star Wars example; so “with the Harry Potter films, and it is happening again with ‘Frozen’, with Disney announcing just last week that it would construct a ‘Frozen’ attraction at Orlando’s Disney World”. It is why studios have scheduled, as of August this year, some 30 movies based on comicbooks to be released over the coming years. Of course, supply follows demand. Such generic shlock wouldn’t be made again and again (and again) if consumers didn’t exercise their capitalist right to choose it and consume it. We have been given Transformers 4 because the market said it wanted it.
But is this desire driven by a faute de mieux – a lack of anything better – in said market? David Fincher may not have been far off the mark back in September when he mentioned in an interview with Playboy that “studios treat audiences like lemmings, like cattle in a stockyard“. But a shift from such a narrow mindset may prove difficult in a consolidated environment – Variety’s editor-in-chief Peter Bart pointed out recently that “six companies control 90% of the media consumed by Americans, compared with 50 companies some 30 years ago”. Some players of course are trying to change the way the business this works. The most provocative statement of this was in September when Netflix announced a sequel to “Crouching Tiger, Hidden Dragon”, to be released day-and-date across Netflix and in IMAX cinemas. Kudos. It’s the kind of thing this blog has been advocating since its inception. Though not in accordance with a capitalist model, the market is certainly showing a desire for more day-and-date releases. Netflix isn’t a lone outlier as on OTT provider trying to develop exclusive content that goes beyond comicbooks (that in itself should give Netflix pause; about a fifth of its market value has eroded since mid-October). Hulu’s efforts with J.J. Abrams and Stephen King, as well as Amazon’s universally acclaimed Transparent series (full disclosure, a good friend works on the show; Zeitgeist was privileged to take a look around the sets on the Paramount lot while in Los Angeles this summer). And that’s not to say innovative content can’t be developed around blockbuster fare; we really liked 20th Century Fox’s partnership with Vice for ‘Dawn of the Planet of the Apes’, creating short films that filled the gaps between the film and its predecessor. Undoubtedly the model needs to change; unlike last summer, there were no outright bombs this year at the box office, but receipts fell 15% all the same. The first eight months of 2014 were more than $400m behind the same period in 2013. Interviewed in the FT, Robert Fishman, an analyst with MoffettNathanson put it wisely, “It always comes down to the product on the screen. And the product on the screen just hasn’t delivered.” An editorial in The Economist earlier this month praised Hollywood’s business model, suggesting other businesses should emulate it. But beyond some good marketing tactics there seems little that should be copied by others. Indeed, lots more work is needed. Perhaps the first step is merely rising that not all blockbusters need to be released in the summer. Next year, James Bond, Star Wars and The Avengers will all arrive on screens… spread throughout the year. Expect 2015 to feature more innovation on the part of exhibitors too, beyond having their customers be rained on.
Tech wars – Hacking, piracy and monopolies
Sony Pictures faced some embarrassment this week when hackers claimed to have penetrated the company’s systems, getting away with large volumes of data that included detailed information on talent (such as passport details for the likes of Angelina Jolie and Cameron Diaz). The full story is still unfolding. We’ve written a couple of times recently about cybersecurity; it was disappointing but unsurprising to see the spectre of digital warfare raise its head again twice in the past week. The first instance was with Regin, an impressive bit of malware, which seems to be the successor to Stuxnet, a spying program developed by Israeli and American intelligence forces to undermine Iranian efforts to develop nuclear materials. Symantec said Regin had probably taken years to develop, with “a degree of technical competence rarely seen”. Regin was focused on Saudia Arabia, Russia, but also Ireland and India, which muddies the waters of authorship. However, in these post-Snowden days it is well known that friendly countries go to significant lengths to spy on each other, and The Economist posited at least part of the malware was created by those in the UK. Deloitte, ranked number one globally in security consulting by Gartner, is on the case.
The news in other parts of the world is troubling too. In the US, the net neutrality debate rages. It’s too big an issue to be covered here, but the Financial Times and Harvard Business Review cover the topic intelligently, here and here. In China, regulators are cracking down on online TV, a classic case of a long-gestating occurence that at some arbitrary point grows too big to ignore, suddenly becoming problematic. But, if a recent article on the affair in The Economist is anything to go by, such deeds are likely to merely spur piracy. And in the EU this past week it was disconcerting to see what looked like a mix of jealousy, misunderstanding and outright protectionism when the European Parliament voted for Google to be broken up. No one likes or wants a monopoly; monopolies are bad because they can reduce consumer choice. This is one of the key arguments against the Comcast / Time Warner Cable merger. But Google’s share of advertising revenue is being eaten into by Facebook; its mobile platform Android is popular but is being re-skinned by OEMs looking to put their own branding onto the OS. And Google is not reducing choice in the same way as an offline equivalent, with higher barriers to entry, might. The Economist points out this week:
“[A]lthough switching from Google and other online giants is not costless, their products do not lock customers in as Windows, Microsoft’s operating system, did. And although network effects may persist for a while, they do not confer a lasting advantage… its behaviour is not in the same class as Microsoft’s systematic campaign against the Netscape browser in the late 1990s: there are no e-mails talking about “cutting off” competitors’ ‘air supply'”
The power of lock-in, or substitute products, should not be underestimated. For Apple, this has meant the acquisition of Beats, which they are now planning to bundle in to future iPhones. For Jeff Bezos, this means bundling in Washington Post into future Amazon Fire products. For media and entertainment providers, it means getting customers to extend their relationship with the business into triple- and quad-play services. But it has been telling this month to hear from two CEOs who are questioning the pursuit of quad-play. For the most part, research shows that it can increase customer retention, although not without lowering the cost of the overall product. Sky’s CEO Jeremy Darroch said “If I look at the existing quadplays in the market, not just in the UK, but pretty much everywhere, I think they’re very much driven by the providers who want to extend their offering, rather than, I think, any significant demand from customers”. Vodafone’s CEO Vittorio Colao joined in, “If someone starts bidding for content then you [might] have to yourself… Personally I have doubts that in the long run that this [exclusive content] will really create a lot of value for the platform. It tends to create lots of value for the owner”. Sony meanwhile are pursuing just such a tack of converged services in the form of a new ad campaign. But the benefits of convergence are usually around the customer being able to have multiple touchpoints, not the business being able to streamline assets and services in-house. Sony is in the midst of its own tech war, in consoles, where it is firmly ahead of Microsoft, who were seeking a similar path to that of Sky and Vodafone to dominate the living room. But externalities are impeding – mobile gaming revenues will surpass those of the traditional console next year to become the largest gaming segment; no surprise when by 2020, 90% of the world’s population over 6 years old will have a mobile phone, according to Ericsson. So undoubtedly look for more cyberattacks next year, on a wider range of industries, from film, to telco (lots of customer data there), to politics and economics.
Talent wars – Cui bono?
Our last section is the lightest on content, but perhaps the most important. It is the relation between artist and patron. This relationship took a turn for the worse this year. On a larger, corporate side, this issue played itself out as Amazon and publisher Hachette rowed over fees. Hachette, rather than Amazon, appears to have won the battle; it will set he prices on its books, starting from early 2015. It is unlikely to be the last battle between the ecommerce giant and a publisher, and it may well now give the DoJ the go-ahead to examine the company’s alleged anti-competitive misdeeds.
Elsewhere, artist Taylor Swift’s move to exorcise her catalogue from music streaming service Spotify is a shrewd move on her part. Though an extremely popular platform, driving a large share of revenues to the artists, the problem remains that there is little revenue to start with as much of what there is to do on Spotify can be done for free. The Financial Times writes that it is thanks to artists like Swift that “an era of protectionism is dawning” again (think walled gardens and Compuserve) for content. The danger for the music industry is that other artists take note of what Swift has done and follow suit. This would be of benefit to the individual artists but detrimental to the industry itself. And clearly such an issue doesn’t have to be restricted to the music industry. It’s not hard to anticipate a similar issue affecting film in 2015.
There’s a plethora of activity going on in TMT as the year draws to a close – much of it will impact how businesses behave and customers interact with said media next year. The secret will be in drawing a long-term strategic course that can be agile enough to adapt to disruptive technologies. However what we’ve hopefully shown here in this article is that there are matters to attend to in multiple sectors that need immediate attention over any amorphous future trends.
Back in July of this year, while schoolchildren dreamt of holidays and commuters sweated their way to work, management consultancy McKinsey sat down with president of eBay Marketplaces Devin Wenig. The interview is above; we’re going to pick on some highlights below as Wenig pontificated on the future of bricks and mortar stores, the change needed in marketing, the fallacy of big data and what will make for good competitive advantage over other retailers in the months and years to come. Often with talking heads the output can be generic and anodyne. Wenig though offers some insightful thoughts.
The future of the store: “I think stores are going to become as much distribution and fulfillment centers as they are full-fledged shopping experiences… They’ll become technology enabled so that you can go to a store and see enough inventory, but you may shop “shoppable windows.” We’re building those right now for retailers around the world. You may end up hollowing out the real estate, where the showroom is a much smaller part of the footprint, and the inventory and the distribution center become more of that footprint.”
How marketing needs to change: “There are still many instances that I see where it is old-school marketing. It’s still about major TV campaigns, get people into the stores. That’s still important, and that’s not going to go away. But understanding how to engage in a world of exploding social networks, how to use search, how to use catalog, how to optimize, and how to engage—very different skills.”
Competitive advantage: “I think the answer is data… While from the merchant standpoint incredible selection may seem great, from the consumer standpoint it can be overwhelming. I actually don’t want to shop in a store with a billion items for sale, I’m just looking for this. Data is the way to connect a long-tail advantage with consumers that oftentimes want simplicity.”
Executing on strategy: “Great data is both art and science. There’s a lot of press about the science; there’s not as much about the art. But the truth is that judgment matters a lot… we bring quantitative analysis to that to say, “The right way to look at our customers is this, not this,” even though there are infinite ways we could.”
The fallacy of big data: “It’s not about big data, it’s about small data. Big data is useless… it’s about me connecting with you, my business connecting with you. You don’t want to be part of a big data set; you’re just looking to buy a shirt. And that’s about small data. That’s about understanding insights that I can glean about you that don’t feel intrusive, don’t feel creepy, and don’t feel artificial—but feel natural. That, to me, is the future. There are glimmers of success there. I wouldn’t say the industry has arrived. For all the rhetoric about data, it’s a work in progress, but a critically important work in progress.”
Merging experiences: “E-commerce [fulfills] a utilitarian function… Stores have an important element of serendipity… The future of digital commerce is trying to get the best of both… we’re trying to spur inspiration.”
How to define innovation, how has it been studied in the recent past, and what does future innovation hold for the human race?
Sometimes the word innovation gets misused. Like when people use the word “technology” to mean recent gadgets and gizmos, instead of acknolwedging that the term encompasses the first wheel. “Innovation” is another tricky one. Our understanding of recent thoughts on innovation – as well as its contemporary partner, “disruption” – were thrown into question in June when Jill Lepore penned an article in The New Yorker that put our ideas about innovation and specifically on Clayton Christensen’s ideas about innovation in a new light. Christensen, heir apparent to fellow Harvard Business School bod Michael Porter (author of the simple, elegant and classic The Five Competitive Forces that Shape Strategy) wrote The Innovator’s Dilemma in 1997. His work on disruptive innovation, claiming that successful businesses focused too much on what they were doing well, missing what, in Lepore’s words, “an entirely untapped customer wanted”, created a cottage industry of conferences, companies and counsels committed to dealing with disruption, (not least this blog, which lists disruption as one its topics of interest). Lepore’s article describes how, as Western society’s retelling of the past became less dominated by religion and more by science and historicism, the future became less about the fall of Man and more about the idea of progress. This thought took hold particularly during The Enlightenment. In the wake of two World Wars though, our endless advance toward greater things seemed less obvious;
“Replacing ‘progress’ with ‘innovation’ skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices our getting newer and newer”
The article goes on to look at Christensen’s handpicked case studies that he used in his book. When Christensen describes one of his areas of focus, the disk-drive industry, as being unlike any other in the history of business, Lepore rightly points out the sui generis nature of it “makes it a very odd choice for an investigation designed to create a model for understanding other industries”. She goes on for much of the article to utterly debunk several of the author’s case studies, showcasing inaccuracies and even criminal behaviour on the part of those businesses he heralded as disruptive innovators. She also deftly points out, much in the line of thinking in Taleb’s Black Swan, that failures are often forgotten about, and those that succeed are grouped and promoted as formulae for success. Such is the case with Christensen’s apparently cherry-picked case studies. Writing about one company, Pathfinder, that tried to branch out into online journalism, seemingly too soon, Lepore comments,
“Had [it] been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, ‘it’s like it never existed’… Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.”
Such were the ramifications of the piece, that when questioned on it recently in Harvard Business Review, Christensen confessed “the choice of the word ‘disruption’ was a mistake I made twenty years ago“. The warning to businesses is that just because something is seen as ‘disruptive’ does not guarantee success, or fundamentally that it belongs to any long-term strategy. Developing expertise in a disparate area takes time, and investment, in terms of people, infrastructure and cash. And for some, the very act of resisting disruption is what has made them thrive. Another recent piece in HBR makes the point that most successful strategies involve not just a single act of deus ex machina thinking-outside-the-boxness, but rather sustained disruption. Though Kodak, Sony and others may have rued the days, months and years they neglected to innovate beyond their core area, the graveyard of dead businesses is also surely littered with companies who innovated too soon, the wrong way or in too costly a process that left them open to things other than what Schumpeter termed creative destruction.
Outside of cultural and philosophical analysis of the nature and definition of innovation, some may consider of more pressing concern the news that we are soon to be looked after by, and subsequently outmaneuvered in every way by, machines. The largest and most forward-thinking (and therefore not necessarily likely) of these concerns was recently put forward by Nick Bostrom in his new book Superintelligence: Paths, Dangers, Strategies. According to a review in The Economist, the book posits that once you assume that there is nothing inherently magic about the human brain, it is evidence that an intelligent machine can be built. Bostrom worries though that “Once intelligence is sufficiently well understood for a clever machine to be built, that machine may prove able to design a better version of itself” and so on, ad infinitum. “The thought processes of such a machine, he argues, would be as alien to humans as human thought processes are to cockroaches. It is far from obvious that such a machine would have humanity’s best interests at heart—or, indeed, that it would care about humans at all”.
Beyond the admittedly far-off prognostications of the removal of the human race at the hands of the very things it created, machines and digital technology in general pose great risks in the near-term, too. For a succinct and alarming introduction to this, watch the enlightening video at the beginning of this post. Since the McKinsey Global Instititute published a paper in May soberly titled Disruptive technologies: Advances that will transform life, business, and the global economy, much editorial ink and celluloid (were either medium to still be in much use) has been spilled and spooled detailing how machines will slowly replace humans in the workplace. This transformation – itself a prime example of creative destruction – is already underway in the blue-collar world, where machines have replaced workers in automotive factories. The Wall Street Journal reports Chinese electronics makers are facing pressure to automate as labor costs rise, but are challenged by the low margins, precise work and short product life of the phones and other gadgets that the country produces. Travel agents and bank clerks have also been rendered null, thanks to that omnipresent machine, the Internet. Writes The Economist, “[T]eachers, researchers and writers are next. The question is whether the creation will be worth the destruction”. The McKinsey report, according to The Economist, “worries that modern technologies will widen inequality, increase social exclusion and provoke a backlash. It also speculates that public-sector institutions will be too clumsy to prepare people for this brave new world”.
Such thinking gels with an essay in the July/August edition of Foreign Affairs, by Erik Brynjolfsson, Andrew McAfee and Michael Spence, titled New World Order. The authors rightly posit that in a free market the biggest premiums are reserved for the products with the most scarcity. When even niche, specialist employment though, such as in the arts (see video at start of article), can be replicated and performed to economies of scale by machines, then labourers and the owners of capital are at great risk. The essay makes good points on how while a simple economic model suggests that technology’s impact increases overall productivity for everyone, the truth is that the impact is more uneven. The authors astutely point out,
“Today, it is possible to take many important goods, services, and processes and codify them. Once codified, they can be digitized [sic], and once digitized, they can be replicated. Digital copies can be made at virtually zero cost and transmitted anywhere in the world almost instantaneously.”
Though this sounds utopian and democratic, what is actually does, the essay argues, is propel certain products to super-stardom. Network effects create this winner-take-all market. Similarly it creates disproportionately successful individuals. Although there are many factors at play here, the authors readily concede, they also maintain the importance of another, important and distressing theory;
“[A] portion of the growth is linked to the greater use of information technology… When income is distributed according to a power law, most people will be below the average… Globalization and technological change may increase the wealth and economic efficiency of nations and the world at large, but they will not work to everybody’s advantage, at least in the short to medium term. Ordinary workers, in particular, will continue to bear the brunt of the changes, benefiting as consumers but not necessarily as producers. This means that without further intervention, economic inequality is likely to continue to increase, posing a variety of problems. Unequal incomes can lead to unequal opportunities, depriving nations of access to talent and undermining the social contract. Political power, meanwhile, often follows economic power, in this case undermining democracy.”
There are those who say such fears of a rise in inequality and the whole destruction through automation of whole swathes of the job sector are unfounded, that many occupations require a certain intuition that cannot be replicated. Time will tell whether this intuition, like an audio recording, health assessment or the ability to drive a car, will be similarly codified and disrupted (yes, we’ll continue using the word disrupt, for now).
Are incumbent companies starting to see the light when it comes to embracing digital? Evidence is slowly starting to point in that direction.
Artists are known for embracing change and innovation, but the art market itself has been slow to adapt to changing consumer behaviour. Now mega e-tailer Amazon is selling art on its site, and venerable auction house Christie’s is pushing headlong into online-only sales, as Mashable recently reported. And while fashion designers know how to use digital to push the envelope, the fashion industry as a business has been notorious for their skittishness at investing in efficient, immersive digital experiences for their customers, so worried are they about detracting from the brand. So it was reassuring to see during Paris Fashion Week recently that French marque Chloé had gotten the message. As Zeitgeist’s dear friend and fashion aficionado Rachel Arthur details on her blog, the brand launched a dedicated microsite for their runway show. Brands like Burberry and Louis Vuitton have been doing this for at least three years, so in of itself it’s nothing new. What made the experience different were two things. Firstly, the site created a journey that started before the show, and continued after it, rather than merely offering a stream of live video and little else. More importantly, it tried to make the experience one that reflected the influence of those watching. As Rachel points out,
“As the event unfolded, so too did different albums under a moodboard header, including one for the collection looks, one for accessories, another for the guests, and one from backstage. Users could click on individual images and share them via Twitter, Facebook, Pinterest or Weibo, or heart them to add them to their own personal moodboard page.
‘[We] are excited to see how you direct your own Chloé show,’ read the invite.”
The recognition of platforms like Weibo should be seen as another coup for Chloé. Too often, companies send out communications to global audiences with perfunctory links to Facebook and Twitter. Not only is there no call to action for these links (why is it that the user should go there?), but there is no recognition that one of the world’s most populous and prosperous markets are more into their Renren and Weibo.
Elsewhere, despite what seems like some niggling problems, Zeitgeist was excited and intrigued to read about Disney‘s latest foray into embracing how consumers use digital devices, this time creating a second-screen experience in movie theaters. Second Screen Live, as Disney have branded it, doesn’t immediately sound particularly logical, as GigaOm point out,
“Of all the places I’d thought would be forbidden to the second screen experience, movie theaters were near the top of my list. After all, you’re paying a premium ticket price for the opportunity to sit in a dark theater and immerse yourself in a narrative — second screen devices operate in direct opposition to that.”
And yet the Little Mermaid experience that the writer goes on to describe cannot be faulted for its attempt at innovation, at reaching beyond current thinking (not to mention revenue streams), in order to forge a new relationship between the viewer and the product. Kudos.
Lastly, Zeitgeist wanted to mention the US television network Fox as a classic example of a company that has slowly come to realise the power of working with digital, rather than against it. In years passed, companies like Fox were indisputably heavily involved in digital, but only from a punitive standpoint. Fox and others were ruthless in their distribution of takedown notices to sites hosting content they deemed to infringe on their product. Fan sites that exploded in support and admiration for shows like The X-Files were summarily threatened with legal action and closed. There was little thought given to the positive sentiment sites were creating around the product, and little thought given to the destruction of brand equity that such takedown notices brought about. Not to mention the dessication of communities that had come together from different parts of the world, their single shared attribute being that they were evangelists of what you were selling. Clips of shows, such as The Simpsons, appearing on YouTube would be treated with similar disdain. So it shows how far we’ve come in a few years that this morning when Zeitgeist went onto YouTube he was greeted on the homepage with a sponsored link from Fox pointing him to the opening scenes of the latest Simpsons episode, before it aired. Definitely a move in the right direction.
This past week, Zeitgeist had the pleasure of enjoying a new adaptation of Shakespeare’s “Much Ado about Nothing”. This adaptation was not performed at the theatre but at the cinema. It was not directed by Kenneth Branagh or any other luminary of the legitimate stage, but rather by the quiet, modest, nerdy Joss Whedon, who until a few years ago was best known to millions as the brains behind the cult TV series phenomenon “Buffy the Vampire Slayer” (full disclosure: Zeitgeist worked on the show in his days of youth). Whedon was picked to direct a film released last year that can, without much difficulty, be seen as the apotheosis of the Hollywood film industry; “The Avengers”. A mise-en-abyme of a concept, involving disparate characters, some of whom already have their own fully-fledged franchises, coming together to form another vehicle for future iterations. “The Avengers” became the third-highest grossing film of all time, and it is a thoroughly enjoyable romp. Moreover, to go from directing on such a broad canvas to shooting a film mostly with friends in one’s own home – as with “Much Ado…” – displays an impressive range of creative ingenuity.
Sadly for shareholders and studio executives’ career aspirations, not every film is as sure-fire a hit as “The Avengers”, try though as they might (and do) to replicate the same mercurial ingredients that lead to success. Marvel, which originally conceived of the myriad characters surrounding The Avengers mythology, was bought in 2009 by Disney for $4bn. Disney for all intents and purposes have a steady strategic head on their shareholders. They parted ways with the quixotic Weinstein brothers while welcoming Pixar back into the fold. They were one of the first to concede the inevitability of closed platforms release windows – something Zeitgeist has written about in the past – they are debuting a game-changing platform, Infinity, which might revolutionise the way children interact with the plethora of memorable characters the studio have dreamt up over the years. However, such sound business strategy could not save them from the uber-flop that was 2012’s “John Carter”, which lost the studio $200m. This summer, the rationale for their biggest release has been built on what appears to be sound logic; taking the on- and off-screen talent behind their massively successful “Pirates of the Caribbean” franchise, and bringing them together again for another reboot in the form of “The Lone Ranger”. The New York Times said the film “descends into nerve-racking incoherence”; it has severely underperformed at the box office, after a budget of $250m. Sony’s “After Earth” similarly underperformed, suddenly throwing Will Smith’s bullet-proof reputation for producing hits into jeopardy.
These summer films – “tentpoles” to use the terminology bandied about in Los Angeles – are where the money is made (or not) for studios. As an industry over the past ten years, Zeitgeist has watched as these tentpoles have become more concentrated, more risk-averse and therefore less original, more expensive and more likely either to produce either stratospheric results or spectacular failures. Paramount is an interesting example of a studio that has made itself leaner recently, releasing far fewer films, and relying on franchises to keep the ship afloat. Edtorial Director of Variety Peter Bart seems to think there’s a point when avoiding risk leads to courting entropy. It’s an evolution that has escaped few, yet is was still notable when, last month, famed directors Steven Spielberg and George Lucas spoke out publicly against the way the industry seemed to be headed. Indeed, the atmosphere at studios in Hollywood seems to mimic that of a pre-2008 financial sector; leveraging ever more collateral against assets with significant – and unsustainable – levels of risk. The financial sector uses arcane algorithms and has a large number of Wharton grads whose aim should be to preserve stability and profit. Yet even with all this analysis, they failed to see the gigantic readjustment that was imminent. In the film industry, Relativity Media’s reputation for rigorous predictive models on what will make a film successful is rare enough to have earned it a feature in Vanity Fair. So what hope is there the film industry will change its tune before it is too late? Spielberg pontificates,
“There’s eventually going to be a big meltdown. There’s going to be an implosion where three or four or maybe even a half-dozen of these mega-budgeted movies go crashing into the ground and that’s going to change the paradigm again.”
Instead of correcting course as failures at the box office failed to abate, studios have dug in harder. Said Lucas,
“They’re going for gold, but that isn’t going to work forever. And as a result they’re getting narrower and narrower in their focus. People are going to get tired of it. They’re not going to know how to do anything else.”
Such artistic ennui in audiences is admittedly sclerotic in its visibility at the moment. “Man of Steel”, another attempt at rebooting a franchise – coming only seven years after the last attempt – is performing admirably, with a position still firmly in the top ten at the US box office after four weeks of release, with over $275m taken domestically. It’s interesting to note that audiences have been happy to embrace the new version so quickly after the last franchise launch failed; though actor James Franco finds it contentious, the same has been true with the “Spider-Man” franchise relaunch.
Part of the problem in the industry, some say, is to do with those at the top running the various film studios. In “Curse of the Mogul”, written by lecturers at Columbia University, the authors contend that since 2005 the industry as a whole has underperformed versus the S&P stock index, yet such stocks are still eminently attractive to investors. The reason, the authors say, is that those running the businesses frame the notion of success differently. They argue that it takes a very special type of person (i.e. them) to be able to manage not only different media and the different audiences they reach and the different trends that come out of that, but more importantly (in their eyes) to be able to manage the talent. They asked to be judged on Academy Awards rather than bottom lines. The most striking thing in the book – which Zeitgeist is still reading – is the continual pursuit by said mogul of strategic synergies. This M&A activity excites shareholders but has historically led to minimal returns (think Vivendi or AOL Time Warner), often because what was presented as operational or content-based synergy is actually nothing of the sort. It’s a point Richard Rumelt makes in his excellent book, “Good Strategy / Bad Strategy”. Some companies are beginning to get the idea. Viacom seemed an outlier in 2006 when it divested CBS. Lately, News Corporation has followed a similar tack, albeit under duress after suffering from scandalous revelations about hacking in its news division. A recent article in The Economist states,
“Most shareholders now see that television networks, newspapers, film studios, music labels and other sundry assets add little value by sharing a parent. Their proximity can even hinder performance by distracting management… they have become more assertive and less likely to believe the moguls’ flannel about ‘synergies’.”
So in some ways it was of little surprise that Sony came under the microscope recently as well, part of this larger trend of scrutiny. The company has experienced dark times of late, with shares having plunged 85% over the past 13 years. The departure of Howard Stringer in 2012 coincided with an annual loss of some $6.4bn. Now headed up by Kazuo Hirai, the company has undoubtedly become more focused, with much more being made of their mobile division. Losses have been stemmed, but the company is still floundering, with an annual loss reported in May of $4.6bn. It was only a couple of weeks later that hedge-fun billionaire Dan Loeb – instrumental in getting Marissa Meyer to lead Yahoo – upped his ownership stake in Sony, calling on it to divest its entertainment division in a letter to CEO Hirai. Part of the issue with Sony is a cultural one, where Japan’s ways of working differ strongly from the West’s. This is covered in some detail in a profile with Stringer featured in The New Yorker. In a speech he gave last year, Stringer said, “Japan is a harmonious society which cherishes its social values, including full employment. That leads to conflicts in a world where shareholder value calls for ever greater efficiency”. But Sony’s film division – which includes the James Bond franchise – is performing well; in the year to March 2013 Sony’s film and music businesses produced $905m of operating income, compared with combined losses of $1.9 billion in mobile phones, according to The Economist. It ended 2012 first place among the other film studios in market share. Sony is the last studio to consistently deliver hits across genres, reports The New York Times in an excellent article. The article quotes an anonymous Sony exeuctive, “We may not look like the rest of Hollywood, but that doesn’t mean this isn’t a painstakingly thought-through strategy and a profitable one”. Sadly the strategy behind films like ‘After Earth’ begin to look flimsy when one glances at the box office results. While Hirai and the Sony board concede that have met to discuss the possibility of honouring Mr. Loeb’s suggestion – offering 15-20% of it as an IPO rather than selling it off in full – Mr. Hirai also commented in an interview with CNBC, “We definitely want to make sure we can continue a successful business in the entertainment space. That is for me, first and foremost, the top priority”. In mid-June Loeb sent a second letter, advocating the IPO proposal and saying “Our research has confirmed media reports depicting Entertainment as lacking the discipline an accountability that exist at many of its competitors”. The question is whether selling off its entertainment assets would remove any synergies with other divisions, thus making the divisions left over less profitable, or whether such synergies even existed in the first place. For Loeb, the “most valuable untapped synergies” are still in the studio and music divisions yet after decades as one company they still remain untapped. That point won’t make for pleasant reading at Sony HQ.
Another problem is the changing nature of media consumption habits. Not only are we watching films in different ways over different platforms, we are also doing much else besides, from playing video games, which have successfully transitioned beyond the nerdy clique of yesteryear, to general mobile use and second screening. This transition – and with it a realisation that competition is not likely to come from across regional boarders but from startup platforms – is largely being ignored by the French as they insist on trade talks with the US that centre on the preservation of l’exception culturelle. Such trends are evident in business dealings. The Financial Times this weekend detailed Google’s significant foray into developing content, setting up YouTube Space LA. The project gives free soundstage space to artists who are likely to guarantee eyeballs on YouTube, and lead to advertising revenue for the platform. From the stellar success of the first season of “House of Cards”, to DreamWorks Animation’s original content partnership announced last month, Netflix has become the bête noire for traditional content producers as it shakes up traditional models. We have written before about the IHS Screen Digest data from earlier this year, showing worrying trends for the industry; as predicted, audiences are beginning to favour access over ownership, preferring to rent rather than own, which means less profit for the studio. As much due to a decline in revenue from other platforms as growth in of itself, cinemas are expected to be the major area of profit going forward to 2016 (see above chart). We’ve written before about the power cinema still has. Spielberg and Lucas pick up on this;
“You’re going to end up with fewer theaters, bigger theaters with a lot of nice things. Going to the movies will cost 50 bucks or 100 or 150 bucks, like what Broadway costs today, or a football game. It’ll be an expensive thing… [Films] will sit in the theaters for a year, like a Broadway show does. That will be called the ‘movie’ business.”
In a conversation over Twitter, (excerpts of which are featured above), Cameron Saunders, MD of 20th Century Fox UK told Zeitgeist that “major changes were afoot”. Such potential disruption is by no means unique to the film industry, and should come as a surprise to one. Zeitgeist recently went to see Columbia faculty member Rita McGrath speak at a Harvard Business Review event. In her latest book, “The End of Competitive Advantage”, McGrath discounts the old management consultant attempts at providing sustainable competitive advantages to business. Her assertion is that any advantage is transient, that incumbency and success often lead to entropy, unless there is constant innovation to build on that success. Such a verdict of entropy could well be applied to the film industry. The model has worked well for decades, despite predictions of doom at the advent of television, the VCR, the DVD, et cetera ad nauseum. But fundamental behavioural shifts are now at play, and the way we devise strategies for what content people want to see and how they wish to see it need to be readdressed, quickly. Otherwise all this deliberation will eventually become much ado about nothing.
UPDATE (15/4/13): Of course, context is everything. The New York Times published an interesting article today saying investing in Hollywood is less risky than investing in Silicon Valley, though the returns in the latter are likely to be greater. Neither are seen as reliable.
This issue isn’t going away. We write again about it, here.