With Games of Thrones back on our screens, the question of the gender roles it depicts and promotes continues to be hotly debated.
Some commentators excuse what they see as the blatant misogyny of the series by noting that author George RR Martin could hardly have written female characters otherwise while being true to the historical context of the “Middle Ages” (at least as it is popularly envisioned).
Others are happy to compile lists of powerful female figures on the series and applaud the way they “destabilise” traditional gender roles.
Yet there is not a female figure on Game of Thrones who does not have a medieval counterpart, whether an actual historical person or a character from a literary text that enthralled audiences for centuries. So let’s look at these:
Game of Thrones has Daenerys Targaryen, the only surviving child of King Aerys II Targaryen; the Middle Ages also had female rulers, some hugely successful (Elizabeth I of England), others less so (Mary Queen of Scots, who was executed at the behest of Elizabeth, her first cousin once removed).
Game of Thrones gives us Cersei Lannister, the widow of King Robert Baratheon and Queen Regent of the Seven Kingdoms. The Arthurian legends gives us Morgan Le Fay, who sleeps with her brother King Arthur and bears from the union Mordred who will eventually be Arthur’s doom.
Women who dress as knights and fight men in battle?
Painting, c. 1485. An artist’s interpretation, since the only known direct portrait has not survived. (Centre Historique des Archives Nationales, Paris, AE II 2490). Wikimedia Commons
Arya Stark and Brienne of Tarth are two of the most popular female characters on Game of Thrones, but they have a long way to go before they can rival the enduring fame and admiration excited by the historical Joan of Arc (1412-1431) who led French armies against the English towards the end of the Hundred Years’ War and was executed by the English in 1431.
Records show that the Inquisitors who interrogated Joan were as much concerned with her transvestitism as with her claims to hear angelic voices and devoted a great deal of effort into convincing Joan to give up her masculine attire, a step with which she never complied.
Yet Game of Thrones never even approaches the slippery and surprising world of gender manipulation and redefinition that are a feature of medieval spirituality.
Indeed, Game of Thrones, for all its quasi-medievalism, is completely lacking in a major segment of the medieval world:
Game of Thrones notes the existence of religion, and of religious leaders – some even female (Melisandre, a priestess of the Lord of Light) – but it does not represent the massive proportion of the medieval population that devoted their lives to religious service.
Medieval Western Christianity proved adept at multiplying gender positions, rendering them fluid, and refiguring gendered embodiment in ways that made an identification with one sex and its “properly” aligned gender difficult, if not redundant.
Although monasticism was a strictly sex-segregated undertaking, it was, paradoxically enough, also productive of some extraordinarily complex and fluid gender formulations. Nuns were exhorted by the men tasked with their pastoral care to be “virile” in their faith, and to surpass even men in the strength of their devotion.
Meanwhile, male monastics cultivated “female” virtues such as humility, aware that in the topsy-turvy economy of New Testament Christianity, based as it was on the willing sacrifice of Christ, being humble, gentle, and patient of suffering was also particularly manly.
We know from manuscript evidence (British Library, Cotton Julius E.vii) that nuns enjoyed stories of women who showed their devotion by undertaking male disguise and living undetected as monks for their whole lives.
Saint Wilgefortis in the diocesan museum of Graz, Austria. Wikimedia Commons
The bearded female saint Wilgefortis proved popular in late medieval iconography: this young woman was martyred for her refusal to marry and her joyful acceptance from God of a full beard in order to avoid this fate.
While monastics deployed gender identities in innovative ways in regard to themselves, even the gender of the Christian Trinitarian God came under consideration with both male and female mystics contemplating the motherliness of Christ.
Male mystics such as Bernard of Clairvaux (1090-1153) saw themselves suckling at the breast of the Virgin, and feminist critics such as Karma Lochrie have commented on the vaginal imagery of the wound in Christ’s side to which male clerics were so devoted and to which they would envision themselves pressing their mouths.
The 14th-century female mystic Agnes Blannbekin claimed to have received the foreskin of Christ in her mouth in her celebration of the Eucharist and described its taste as sweet as honey.
The world of medieval spiritual gender was powerfully fluid and productive, with performativity the key. Game of Thrones might offer some interesting, and even compelling, female role models in its medievalist worldview, but perhaps contemporary viewers would be more shocked by the gender permutations at play in the “real” Middle Ages.
Nothing is produced in a cultural or social vacuum. All forms of representation intersect and interact with our contemporary world, whether we like it or not. This includes recently acclaimed television programs such as True Detective, Breaking Bad, House of Cards and Game of Thrones.
The fantasy elements of Game of Thrones enhance its images of a brutal world driven by the will to control. This is highly relevant when conflicts in pursuit of power – often underpinned by violence – continue to take place across the globe.
Medieval costuming, settings and magic may seem distant. Yet even Julia Gillard, a fan of the series, linked Game of Thrones to her impending doom as prime minister and the Rudd sword that would eventually slay her.
It is surprising, then, that Jason Jacobs’s recent Conversation article on Game of Thrones claims that social and political context has little if any bearing on its success. Instead, the show is elevated for resisting what he calls the tendency to exhibit “boutique contemporary issues”.
Gender inequality is offered as one example of an issue commonly woven into cultural narratives in a didactic manner. Setting aside his unsettling broad-brush treatment of contemporary feminism it is unfortunate that Jacobs conceives of discrimination as a “boutique” matter.
On the contrary, such collective problems demand critical inspection in cultural settings. Social content is something neither authors nor viewers can avoid. In the words of Duke University academic Fredric Jameson, we are each “condemned to history” in the inherent sociability of our lives.
Kevin Spacey as Frank Underwood and Robin Wright as Claire Underwood in House of Cards. Courtesy of FOXTEL
Jacobs goes on to argue that Game of Thrones is the best form of entertainment chiefly because it avoids complexity. Such complexity – lauded in an earlier Conversation piece by Jason Mittell – is instead seen as largely negative in the programs Mad Men and Breaking Bad.
But complexity is not the same as complication. Narrative power is produced through sophisticated storytelling. This enables many perspectives to emerge and audience pleasure to be heightened. Moreover, without complexity there is no simplicity: each relies on the other for meaning.
In the same way as other aesthetic forms such as literature and painting, a quality television series can ask important ethical questions. This involves making compromising and morally messy decisions, because the world we live in is difficult and complex. Television that responds to the urgent need for self-questioning cannot be so easily written off as convoluted.
The myth of the cultural divide
Well written and produced programs such as House of Cards and The Wire provide levels of meaning accessible to some, though not necessarily to all. That does not mean that they are less worthy. On the contrary, what is revealed is a wide assortment of narratives that respond to a diversity of viewers.
Audiences are not a uniform mass of receivers interpreting televisual texts in the same way. We are a varied lot, an unpredictable array of individual consumers. Appeals to “entertainment” value may seem to renew the division between purportedly complex cultural artifacts of limited audience reach, and the allegedly modest, accessible-to-all variety – the old “high versus popular” debate.
Bryan Cranston as Walter and Aaron Paul as Jesse in Breaking Bad. Courtesy of FOXTEL.
That argument is long dead. When Man Booker prize winner Eleanor Catton speaks of being influenced by the The Wire and Breaking Bad in writing The Luminaries (2013), this only helps confirm the waning of any boundary between high and low culture. Reception can also change over time, with “difficult” art evolving into popular.
Television, once considered by many as the exclusive location of mostly worthless diversion, is home to much that may be seen as important art. Jacobs reasonably calls for judgements of taste to be a part of the academic’s critical arsenal. Yet his evident distaste for all matters contextual (social and historical) risks reviving a culture war that is a relic of the 20th century.
Further, raising a single television program above “most other contemporary cultural output” takes the process of cultural evaluation to an unhealthy level of precision. The almost infinite vista of cultural forms available to audiences in the 21st century – in television, film, music, literature, and elsewhere – surely demands more cautious language on the part of scholars and critics.
There is no such thing as generic or aesthetic purity. Breaking Bad’s sharp indictment of US health care does not prevent moments of experimental fantasy. A cinematographic style that might at first appear incongruous can in fact tap into questions fundamental to our existence.
True Detective, arguably the most literary series of all, depicts an astonishingly dark realm of violence and despair. Its film noir elements, including ghostly crime scenes, exposes audiences to nightmarish gothic moments where the divisions between reality and fantasy begin to blur.
Genres are almost always intertwined, which means that all kinds of narratives adopt different styles of storytelling. These can be both escapist but also strangely familiar. What makes these contemporary television programs particularly successful is their ability to skirt the boundaries between simplicity and complexity, fantasy and reality.
Game of Thrones returned to our screens on Monday, and faster than you could say Daenerys Targaryen, eager fans had either watched the first episode via their legitimate cable subscription, or taken to the illegal file-sharing sites to nab their copy. By 5pm, just under 150,000 downloads had been logged, making Australia the fourth biggest torrenting nation in the world for the show.
For the new season, however, the gods of Westeros had something a little extra in store for the nation’s pirating hordes. Illegal downloaders soon discovered that not one but four episodes were ready and waiting on torrent sites, the result of a leaked DVD “screener” which was sent to reviewers in advance.
No doubt this was a real kick in the wallet for legitimate cable subscribing fans, who must chose to either remain above the law and work hard to avoid spoilers for the next three weeks, or dabble in the dark arts of file sharing to keep up.
For many less scrupulous folks this release was no doubt akin to stumbling across a stash of unguarded Lannister gold. In an era of box sets and binge watching, the patience required to wait for a weekly instalment can often be overwhelming. The sweet relief of instant gratification – even if it means no new episodes for three weeks – may be too much to resist.
To others, though, the availability of these extra episodes presents a different kind of quandary. The serialised nature of the show means that for many Monday has become “Game of Thrones Night”, a ritual which brings families, flatmates, partners and friends together to experience the most thrilling, engaging and often shocking drama on television today.
Even more brilliantly, each new episode is preceded by days of mounting anticipation and followed up with analysis and in-jokes that engender a real sense of shared connection on social media and in offices, universities, cafes, shops and bars across the country.
Watching the first four episodes might be satisfying in the short-term, but the price is the temporary loss of that communal social experience.
This dilemma – to consume immediately or to delay gratification for greater reward – brings to mind the famous oft-replicated marshmallow test. Beginning in the late 1960s, American psychologist Walter Mischel and his team gave pre-schoolers a fiendishly simple, yet enormously revealing, test of their self-control.
Children were seated at a table in a stimulus-free room and presented with marshmallows or other treats.
The deal was that the children could choose to eat one straight away, or wait until the researcher returned after up to 20 minutes and be rewarded with two marshmallows. The children would then be left alone with the marshmallows and a bell.
Ringing the bell would bring the researcher back and allow them to eat a single marshmallow, but waiting until the researcher returned of their own volition would mean double the reward.
While the tests had initially been designed to measure at what age the human ability for delayed gratification developed, years later the researchers discovered a much more important result: as adults, those children who could resist the temptation longer did better on their college admission SATs, had higher self-worth and lower drug use as adults, and were less likely to be overweight.
Certainly a debate continued over what it was that led to greater willpower, with studies variously suggesting a relationship to parenting, beliefs about the reliability of the world around them, an ability to find ways to distract themselves in an empty room, or just pure grit. But the fact remains that however they managed to do it those skills or innate qualities led to more benefits in later life.
It would be a stretch to suggest that those who have already downloaded and watched the first four episodes of Game of Thrones would have failed the marshmallow test. And there are other factors at play here – the concern about spoilers being a not-inconsiderable one.
At the very least, the marshmallow test should remind us of the benefits of a developed ability to delay gratification, to postpone short-term satisfaction for richer reward.
In an era of Netflix, Spotify and their less legitimate counterparts, where we rarely have to wait too long to get our hands on the next big thing in entertainment, the pleasure of the increasingly rare shared social experience of the weekly instalment is worth the wait.
And there’s nothing to stop you from eating as many marshmallows as you like while you watch.
When Game of Thrones returns to screens for its fifth season on Sunday night, US time, it will no doubt continue to attract the critical and popular praise that it richly deserves.
DB Weiss and David Benioff’s HBO adaptation of George R. R. Martin’s string of fantasy novels, A Song of Ice and Fire, has achieved its cultural prominence not because of the vast amount of cash invested in the production and not on the back of the passionate fan base for the books. It’s not even the lucky coincidence of industrial changes in Hollywood television organisation over the past 20 years that have made it more hospitable to signature television, that is, television with strong authorial identity, style and attitude.
No: Game of Thrones is successful simply because it is much better than most other television and, for that matter, most other contemporary cultural output.
By “better”, I mean it reaches a consistently satisfying – often breathtaking – level of achievement as entertainment.
That’s something that is not explainable by merely pointing to its fidelity to the books on which it is based, or its budget. There are many other well-funded television shows that blend history, sex, violence in a genre package, such as Marco Polo, The Borgias, Wolf Hall and Spartacus. The thing is, picking a formula and loading it with cash and a sprinkling of talent doesn’t guarantee critical or popular success: art and culture do not work like that.
What Game of Thrones does best
So what is the distinction of Game of Thrones – what makes it better?
First of all it avoids the temptation to import a bunch of boutique contemporary issues into its narrative.
The women depicted in it, for example, would have little truck with the contemporary feminist tendency to paint women as vulnerable victims in need of legal and state protections against feral men. Daenerys, Brienne and Arya are valiant as lions and cunning as foxes: armies, weapons and courage are their currency. We’ve watched them over four seasons carving out a space for themselves in a hostile world full of pitiless foes.
Its homo- and bisexual characters are not magical emblems of ethical goodness for our edification; its men are not oversensitive metrosexuals in fur.
In avoiding such things, the show sidesteps the self-righteous and pompous tendency of some Hollywood productions to instruct us on how we should conduct our moral lives.
Secondly, Game of Thrones embraces its genre and uses it as a medium for expression.
It was the great philosopher Stanley Cavell who, in his books on Hollywood screwball comedies and melodramas, developed the notion that popular genres in the hands of great artists can be the source of extraordinary accomplishment. Game of Thrones – unlike other critically acclaimed shows such as Mad Men or Breaking Bad – fully revels in its fantasy genre.
More importantly, it draws on the familiar resources of the genre – the magic, the medieval brutality, the monarchical mania for power – in order to do something never achieved in the genre before – even, arguably, by the books. That is, it makes the search for meaning, particularly of those struggling for power and revenge, intelligible in dramatic and spectacular ways.
The threat of the White Walkers, the disintegration and murder of the families of the North, the corruption and debt of the Lannisters; competing religious faiths, one breeding evil the other a strange kind of solidarity and commitment to justice; it shows us what loyalty, lies and betrayal mean when the stakes are mortal not trivial.
Somehow we feel this speaks to our own search for meaning today, but not in an aversive way that suggest the writers have the cocky confidence to provide an answer.
Popular genres resonate in uncertain times
The unravelling of Westeros, as it is figured by some of the finest performances on screen today, resonates with a profound uncertainty about our relationship to our own history and future. And it is through genre and fiction that this intelligibility is achieved. Why? Because we cannot make sense of it, cannot feel its palpable weight for us, through other forms of discourse such as philosophy, politics or science.
Indeed the great Hollywood directors of the 1940s and 50s all used popular genres in this way – think of John Ford’s westerns, Hitchcock’s thrillers, the melodramas of Douglas Sirk, Vincente Minnelli, Nicholas Ray and Max Ophuls.
None of that would matter much if it was executed poorly.
Veteran actors like Charles Dance, Liam Cunningham, Diana Rigg and Ciaran Hinds have never done better work; the standout contributions are Peter Dinklage’s Tyrion Lannister, Maisie Williams as Arya Stark and Aiden Gillan as Lord Baelish. Dinklage communicates the shifting blend of family resentment with the wit of a well-read intellect. Williams uses her youth as a constant misdirection for her evolving capabilities as a killer. Gillan manages to pull off the feat of never being less than chillingly ahead of us in his political designs while allowing us to glimpse chinks of his interiority that almost humanises him.
A win for ‘complex TV’?
Finally, the credit for making Game of Thrones better than the rest of television must lie with Benioff and Weiss.
In a recent Conversation, piece Jason Mittell made the case for denoting the shift to more sophisticated television drama as part of the rise of “complex TV”, by which he meant narrative complexity enabled by “major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.”
There are many problems with this notion of “complex TV”: as a criterion of achievement, “complexity” is a quality that is too broad to capture aesthetic specificities. Lots of complex things are quite unrewarding aesthetically – transport and sewerage systems for example. And complex narratives can be irritating, or over plotted.
But what is striking is the demotion, in Mittell’s account, of the creative achievement of individual artists who are given a perfunctory mention at the end of his piece. On the contrary. Without Weiss and Benioff, Game of Thrones would be just another disappointing adaptation – think Harry Potter films – of some fine genre writing.
The tendency of cultural and media studies over the past 20 years has been to avoid judgement and discrimination between the bad, the good and the better, by pretending that everything except the contribution of talented artists is important.
In the meantime, the leading artists of our time – such as David Milch (Deadwood), Matthew Weiner (Mad Men), Nic Pizzolatto (True Detective) and Weiss and Beinoff – have got on with the job of making great art for the masses. And herein lies one lesson that Game of Thrones can be said to offer us: without the talent and courage of individuals, no justice and, I would argue, no art, is feasible.
The first episode of the fifth season of Game of Thrones will screen in Australia on Monday April 13 in simulcast with the US launch. Details here.
Karan Sondhi has a very typical reason for loving HBO’s Game of Thrones: “You don’t know who’s dying in the next episode.” The wildly successful fantasy book turned TV series is notorious for killing off its main characters. In anticipation of the upcoming Season 5 premiere, HBO Canada opened a pop-up shop on Queen St. West… Continue reading →
In AMC’s Breaking Bad we saw how Walter White ended up Heisenberg, how Jesse Pinkman ended up more broken than he began. But what about Saul Goodman, Mike Ehrmantraut, Tuco Salamanca?
Many successful shows spawn sequels. Producers and networks, keen to capitalise on having hit the jackpot, are loath to let go of the winning formula even after the final episode of a high-rating, well-loved show, and they follow the inevitable path of What Comes Next? This thinking gave us Joey, Frasier, and Joanie loves Chachi. Some are successful; some leave us wishing we’d never set eyes on them.
Vince Gilligan – who is repeatedly proving himself to be an inventive story-telling mind – chose the other direction: How Did We Get Here? Building on the success of Breaking Bad, which he wrote and produced, and rewarding its devoted viewers, he’s spun off a prequel: Better Call Saul. And it’s excellent.
We’re a week away from the finale of season one. The debut episode became the most-watched TV series premiere (for a key demographic) in US cable history, with 6.9 million viewers, when it aired in February. It has claimed further viewing records since.
Why does it work so well? Three reasons: character, character, and character.
Bob Odenkirk as Saul Goodman in Better Call Saul. Ben Leuner/AMC
Gilligan understands that story comes from character, so he develops characters who give endless story, who have enough complexity and internal logic that they can twist and turn and baffle and surprise and still remain in character.
Well-conceived characters are icebergs – we the viewers see about 10% of the whole. Most of what the writer knows about them lies beneath the surface, and it’s these histories and drives that cause complex, interesting characters to act the way they do, to surprise and confound us, to compel us to watch them in every episode they appear.
Saul Goodman is one such creation. He exploded onto our screens fully formed as the fast hustling, ambulance chasing lawyer inhabiting a brilliantly, bafflingly over-the-top office. Who decorates like that? How on earth did this creature emerge? In Better Call Saul we find out.
Jonathan Banks as Mike Ehrmantraut. Lewis Jacobs/AMC
We also discover how he met his fixer, Mike Ehrmantraut. And where he first crossed paths with crazy Mexican drug-lord Tuco Salamanca. Better Call Saul is a series of meet-cutes, but not of the romcom kind, more the deeper-into-trouble variety. Saul’s world is being built and, in however many seasons Better Call Saul runs for, we’ll avidly watch Jimmy McGill transform into Saul Goodman, the man Walter White better call.
Another of Gilligan’s talents as a writer is raising the stakes. We saw this repeatedly in Breaking Bad when he put Walter and Jesse under ever-increasing pressure, in seemingly impossible life and death scenarios, and they continually survived.
And not through some random act of god, but from seeing an opportunity where no one else did, through deal making and fast thinking, through chemistry.
Solutions came from character and story logic. They surprised us but they didn’t perplex us. In the world Gilligan had created they made sense. I once heard Gilligan say in an interview that he strives to have seven surprises in every hour of television he writes, and surprise us he does. Repeatedly. Satisfyingly.
And the stakes were not only raised for Walt and Jesse, but for Skylar, and Walt Jnr, for Hank the DEA brother-in-law. Thorough and complex characterisation plus tight, surprising plotting equalled devoted fan viewing.
Walter White (Bryan Cranston) and Saul Goodman in Breaking Bad. AMC
Gilligan has chosen to go back six years with Better Call Saul, to 2002. To a time when Walter White was a law-abiding chemistry teacher, and Jesse Pinkman was still at high school, most likely paying no attention to Walt’s teaching and failing his chemistry exams.
One of the great joys of this choice of 2002 is that at the end of the 10 episodes of season one, we’ve still got five years of Saul’s evolution to explore. A second season of 13 episodes had already been commissioned before the first season aired. Plus the opening scenes of series one promise even more than six years of prequel – could there be life for Slippin’ Jimmy post-Breaking Bad?
In those opening scenes, Jimmy then Saul now Gene is living undercover in Omaha, Nebraska managing a Cinnabon store and looking mighty nervous that his old life is going to find him.
Bob Odenkirk as Jimmy McGill and Rhea Seehorn as Kim Wexler. Ursula Coyote/AMC
The viewers are more nervous that it won’t. Here’s one sequel I would have complete confidence in.
When we’ve finished watching Better Call Saul I’m betting many of us will turn back to watch Breaking Bad again, from beginning to end. Saul is a spin-off series that adds layers and richness to its parent show. Where so many spinoffs leave us with regrets and the wish we’d never fallen into their arms with so many hopes, Better Call Saul is so far giving us exactly what we want – familiar characters involved in great stories with an added frisson of knowing where it’s all going to end.
Vince Gilligan, please don’t ever stop writing. You’ve got a lot to teach us about storytelling, about the human animal and about life.
In last May’s mid-season finale of “Mad Men,” advertising agency patriarch Bert Cooper dies unexpectedly after watching the live television broadcast of the Apollo 11 moon landing.
The next day, Don Draper has hallucinatory vision of Bert performing a winsome song and dance routine of what must be the greatest of all deceptive advertising promises: “The Best Things in Life Are Free.”
Cooper’s 1969 death could signify much more: as the second half of season seven begins and the series approaches its conclusion, viewers may witness the stirrings of what – in real life – marked the end of advertising’s modern era.
The burst of creative innovation inspired by the mad men of advertising’s heyday began with the rise of television in the 1940s and concluded in the late 1970s, when giant, publicly traded advertising and media holding companies began gobbling up smaller, creative boutiques, like Mad Men’s Sterling Cooper.
The acquisition binge would result in a loss of creative independence and innovation.
Draper embodies modern era’s eccentric personalities
During those halcyon days of the modern era, advertisers, for the first time, crafted messaging with a dual purpose: appealing to consumers’ logic and emotion.
Back then, advertising was primarily a creative endeavor, with messaging honed through market research and focus groups. There was a storytelling element, too: the formulation of big, creative ideas that entertained and engaged consumers through the novel medium of television.
The pioneers of these big, consumer-focused advertising ideas were primarily spearheaded by a few big names: personality-driven males who dominated the business (and still do), along with a few women (like the diminutive but courageous Mary Wells of Wells Rich Green, who may be a model for Peggy Olson’s character).
Like Bernbach, Draper is known for elegantly integrating art and copy in his advertisements. In a business that previously separated these distinct elements, Bernbach was one of the first “mad men” to wield creative direction over both.
Take Bernbach’s Volkswagen Beetle ad. The small (admittedly unattractive) German import was foreign to American consumers accustomed to bulky, boxy automobiles. Rather than try to downplay the car’s features, Bernbach put them front and center: “Think Small” and “Lemon” were his spare, Helvetica Bold headlines, which floated in white space above a black and white photo of the Beetle. The minimalism and directness of Bernbach’s iconic Beetle advertising was original and authentic, resonating with car buyers open to economy, function and style.
Draper also possesses characteristics of the Ted Bates agency’s Rosser Reeves, who was renowned for his development of the Unique Selling Proposition. The gregarious Reeves thought every product must convey its “unique benefit,” the quality that differentiated it from the rest of the competition. If this quality didn’t exist – well, it could be created through advertising.
Reeves believed that once a Unique Selling Proposition was established, the advertising message – the brand benefit – had to be repeated (often within the same ad) to resonate with the consumer. Anacin’s “Fast Relief” ads were an example of his effective and memorable use of the repeated USP brand benefit.
Draper employs Reeves’ USP strategy in an episode when he pitches the “It’s Toasted” tagline to Lucky Strike. The client argues that all cigarette tobacco is toasted. Draper’s reply? “Yes, but you will be the first to advertise it.”
Finally, some of Draper’s excesses could be linked to the antics of temperamentally creative George Lois, who once threatened to jump out a window if a client rejected his advertising ideas. “You make the matzos, and I’ll make the ads!” he once shouted as he stood on the window ledge at a Kosher food company’s Long Island City headquarters. (The client eventually agreed to move forward with Lois’ idea.)
Postmodern era defined by market warfare and consolidation
Mad Men has given fans a taste of this unique era in advertising. So what happened in the industry over the ensuing decades?
I began working in the advertising agency business in 1979, and witnessed the seismic shifts that occurred during the next era of advertising: the postmodern era.
Military-inspired jargon dominated marketing strategies; terms like “positioning,” “offensive warfare,” “defensive warfare” and “flanking” were the battle cries of advertising’s new warriors. The cola, beer and burger wars fought for the hearts, minds – and wallets – of consumers, primarily through television advertising. This lasted until the rise of the Internet (itself a powerful, new advertising medium).
The postmodern era saw the growth and domination of publicly traded, global holding companies. None rose faster – and fell further – than Saatchi & Saatchi, which, at one point, was the world’s largest advertising company.
Saatchi & Saatchi, founded in London by two Iraqi immigrants named Maurice and Charles Saatchi, initially gained worldwide attention in the 1970s for its advertising on behalf Margaret Thatcher’s Conservative Party.
The Saatchi brothers (known among competitors as “Snatch it and Snatch it”) went on an acquisition binge throughout the 1990s that included some of the world’s biggest and most well known agencies. In 1986, it purchased Ted Bates for a staggering $500 million.
While ad agencies were growing in size through acquisition, clients sought to grow their market share through aggressive, comparative (sometimes negative) advertising. Before there was Dave, the soft-sell pitchman and founder of Wendy’s, there was “Where’s the Beef?” – Wendy’s attempt to cut into the sales of Burger King and McDonald’s.
Meanwhile, clients took notice that agency principals were getting mega-rich from their advertising budgets. Ted Bates chairman Robert Jacoby reportedly pocketed $100 million from the Bates sale alone. Ultimately, disgruntled shareholders ousted the Saatchi brothers in 1994, and the agency was carved into pieces.
The Saatchis demonstrated that the new model for advertising agencies of the future was first and foremost financial; lost were the creative zeitgeist and “mad men” of the modern era.
But advertising, like nature, abhors a vacuum. During the 1980s a new crop of independent, boutique agencies also emerged with a focus on creativity and the fine art of persuasion: The Martin Agency in Richmond, Virginia; Fallon and Carmichael-Lynch in Minneapolis; and Los Angeles-based Chiat Day, which created Apple’s iconic 1984 ad.
Nonetheless, many of these new wave creative “hot shops” were acquired, like their predecessors, by five major global holding companies: WPP, Omnicom, Publicis, Interpublic and Havas.
It’s telling, then, that after Bert Cooper’s death in Mad Men, the agency’s remaining partners sell a majority stake to rival McCann Erikson – thus ending its creative independence.
My prediction for the end of the show’s run? Don Draper – unable to function in the postmodern world of ads, fads, excessive consumerism and toxic agency politics – abandons advertising once and for all. Perhaps he is tossed out by new agency owners for his contempt of the executive “yes men” who replaced his generation of “mad men.” Or he refuses to dumb down his creative visions for increasingly risk-averse clients. (Or, once again, he is caught in the wrong bed at the wrong time!)
More likely, he walks out unceremoniously – empty handed and alone – continuing the desperate search for the authentic life that has eluded him.
He accepts, finally, that the best things in life actually may be free – and not as advertised.
Last night, the final episode of the fifth season of The Walking Dead screened on Australian television. The hit US series has for the last five years constantly reached larger and larger audiences around the world. In the US, each new season continues to break cable ratings records.
Swarms of downloaders accessing the show after each episode airs are evidence of its global following. But the stunning renaissance of the zombie in popular culture is reflected not only in the popularity of films and TV series such as The Walking Dead.
Zombies have become an urban phenomenon, with cities from Sydney to Santiago, Chile, organising annual zombie walks. Not long ago the University of Sydney was plagued by a mass of the undead during a Zedtown event, a humans-versus-zombies game involving hundreds of players.
It’s not just about the zombies
Many cultural theorists have explored the significance of the zombie and its continued prevalence in contemporary culture. With roots in Haitian folklore and precursors in West-African religions, the zombie of Afro-American creole beliefs was animated by magic, unlike the zombie of the late 20th century, which is re-animated by viral contagion.
British sociologist Tim May sees zombie films – from White Zombie (1932) to Night of the Living Dead (1969) and Dawn of the Dead (1972) – as expressions of racial anxiety.
Others examining the same films see the zombie as embodying the mindlessness of consumer society. In his recent piece on The Conversation, Joseph Gillings saw in the remorselessness and lack of self-regard an apt metaphor for the terrorism spawned by globalisation’s discontents.
More recently, film scholar Deborah Christie framed the zombie as means for thinking through anxieties about the post-human condition emerging at the turn of the 21st century.
“It’s their world now, we are just living in it.” That’s how one of the young characters in The Walking Dead puts it while hiding from “walkers” (the name the group gives to zombies) in the forest.
But what attracts our attention in The Walking Dead is not the zombies but the survivors. For them such arcane meanings are less important than finding a way to continue to live. Or to put it another way: the importance is less on containing the zombie apocalypse but understanding the new complexities emerging from a zombie aftermath in which bare life and community economies must be redefined.
In our view The Walking Dead reflects on the meaning of group solidarity in a brave new world. Rick Grimes, the leader of the survivors, played by British actor Andrew Lincoln, sees his group as a family bound by relations of mutual support.
Throughout the series we see characters transformed by this practice of solidarity. Daryl, the group’s consummate survivalist, is transformed from a stereotypical redneck into someone deeply concerned with the group’s welfare.
Rick’s communitarian family contrasts with other failed collectivities the group has crossed paths with during this and previous seasons. We’ve seen a dystopia under the control of a threatening “governor”; cops holed up in an Atlanta hospital with patients who amount to slaves. We’ve watched a biker-gang collective being decimated by a few zombies because of a lack of group cohesion; we’ve see a non-zombie cannibal collective dispatch other hapless survivors with bureaucratic efficiency to save themselves.
Rick’s family is also quite different from the utopian walled eco-village it finds itself in during the fifth series. Even though there is enough room for them to live separately, initially Rick’s group refuses this return to the nuclear family. Rejecting the trappings of civilisation, they prefer to remain in collective life.
In different ways, the characters express their desire to “not forget” what has allowed them to survive thus far.
At this moment in The Walking Dead, the real challenge facing the characters in Rick’s family is not whether or not they can survive but rather whether or not they can survive as a “collective” in a setting that promises a return to individual existence.
The dilemma facing Rick’s family may seem to have little to do with our present circumstances – but learning how to value collectivity and act collectively in the face of profound crisis is something we must all embrace.
Zombies have been incorporated into innovative educational tools aimed at disaster preparedness. The Centers for Disease Control and Prevention (CDC) in the US recently put together a sort of toolkit for emergency-preparedness for disasters and catastrophes: Preparedness 101: Zombie Apocalypse. The use of a zombie apocalypse allows us to think of possible disaster responses and the relation between collectivity and resilience.
This might be seen in the containment strategies of the Ebola outbreak in West Africa. The xenophobic responses that followed the outbreak led nowhere whereas the collective response has been far more effective in containing the spread of the virus. Climate change is yet another example of the need to think seriously about solidarity as a key instance of disaster-preparedness and response.
The Walking Dead is a useful metaphor to think with. As we wait for the sixth season, we can contemplate its role in debating how we anticipate events that may threaten the economic order of things. So, does a zombie apocalypse signify the end of capitalist civilisation, or its perverse consummation?
We take it for granted but acting – or failing to act – in advance of possible futures is in fact an essential aspect of contemporary neoliberal democracies, whether we are talking about terrorism, climate change or a zombie pandemic, as the CDC toolkit implies.
A show like The Walking Dead helps us think through the challenges we face as a species; it helps us reflect on the critical importance of how to make new economies possible, and not just in the aftermath of major disaster.
The final episode of season 5 of The Walking Dead screened on FXTV last night.
If you watch the season finale of The Walking Dead this Sunday, the story will likely evoke events from previous episodes, while making references to an array of minor and major characters. Such storytelling devices belie the show’s simplistic scenario of zombie survival, but are consistent with a major trend in television narrative.
Prime time television’s storytelling palette is broader than ever before, and today, a serialized show like The Walking Dead is more the norm than the exception. We can see the heavy use of serialization in other dramas (The Good Wife and Fargo) and comedies (Girls and New Girl). And some series have used self-conscious narrative devices like dual time frames (True Detective), voice-over narration (Jane the Virgin) and direct address of the viewer (House of Cards). Meanwhile, shows like Louie blur the line between fantasy and reality.
Many have praised contemporary television using cross-media accolades like “novelistic” or “cinematic.” But I believe we should recognize the medium’s aesthetic accomplishments on its own terms. For this reason, the name I’ve given to this shift in television storytelling is “complex TV.”
There are a wealth of facets to explore about such developments (enough to fill a book), but there’s one core question that seems to go unasked: “why has American television suddenly embraced complex storytelling in recent years?”
To answer, we need to consider major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.
A business model transformed
We can quibble about the precise chronology, but programs that were exceptionally innovative in their storytelling in the 1990s (Seinfeld, The X-Files, Buffy the Vampire Slayer) appear more in line with narrative norms of the 2000s. And many of their innovations – season-long narrative arcs or single episodes that feature markedly unusual storytelling devices – seem almost formulaic today.
What changed to allow this rapid shift to happen?
As with all facets of American television, the economic goals of the industry is a primary motivation for all programming decisions.
For most of their existence, television networks sought to reach the broadest possible audiences. Typically, this meant pursuing a strategy of mass appeal featuring what some derisively call “least objectionable programming.” To appeal to as many viewers as possible, these shows avoided controversial content or confusing structures.
But with the advent of cable television channels in the 1980s and 1990s, audiences became more diffuse. Suddenly, it was more feasible to craft a successful program by appealing to a smaller, more demographically uniform subset of viewers – a trend that accelerated into the 2000s.
In one telling example, FOX’s 1996 series Profit, which possessed many of contemporary television’s narrative complexities, was quickly canceled after four episodes for weak ratings (roughly 5.3 million households). These numbers placed it 83rd among 87 prime time series.
Yet today, such ratings would likely rank the show in the top 20 most-watched broadcast programs in a given week.
This era of complex television has benefited not only from more niche audiences, but also from the emergence of channels beyond the traditional broadcast networks. Certainly HBO’s growth into an original programming powerhouse is a crucial catalyst, with landmarks such as The Sopranos and The Wire.
But other cable channels have followed suit, crafting original programming that wouldn’t fly on the traditional “Big Four” networks of ABC, CBS, NBC and FOX.
A well-made, narratively-complex series can be used to rebrand a channel as a more prestigious, desirable destination. The Shield and It’s Only Sunny in Philadelphia transformed FX into a channel known for nuanced drama and comedy. Mad Men and Breaking Bad similarly bolstered AMC’s reputation.
The success of these networks has led upstart viewing services like Netflix and Amazon to champion complex, original content of their own – while charging a subscription fee.
The effect of this shift has been to make complex television a desirable business strategy. It’s no longer the risky proposition it was for most of the 20th century.
Miss something? Hit rewind
Technological changes have also played an important role.
Many new series reduce the internal storytelling redundancy typical of traditional television programs (where dialogue was regularly employed to remind viewers what had previously occurred).
Instead, these series subtly refer to previous episodes, insert more characters without worrying about confusing viewers, and present long-simmering mysteries and enigmas that span multiple seasons. Think of examples such as Lost, Arrested Development and Game of Thrones. Such series embrace complexity to an extent that they almost require multiple viewings simply to be understood.
In the 20th century, rewatching a program meant either relying on almost random reruns or being savvy enough to tape the show on your VCR. But viewing technologies such as DVR, on-demand services like HBO GO, and DVD box sets have given producers more leeway to fashion programs that benefit from sequential viewing and planned rewatching.
Serialized novels – like Charles Dickens’ The Mystery of Edwin Drood – were commonplace in the 19th century. Wikimedia Commons
Like 19th century serial literature, 21st century serial television releases its episodes in separate installments. Then, at the end of a season or series, it “binds” them together into larger units via physical boxed sets, or makes them viewable in their entirety through virtual, on-demand streaming. Both encourage binge watching.
Giving viewers the technology to easily watch and rewatch a series at their own pace has freed television storytellers to craft complex narratives that are not dependent on being understood by erratic or distracted viewers. Today’s television assumes that viewers can pay close attention because the technology allows them to easily do so.
Shifts in both technology and industry practices point toward the third major factor leading to the rise in complex television: the growth of online communities of fans.
Today there are a number of robust platforms for television viewers to congregate and discuss their favorite series. This could mean partaking in vibrant discussions on general forums on Reddit or contributing to dedicated, program-specific wikis.
As shows craft ongoing mysteries, convoluted chronologies or elaborate webs of references, viewers embrace practices that I’ve termed “forensic fandom.” Working as a virtual team, dedicated fans embrace the complexities of the narrative – where not all answers are explicit – and seek to decode a program’s mysteries, analyze its story arc and make predictions.
The presence of such discussion and documentation allows producers to stretch their storytelling complexity even further. They can assume that confused viewers can always reference the web to bolster their understanding.
Other factors certainly matter. For example, the creative contributions of innovative writer-producers like Joss Whedon, JJ Abrams and David Simon have harnessed their unique visions to craft wildly popular shows. But without the contextual shifts that I’ve described, such innovations would have likely been relegated to the trash bin, joining older series like Profit, Freaks and Geeks and My So-Called Life in the “brilliant but canceled” category.
Furthermore, the success of complex television has led to shifts in how the medium conceptualizes characters, embraces melodrama, re-frames authorship and engages with other media. But those are all broader topics for another chapter – or, as television frequently promises, to be continued.
The star-studded world premiere of Game of Thrones’s fifth season has taken place – in the UK, at the Tower of London. It’s hard to imagine a more fitting venue for a show based on George R R Martin’s notoriously brutal novels.
Like the fictional Red Keep built by Aegon the Conqueror at Kings Landing after his War of Conquest, William the Conqueror founded London’s iconic fortress to subdue the locals after the Norman Conquest and reinforce his dominance as the new monarch. And 900 years on, the Tower has become synonymous with political intrigue, imprisonment, torture and death – a reputation that stems largely from the late 15th and 16th centuries, when several kings, queens and martyrs were imprisoned, murdered or executed within and around its walls.
During the Wars of the Roses, the medieval conflict that inspired Game of Thrones, King Henry VI was confined in the Tower twice by his dynastic rival, Edward IV. Not unlike Aerys II, the “mad” Targaryen king, who incites Robert Baratheon’s rebellion in the back story to the series, Henry VI’s ineffectual leadership and madness triggered the historic civil war.
A prisoner for more than five years, Henry was murdered at the Tower following the death of his only son and heir at the Battle of Tewkesbury (1471). His execution, like the slaughter of mad king Aerys and his immediate heirs, was followed by a period of relative peace. But when Edward IV died in 1483, political wrangling resumed and the Tower once again provided the backdrop for the next stage of the conflict: the controversial disappearance of Edward’s sons, the “Princes in the Tower”.
The slaughter or disappearance of young heirs and bastards is also a disturbing and recurring motif in Game of Thrones, resonating with the medieval and Tudor obsession with bloodlines and succession.
Jamie Lannister throws young Bran Stark from a tower at Winterfell to prevent his incestuous affair with Cersei and their illegitimate children being discovered. Theon Greyjoy fakes the murder of Bran and Rickon Stark to secure Winterfell by butchering two proxies. Joffrey orders the Gold Cloaks to massacre Robert Baratheon’s bastards. Rickard Karstark kills Tywin Lannister’s nephews. Craster sacrifices his newborn sons to the White Walkers. All are innocent children, and all fall victim to the morally ambiguous “game” being played by the leading Houses of Westeros.
The prime responsibility of the noblewomen, such as Cersei Lannister, Margery Tyrell and Sansa Stark, is to provide the next generation of kings. They are therefore also at the heart of the political machinations. And here, too, the series draws inspiration from the real medieval and Tudor women associated with the Tower.
In his pursuit of a male heir, Henry VIII famously had his politically sharp consort, Anne Boleyn, imprisoned and executed at the Tower on charges of treason, adultery and incest (the same crimes embraced by Cersei Lannister). This example will be particularly fresh for those who watched the BBC’s Wolf Hall.
Queen of the Roses
Henry VI’s widow, Margaret of Anjou, was likewise incarcerated in the Tower after a decade-long struggle to secure the crown for her son, Prince Edward. A formidable and proud French woman, who married for political purposes, Margaret was unafraid of engaging with the factionalism of her husband’s all-male council. She fought fiercely, if not always astutely, when her son’s birthright came under attack, and made strategic alliances wherever she could – most notably by marrying Prince Edward to the daughter of her former enemy, Richard Neville, earl of Warwick, better known as the “Kingmaker”.
Captured after leading the Lancastrian army against Edward IV at the aforementioned Battle of Tewkesbury, where her son was killed, Margaret avoided her husband’s grim fate and was eventually ransomed, returning to France to live out her days in relative obscurity.
All decked out. Ian West/PA Wire
If aspects of Margaret’s story sound familiar it’s because she is one of several resilient historical women who inspired the characterisation of Cersei Lannister. While Cersei’s future is uncertain, we’ve seen her fight to influence and fortify Joffrey’s sovereignty and this season promises to follow her struggle with Margery Tyrell for control of King Tommen, her second son.
Ravens and crows
It’s impossible to avoid a final comparison between the Night’s Watch, or “Crows”, who swear to be “the shield that guards the realms of men” at the Wall, and that other species of the crow genus – the raven – which said to protect the kingdom by its presence at the Tower. According to tradition, if the ravens ever leave, the Tower and the realm will fall.
By the same token, one can’t help wondering what will happen to the Seven Kingdoms if the crows defending the northern frontier are slain, or forced to flee, by the White Walkers. In future seasons, Bran and the mysterious three-eyed raven doubtless have an equally important role to play in the defence of the kingdom.
But for the time being, the imminent series of Games of Thrones will continue to delight and terrify its audience with the same bouts of intrigue, scandal and brutality that have contributed to the Tower’s notorious reputation and popularity.
If winter is coming, then so is more bloodshed, for, as Tower prisoner and martyr Sir Thomas More once said in his account of Richard III’s acquisition of the crown, “king’s games” are “for the more part played upon scaffolds”.