We are just days away from the “Furious 7” release date on Friday, April 3, and franchise star Vin Diesel has been extremely busy promoting the film lately. During an appearance on Monday’s “Jimmy Kimmel Live,” Vin dropped a bombshell about the upcoming “Fast and Furious 8.” “I always think of these films multiple pictures in… Continue reading →
Last night, the final episode of the fifth season of The Walking Dead screened on Australian television. The hit US series has for the last five years constantly reached larger and larger audiences around the world. In the US, each new season continues to break cable ratings records.
Swarms of downloaders accessing the show after each episode airs are evidence of its global following. But the stunning renaissance of the zombie in popular culture is reflected not only in the popularity of films and TV series such as The Walking Dead.
Zombies have become an urban phenomenon, with cities from Sydney to Santiago, Chile, organising annual zombie walks. Not long ago the University of Sydney was plagued by a mass of the undead during a Zedtown event, a humans-versus-zombies game involving hundreds of players.
It’s not just about the zombies
Many cultural theorists have explored the significance of the zombie and its continued prevalence in contemporary culture. With roots in Haitian folklore and precursors in West-African religions, the zombie of Afro-American creole beliefs was animated by magic, unlike the zombie of the late 20th century, which is re-animated by viral contagion.
British sociologist Tim May sees zombie films – from White Zombie (1932) to Night of the Living Dead (1969) and Dawn of the Dead (1972) – as expressions of racial anxiety.
Others examining the same films see the zombie as embodying the mindlessness of consumer society. In his recent piece on The Conversation, Joseph Gillings saw in the remorselessness and lack of self-regard an apt metaphor for the terrorism spawned by globalisation’s discontents.
More recently, film scholar Deborah Christie framed the zombie as means for thinking through anxieties about the post-human condition emerging at the turn of the 21st century.
“It’s their world now, we are just living in it.” That’s how one of the young characters in The Walking Dead puts it while hiding from “walkers” (the name the group gives to zombies) in the forest.
But what attracts our attention in The Walking Dead is not the zombies but the survivors. For them such arcane meanings are less important than finding a way to continue to live. Or to put it another way: the importance is less on containing the zombie apocalypse but understanding the new complexities emerging from a zombie aftermath in which bare life and community economies must be redefined.
In our view The Walking Dead reflects on the meaning of group solidarity in a brave new world. Rick Grimes, the leader of the survivors, played by British actor Andrew Lincoln, sees his group as a family bound by relations of mutual support.
Throughout the series we see characters transformed by this practice of solidarity. Daryl, the group’s consummate survivalist, is transformed from a stereotypical redneck into someone deeply concerned with the group’s welfare.
Rick’s communitarian family contrasts with other failed collectivities the group has crossed paths with during this and previous seasons. We’ve seen a dystopia under the control of a threatening “governor”; cops holed up in an Atlanta hospital with patients who amount to slaves. We’ve watched a biker-gang collective being decimated by a few zombies because of a lack of group cohesion; we’ve see a non-zombie cannibal collective dispatch other hapless survivors with bureaucratic efficiency to save themselves.
Rick’s family is also quite different from the utopian walled eco-village it finds itself in during the fifth series. Even though there is enough room for them to live separately, initially Rick’s group refuses this return to the nuclear family. Rejecting the trappings of civilisation, they prefer to remain in collective life.
In different ways, the characters express their desire to “not forget” what has allowed them to survive thus far.
At this moment in The Walking Dead, the real challenge facing the characters in Rick’s family is not whether or not they can survive but rather whether or not they can survive as a “collective” in a setting that promises a return to individual existence.
The dilemma facing Rick’s family may seem to have little to do with our present circumstances – but learning how to value collectivity and act collectively in the face of profound crisis is something we must all embrace.
Zombies have been incorporated into innovative educational tools aimed at disaster preparedness. The Centers for Disease Control and Prevention (CDC) in the US recently put together a sort of toolkit for emergency-preparedness for disasters and catastrophes: Preparedness 101: Zombie Apocalypse. The use of a zombie apocalypse allows us to think of possible disaster responses and the relation between collectivity and resilience.
This might be seen in the containment strategies of the Ebola outbreak in West Africa. The xenophobic responses that followed the outbreak led nowhere whereas the collective response has been far more effective in containing the spread of the virus. Climate change is yet another example of the need to think seriously about solidarity as a key instance of disaster-preparedness and response.
The Walking Dead is a useful metaphor to think with. As we wait for the sixth season, we can contemplate its role in debating how we anticipate events that may threaten the economic order of things. So, does a zombie apocalypse signify the end of capitalist civilisation, or its perverse consummation?
We take it for granted but acting – or failing to act – in advance of possible futures is in fact an essential aspect of contemporary neoliberal democracies, whether we are talking about terrorism, climate change or a zombie pandemic, as the CDC toolkit implies.
A show like The Walking Dead helps us think through the challenges we face as a species; it helps us reflect on the critical importance of how to make new economies possible, and not just in the aftermath of major disaster.
The final episode of season 5 of The Walking Dead screened on FXTV last night.
If you watch the season finale of The Walking Dead this Sunday, the story will likely evoke events from previous episodes, while making references to an array of minor and major characters. Such storytelling devices belie the show’s simplistic scenario of zombie survival, but are consistent with a major trend in television narrative.
Prime time television’s storytelling palette is broader than ever before, and today, a serialized show like The Walking Dead is more the norm than the exception. We can see the heavy use of serialization in other dramas (The Good Wife and Fargo) and comedies (Girls and New Girl). And some series have used self-conscious narrative devices like dual time frames (True Detective), voice-over narration (Jane the Virgin) and direct address of the viewer (House of Cards). Meanwhile, shows like Louie blur the line between fantasy and reality.
Many have praised contemporary television using cross-media accolades like “novelistic” or “cinematic.” But I believe we should recognize the medium’s aesthetic accomplishments on its own terms. For this reason, the name I’ve given to this shift in television storytelling is “complex TV.”
There are a wealth of facets to explore about such developments (enough to fill a book), but there’s one core question that seems to go unasked: “why has American television suddenly embraced complex storytelling in recent years?”
To answer, we need to consider major shifts in the television industry, new forms of television technology, and the growth of active, engaged viewing communities.
A business model transformed
We can quibble about the precise chronology, but programs that were exceptionally innovative in their storytelling in the 1990s (Seinfeld, The X-Files, Buffy the Vampire Slayer) appear more in line with narrative norms of the 2000s. And many of their innovations – season-long narrative arcs or single episodes that feature markedly unusual storytelling devices – seem almost formulaic today.
What changed to allow this rapid shift to happen?
As with all facets of American television, the economic goals of the industry is a primary motivation for all programming decisions.
For most of their existence, television networks sought to reach the broadest possible audiences. Typically, this meant pursuing a strategy of mass appeal featuring what some derisively call “least objectionable programming.” To appeal to as many viewers as possible, these shows avoided controversial content or confusing structures.
But with the advent of cable television channels in the 1980s and 1990s, audiences became more diffuse. Suddenly, it was more feasible to craft a successful program by appealing to a smaller, more demographically uniform subset of viewers – a trend that accelerated into the 2000s.
In one telling example, FOX’s 1996 series Profit, which possessed many of contemporary television’s narrative complexities, was quickly canceled after four episodes for weak ratings (roughly 5.3 million households). These numbers placed it 83rd among 87 prime time series.
Yet today, such ratings would likely rank the show in the top 20 most-watched broadcast programs in a given week.
This era of complex television has benefited not only from more niche audiences, but also from the emergence of channels beyond the traditional broadcast networks. Certainly HBO’s growth into an original programming powerhouse is a crucial catalyst, with landmarks such as The Sopranos and The Wire.
But other cable channels have followed suit, crafting original programming that wouldn’t fly on the traditional “Big Four” networks of ABC, CBS, NBC and FOX.
A well-made, narratively-complex series can be used to rebrand a channel as a more prestigious, desirable destination. The Shield and It’s Only Sunny in Philadelphia transformed FX into a channel known for nuanced drama and comedy. Mad Men and Breaking Bad similarly bolstered AMC’s reputation.
The success of these networks has led upstart viewing services like Netflix and Amazon to champion complex, original content of their own – while charging a subscription fee.
The effect of this shift has been to make complex television a desirable business strategy. It’s no longer the risky proposition it was for most of the 20th century.
Miss something? Hit rewind
Technological changes have also played an important role.
Many new series reduce the internal storytelling redundancy typical of traditional television programs (where dialogue was regularly employed to remind viewers what had previously occurred).
Instead, these series subtly refer to previous episodes, insert more characters without worrying about confusing viewers, and present long-simmering mysteries and enigmas that span multiple seasons. Think of examples such as Lost, Arrested Development and Game of Thrones. Such series embrace complexity to an extent that they almost require multiple viewings simply to be understood.
In the 20th century, rewatching a program meant either relying on almost random reruns or being savvy enough to tape the show on your VCR. But viewing technologies such as DVR, on-demand services like HBO GO, and DVD box sets have given producers more leeway to fashion programs that benefit from sequential viewing and planned rewatching.
Like 19th century serial literature, 21st century serial television releases its episodes in separate installments. Then, at the end of a season or series, it “binds” them together into larger units via physical boxed sets, or makes them viewable in their entirety through virtual, on-demand streaming. Both encourage binge watching.
Giving viewers the technology to easily watch and rewatch a series at their own pace has freed television storytellers to craft complex narratives that are not dependent on being understood by erratic or distracted viewers. Today’s television assumes that viewers can pay close attention because the technology allows them to easily do so.
Shifts in both technology and industry practices point toward the third major factor leading to the rise in complex television: the growth of online communities of fans.
Today there are a number of robust platforms for television viewers to congregate and discuss their favorite series. This could mean partaking in vibrant discussions on general forums on Reddit or contributing to dedicated, program-specific wikis.
As shows craft ongoing mysteries, convoluted chronologies or elaborate webs of references, viewers embrace practices that I’ve termed “forensic fandom.” Working as a virtual team, dedicated fans embrace the complexities of the narrative – where not all answers are explicit – and seek to decode a program’s mysteries, analyze its story arc and make predictions.
The presence of such discussion and documentation allows producers to stretch their storytelling complexity even further. They can assume that confused viewers can always reference the web to bolster their understanding.
Other factors certainly matter. For example, the creative contributions of innovative writer-producers like Joss Whedon, JJ Abrams and David Simon have harnessed their unique visions to craft wildly popular shows. But without the contextual shifts that I’ve described, such innovations would have likely been relegated to the trash bin, joining older series like Profit, Freaks and Geeks and My So-Called Life in the “brilliant but canceled” category.
Furthermore, the success of complex television has led to shifts in how the medium conceptualizes characters, embraces melodrama, re-frames authorship and engages with other media. But those are all broader topics for another chapter – or, as television frequently promises, to be continued.
Vampires walk among us. But these people aren’t the stuff of nightmares – far from it actually. Just sit down for a drink with one of them and ask for yourself. That’s if you can find one. They aren’t necessarily looking to be found.
I’ve spent five years conducting ethnographic studies of the real vampires living in New Orleans and Buffalo. They are not easy to find, but when you do track them down, they can be quite friendly.
“Real vampires” is the collective term by which these people are known. They’re not “real” in the sense that they turn into bats and live forever but many do sport fangs and just as many live a primarily nocturnal existence. These are just some of the cultural markers real vampires adopt to express a shared (and, according to them, biological) essence – they need blood (human or animal) or psychic energy from donors in order to feel healthy.
Their self-described nature begins to manifest around or just after puberty. It derives, according to them, from the lack of subtle energies their bodies produce – energies other people take for granted. That’s the general consensus anyway. It’s a condition they claim to be unable to change. So, they embrace it.
The real vampire community, like the legendary figure it emulates, knows few national boundaries, from Russia and South Africa to England and the United States. Particularly in the internet age, vampires are often well attuned to community issues.
This is more true for some than others though. I found the vampires of Buffalo to be keen to keep up to date with the global community, while those in New Orleans were often more interested in the activities of their local vampire houses (an affiliated group of vampires usually led by a vampire elder who helps his or her house members to acclimate to their vampiric nature).
Some houses, and indeed whole vampire communities, as in the case of New Orleans, will combine their efforts to organise charity events, like feeding (not feeding on) the homeless. However, despite their humanitarian efforts, real vampires don’t go around advertising who they are for fear of discrimination by people who simply don’t understand them.
Some semblance of the real vampire community has existed since at least the early to mid-1970s, but my own dealings began in 2009 when I entered the New Orleans community clinging to my digital voice recorder.
I eventually met around 35 real vampires there, but the total number in New Orleans is easily double that. They ranged in age from 18 to 50 and represented both sexes equally. They practised sanguinarian (blood) and psychic feeding – taking energy using, for example, the mind or hands.
Blood is generally described by my study participants as tasting metallic, or “coppery” but can also be influenced by the donor’s physiology, or even how well he or she is hydrated. Some psychic vampires use tantric feeding, that is through erotic or sexual encounters, while others use what could be described as astral feeding or feeding on another from afar. And others feed through emotion.
Afterwards, blood-drinking and psychic vampires feel energised or otherwise better than they would if they were to sustain themselves on regular food alone, like fruits, fish, and vegetables (which they eat too).
These vampires described themselves as atheistic, monotheistic or polytheistic. Some identified as heterosexual, some homosexual and some bisexual. Some were married, some were divorced and some were parents.
Unquestionably, I found the vampires I met to be competent and generally outwardly “normal” citizens. They performed blood-letting rituals safely and only with willing donors and participated regularly in medical exams that scarcely (if ever) indicated complications from their feeding practises.
Tales of the unexpected
What was perhaps most surprising about the vampires I met though was their marked lack of knowledge about vampires in popular culture. They seemed to know much less than you might expect – at least for vampires – about how their kind were depicted in books and films. By this I mean to say that the people I met with and interviewed hadn’t turned to drinking blood or taking psychic energy simply because they had read too many Anne Rice novels.
In fact, the real vampire community in general seems to have appropriated very few of the trappings mainstream culture attaches to creatures of the night. Many do dress in gothic clothes but certainly not all the time, and very, very few sleep in coffins. In fact, those vampire who do dress a certain way or wear fangs do so long after realising their desire to take blood.
This is what might be called a “defiant culture”. Real vampires embrace their instinctual need to feed on blood or energy and use what mainstream culture sees as a negative, deviant figure like the vampire to achieve a sense of self-empowerment. They identify others with a similar need and have produced a community from that need.
But real vampires can also help us understand, and perhaps even shed, some of the ideological baggage each of us carries. They show us how repressive and oppressive categories can lead to marginalisation. Through them, we see the dark side of ourselves.
More generally, this community shows that being different doesn’t have to force you onto the margins of society. Real vampires can and do exist in both “normal” society and their own communities, and that’s okay.
Many jurisdictions around the world now allow gay and lesbian marriages, but “treating” homosexuality remains a politicised topic. Some groups, mainly in the United Kingdom, are visibly displeased by efforts to discredit attempts to change homosexuals.
Despite appearances that we increasingly live in a world where same-sex attraction is acceptable, it’s not been long since homosexuality was considered to be a mental illness. Many psychiatrists though it was curable with “therapies” designed to turn gay people off their same-sex desires.
Like mental health governing bodies in the United States and Australia, those in the United Kingdom, such as the UK Council for Psychotherapy, have long rejected the idea of conversion therapies for homosexuality.
But the practice has still not been entirely eradicated from mental health care. A 2009 UK survey found that while only 4% of therapists reported they would attempt to change a client’s sexual orientation if one consulted asking for such therapy, 17% reported having assisted at least one client/patient to reduce or change his or her homosexual or lesbian feelings.
Opposition to this sometimes National Health Service-supported therapy (in 40% of cases) was raised in parliament by the UK health minister in April 2014. Political pressure to ban conversion therapies is mounting, and not only because the discredited therapy is a waste of public money. Such therapies are homophobic in that they continue to suggest homosexuality is a mental illness and something that can be eradicated.
A wilful misinterpretation
Mental health care has historically held various positions on homosexuality, from trying to cure it, to defending it against the law, to promoting a more positive view of same-sex attraction.
In April 2014, the UK Royal College of Psychiatrists (RCP) released a position statement on sexual orientation. This reiterated that homosexuality was not a mental disorder.
This document was an important step in the history of psychiatric engagement with lesbian, gay and bisexualities; it was (rather belatedly) in line with the removal of homosexuality from the Diagnostic and Statistical Manual of the American Psychiatric Association in 1973, and the International Classification of Diseases of the World Health Organisation in 1992.
The College said it:
considers that sexual orientation is determined by a combination of biological and postnatal environmental factors. There is no evidence to go beyond this and impute any kind of choice into the origins of sexual orientation.
The statement proved to be controversial, with some religious groups misinterpreting “postnatal factors” (that homosexuality is not only inborn) as a justification for conversion therapies.
The main thrust of the College’s position – that lesbian, gay and bisexuality don’t need to be “treated” – is in keeping with other psychiatric and psychological governing bodies. But in June 2014, Mike Davidson, director of the Core Issues Trust, the same British registered charity that unsuccessfully attempted to advertise “cures” for homosexuality on London buses in 2013, said:
The assumption that people are ‘born gay’ has become deeply rooted in our society and has driven huge political, social and cultural change. As this latest statement [from the College] reveals, that assumption is false and it is vital that professional bodies stand up against it rather than perpetuate it.
This wilful misinterpretation of the College’s position has been echoed by a number of other religious groups hoping to reinstate “cures” for homosexuality.
Push and pull
The debate around whether homosexuality is inborn, acquired, or a vice chosen by the person, is almost as old as modern psychiatry.
While many jurisdictions traditionally made homosexual activity illegal, assuming that the person was choosing to engage in a criminal act, several important sexologists attempted to show that it was inborn. Havelock Ellis’s radical 1897 work Sexual Inversion, for instance, argued homosexuality was innate, natural, found in various periods of history and many cultures, and therefore should not be illegal.
Ellis advocated legalising consensual homosexual sex between adults. But it was many decades before such a law offered lesbian, gay and bisexual people some vestige of legal protection and similar social rights.
It’s only when homosexuality is considered to be acquired that it is sometimes thought to be possible to “treat” it. But this conception requires a belief that it’s actually a mental illness, rather than one of many appropriate means of seeking sexual pleasure and forming emotional bonds.
Quinn Dombrowski/Flickr, CC BY-SA
Psychoanalysts considered homosexuality to be an immature deviation of the sexual object choice, caused by postnatal influences, some of which could be overcome in analysis to turn the analysand to heterosexuality. But in a letter to an American mother of a homosexual man, the father of the movement, Sigmund Freud, insisted:
it is nothing to be ashamed of, no vice, no degradation, it cannot be classified as an illness; we consider it to be a variation of the sexual function produced by certain arrest of sexual development.
He thought psychoanalysis would help people deal with the persecution their sexuality received in an unaccepting world.
A better time
After many years of intense persecution by police in efforts to regulate homosexuality, aversion therapies were developed to “cure” people of their prohibited sexual desires in the mid-20th century. Submitting to such therapies was a way of shortening prison sentences for those incarcerated for being gay.
In what effectively amounted to psychological torture, homosexuals were given emetics and shown gay pornography while being kept unwashed and sleep-deprived in rooms that stunk of their own vomit for days on end, to create an aversion to the objects of their desires. The idea was to forge an association between this horror and their sexual needs.
Ultimately, such therapies were unsuccessful. But they did cause intense trauma and suffering in the people on whom they were performed.
Eventually, after criticism from homosexual rights groups, and in the face of evidence that the “treatments” caused more harm while effecting no cure, homosexuality came to be considered a normal and acceptable expression of sexual desire. It was and still is the case that homophobia causes more suffering than homosexuality, which was decriminalised in many jurisdictions between the 1960s and 1990s. It remains illegal in 78 countries.
The UK Royal College of Psychiatrists’ position statement on sexual orientation is important because it underlies the idea that homosexuality should be acceptable, even if it’s not solely inborn. It should make no difference to people’s sexual rights if they are born that way, if they became homosexual because of some influence after their birth, or if they decide to engage in gay sex because they find it pleasurable.
There’s no case for attempting to cure sexual desire for people of the same gender. There is much to be said for banning treatments that have only added to the suffering and persecution of people for their sexual feelings.
The Royal College of Psychiatrists is to be commended for continuing to state that such “cures” are wrong. The religious groups trying to bring back aversion therapy need to catch up.
The star-studded world premiere of Game of Thrones’s fifth season has taken place – in the UK, at the Tower of London. It’s hard to imagine a more fitting venue for a show based on George R R Martin’s notoriously brutal novels.
Like the fictional Red Keep built by Aegon the Conqueror at Kings Landing after his War of Conquest, William the Conqueror founded London’s iconic fortress to subdue the locals after the Norman Conquest and reinforce his dominance as the new monarch. And 900 years on, the Tower has become synonymous with political intrigue, imprisonment, torture and death – a reputation that stems largely from the late 15th and 16th centuries, when several kings, queens and martyrs were imprisoned, murdered or executed within and around its walls.
©Sky Atlantic/Justin Downing
During the Wars of the Roses, the medieval conflict that inspired Game of Thrones, King Henry VI was confined in the Tower twice by his dynastic rival, Edward IV. Not unlike Aerys II, the “mad” Targaryen king, who incites Robert Baratheon’s rebellion in the back story to the series, Henry VI’s ineffectual leadership and madness triggered the historic civil war.
A prisoner for more than five years, Henry was murdered at the Tower following the death of his only son and heir at the Battle of Tewkesbury (1471). His execution, like the slaughter of mad king Aerys and his immediate heirs, was followed by a period of relative peace. But when Edward IV died in 1483, political wrangling resumed and the Tower once again provided the backdrop for the next stage of the conflict: the controversial disappearance of Edward’s sons, the “Princes in the Tower”.
The British Library, MS Royal F ii, f. 73
Sons and queens
The slaughter or disappearance of young heirs and bastards is also a disturbing and recurring motif in Game of Thrones, resonating with the medieval and Tudor obsession with bloodlines and succession.
Jamie Lannister throws young Bran Stark from a tower at Winterfell to prevent his incestuous affair with Cersei and their illegitimate children being discovered. Theon Greyjoy fakes the murder of Bran and Rickon Stark to secure Winterfell by butchering two proxies. Joffrey orders the Gold Cloaks to massacre Robert Baratheon’s bastards. Rickard Karstark kills Tywin Lannister’s nephews. Craster sacrifices his newborn sons to the White Walkers. All are innocent children, and all fall victim to the morally ambiguous “game” being played by the leading Houses of Westeros.
The prime responsibility of the noblewomen, such as Cersei Lannister, Margery Tyrell and Sansa Stark, is to provide the next generation of kings. They are therefore also at the heart of the political machinations. And here, too, the series draws inspiration from the real medieval and Tudor women associated with the Tower.
©2015 Home Box Office, Inc.
In his pursuit of a male heir, Henry VIII famously had his politically sharp consort, Anne Boleyn, imprisoned and executed at the Tower on charges of treason, adultery and incest (the same crimes embraced by Cersei Lannister). This example will be particularly fresh for those who watched the BBC’s Wolf Hall.
Queen of the Roses
Henry VI’s widow, Margaret of Anjou, was likewise incarcerated in the Tower after a decade-long struggle to secure the crown for her son, Prince Edward. A formidable and proud French woman, who married for political purposes, Margaret was unafraid of engaging with the factionalism of her husband’s all-male council. She fought fiercely, if not always astutely, when her son’s birthright came under attack, and made strategic alliances wherever she could – most notably by marrying Prince Edward to the daughter of her former enemy, Richard Neville, earl of Warwick, better known as the “Kingmaker”.
Captured after leading the Lancastrian army against Edward IV at the aforementioned Battle of Tewkesbury, where her son was killed, Margaret avoided her husband’s grim fate and was eventually ransomed, returning to France to live out her days in relative obscurity.
Ian West/PA Wire
If aspects of Margaret’s story sound familiar it’s because she is one of several resilient historical women who inspired the characterisation of Cersei Lannister. While Cersei’s future is uncertain, we’ve seen her fight to influence and fortify Joffrey’s sovereignty and this season promises to follow her struggle with Margery Tyrell for control of King Tommen, her second son.
Ravens and crows
It’s impossible to avoid a final comparison between the Night’s Watch, or “Crows”, who swear to be “the shield that guards the realms of men” at the Wall, and that other species of the crow genus – the raven – which said to protect the kingdom by its presence at the Tower. According to tradition, if the ravens ever leave, the Tower and the realm will fall.
©2014 Home Box Office, Inc.
By the same token, one can’t help wondering what will happen to the Seven Kingdoms if the crows defending the northern frontier are slain, or forced to flee, by the White Walkers. In future seasons, Bran and the mysterious three-eyed raven doubtless have an equally important role to play in the defence of the kingdom.
But for the time being, the imminent series of Games of Thrones will continue to delight and terrify its audience with the same bouts of intrigue, scandal and brutality that have contributed to the Tower’s notorious reputation and popularity.
If winter is coming, then so is more bloodshed, for, as Tower prisoner and martyr Sir Thomas More once said in his account of Richard III’s acquisition of the crown, “king’s games” are “for the more part played upon scaffolds”.