Over*Flow: They Are Risen: Drive-In Distractions and Hallowed Ground Under Lockdown
David Church / Indiana University


Easter 2020 church service at Becky’s Drive-In Theatre (Walnutport, PA)
Easter 2020 church service at Becky’s Drive-In Theatre (Walnutport, PA)

Among the COVID-19 pandemic’s major disruptions to the entertainment industry, the shuttering of movie theaters across much of the world has been one of the most talked-about developments. Even at a time when popular reportage about box-office numbers does more to gloss over theatrical exhibition’s place as a loss leader compared to home-video distribution windows, the major Hollywood studios have either delayed their upcoming release slates, vastly foreshortened the theatrical window, or premiered a handful of new releases as video-on-demand (VOD) streaming rentals. Perhaps a foregone conclusion, this latter option ironically recalls the same strategies that art-cinema distributors have used for several decades to combat the decline of arthouse theaters—even as the “eventness” of these day-and-date rentals also suggests the longer tradition of pay-per-view’s (PPV) imagined viewing collectivities for boxing and wrestling events.[ (( WWE’s WrestleMania 36, for instance, aired on April 4-5, 2020, but the PPV event’s conspicuous lack of cheering audiences in the stands worked against PPV subscribers’ sense of belonging to a collective viewing audience. On the shift to VOD platforms by art cinema distributors, also see Lucas Hilderbrand, “The Art of Distribution: Video on Demand,” Film Quarterly 64, no. 2 (2010): 24-28.))]

Here in the United States, spring 2020 has witnessed a series of social-distancing restrictions (all of different severity or laxity, depending on the uneven rollout of local and state guidelines). Among many trade groups attempting to weather the lockdown, the National Association of Theatre Owners lobbied Congress to help keep shuttered theaters afloat through the multi-trillion-dollar COVID-19 relief bills. But these bailout calls from brick-and-mortar businesses have been counterbalanced, in part, by the forms of mobile privatization that cars and trucks have served for many Americans since the 1950s.[ ((During the 1957-58 H2N2 flu pandemic, movie theaters were not shuttered, but attendance dropped by 25-50% in large cities as people stayed home to avoid infection. In an interesting connection to the boom in streaming services during the COVID-19 pandemic, industry wags debated whether to instead blame the box-office drop-off on the growth of home movie viewing, since October 1957 marked “the first time that daily audience for vintage productions on TV exceeded a full week’s attendance at theaters.” See “Current Alibi: Flu,” Variety, October 23, 1957; and “Flu Cost 10-Mil Tix—Sindlinger,” Variety, November 6, 1957.))] The curbside pickup of groceries, meals, and other essentials calls back to the 1950s rise of drive-in restaurants (minus the sociality); meanwhile, restaurant drive-thru windows have allowed other businesses to stay financially alive, and even drive-through COVID-19 testing facilities (where available) have been created to help foster more literal forms of survival.


Autoscope Drive In System
Television-style movie spectatorship at 1950s “Autoscope” drive-in system with individual screens per car

In an ironic stroke of timing, however, America’s indoor theaters closed en masse just as its over 300 surviving drive-in theaters were slowly emerging from their winter hibernation—a fraction of which were allowed to re-open for business (subject to state and local restrictions), so long as attendees maintained social-distancing protocols by remaining in and around their vehicles. Even operating with reduced services (such as limited food sales and restroom access), drive-ins became almost the only theaters operating across the nation for weeks, with the admission price for a carload of family members comparable to a VOD rental ($19.99) of the same movies; families might still be isolated together, but with a welcome change of scenery.[ ((For a historical example of this overlap between domestic and public viewing, see the “Autoscope” system, a short-lived variety of 1950s drive-in theater, in which each car, parked in a circular formation, had its own small, individual rear-projection screen served through optical refraction.))] This sudden demand for drive-ins has even led to the creation of pop-up theaters and calls for new construction of permanent drive-in theaters.

National news coverage of this unexpected revival for a decidedly “retro” exhibition style has tended to play up much of the same novelty value that drive-ins originally used to promote themselves back during their 1950s heyday, such as convenience, affordability, and family-friendly ambience—albeit now reframed around their scarcity in the streaming video era.[ ((Also see David Church, Grindhouse Nostalgia: Memory, Home Video, and Exploitation Film Fandom (Edinburgh: Edinburgh University Press, 2015), chp. 1.))] At a deeper level, however, car ownership is far more common in rural America than the high-speed internet access needed for streaming video platforms. Despite so much media coverage about the COVID-19 lockdown as a boon to Netflix, Hulu, and other major players in the streaming wars, the outsized attention paid to streaming services ignores that this luxury is not enjoyed by large chunks of the country, especially in more rural areas. Discussions of the so-called “digital divide” may now be less about total inaccessibility than very limited Internet functionality depending on one’s region and/or class status—hence it remains crucial to heed the affordances of older cinematic technologies like drive-in theaters when newer platforms are not viable options for accessing movies during the lockdown.


Pioneer of drive-in church services, Rev. Robert Schuller, at the Orange Drive-in Theatre, 1978
Pioneer of drive-in church services, Rev. Robert Schuller, at the Orange Drive-in Theatre, 1978

In a parallel development, some shuttered churches began holding drive-in services using the same shortwave FM transmitters that drive-in theaters use to channel sound through car speakers. Assembling their vehicle-bound congregants in the empty parking lots of stores and, in some cases, drive-in theaters themselves, churches have turned to both live and pre-recorded sermons, encouraging parishioners to honk their horns for “amens.” Indeed, the concept of the drive-in church dates back to the 1950s, when drive-in theaters were first used for evangelical purposes during daylight hours. Whereas some churches have turned to livestreaming services, drive-in sermons might be a better option for those unable to technologically access such content—much like the cinematic forms of distraction and comfort offered by drive-in movies. Adamant about holding services during Holy Week, some church leaders echoed President Trump’s delusional hopes to resurrect the economy by Easter Sunday, a potentially lucrative time for churches due to above-average holiday attendance. In some cases, though, local governments have attempted to shut down drive-in services or penalize attendees for violating bans on public assembly, leading to lawsuits over religious freedom.


News coverage of Easter drive-in church service, Walnutport, PA (MSNBC, April 12, 2020)

Yet, the similarities between drive-in movies and drive-in church services would be merely a fluke were they not indicative of a larger politicization of public space during the crisis, with political conservatives and conservative-leaning areas less likely to heed social-distancing measures. Due to the lower land costs required for a permanent drive-in theater, they have best survived in areas whose lower population densities are often less conducive to high rates of COVID-19 transmission than large cities—the same non-urban areas also more likely to lean conservative. Scott Herring suggests, for example, that regional drive-ins became an unlikely site for the 1970s burgeoning of New Right politics by screening “hixploitation” films that romanticized rural white identity,[ ((Scott Herring, “‘Hixploitation’ Cinema, Regional Drive-ins, and the Cultural Emergence of a Queer New Right,” GLQ 20, no. 1-2 (2014): 95-113.))] while I would argue that drive-ins’ latter-day reputation as sites of 1950s nostalgia can create imagined spaces of reactionary refuge from various forms of social turmoil that now include the COVID-19 threat. Indeed, nostalgizing drive-ins as populist sites runs the risk of nostalgizing the 1950s as a time when America was supposedly “great,” much as the less disciplined behaviors historically permitted by drive-in attendance can play into right-wing skepticism about social-distancing guidelines. Small wonder that conservative publication The Federalist nostalgically heralded the return of the drive-in theater as “good, clean, old-fashioned fun” for American families, only days after dangerously suggesting that “controlled voluntary infection” through “coronavirus parties” would help build herd immunity in order to re-open the economy sooner (never mind the lives lost in the process).

Much as the Trump administration has filed a statement of support in lawsuits against local crackdowns on drive-in church services, there is no shortage of people whose belief in some great reward beyond—whether a spiritual afterlife or thriving markets—should supposedly outweigh their social responsibility to the safety of nonbelievers. Of course, we should not presume that all people attending drive-in movies or church services have ideological reasons for doing so, beyond trying to maintain some sense of normalcy under extraordinary circumstances—yet, both types of drive-in events offer the semblance of community in semi-isolation, while still warding off different sorts of invisible evils. Faced with the existential threat of COVID-19, there are plenty of Americans who might not have ready access to the same means of soothing themselves in such troubling times—but whether attending a drive-in movie or drive-in church service is ultimately any more useful than a Netflix binge or VOD rental remains a question of faith, if not politics.



Image Credits:

  1. Easter 2020 church service at Becky’s Drive-In Theatre (Walnutport, PA)
  2. 1950s “Autoscope” drive-in system
  3. Pioneer of drive-in church services, Rev. Robert Schuller, at the Orange Drive-in Theatre, 1978
  4. News coverage of Easter drive-in church service, Walnutport, PA (MSNBC, April 12, 2020)


References:




A Bachelorette F***ing in a Windmill
Matthew H. Brittingham / Emory University


Hannah Sends Luke Home.

Season 15 of ABC’s reality TV show The Bachelorette (2003-present) had an interesting scene late in the season: bachelorette Hannah Brown of Tuscaloosa, AL and contestant Luke Parker of Gainesville, GA had a very intense discussion about sex, faith, and the Fantasy Suite. The Fantasy Suite is a luxurious room where the bachelor or bachelorette, if they so desire, can become intimate with contestants. It is a staple of the show when the number of contestants has been cut to the last few. Luke, a kind of villain character on the show who took Hannah to his hometown church in one scene, sat with Hannah over dinner and appeared to call into question the consistency of Hannah’s beliefs, if she were to invite contestants into the Fantasy Suite. Hannah, also a professing Christian, took offense, “you’re judging me and feel like you have the right to when you don’t at this point… guess what, sex might be a sin outside of marriage, but pride is a sin too. I feel like this is a pride thing… I’m a grown woman and can make my own decisions.” Luke again and again tried to backtrack, but his own foot had moved past his mouth and was already entering his stomach. She rejected the idea that, in her words, “you would not think of me as a woman of faith like I am.” Right before pushing Luke into the limousine and basically off the show, she said something that caused a public stir, “I have had sex… and Jesus still loves me.” She then followed this statement by saying that she has already had sex with another contestant on the show… in a windmill no less. In an interspliced interview, she turned to the camera and said “I didn’t just go to the Fantasy Suite, I f***ed in a windmill. And guess what, we did it a second time,” winking at the camera after this last line. The whole situation and conversation between Hannah and Luke was awkward and clearly Luke did not want to leave the show—he tried to convince her to hear him out multiple times.

The exchange immediately spilled onto social media and occupied the news for the next several weeks. There were earlier signs that Hannah was clearly a Christian while also not being a holier-than-thou stereotype of a Christian. In the first episode of the show the viewer sees Hannah pray to God for strength before meeting the bachelors. Later in the first episode, after it is revealed that one of the bachelors has a girlfriend back at home, we hear Hannah dropping bleeped out words left and right. In the exchange with Luke above, we hear the same—talking about how Jesus loves her moments before discussing f***ing in a windmill. For the Bachelor/Bachelorette series in particular, this was new territory. In a broader sense, however, religion has been present on reality TV since its near-beginning, and it continues to be a force. There was little new about that part of the spectacle. As some scholars who have written about religion and reality TV would say, it was a classic moment of where “Reality television turns intimate moments of prayer, confession, ecstasy and sin into spectacle” (Einstein, Winston, and Madden p. 8).


Hannah and Luke’s Twitter Battle.

Everybody wanted to participate in the spectacle as Hannah’s sexuality and its connection to Christian piety or impiety, depending on one’s perspective, brought forth boos and cheers. It was a media rose ceremony of sorts. Conversations about Hannah’s faith and position on sex before marriage ranged from The American Conservative to Fox News to The Daily Beast to NPR, with all the types of responses one could imagine. She was supported and attacked in all the normal venues for these kinds of conversations as well — Twitter, Facebook, Instagram, etc. In an interview with People Hannah talked about how “soul-crushing” it was to hear some people say that she “misrepresent being a Christian.” Instead, she affirmed that she “can be a woman of faith and also be sex-positive.” Today reported, quoting Hannah, that “In fact, as ‘an imperfect human, who is yes, also a Christian,’ she believes she’s a good representation of the redemptive nature of her faith.” In an NPR interview, Hannah stated, “It’s tough, because my faith is really important to me, and I do know the Bible, and I do know what it says, and I still stand by what I said. But a lot of people will try to sway what it actually means.” In this same interview, she talked about her background and why the conversation with Luke really irked her:

And I used to carry a lot of shame because I had had sex before. And in that moment, [being questioned by Luke P.], I felt like I was right back in church, just feeling like I was not enough. And that’s what I meant. Well you know what, I have had sex, but, like, I know my relationship with the Lord, I know that he forgives me. He loves me. And I’m not alone in that.

Along with many, many interviews with other media outlets, Hannah posted her own Instagram response to the vile slut-shaming comments she received. The Instagram post likewise became a place where her defenders could rally around her cause.

The media depiction of Hannah’s Christianity and sexuality, as well as her self-portrayal, is notable given trends in American Christianity. Having felt judged and having carried shame in a church she once attended, she framed herself, and was also framed by media commentators, as a “sex-positive” Christian. Certainly, in terms of shame, sex, and church, heavy criticism has been poured on conservative Christian purity culture, a culture with which Hannah appears to be at odds. Hannah, rather, framed her sense of spirituality as a personal relationship with God, a relationship that, in her words, contrasted with once “just feeling like I was not enough” in church. The contrast of shame/sex-positive is still in some ways a reinscription of old fuddy-duddy stereotypes about certain Christians, as if conservative Christians are sex-negative or don’t like having sex (for example, see: ­Williams 2013). Actually, scholar Kelsy Burke (2016) has shown that conservative Christians love talking about sexual pleasure, placed within certain boundaries of course. On the other hand, Burke’s research does indeed show the opposite too: there are negative Christian messages about sexual pleasure from religious leaders who seek to define the boundaries of who should be doing what, where, and with whom. In American Christianity, these constantly negotiated boundaries have complex histories, both on the more conservative and more liberal side of the bed.

In her self-representation, Hannah further tapped into longstanding religious rhetoric of potential transgression and imperfection. She called herself a person who had “slipped” and “wasn’t perfect,” despite her positive relationship with God. Luke actually tapped into the same rhetoric when he responded to what happened between Hannah and himself. He wrote on Instagram, “our conversations and our beliefs led me to believe we were on the same page about sex… As for my time on the show I made mistakes and no I’m not perfect (crazy right) I didn’t totally behave as the man I want to be and I did not represent Christ the way I thought I was prepared to and that has broken me.” [The feud over sex and religion was ongoing too…]. The representation of Christians with two different views of sex is not necessarily new, but it is interesting in light of how Hannah and various media outlets were able to frame the clash in terms of a devout Christian who is “sex-positive” and on a not yet completed faith journey. The spectacle of Hannah and Luke’s conversation might leave a little to be desired for viewers who want a more definitive position from the bachelorette heroine, whichever way that may be. Regardless, the rhetorical work allowed many viewers to treat Hannah sympathetically, even if they did not totally agree with her. It is this small taste of cloudiness surrounding the fairly well manicured reality TV world that provides just enough titillation and just enough distance from completion to keep viewers coming back. This is not perhaps what critic Michael Warner called “‘the agony’ of ‘choosing between the orgasm and religion” (cited in Burke, p. 3), rather it is the tense agony of having a bit of both without being completely satisfied.



Image Credits:

  1. Hannah Sends Luke Home.
  2. Hannah and Luke’s Twitter Battle.


References:

Kelsy Burke, Christians Under Covers: Evangelicals and Sexual Pleasure on the
Internet
. Berkeley, CA: University of California Press, 2016.

Mara Einstein, Katherine Madden, Diane Winston (eds.). Religion and Reality TV: Faith in Late Capitalism. New York: Routledge, 2018.

Daniel K. Williams, “Sex and the Evangelicals: Gender Issues, the Sexual Revolution, and Abortion in the 1960s,” in American Evangelicals and the 1960s, Axel R. Schäfer (ed.). Madison, WI: University of Wisconsin Press, 2013.​




Legitimating/ed Complexity and Religion in The Leftovers
Kiwi Lanier / University of Texas at Austin

Kevin

Kevin Garvey, played by Justin Theroux, in The Leftovers

Especially during election years, discussions of faith and politics are almost ubiquitous on broadcast news and social media. And yet, 50% of American adults say they seldom or never discuss religion with anyone outside of their families. [ ((Cooperman, Alan. “Many Americans Don’t Argue about Religion – or Even Talk about It.” Pew Research Center Fact Tank. Pew Research Center, 15 Apr. 2016. Web. 15 May 2016.))] Pew Research Center’s Religion in Public Life Project charts the religious attitudes and behaviors of Americans between 2007 and 2014. According to this study, the percentage of Americans who believed in God declined by 8% between 2007 and 2014, with the percentage of Americans who do not believe in God almost doubling during the same time period. [ ((Wormald, Benjamin. “Religious Landscape Study.” Pew Research Centers Religion Public Life Project RSS. Pew Research Center: Religion and Public Life, 11 May 2015. Web. 15 May 2016.))] 49% of religiously unaffiliated Americans (usually referred to as “nones,” an amusingly coincidental yet frustrating homophone) either believe in God and are certain or are fairly certain that God exists. [ (( Ibid. ))] Because people (albeit a small section of the population) are choosing to forego affiliating with a recognized branch of religion, this allows them to choose a unique combination of beliefs and rituals — or lack thereof — to appropriate for themselves. These results indicate a shift in how these ideas manifest and evolve in people’s internal lives and patterns of behavior.

Given these shifting attitudes and behaviors, how is a show like The Leftovers (2014–), which portrays explicitly religious characters and themes, so popular? Premium networks like HBO (the network which airs The Leftovers) have the bargaining power to change their lineups however they wish — meaning they could try to program shows of the same production value and narrative complexity without having to go near religion at all — and yet these shows regularly cover religious ideas and create complex religious characters and manage to persist and thrive. The Leftovers uses religion and narrative complexity as a mechanism for legitimation that, if aired on more traditional networks, may have been censored to the shows’ detriment. Despite changing religious attitudes and beliefs, television can still serve as a site for religious identity negotiation for those who can afford access.

Jason Mittell’s essay on narrative complexity, although divisive, provides a good framework for how we can understand The Leftovers’ appeal to critics and devoted fan base. He writes that television of the 1990s onward exists within a larger framework of complexity and innovation. Mittell cites the rise in channels as a factor in shifting programming tastes among viewers. According to Mittell, the network logic worked like this:

Audiences lacked the weekly consistency to allow for serialized narratives…but as the number of channels has grown and the audience for any single program has shrunk, networks and channels have grown to recognize that a consistent cult following of a small but dedicated audience can suffice to make a show economically viable. [ ((Mittell, Jason. “Narrative Complexity in Contemporary American Television.” The Velvet Light Trap 58.Fall (2006): 29-40. Web. 15 May 2016.))]

Narrative complexity is a possible solution to the interchangeability of procedural dramas and conventional sitcoms that characterized the network era. [ ((Ibid, 31).))] He defines narrative complexity as a “redefinition of episodic forms under the influence of serial narration — not necessarily a complete merger of episodic and serial forms but a shifting balance.” [ ((Ibid, 33.))] Additional caveats to narrative complexity include the use of narrative spectacle and also rejecting the conventions of episodic and serial narration altogether to create a style of storytelling entirely different. According to Mittell, this kind of storytelling requires a more engaged viewer than typical television viewing, since the enjoyment comes out of the negotiating processes between the viewer and the program. [ ((Ibid, 38.))]

The Leftovers depicts life in America after an apocalyptic event called the Departure that caused 140 million people to disappear from Earth. The people in this universe (and the audience) never find out where those people went, why those people disappeared, if they are ever coming back, or if the people who disappeared are alive or dead. Characters within the show devise their own explanations for why the Departure occurred and coping mechanisms for dealing with grief and doubt.

One particular result of the Departure is a cult called the Guilty Remnant, which protagonist Kevin Garvey’s estranged wife Laurie joins in the first season. Members of the Guilty Remnant do not worship any god. If they worship anything, it is survivor’s guilt. Upon joining, members of the Guilty Remnant must wear all white, swear a vow of silence, smoke cigarettes, and generally assist in operations of the cult under the leadership of Patti Levin, the founder of the cult and one of the most compelling villains on television. The cult performs acts of both violent and nonviolent intimidation in an attempt to force the people of Mapleton, New York (first season) and Jarden, Texas (second season) to essentially never move on from their grief and guilt.

guiltyremnant1

The Guilty Remnant

The Guilty Remnant targets these acts of intimidation at the families and friends of those who departed. Portraying a corrupt cult is by no means revolutionary, but the cult’s narrative trajectory over the course of the series takes so many violent turns that the viewer feels disoriented. One reviewer says:

Even after three episodes, it’s nearly impossible to tell where the second season’s story is headed. The structuring alone prevents viewers from knowing what next week’s focus will be, and whether that makes the series more accessible to potential fans and as exciting to existing ones is an equally difficult prediction. [ ((Travers, Ben. “Review: ‘The Leftovers’ Season 2 Boldly Embraces the Unknown to Be Born Again.” Indiewire. Indiewire.com, 30 Sept. 2015. Web. 15 May 2016.))]

In another review, Damon Lindelof (the showrunner) essentially confirms that Travers has picked up on his frustrating yet fascinating modus operandi when he says, ”We’re much more interested in the condition of living in a world without answers than we are in providing them.” [ ((Sepinwall, Alan. “Damon Lindelof: ‘I Was Really Depressed Writing Leftovers Season 1.” HitFix. HitFix, 29 Sept. 2015. Web. 15 May 2016.))]

When it is not trying to produce confounding narrative elements, the character work on The Leftovers is worth noting. The Leftovers includes explicitly religious characters but avoids stereotypes where it can. Christopher Eccleston plays a preacher named Matt Jamison determined to expose the sins and indiscretions of those who departed, even to their families, in a misguided attempt to prove that the Departure was not the Rapture and to bring people back into his church. However, the Guilty Remnant ends up buying his church out from under him because he couldn’t bring in enough money. Matt’s relationship with God is troubled because his wife, Mary, got into a lethal car accident when a driver of an oncoming car disappeared in the Departure. The accident left Mary paralyzed and nonverbal, making Matt her sole caretaker and complicating his relationship with his God.

Matt

Reverend Matt Jamison

Within Mittell’s framework, the show fits several of his criteria with regard to narrative spectacle. Dream sequences, flashbacks, and visions are often presented without comment or explanation (and even if an explanation is given, it usually prompts even more questions or addresses the bare elements that need to be clarified), similar to the fantasy sequences that Mittell describes. [ ((Mittell, 38))] In the second season, the show changes setting from Mapleton, New York to Jarden, Texas. This is ultimately explained within the show, but more conventional shows definitely could not pull off such a drastic change and get away with such little clarification. With so many gaps in the storyline, the new, more involved mode of viewer engagement that Mittell calls for is entirely necessary to understand the few details the show provides for the audience. [ ((Mittell, 38))] The storytelling is self-reflexive, as Mittell says narratively complex programming is. It deals with religious themes to create complex plot lines, but it does not shy away from being critical of the same religions it uses for its own benefit.

To conclude, religion on television is certainly still a worthy topic of study due to its changing roles in society. In an audience with decreasing religious affiliation, shows like The Leftovers that handle the themes with enough reflexivity, innovation, and finesse can appeal to critics and viewers of various backgrounds. This show allows viewers to meet and wrestle with religious ideology on their own terms in a way that inspires instead of condemns or shames.

Image Credits:

1. Kevin Garvey.
2. The Guilty Remnant.
3. Matt Jamison.




Sometimes a kiss is just a kiss: (not) responding to the Richard Gere-Shipla Shetty controversy in India

by: Shanti Kumar / University of Texas-Austin

Shilpa Shetty, it appears, cannot stay out of controversy and news headlines these days. Shetty, a well-known Bollywood actress in India, shot to international prominence after appearing as a contestant on Celebrity Big Brother in the U.K. in January 2007. The British reality TV show was engulfed in a major controversy when Shetty became the target of racist remarks and bullying by some of her housemates led by the now infamous Jade Goody. When Shetty went on to win the show, she not only became a household name in Britain, but was also the focus of attention in many newspapers, television channels and online sites around the world.

Shetty was back in the global news headlines in April 2007, when she was embroiled in another controversy, this time in India. At an AIDS awareness campaign organized in Delhi to benefit truck drivers, the American actor Richard Gere planted a series of kisses on Shetty. Although taken aback by Gere’s actions, Shetty reportedly laughed it off with a comment directed to the truckers, “yeh thoda zyaada ho gaya” (“This is a bit much.”)

Shipla Shetty and Richard Gere

Shipla Shetty and Richard Gere

Condemning the kiss, Prakash Javadekar, the spokesman for Hindu nationalist party, Bharatiya Janata party (BJP) proclaimed, “Such a public display is not part of Indian tradition.” In Mumbai, members of the right-wing Hindu nationalist group Shiv Sena stormed onto a set where Shetty was shooting a film, set fire to her photographs and burned effigies of Gere. Poonal Chandra Bhandari, an advocate in the city of Jaipur, filed public interest litigation accusing Gere and Shetty of committing “an obscene act” in a public place. Conceding that the kiss at the public event was “highly sexually erotic,” Dinesh Gupta, Additional Chief Judicial Magistrate in the Jaipur Court, issued an arrest warrant against Gere and summoned Shetty for appearance on May 5, 2007.

Sensing trouble due to the growing controversy, Gere tried to set the record straight with an apology. In a statement addressed to “My dear Indian friends,” and released to the media, Gere wrote, “What we thought was a very successful HIV/AIDS event has taken a sad turn. The evening and event in question was intended to celebrate courageous people and partnerships in the supremely important fight against HIV/AIDS, a worldwide pandemic which has afflicted over 5 million Indians and is still increasing.” Applauding Shetty for taking a leadership role in the fight against AIDS, Gere said, “I assure you, I have utmost respect for her, and she knows this. Of course, I’ve felt terrible that she should carry a burden that is no fault of hers. The burden is mine and no one else’s.”

Shetty, on her part, strongly defended Gere saying, “He is such a gentleman. He is incapable of indecent behaviour.” Lashing out against her critics, Shetty argued, “It was just a kiss on my cheek! What’s the big hue and cry about?” She explained the reason for the kiss as follows: “Earlier during the day during lunch we were teasing him about a dance step in Shall We Dance? When he suddenly bent me down on stage he was doing that whole step from Shall We Dance? I was as taken aback as the people who saw it. It was nothing but a joke and not pre-planned at all.”

But some critics of the kiss seemed unwilling to accept either Gere’s apology or Shetty’s explanation. “The indecency might have been purposefully done as a publicity stunt,” argued Lily Agarwal, a BJP member of the Bhopal City Corporation. Supporting the protests, Agarwal said, “An Indian woman’s greatest asset is her modesty, her reputation and dignity. Shilpa’s lack of any protest only confirms that we are still slaves of the ‘White.’ We will tolerate all humiliation just because we feel the ‘White’ is our master.”

In many postcolonial nations like India, the myth of a homogenous and homogenizing (white) Western culture is a convenient reference point for many political parties and ideological blocs struggling to establish their hegemony in the very diverse terrain of culture. As the noted postcolonial critic Ashis Nandy argues, the myth of “the West” has engendered (and has in turn been engendered by) three responses in colonial and postcolonial India; or more precisely, two responses and one non-response.

The first response, writes Nandy, is to model Indian culture on the idealized myth of Western culture. However, there is more than mere imitation or mimicry involved in this process: It involves “capturing, within one’s own self and one’s own culture, the traits one sees as reasons for the West’s success on the world stage.” This process is seen as a liberal synthesis of “Indian” and “Western” cultures, and justified in terms of universal principles such as “democracy” and “civilization.” In the Gere-Shetty controversy, for instance, some in the Bollywood fraternity embraced this view in their defense of Shetty. Noted Bollywood director Mahesh Bhatt declared, “When the mother of civilisation gets obsessed with trivia, you can be sure doom is around the corner.” Actress Celina Jaitley asked, “If she [Shetty] does not have an objection, why should others be bothered? She is above 18, is grown up and knows what she is doing. I really wonder what has happened to the world’s biggest democracy where every citizen has the right to expression and this reaction from fundamentalists groups is really uncalled for.” Shetty also seemed to endorse this view when she said, “I don’t want the Indian media and Indians to look foolish to the outside world.”

In a similar vein, former attorney general, Soli J Sorabjee criticized Judge Gupta for behaving like the “Taliban moral police,” and opined that “the order is unsustainable and makes us look ridiculous.”

The second response to the so-called clash between “Indian” and “Western” cultures is that of the fundamentalist zealot whose sole aim is somehow to defeat Western culture at in its own game. Examples of this type of response abound in India; the over-zealous moral policing of the Gere-Shetty episode by Hindu “fundamentalist” groups like the Shiv Sena in the city of Mumbai, and the Bharatiya Janata Party (BJP) at the national level being only the most recent. The strategy of the Hindu fundamentalist groups is all too evident. As Nandy puts it, the goal of the Hindu fundamentalists is to:

[D]econtaminate Hinduism of its folk elements … then give it additional teeth with the help of Western technology and secular statecraft, so that Hindus can take on, and ultimately defeat, all their external and internal enemies, if necessary, by liquidating all forms of ethnic plurality — first within Hinduism and then within India, to equal Western Man as a new ubermenschen.

Many liberal-minded Indians who are embarrassed by the political manipulation of religion by fundamentalists tend to classify the response of the Hindu right wing groups as “a retrogression into primitivism and as a pathology of traditions.” But look closely, argues Nandy, and there is nothing “fundamental” about the “fundamentalists.” The almost complete lack of tolerance of the fundamental principles of religion, and the inability to accept the diversity of cultural traditions demonstrate how Hindu right has morphed into a highly modern political machinery that seeks to create an “Indian” culture which not only equals but ultimately surpasses Western culture.

The third response of postcolonial Indians to the myth of a Western culture, writes Nandy, is a non-response. This (non)response emerges from a pragmatic recognition of the cultural and historical continuities and tensions between the “colonial” and the “postcolonial,” “Indian” and the “Western” or the “traditional” and the “modern.” This non-response, according to Nandy, is voiced by a majority in postcolonial India and is based on the belief that diverse cultures in India have known how to live with each other for centuries. This belief emerges from a cultural consensus that religion is not a tool for political manipulation but is a way of life with its own principles of tolerance.

The three responses outlined above are inextricably linked in the political, religious and cultural realms of everyday life in India. But, paradoxically enough, both the enthusiastic admirers of the “West” and their over-zealous opponents in the Hindu right wing would like to believe that the third response is merely a minority view. However, the non-response is clearly in evidence as a majority of Indians ignored the controversy over the Gere-Shetty kiss and the protests organized by Hindu right wing groups fizzled out with a whimper – notwithstanding the excessive media coverage in India and abroad. But the perhaps the most powerful impact of the non-response by a majority of Indians to the Gere-Shetty controversy has been that Judge Gupta (who issued the warrants against Shetty and Gere) was quietly transferred from his post in Jaipur to the small town of Kishangarh several hours away. A spokesman for the Court claimed that the transfer was “routine,” but he also said that Judge Gupta acted on a “frivolous” public interest litigation, and noted that the transfer order came from the state’s Chief Justice. Although it is not clear what effect the transfer will have on the Gere-Shetty case, one can only surmise that the judiciary has recognized that the non-response to the controversy is indeed a majority opinion in Indian public culture.

Endnotes
Effigies of Richard Gere burn in India
My dear Indian friends, I’m surprised: Gere
Richard Gere cannot do anything obscene
Gere has apologized: Shilpa
Nandy, Ashis. “The Politics of Secularism and the Recovery of Religious Tolerance,” Alternatives, XIII (1988): 186.
Ibid.
Gere has apologized: Shilpa
Indian judge who ordered Richard Gere’s arrest transferred: report
Nandy, 187.
Ibid.
Ibid., 188.

Image Credits:
1. Shipla Shetty and Richard Gere

Please feel free to comment.




Television and the Work of Mourning

Nate Fisher Jr.

Peter Krause as Nate Fisher Jr.

“You gotta go through all of the necessary stages of grief. That’s how you honor what a person actually meant to you.” — Nate Fisher, Jr. in “In Case of Rapture”, Episode 2, Season 4 of Six Feet Under.

I had not planned on writing this column. Of course, I had planned on writing a column, just not one about missing a television series, particularly one whose episodes I can easily access on DVD and online. But you can’t always plan where life is going to take you, even when you know that a significant portion of it is coming to an end. And throughout the past five years Six Feet Under held a significant place in my life. It was that one show that I tried to never miss, the one that truly compelled me. When in 2004 Alan Ball announced he would end his HBO series fans prepared with predictions and all of the other chatter typical of fan boards. Much to my chagrin I would have to accept the loss of what one friend of mine called this “own special brand of fucked up melodrama.”

So as the fifth and final season ended with the Fisher family in mourning, we were somewhat prepared. To loyal audiences, Nate Fisher Jr.’s death wasn’t terribly shocking. After all, Nate had had a near death experience that began at the end of season 2 and carried over into the beginning of season 3. And it was never clear whether or not his Arterio-Venous Malformation was sufficiently corrected. Because the Fishers specialized in dose after dose of denial, anger, bargaining and depression, the three episodes of televised grief that followed Nate’s passing weren’t even that shocking. If anything it was appropriate that in the final episode of the final season that we see the Fishers accept the loss of Nate through an impromptu commemoration of his life. The gesture provided as much closure as one might expect from a series that specialized in providing audiences with that occasional unresolved death, the kind that reminds us that narrative is the necessary frustration for those who remain.

Six Feet Under: Nate\'s Burial

Six Feet Under: Nate’s Burial

Yet, after the series ended, what surprised me was how much I missed the show. And it wasn’t just me as friends and acquaintances acted, as if they, too, had lost a significant portion of their lives. As a media scholar, I had never underestimated the television’s significance. What pages of research and speculation had not explained to me was why I would begin to compulsively review old episodes and search for a familiar position on the couch in search of something to take Six Feet Under‘s place. Indeed, I was reminded that television, when it is important, brings to our lives a sort of mystical combination of everyday relevance and predictability that reveals the prosaic as simultaneously ordinary and illuminated. When a program is at its most noteworthy even the most pedestrian elements of life shine through. Since I moved to Ohio, Sunday night has meant that I would watch an HBO drama after my preps for Monday’s classes were finished. But Six Feet Under, more so than any other HBO “Sunday Night Program”, became a part of my life. But I am not here to praise Six Feet Under, I am here to mourn it.

Mourning, grief, that part of the human experience that most of us must endure in order to heal after a significant loss, contains complexities of memory that resist language and conventional “understanding”. Reviewing Derrida’s The Work of Mourning, Sorcha Fogerty notes that Derrida accepts:

“the challenge of making the impossible a possibility in mourning; i.e. (i) invoking the possibility of an interiorization of what can never be interiorized (in that the dead are both ‘within us’ but ‘not ours’); and (ii) establishing a language for the unspeakable work of mourning, of how to mourn and how to speak in mourning, how to contend with the intolerable choice between what appear to be the two betrayals of silence and speech. This leads to the central paradox of the work of mourning: that success fails and failure succeeds. This typically Derridean contradiction indicates that if we achieve in some way the successful interiorization (but this is impossible) of the other, we in fact fail, because then the other is no longer other, we are no longer respecting the other’s ‘otherness’ if we somehow draw the other into ourselves, And conversely, if we fail (which we are bound to), we succeed, because we have retained respect for the other as other.”

If this paradox helps explain anything with regards to television production, it may help us understand why so few spin offs of “dead programs” ever measure up to the programs whose legacy to which they are attempting add. Perhaps the failure of show like After MASH may not only be a confirmation of poor writing and less-than compelling characters as it is confirmation of the strength of its parental text. Indeed, successful spin offs effectively distance themselves from their origins: Frazier succeeds in its failure to become yet another Cheers. Frazier invoked its past, however its legacy was dependent on sufficiently exteriorizing its efficient cause so it could become something sufficiently other from its textual universe.

I don’t want to spend much time debating the above proposition. Frankly, that would be the work of a lengthier paper, and fortunately, this column is not that. I am much more interested in exploring how media institutions address the many acts of memory that are fundamental to the experience of significant loss. To be sure, television contains multiple lessons about how we deal with the past. Yet, to invoke Raymond Williams, perhaps we should continue to look at the practices of social communication to understand what the development of televisual communication systems provide us:

“The true basis of this system had preceded the developments in technology. Then as now there was a major, indeed dominant, area of social communication, by word of mouth, within every kind of social group. In addition, then as now, there were specific institutions of that kind of communication which involves or is predicated on social teaching and control: churches, schools, assemblies and proclamations, direction in places of work. All these interacted with forms of communication within the family” (1974. 14-15).

If the work of the family and the church has been the primary site where the consideration of passing and loss took place, perhaps we
need to think through what our many reactions to media change can teach us about our social status as social animals of the late 20th and early 21st century.

It is no secret that studying television we can learn how we have invested in the past. Derek Kompare points out in his book Rerun Nation that numerous historical and institutional issues have been fervently negotiated so that past television programming can be presented as a fundamental and valued commodity in American television culture. Yet the manner in which media and memory are processed go far beyond the printing of DVD box sets and the process of off-network syndication. Online spaces such as Television Heaven claim “to preserve the memory of television programmes both past and present that the writers/reviewers either consider to be true classics, or have a lasting influence on what we watch or how we view the world around us.” And TV Land’s Caught on Camera web page promises us that we can “Hear the juiciest stories, relive the most touching moments, and find out some of the quirkiest facts — all straight from the source.”

TV Land\'s \'Caught on Camera\'

TV Land’s ‘Caught on Camera’

If “celebrity reunions” provide audiences a chance to celebrate a past televisual memory, we should not forget how the verb, mourn, has an etymological connection to the Ancient Greek term mermEra, a term that means, “to care or cherish”. Such reverence is not only evident in the letter-writing campaigns and online petitions circulated by fans in attempts to save their favorite programs, but on the many chat boards. These online testaments exist are often maintained by producers with a vested interested in preserving a space for memorial. And if all this investment in “cyber cemeteries” feels just a bit uncanny, perhaps it is because there is a fine line between nostalgia and commemoration. Yet while the former longs to return home, the other is mindful that there is no possible return as it calls the past into the present, and ritualistically moves forward.

Allow for the possibility that a fan board at JumptheShark.com could act as a sort of ritual space where tribute and longing intermingle, where numerous memorial acts may be composed. Take for example, the following post about the 1980s television program, Frank’s Place, that exists on a JumpTheShark.Com board regarding the program:

“I’ve been in mourning for the past 14 years for the best show ever aired. I petitioned my cable company to have BET placed into the line-up because it was the only place where I could see Frank’s Place. Unfortunately, by the time they complied, it was no longer being shown on BET. It was intelligent and wildly funny, unlike most series with predominantly black casts on today. Wish this show would be available on DVD because a visit to the Chez would be like a trip home.”

While the quote reveals a wish to go home, we should not simply conflate it with a nostalgic longing. The post also reveals that the viewer of Frank’s Place, like the mourner who lacks photographs and letters of a loved one, lacks the convenient mnemonic devices that many loved ones utilize in order to move one through the processes of loss. Of course, I am not claiming that by observing how we react to the loss of a television show we necessarily gain a finer understanding of what it means to mourn our brothers and sisters. Rather, I do believe that an honest observation of the way we react to the expiration of a television show offers us another chance to understand the complexities involved in the institution of television as a portion of our social fabric.

All of which, brings me back to the question of what it means to “miss television”. When I informed one of my colleagues about the possibility of writing this column, she reacted by saying, “I think you should. I mean, I miss Buffy even though the final season kind of sucked and I have every available DVD.” Indeed, that was the very sentiment that confused me: even though I have every episode of Six Feet Under I still miss the show. More specifically, I miss the show’s particular rhythmic presence and ability to predictably surprise me about questions of death that I simply would have never asked.

For my money, the character I will miss the most is Nate Fisher, Jr. As the heart of the show, Nate grew from resentful to accepting over his five-season span and moved through more melodrama than anyone short of Job. Indeed, the following testimonial posted on an HBO maintained Six Feet Under fanboard indicates, I am not alone:

“Our family will really miss the Fishers this next year. For the past three years, my teenage son (while in the ninth through twelfth grades) has said that his role model for being a man is a combination of Nate and David–and I am pleased with his choice.

“Nate was a stand-up guy. He didn’t want to be a funeral director; yet, when his family’s finances were in peril, he decided do do what he didn’t wish to do — but to do the ‘right thing’ and pitch in. And he found that he was empathetic and advocated for those grieving (in his profession and in his life as well) in a unique, skilled way. He was horrified at Brenda’s betrayals but tried to understand. He ‘did the right thing’ despite not loving Lisa and married her and TRIED to love her, tried to make their marriage work, despite her distance, her obsessiveness.

“Yes, he acted like a grieving person. A theme of SFU is about how we grieve. And we behave in ‘an unusual manner’ during grief.”

Ending the post with, “Thank you for the best show we have ever seen. Thank you all.”, after substantially rehashing the past narratives, one senses that our writer has finally achieved some sense of closure, no matter how awkward and forced it may seem. But then again, so is grieving.

Work Cited:
Williams, R. (1974). Television: Technology and Cultural Form. Hanover, New Hampshire, Wesleyan University Press.

Image Credits:

1. Peter Krause as Nate Fisher Jr.

2. Six Feet Under: Nate’s Burial

3. TV Land’s ‘Caught on Camera’

Please feel free to comment.




When Mullahs Ride the Airwaves: Muslim Televangelists and the Saudi Connection

Dishes and Mosque

Dishes and Mosque

IN ADDITION TO OUR REGULAR COLUMNISTS AND GUEST COLUMNS, FLOW IS ALSO COMMITTED TO PUBLISHING TIMELY ONE-TIME COLUMNS, SUCH AS THE ONE BELOW. THE EDITORS OF FLOW ARE TAKING SUBMISSIONS FOR THIS SECTION. PLEASE FEEL FREE TO CHECK OUT OUR LATEST SUGGESTED CALLS FOR CONTACT INFORMATION.

“Soccer is not an illicit form of entertainment, but when practiced in violation of shariah, then it is as abhorrent as any other sin…. When we fanatically love non-Muslim players who perform the sign of the cross upon entering or leaving the field…or when Muslim players imitate the pagan dance of famous infidel players when they score, or put forbidden things on their chests, that’s not acceptable.” The author of this soccer fatwa is Saudi cleric Muhammad Al-Munajid on a set of Islam’s powerful spokeschannel, Iqra’ TV.

Until recently sheikhs like Al-Munajid were only able to reach their audience through audio and video recordings sold on Arab black street markets. Those who preached a rigorous interpretation of Islam had a minimal impact among fringe groups of Arab populations, but as satellite technology becomes greatly appealing to the religious and the secular alike, television channels with a strict religious message as Iqra’ are quickly setting shop. Inaugurated in 1998, Iqra’ is Saudi Arabia’s most recent and probably most effective campaign of spreading its Wahhabi doctrine, which the channel’s producers temper by saying on their website that their mission is to bring “the teachings of Islam into the homes and hearts of Arabs worldwide.” The Saudis take issue with the Wahhabi label because it makes them look less as the real Islam and more like a sect that is highly disputed in some respectable religious circles. But the systematic indoctrination of imams and financing of religious schools and mosques around the world reveal a rigid reading of Islam which forbids close interaction with non-Muslims and calls for the literal application of shariah laws across the region, including hand amputation for theft, sword beheading for capital crimes, and denying women any role in public life.

For years, Saudi Arabia had to flaunt its generosity towards poor Muslim countries by building hospitals, schools, universities and mosques even in Western Europe and the United States. According to Saudi officials, between 1975 and 2002, the Riyadh government spent more than $70 billion on Islamic projects around the world, excluding the millions of dollars volunteered by Saudi charity foundations and unidentified philanthropists. An estimated 80 percent of mosques in the U.S. are funded by Saudi Arabia, according to Sheik Muhammad Hisham Kabbani, head of the Islamic Supreme Council of America. While the funding of mosques and the ideological direction of those who frequent them do not necessarily correlate, the influence of the Saudis over the content of the sermons, the training of imams, and the substance of Islamic schools’ curricula is undeniable.

Religious spending per se is not the problem here, but it is the extremist ideology promoted thanks to this cash availability that is disturbing. The voices of intransigent Islam are featured frequently on the airwaves of Iqra’, and their edicts are often consistent with the Wahhabi attempt to purge Islam of what is perceived as foreign threat disguised as societal change. In fact, some of the messages on the channel can be extreme like Saudi cleric Aed Al-Qarni’s recent on-the-air endorsement of suicide bombing. “Houses and young men must be sacrificed,” he says, “Throats must be slit and skulls must be shattered. This is the road to victory and to shahada (sacrifice). Oh brothers, the idolatrous Vietnamese, Cambodians, and South Africans….Nations with no calling or divine law make sacrifices–sacrificing people, blood, and souls. All the more so should we, the nation of Islam.” And some show moderators often appear as enlightened by their guests’ revelations as when Egyptian historian, Zaynab Abdel Aziz tells a show host that the “Vatican delegated the US to carry out 9/11.”

While religious platforms such as Iqra’ do not call for jihad bluntly, theycontribute to an increasingly radicalized religious culture in the Arab world, making every facet of social, cultural, and economic life a religious issue in need of a fatwa. Fatwas range from Muslim women needing to comply with their husbands’ desire in bed even if they don’t want to, to why hands of stealers should be chopped, to whether Muslims should shake hands with Jews. Iqra’ (literally: “recite” or “read in an
intelligent way”), has found a fertile ground in a region still lacking basic political reforms and jaded with repetitious autocratic and corrupt regimes. For years, religious groups–mostly underground–in the Arab world have become the only viable alternative: when the health
system fails customarily in these countries, Islamic groups with disposable cash can intervene with their own doctors for free; when schools educate poorly, the same groups offer their own teachers for free. In the wake of natural disasters like floods and earthquakes, religious groups often respond quickly and more efficiently than governments to help the victims and alleviate their losses, as was the case in the earthquakes of Algeria and last year’s floods of northeastern Morocco. The failure of secular regimes to provide minimum social welfare and secure political freedom in the region has steadily nurtured a new perception whereby the state benefits the elite while religion benefits the masses.

This is why the world of Arab media seems swamped with religious messages, but by now, Arabs have evolved since the state-owned, everything-is-fine, and dull television channels. So, in order to appeal to a more media saturated audience, the producers of Iqra’ are taunting their skills by making religious preaching less shabby and threatening. The on-screen graphics and studio sets are comparable to entertainment television, but nothing is more alluring than the new look of Islamic scholars and sheikhs who do not always conform to the conventional image of a preacher in a mosque. In fact, many of these preachers and scholars wear suits and use softer tones than usual. Some of them are young and do not claim to be a religious authority like the channel’s superstar preacher, Amr Khaled, a 38-year-old who hosts one of the most popular programs on Arab television, Sunaa al Hayat (Life Makers).

Khaled, who has become a household name across the Arab world, is seemingly an anomaly in the Saudi quest to popularize Wahhabism: he is young, a business accountant not a religious scholar, and with a somewhat liberal and tolerant approach to Islamic preaching. Khaled’s fame at Iqra’ was preceded by a long showdown with Egyptian authorities who expelled him from Egypt after his religious lectures had become spiritual revelations for thousands of well-to-do women and youth in the country. His age, modern look (wearing jeans or a suit and clean-shaven), and the use of colloquial Arabic make him accessible to a young Arab audience extremely tired of the staid, disconnected sheikhs of Islam. But what made Khaled’s message appealing to the Saudi channel Iqra’; is that it is liberal only in style and quite conservative in substance. During his lectures and discussions on the hijab, Khaled is rarely original in citing the reasons why Muslim women should be veiled. Women are the pillars of Islamic education and wearing the veil, he says, is a selfless gesture to protect the sanctity of the faith itself: “I think that the primary purpose of legislating hijab, other than preservation of virtue, is…to remind people in the street about Islam; there will be no way better than hijab.” Islam’s integrity, he says on his show, depends on the virtue of its women and since their responsibility in the temptation of men is inevitable, veiling is a must, even if you don’t understand. While Khaled’s message lacks in originality and critical quality, his highly emotional, talk-show style provides an innovative and soothing statement that you can be pious and still remain modern and cool. And the Amr Khaled phenomenon has just begun despite some already unprecedented television ratings for his show: five million viewers tune in to his weekly show and his web site records millions of hits daily.

By putting Khaled next to the old and conventional sheikhs, Iqra’s producers are hoping to change the moral path of young Arabs who are still deeply influenced by Western popular culture. Major Internet chat rooms in the region are teeming with testimonies, particularly of young women thanking Khaled for convincing them to put on the veil. Programming this year included not only talk shows and lectures, but dramas and cartoons. It is hard to quantify the impact of Khaled’s hip preaching and Iqra’s religious broadcasting, but religion has never been this popular from Cairo to Casablanca. At a time when political regimes in the region continuously fail their constituency and Islam is the subject of humiliating headlines, Khaled and a wave of young preachers seem not only innovative, but also vengeful in a let’s-go-back-to-the-roots fashion. It is therefore not a surprise to find Saudi Arabia at the helm of this religious survival in disguise. Though Wahhabism may never become a preferred doctrine of Muslim Arabs, its signature of uncritical, exclusionary spirituality is quickly infiltrating Arab living rooms and delaying badly needed reforms both in religious interpretations and political rule.

The 30-year-old executive manager of Iqra’, Mohammad Hammam, likes to think of his channel as serving a double mission: counter the post-September 11 image of Islam and guide Muslims to understand better their own religion. Many of the ideas propagated from the sets of the channel, however, belie the core of this mission. If there is one, it seems to be to flood the airwaves with a fatigued interpretation of religion simply refurbished with funky jingles and beardless preachers.

Link
Iqra’ TV

Image Credits

1. Dishes and Mosque

Please feel free to comment.




War, Incendiary Media, and International Law (Part I)

War Protesters In Iraq

War Protesters in Iraq

In numerous major military conflicts of the past twenty years, of which the Iraqi war was/is the most recent, there has been an increased focus for observers in international law on the abuse of the media to engender violence, ethnic hatred, and even genocide. The media, particularly radio and the internet, have been identified as significant political tools for mass manipulation by dictatorial governments to drive deep seated animosity between social and ethnic groups, resulting in an intense atmosphere of mistrust, misinformation, and devastating killings. Nationalistic and propagandistic constructions of ethnophobia in the media helped shape wars and justify mass violence, through pitching Serbs against Croats, Hutus against Tutsis, Muslims against Roman Catholics, the Iraqis against the Kurds. What these media-influenced atrocities have made clear is that critical media studies must be reconfigured to respond to these and other crisis conditions.

The pre-conflict abuse of the media to inflame inter-ethnic differences is seen as the catalyst for war. Once warfare breaks out, the media can become a centerpiece of the struggle between factions that want to utilize the media to escalate hatred and spread fear against one another. In post-conflict times, with the media infrastructures possibly destroyed, journalists killed or fled, and the entire media space quickly becoming a site of renewed struggle between the interim authority and remaining factions, there are critical questions that urgently concern critical media studies from the perspective of international human rights law: To what extent should foreign agencies such as the EU, UN, USAID, etc. intervene in the post-conflict reconstruction of the media space in order to prevent it from being abused again as well as to help produce and maintain public order? What is the legal basis in human rights law for such an intervention? How do different forms of intervention stand the legal scrutiny for managing’ and even restricting the freedom of the press in the post-conflict state? How is the line drawn between a “media intervention” aimed at achieving urgent military goals of stabilization and peace-keeping, and a media intervention aimed at longer-term development of a civil and human-rights respecting society? In what ways are the perspectives different among inter-governmental agencies, donor nations, and non-government organizations (e.g. journalist associations) regarding the legality of, and the actual protocol for, media intervention? What perspectives do they share, especially as benchmarked against international legal norms? This is the first of a three-part analysis that attempts to open up these questions and introduce to critical media studies practitioners a legal mode of analyzing media and warfare from a human rights perspective. This first piece outlines what media/information intervention is.

The most pressing legal and humanitarian consideration about the mass media, to which the whole question of media intervention is directed, is the profound problem of “hate speech.” The discussion of hate speech in human rights law has indeed moved beyond the confines of racial discrimination in community settings. It has moved into the contexts of inter-ethnic violence, armed conflict, and genocide. Indeed, underpinning a part of the mandate of the International Criminal Tribunal for Rwanda (ICTR) is the explicit association of the media and genocidal violence as well as the prosecution of media-generated hate speech. The legal definition of hate speech has been most clearly articulated in the International Covenant on Civil and Political Rights and the International Convention on the Elimination of Racial Discrimination (ICERD). Article 20(2) of the ICCPR prohibits “advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.” Article 4 of the ICERD defines racist speech as “ideas based on racial superiority or hatred, incitement to racial discrimination” and “propaganda activities, which promote and incite racial discrimination.” In addition, “direct and public incitement to commit genocide” is punishable pursuant to Article 3 of Genocide Convention.

“We need to explore what can be done between the impossible everything and the unacceptable nothing. The political cost of doing everything is usually prohibitive. The moral cost of doing nothing is astronomical. If we accept that we are not going to do everything possible to stem a given conflict, what can we do to have as much impact as we are willing to have?” (Thompson, 2002a, 41-42). Jamie Metzl, a key proponent of information intervention, describes in the above the need for intervention as a moral obligation exercised in the context of limited influence. Media/information intervention refers to the means of getting involved in a humanitarian crisis where there is evidence that the mass media have been manipulated for inciting hatred and violence. Where there is humanitarian intervention taken to avert mass suffering, media intervention campaigns are designed to supplement such an action. But where there is weak or even no political will to take action in crisis situations, media intervention campaigns are to compel an ideological force in the international community to confront the crises. Such campaigns are supposed to adhere to human rights norms.

Regarding methods, information intervention can take place in pre-conflict, mid-conflict, and post-conflict times. Strategies such as broadcasting counter-information, dropping leaflets, and the most controversial of all, jamming broadcasting signals from the target state, are best applied in pre-conflict and mid-conflict times. As for after the conflict, reconstruction work typically calls for a robust “media development” program, which can include

  • human rights training and education of journalists
  • enhancement of independent local media outlets
  • setting up interim media commissions
  • establishing licensing mechanisms linked to hate speech laws and other codes of conduct to ensure quality balanced programming
  • creating programmes that promote inter-ethnic conversation
  • protecting safety of journalists from intimidation and other violent threats
  • forging a monitoring role for the media during the transition to a stable government through election
  • other democratizing activities of the media sphere.

However, while the ultimate legality of such intervention methods created in the name of reconstruction will continue to be debated, the legal ground for more aggressive measures taken in times of imminent or present conflict appears to be tenuous, such as in jamming broadcasting signals, techniques of information manipulation (such as cyberwar), seizure of transmitters, or even bombing broadcasting towers. These aggressive actions resemble the “use of force,” which is prohibited by the UN Charter and other long-standing international norms. Peter Krug and Monroe Price (2002) warn: “[T]he human rights rationale for what might be called ‘aggressive peacemaking’ and the intrusiveness into the zone of freedom of expression is a precarious one. [Moreover][w]hen an international governmental organization engages in regulation of the press, its actions may affect the nature of the political system that follows. How a regulatory rule is shaped, how it is presented in the society, how those who will be subject to a seemingly censorial rule react and accept that rule–all these are part of the difficult process of democracy development in a conflict zone” (164). Certainly, it is one thing to prevent violence, it is another for the information intervention program to intrude upon the target state’s autonomous public sphere and even to exert influence and authority in the target state.

Not surprisingly, Jamie Metzl has been criticized for promoting “a more adroit spinning of United States foreign policy represent[ing] a fashionable means of enhancing United States predominance within the international system, using information technology”(Thompson, 2002, 56). It has been argued that the entire effort smacks of hegemonic intention under the guise of humanitarian intervention. In Part II, I will examine in closer detail the legal framework for scrutinizing media intervention according to international human rights norms.

Note
This list is compiled from several media development experiences in post-conflict Bosnia and Kosovo. See, among others, Pech (1999/2000) and Price (2000).

Sources

Krug, Peter, and Monroe Price. “A Module for Media Intervention.” Monroe E. Price and Mark Thompson, eds. Forging Peace: Intervention, Human Rights and the Management of Media Space. Edinburgh: Edinburgh UP, 2002. 148-74.

Pech, Laurent. “Is Dayton Falling? Reforming Media in Bosnia and Herzegovina.” International Journal of Communication Law and Policy 4 (1999/2000): 1-28.

Price, Monroe. “Intervention: Bosnia, the Dayton Accords, and the Seizure of Broadcasting Transmitters.” Cornell International Law Journal 33 (2000): 67-112.

Thompson, Mark. “Defining Information Intervention: An Interview with Jamie Metzl.” Forging Peace. 2002. 41-68.

Image Credits:

1. War Protesters in Iraq

Links:
Ferdinand Nahimana page on Trial Watch website
International Crime Tribunal for Rwanda
Media Development in Post-war Iraq

Please feel free to comment.




Faith-Based Plot Initiatives

by: Mimi White / Northwestern University

Joan of Arcadia
Joan of Arcadia

What is the story with Joan of Arcadia? I don’t mean this literally, since the basic story (or perhaps more accurately, the “high concept”) is pretty clear: teen girl talks to God. I have been watching the show, off and on, and trying to make sense of it. Brief disclaimer: I watched the show occasionally during its first season on U.S. television and am also watching it now in Finland, where they are airing the first season. So I am not “up to date” with the current season. I have perused episode summaries on line, and it seems that my concerns are still relevant, but I cannot be certain about this.

My confusion starts with the high concept at the heart of the show: I really don’t know how to take the God(s) that speak(s) to Joan. Is it an actual divine being, a figment of Joan’s imagination, or something else? In Touched by an Angel at least the angels were really supposed to be angels…really. The God(s) in Joan is not so clear. At some level, it seems to function as a faith-based initiative for generating plots, kind of like an inverse deus ex machina, where God shows up at the start, to get things going, instead of appearing in the nick of time to resolve dilemmas. Can’t figure out how to get Joan into awkward situations, interacting with different high school factions and misfits each week? Gee, let’s have God tell her to join the band (even though she doesn’t play an instrument) or the debate club or the cheerleading team.

But then I wonder about what sort of “God” would put a stumbling, angst-ridden teenager into such difficult situations week after week. I realize there is an ostensible moral, more or less, to be drawn from each situation: Joan doesn’t have to be perfect, as long as she is trying; she makes good or bad decisions and learns something new as a result; she learns about taking responsibility for her actions and exercising free will; etc. But having God in the picture seems to belie these very lessons. After all–and being quite literal here–by following God’s instructions, Joan is precisely not exercising free will, not taking responsibility, not figuring out things for herself. Instead she is obeying the direct orders of someone(s) she takes for an almighty power, even when doing so often places her in yet another impossible position, or embroils her in some new harebrained scheme. (At times–for example when Joan has promised to cook dinner at home, help a friend mount an art exhibit, and fetch her brother’s science notebook from another friend all at the same time–I imagine a sit-com version of the show, revising it, and Joan herself, along the lines of I Love Lucy instead of the angst-ridden domestic drama it is.)

Joan’s God (Gods?) seems to derive from some sort of multicultural Judeo-Christian tradition, broadly speaking, embodied in humans of varying races, genders, and ages. (There aren’t any flora, fauna, or inanimate objects that speak as God. The short-lived Wonderfalls did endow inanimate objects in a Niagara Falls gift shop with the power of directive advice.) This is also God without a specific religion, just a range of muddled, generic ideas about doing deeds, exercising free will, and so on. God(s) starts verging upon New Age spiritualism, as some kind of diffuse power, existing in everyone, or some sort of higher ethical consciousness.

Of course, God could be a metaphor, or an objective correlative, possibly for the interior state of an insecure teenager fitfully progressing toward womanhood (ick!), or something like that. Or Joan could be mentally ill, and hallucinating the Gods who speak to her. But in the larger context of the program, these explanations don’t quite fit. The first of these moots the whole point of bringing in God, specifically, in the first place (as opposed to something along the lines of the inanimate objects of Wonderfalls). As for mental illness, the program suggests that high school is generally an age of affective and cognitive extremes, if not full mental imbalance; everyone is a little crazy. But only Joan actually talks to God(s).

Joan of Arcadia
The cast of Joan of Arcadia

I could continue with my questions and confusions about the show, and how come different ideas about what God is doing there (besides bossing Joan around) all seem inchoate or nonsensical. But there are reasons for my impulse to try to figure out what is going on. “God” clearly means many different and particular things to many people. Depending, God does or does not exist; God exists in manifold versions; God is (or is not) already a metaphor. The show implicates different versions of God at different times; it is almost impossible to “invent” God without invoking, referencing, or including/excluding at least some of the versions that already exist.

More specifically, here is a show that makes God a manifest part of its content at a time when the American President has publicly called for more faith-based initiatives and even established an Office of Faith-Based and Community Initiatives in the White House, (although there is no specific mention of television drama in its purview). In this context, the program might be expressing, or at least trying to tap into, this conservative socio-political discourse. In addition to having embodied God(s) as a recurring character, Joan’s nuclear family includes prominent representatives of repressive and ideological state apparatuses with her father on the police force (originally brought to town as chief of police) and her mother a high school teacher.

Yet even though this is not inaccurate, it still seems too pat. For example, the program’s general tenor of angst and disorder extends beyond the high school characters, into the socio-political world of Arcadia with the rampant political and police corruption that the father exposes, losing his job as police chief in the process. In some ways, at least, the program seems to question the quiescent conservatism that it also advances (in part by having God direct Joan’s fate). And when it comes to religion, the program undercuts its God as avatar of mainstream religion just as readily as it encourages the idea that religion, or at least some sort of faith in a higher being, is a meaningful force in the life of its eponymous heroine.

Maybe the show’s idea of God is just so half-baked that it isn’t even worth thinking about this much. (The inverse of this is something along the lines of “God is whatever you think it means,” yielding a quiescent liberal complement to its quiescent conservatism.) Maybe putting God quite literally in the picture is a means of giving the impression that there really is something substantive to think about. You start to wonder if it isn’t just a cheap gimmick. Perhaps it really is a faith-based initiative after all, or maybe even a perverse joke on the very idea of faith-based initiatives.

Image Credits:
1. Joan of Arcadia
2. The cast of Joan of Arcadia

Links
CBS.com – Joan of Arcadia
Joan of Arcadia Fansite
Office of Faith-Based and Community Initiatives

Please feel free to comment.




Pass the Remote!

by: Natalie Cannon, Zak Salih, and Angela Nemecek

Welcome to Flow’s latest experiment in academic discourse, Pass the Remote. Over the course of each bi-weekly issue of Flow, three or more scholars will exchange open letters on a topic of shared interest. Check back to see the discussion’s progress, and feel free to comment below. If you are interested in contributing to Pass the Remote contact Christopher Lucas at clucas@mail.utexas.edu.

Pass the Remote Logo

Dear Zak and Angela,

A little over a year ago I got hooked on HBO’s new series, Carnivale. I liked the strangeness of the story — it was like a grittier Tim Burton movie — and I really enjoyed the artistic quality of the cinematography. After my course work on “Disability and Freakery” last semester, I found a lot more in the show that catches my attention.

Interactions between freaks or social others with “normal” people, in the first season of Carnivale in particular, seems to beg for commentary and further study. The fact that the show has two main venues only complicates the query in a good way; it allows for comparisons among and between the carnival and the settled town of Mintern, which comprise the two branches of the story. All the characters in the traveling Carnivale branch of the show are represented as various levels of freaks — bearded lady, lizard man with tail, whores, conjoined twins, disabled head roustie, and the dwarf, a voice of authority for the mutilated management. The protagonist Ben Hawkins is rescued by these carnies, and a lot of the tension that drives the first few episodes is how he tries to fit in or not, and the way he is made the butt of jokes and initiation pranks because they all assume that he is “normal,” which is an “other” to their freakish way of life.

Carnivale

Carnivale

The Mintern, CA branch of the show is set up as a “normal” story about holy, but normal people — a preacher, his sister, their friends, and his parishioners of citizens and migrants. The odd thing is that the “normal” people are manipulative and turn out to be evil, while the carnies and freaks are the characters most easily identified with by the audience and house tools to stop the evil. This begs the question, are we to think the other is the answer and the normal is the problem?

Regards,
Natalie Cannon
University of Virginia


Dear Natalie and Angela,

I second Natalie’s notion about the twisted definitions of “other” and “normal” throughout the two seasons of Carnivale. As to her question of whether the “other” is the perceived hero of this series in contrast with the evilness of what we would commonly consider “normal,” I would argue that it’s near impossible to arrive at any simple answer.

I think the qualifiers of good and evil in Carnivale are indeed based on these notions of “normal” and “other.” But as the show develops, we come to realize that what is normal and what is other is not based so much on physical characteristics but on actions and internalized characteristics. So, in a sense, Carnivale is validating these traditional notions of good and evil while trying to step outside them. Good and evil are based on character, not physical appearance – it’s a classic theme we’ve seen in numerous television shows, movies, and books. The other is still the problem and the normal is still the answer; that is, if we read Carnivale on a moral level as well as a visual level.

I guess the formula for the morality of this show would be that: visually, the other is the answer and the normal is the problem; morally, the “other” (the evil actions and intent of Brother Justin) is the problem and the “normal” (the good intentions of the prophet Ben Hawkins) is the answer. After all, don’t the common cultural codes tell us that good character is normal and acceptable while bad/evil character is abnormal and unacceptable? Isn’t Carnivale then just reaffirming these cultural codes, albeit under trickier circumstances?

Best,
Zak M. Salih
University of Virginia


Dear Zak and Natalie,

I agree with Zak’s assessment that Carnivale fundamentally reinscribes normative cultural codes, even as it plays with the slippage between outer and inner: morally good characters can look physically “deformed,” while morally bad characters can look physically “normal.” But the show adds a further twist, demanding that evil characters come to be physically altered by their evil.

Brother Justin, for example, shows consistent outward signs of evil — most notably, demonic pupil-less eyes. Indeed, this physical change is a cultural trope of evil; we come to expect this cue in everything from The Ring II to Buffy the Vampire Slayer, since it’s how we tell the “good” guys from the “bad” guys — or even how we tell the Doctor Jekyll/Mr. Hyde versions of the same character apart (Oz on Buffy, for example, gradually morphs into a werewolf when it’s that time of the month).

This physical metamorphosis indicating evil is most evident — and most permanent — in Brother Justin when he requests his gigantic tree tattoo, the Mark of the Usher. Once he has been literally “marked,” he is physically othered into the evil Other we have always known him to be. I’d argue that the viewer finds this form of othering quite satisfying, as now the character’s outside confirms his inside, and his identity is stabilized.

But this stability obviously complicates the status of other Others on the show, since it reaffirms the notion that physical otherness corresponds with moral otherness. In the end, does Carnivale only reify the stereotype that the physically different are morally reprehensible?

Angela Nemecek


Dear Angela and Zak,

While I agree and enjoy that Justin finally starts to look as bad, visually, as he is inwardly, I do not agree that Carnivale only reifies the stereotype that the physically different are morally reprehensible because, at the end of Season Two, the ‘freaks’ are still visibly, physically different, but they are the ones who emerge triumphant.

Perhaps the show is instead enacting a subversion of that topos as it appeals to the audience to look beyond appearances because it has been proven that appearances can be deceiving and it was never exactly clear until nearly the end of the season as to who — Ben or Justin — is the dark one of the generation. They both kill, they both help people, they both are conflicted, and they both share the same nightmares frequently, so the division between them as Good or Evil is hazy until Justin reveals his intent.

I would argue that Justin receives his changes as punishment for falling pray to the devil within rather than the idea that he is branded so as to physically become an ‘other’ for the audience. In support of this I would also like to offer that Ben, the established ‘Good Guy’ heals the sick but not the freakishly disabled like Sampson or any of his fellow carnies. He does not exhibit the least desire to do so, and because of these contrasting actions it seems that Carnivale is not operating the way David Mitchell’s “Narrative Prosthesis” theory accuses most representations of the disabled of operating. Mitchell claims that disabled characters are either killed or cured by the end, but in Carnivale they end the same as they were before, if not a little spiritually or morally purified.

Regards,
Natalie Cannon
University of Virginia


Dear Natalie and Angela,

In regards to the case of Mitchell’s “narrative prosthesis” that Natalie brought up at the end of her post, I’m left wondering how we can apply this to the culminating season (and if the buzz on the web is true, possibly series) finale that aired nearly a month ago. Natalie points to spiritual and moral purification, an idea I find intersting when analyzed in the light of Brother Justin’s death in the cornfield and subsequent resurrection at the hands of Sofie. It appears that Brother Justin has taken on the persona of a dramatic Christ figure — yet aren’t Christ figures commonly considered to be agents of good rather than evil (as we all three seem to agree that Justin is, indeed, morally deformed)?

This notion further complicates our reading of good and evil in Carnivale. Notice how entrenched Brother Justin is in the Church. Throughout the progression of the two seasons, we have seen our traditional notions of faith and religion as moral forces somewhat skewed by the nefarious goings-on with Justin’s congregation (everything from the psychological torture of Rev. Balthus to nefarious allegiances with local politicians and grand baptism sequences that take on the tone of mass brainwashing). Given that Brother Justin is head of this particular religious camp, I’m left wondering what Carnivale is saying about the politics and morality of the devout Christianity on display here. In the same way that the outward/inward morality of the characters are skewed, so too do we see the same complications with Justin’s religious camp (what is normally a force for spiritual good is now a marketplace for evil and sin). It would seem then that in the universe of Carnivale, social institutions can be just as deformed and disabled as any carny.

If we consider Brother Justin to be a morally disabled character, then how does his demise/resurrection fit in with the aformentioned narrative prosthesis? He is resurrected but we have yet to see whether his evil (his disability) has been cured or transplanted into Sofie, who possesses the same black, pupil-less eyes that Angela notes is a common trope of evil in popular entertainment. While his death at the hands of Ben Hawkins might seem to affirm the narrative prosthesis on a moral level, his resurrection further complicates matters.

Best,
Zak M. Salih
University of Virginia


Image Credits:
1. Carnivale

Please feel free to comment.




“Lost”

by: Allison McCracken / DePaul University

America is making people disappear. While the “real” casualties of this administration are rarely represented on television, rituals of death are continually replayed and the sense of loss remains, haunting these texts. My subject for this first column is, appropriately, what we have lost and how we’re coping with that loss — on television, anyway. With a fall season marked by the popularity of programs entitled Without a Trace and Lost, the importance of loss as a televisual theme seems rather obvious. We can easily look back on the past few years for confirmation of this trend. For example, competitive reality programs in which the “unchosen” disappear into the night, through a ritual cab ride (as in The Apprentice or The Bachelor) or simply by going “off” camera. Others like Wife Swap exploit fears of spousal disappearance, creating fractured families who long for reconnection. And death, not love, is certainly all around in the crime procedurals that dominate prime time. These programs litter our evenings with corpses, most often women or children, casualties in a domestic war that has no name. Invisible during their lives, such bodies become sites for investigation after their death, as professionals use the latest technology to probe their flesh for clues to their untimely demise. As hard as these investigators work, however, the “losses” continue to pile up. On the one hand, these programs serve as cautionary tales reinforcing the terror warnings: we must be fearful, we must be good consumers, we must not lose the game. If we make a mistake, we shall be erased. On the other hand, these programs also enact a revealing displacement: both domestically and internationally, America is making people disappear. While the “real” casualties of this administration are rarely represented on television, rituals of death are continually replayed and the sense of loss remains, haunting these texts.

In American prime-time, such losses are not exclusively thematic; the industry itself has dramatically changed in the last four years, and the loss of socially progressive programming has been devastating to liberal producers and to the communities they serve. During the 2000 Presidential election, for example, I alternated between watching the returns and reading the reactions to them by Buffy fans on-line. That evening, we had all previously watched a new episode of the program’s 5th season, “Family,” in which Buffy producer Joss Whedon took a firm stance in support of gay couples, to the delight of fans. This year, one week before the Presidential election, Whedon unexpectedly shut down his television production company, Mutant Enemy, because, he said “I have a bitter taste in my mouth with where tv has gone the past five years.” (Variety, Oct.24, 2004). Since the surprise cancellation of Angel this year, all of Mutant Enemy’s programs are now off the air, replaced by sometimes entertaining but largely reactionary boy-centered melodramas like Smallville, Everwood, The O.C., Jack and Bobby, and life as we know it. Aside from a few female-centered programs, none of which offers the innovations Buffy did, girls (and queers) have largely vanished from prime time prominence, along with socially progressive agendas. Television’s experiments in the mid-to-late 90-s,which resulted in such gems as Freaks and Geeks, Homicide, My So-Called Life, Ellen, Oz, Once and Again and Sex and the City seems over. For their audiences, these texts represented a socially liberal space that enabled viewers to connect with alternative forms of community which may not have been available to them otherwise. Their loss (and the lack of comparable replacements) is a potentially profound one for many television viewers, who are no longer permitted the range of discussion or opportunity for community richer, more critical texts made available to them (and often encouraged by producers like Whedon).

It’s perhaps no surprise that, amidst such loss, prime-time television has turned to God (like many voters in this year’s election). While in the 90s Buffy re-appropriated religious symbols and icons to serve feminist and queer ends, and Oz acknowledged religious diversity and linked spiritual practices with broader humanitarian concerns, God has reappeared in more traditional forms in recent years, as a wise advisor or institutionalized icon. This shift to God in “straight” form has been particularly hard on female characters. The most obvious example is Joan of Arcadia, whose creator, Barbara Hall, rediscovered God after suffering a sexual assault. Hall created Joan so that adolescent girls and other viewers could turn to God in dealing with the perils of modern life. The program, however, often seems to have the opposite effect for Joan. God tells his handmaiden how to make everyone else’s life better except her own, which is continually disrupted by his bizarre requests (unsurprisingly, Joan is not permitted to know God’s reasons beforehand). Similarly, on the much-heralded new drama Jack and Bobby, future President Bobby recoils from his fiercely secular (and unfortunately shrill) mother to embrace religious life, paving the way to his becoming a minister. And last season on Everwood, local doctor Harold Abbott races to church to confess his sins after performing an abortion for a random teenage girl. While the girl herself never reappears, the point is clear: the fallen woman caused this good man to sin.

Alongside these literal references to God, the desire for supernatural or spiritual intervention has taken hold of more secular-seeming dramas as well, most notably J.J. Abrams’ Lost. Lost begins where most disaster films end — after the plane crash on the deserted tropical island. The program is particularly timely in that it deals both with lost people and feelings of loss generally, especially for a liberal-minded middle-class audience. Lost represents many of those who are normally invisible as protagonists on television (non-Americans, non Anglos, the disabled, the overweight, an Iraqi citizen, a drug user), but it also suggests the world view of American liberals who feel stranded in a land in which they have lost social power, and who are haunted by past events which have brought them to where they are. This is a potentially powerful scenario, but Lost has resisted complex interrogations of liberal alienation or American social violence in favor of more comforting supernatural band-aids.

The most successful episode, “Walkabout,” found fans absolutely overjoyed and in tears when it was revealed that wheel-chair bound Terry O’Quinn had been mysteriously healed by the plane crash. Even on such seemingly secular boards as televisionwithoutpity.com, religious rhetoric was plentiful as fans referred, some in gingerly quotations, to the “miracle” that had occurred. While the quotation marks indicate some possible discomfort with the term, especially in relation to a program coming from generally more progressive Buffy writer David Fury and Alias/Felicity creator J.J. Abrams, they also suggest an increased willingness to entertain religious explanations. Indeed, a recent TV Guide poll revealed that 26% of viewers think that the “survivors” are actually all dead, another 23% that they’re in “Purgatory” (TV Guide, 11/14/04). Perhaps more than anything else, this poll suggests the hopelessness of many audience members who seem willing to embrace, at least televisually, some sign of a divine or at least an easy, solution to a depressing, perhaps intolerable, situation.

Remarkably, I find myself looking to a procedural for representations of the “disappeared” in which conditions of actual social violence are referenced. Without a Trace is unusual for today’s procedurals because it is the only crime program which consistently offers thoughtful characterizations, fallible detectives, failed investigations, and moments of progressive politics. The program recently departed from its procedural format to offer a pretty faithful adaptation of Barbara Ehrenreich’s social critique of the situation of low-wage working women (the episode titles are “Nickel and Dimed, Parts I and II”). In this case, the “vanished” woman is a single mother, trying to make ends meet by working at “Everymart” and cleaning houses on her days off. Desperate for money for her son’s hearing aid, she works as a go-between for drug-dealers, who kidnap and eventually kill her. Single female Detective Samantha Spade empathizes with the women, putting her own life at risk in order to search for her by going undercover as a low-wage worker; Spade’s “break from common procedure” allows the program to further expose these women’s inhumane working and living conditions. In the episode’s thesis statement, the frustrated Spade angrily mourns the missing woman: “It shouldn’t have been so hard for her, you know? She deserved better. This isn’t about records or files or paper trails. The problem is she’s invisible. This woman has vanished into thin air, and if it weren’t for Jake, [her son], it wouldn’t have even made a ripple. I feel like things happen to people like her and no one notices and no one’s held accountable!” Spade’s critique is remarkable in that it exposes the blinders of our culture generally, well represented by television’s other procedurals — their devotion to “paper trails” and elaborate autopsies while larger structural causes are never addressed. While her outburst does not offer a divine or easy solution, it does significantly acknowledge the pervasive losses caused by our social system. And Spade does mourn these losses, at least for a television moment — and such moments may be the only “real” comfort television has to offer for the next four years.

Links
ABC’s Lost Home Page
CBS’s Without a Trace Page
Religion Online
Religion and The Mass Media: Bibilographic Database

Please feel free to comment.




Political Polarization and the New Hollywood Blockbuster

by: Frederick Wasser / Brooklyn College CUNY

The connection between politics and movies is continuous and yet murky. It is not to be found solely in the intentions of the creative team or in overt themes of the plotline. The once popular analysis of the link between politics and cinematic form that argued the jump cut is a blow against capitalism is now as a quaint reminder of the “60s.” Nonetheless the relationship between films and politics is immediate perhaps because both involve large populations and both appropriate more and more cultural resources. The rather dramatic change in the American landscape from the relative consensual mainstream politics of the post war era into the polarized, mutually incomprehensible positions of today are bound to be reflected in films. I think the reflection may even change the way New Hollywood does business.

The “blockbuster” movie was a term associated with the saturated bombing of World War Two and certainly there was mixture of shock and awe at the blockbusters of the past quarter century along with some concern over their excesses. I would like to treat the blockbuster as a genre although there is little unity of content. But there is enough sharing of formal features and marketing strategies to group these films together. Certainly they are all big budget movies attempting to achieve “event” movie status. (At what point did someone come up with the term “event movie?”) The event movie is defined as being that movie that I (and the rest of us) had to go to because everyone else was going to see it. Quite frankly despite being in the film business and now in the academic film biz and despite generally liking anything with sprocket holes and even things with time code, there are many, many event films I resent that I felt obliged to go to. Like Ignatius Reilly in Confederacy of Dunces, I took some minor satisfaction in railing at the minor credits in event movies. But the satisfaction was only minor. It was a bit of circular logic; I went because they were important; they were important because everybody else went.

Thus these films were particular evidence for the social importance of film. The marketing campaigns and the various formulations of new Hollywood suffice to convince a population that this was a “must-see” movie. They were invariably from the major studios except for the annual noble independent low budget film that also achieved event status precisely because it was not from the majors. Think of Sex, lies and videotape (1989), The Crying Game (1992), and The Blair Witch Project (1999) to name a few.

Suddenly this year there are two event films and they have odd relations with the majors, neither one a major studio release and neither very independent. Both are driven by outside the box marketing strategies and both are raising fierce hackles as well as exuberant praise. The dual arrivals of The Passion of the Christ and Fahrenheit 911 inspires thoughts that we are at a sea change regarding the cultural status of movies or, at least, the blockbusters.

They revealed how much the film product of the last twenty, twenty-five years has been pitched at the entire audience, the entire globe. Media historians have asserted that the general movement for media is to go from general audiences to niche ones (Shaw). This statement only reveals how media historians constantly overlook films. The film industry had constructed niche audiences in the 1950s and 1960s in reaction to the loss of the habitual audience to television. But the 1970s new Hollywood blockbuster formula of cross media marketing was premised on attracting everyone everywhere. Statistics show that video and other ancillary markets meant that eventually all age groups were attracted back to the Hollywood product by the end of the 1980s. No niche audiences here as we headed into the apogee of Titanic (1997).

Yet Passion and Fahrenheit both struck like lightning bolts within a few months at the cultural cleavage of America and it is safe to say that only a few went to both movies (I did) and that even fewer liked both (I didn’t). Will new Hollywood be able to paper over this new culture war as well as it did with the old 60s cultural war?

Fahrenheit 9/11

Michael Moore’s Fahrenheit 9/11

Fahrenheit has little in common with the blockbuster. It is a documentary that is framed around the central everyman character of Michael Moore in a popularizing tradition that refers both to TV’s 60 Minutes (1968-now) and Ross McElwee’s Sherman’s March (1986). But the recent turn of the documentary genre towards the popular still rarely attracts production funding from a major studio. It was a sign of extraordinary times that Moore was able to find such financing when the Miramax division of The Walt Disney Company gave his project the go ahead. This division occasionally does revert back to the kind of marginal projects it was famous for when it was still an independent company 12 years ago. But the parent Disney showed mainstream timidity when it sold the film rather than release it itself. It was another sign of the extraordinary times that Disney anticipated a political retaliation from the Governor of Florida who is the brother of the sitting president. Is the breakdown of American polity eroding the nominal autonomy of popular culture?

Passion has many features that are in common with the blockbuster and many that are not. It was a relatively modest budget for a blockbuster but it was based on a well-known story that has been filmed many times before. Thus the production and the marketing needed to claim a unique status to compel a global audience to see this new version. Typically the New Hollywood Blockbuster handles this problem of universal appeal by promising the audience something new within a comfortable and well-known formula. They do this by giving an “A” budget to a “B” script. Thus great efforts are made to assure the audience that they will have to make very little effort to believe in the reality of such impossible things as contemporary dinosaurs, men with super-human powers, or even the end of the world. All the audience has to do in return is to show up in numbers that surpasses the usual clique of dinosaur fans or action hero enthusiasts.

In addition to a realism effect of heightened budgets, the new Hollywood likes to use universal villains such sharks, treacherous lions, serial murderers or creatures from outer space. While The Godfather (1972) angered some Italian-American groups and many action films have upset Middle Eastern groups for their choice of villains, these are viewed as lapses. After all the economics of the blockbuster dictates that it cannot alienate any sizable segment of the global audience. Even those few films that touch on real world controversies are designed to allow a great deal of flexibility of interpretation. Saving Private Ryan (1998) can serve as a recent example since it was interpreted as both a corrective to the jingoism of earlier war movies and as a continuation of the same themes. Dances With Wolves (1990) was an earlier example of correcting earlier Western movie attitudes towards Native Americans without challenging the audience.

Thus we have two measures of the new Hollywood formula: heightened believability and the attempt for universal appeal either through avoidance of cultural specifics or a polyvalent attitude towards cultural divides.

From Mel Gibson\'s The Passion of the Christ

Mel Gibson’s The Passion of the Christ

Passion of the Christ follows the former but not the latter.

Gibson placed heavy stress that the audience will believe in his film. Of course belief takes on added meaning when speaking of the gospel of a major religion. But Gibson didn’t just rely on pre-existing faith. He resorted to the tried and true tactics of new Hollywood to compel such faith. He emphasized action and he indulged in the use of a subjective camera. The combination of the two particularly during the flagellation scenes ensured the same kind of visceral roller coaster ride that is the feature of most of new Hollywood’s blockbusters. Many critics have commented on the bloody shots of Jesus, as Gibson portrays the beating in detail. In addition to what is being portrayed we should also know it is being represented with all the hyperrealism of the blockbuster. The sound effects, particularly in the opening fight in the Mount of Olives, follow the exaggerations that were initially popularized in Raiders of the Lost Ark (1982). Gibson uses the camera in the blockbuster hyperbolic manner; even going so far as to turn the camera upside down to reproduce Jesus’ point of view of his torturers. This extreme visual along with an extreme overhead at the end of the film and others would not be tolerated in old Hollywood but is relished in contemporary big budget filmmaking.

The blockbuster’s commitment to visceral effects is well known. Jean Pierre-Geuens is convincing in his explanation of the motivation for such visceral effects. He borrows from William James’ psychology to note that there are two preliminary human reactions to stimuli before the human can reflect even emotionally on the stimuli. For example, with a fire, there is the feeling of fire, and the flight from fire, before there is the reflective reaction of the fear of fire. Geuens notes that contemporary filmmakers aim to cause the first two reactions in their audience without concern for the third stage of a reflective emotion. This is precisely my critique of blockbusters in general which give us little time to ponder the wonder of the worlds they create and for Passion in particular which gives us no time to reflect on the universal message of Jesus’ death. The extreme physicality bullies us into crying in reaction to the brutality, not in reflecting on the love of God’s only son undergoing the human pain of death.

So Gibson learned that lesson of the blockbuster but he deviates radically from the universal appeal of the blockbuster in his own marketing and positioning. He certainly took pride from his own radical brand of Catholicism and used it in the marketing of the movie. He did not seek compromise or consensus when scholars warned him that his script reopened old sectarian wounds. This is the only recent event film that purposefully drives a wedge between Christian and Jew, between fervent believers and rational religionists. This is not the formula of new Hollywood, which now goes to some lengths to attract audiences across social divides. Gibson’s deviation from Hollywood’s marketing is reflected in his self-financing, his use of the small independent distributor (Newmarket) and his extensive use of religious groups to give marketing clout to the movie.

Both Fahrenheit and Passion avoided the bland ambiguities and polyvalent plotlines of new Hollywood. They both became “must-see” movies despite major deviations from the blockbuster formula.

What is happening to the major studio blockbuster? Is it no longer an event?

Certainly not if we just look at the figures. Shrek Two (2004) and The Shark Tale (2004) are performing to the high standards of their studios. Lord of the Rings trilogy (2001-2003) has been as eventful as Star Wars (1977). Still I detect that the fractured American audience is a glimmer of the down slope of the blockbuster cycle. Peter Bart of Variety also confirms a downward trend when he reports that a certain zest for gamesmanship has gone out of the movie executive suites in light of the 1998 summer release schedule. It stands to reason that media moguls who own the corporations of Viacom, News Corporation, and Sony are tired of their filmed entertainment divisions which rarely contribute more than a fifth of revenue and yet never settle down to be a nice dependable steady flow of revenue. Instead these divisions continue to operate with low profit margins and high stakes. The atmosphere following the success of Titanic seems to be more one of relief than jubilation. In 2002 Disney announced to Merrill Lynch that it would try to discipline the budget levels. Hardly the mantra of blockbuster movie making.

The big movies since the turn of the current century have been full-length animations. Certainly these are the movies that inspire Wall Street to buy stocks and invest money. Animation audiences are typically bimodal with parents or grandparents bringing children to the theaters or buying the DVDs. These audiences are not the universal ones that blockbusters at the height of the cycle a decade ago attracted. The big live action films are also turning towards a youthful audience with Harry Potter and The Lord of the Rings trilogy. J. Kapur noticed that Harry Potter represents the big budget movie genre turning towards “Englishness.” Lord of the Rings has the same quality albeit New Zealanders represented its Englishness.

It stands to reason that America’s embrace of unilateralism would limit American film exporting. Perhaps we exaggerated power of our appeal even before the latest Iraq invasion. Early reports, following the US invasion of Afghanistan in 2002, were that a liberated population was storming the cinemas to see Hollywood films. Later reports clarified that they were actually anxious to see the latest imports from “Bollywood,” not Hollywood. Our grip is slipping. I suspect that even a favorable result to the upcoming election will not restore the universal global audience to Hollywood and that the fracture uncovered by Passion and Fahrenheit will lead to more niche audience marketing for American films than before.

Works cited

Geuens, Jean-Pierre. (2000). Film Production Theory. Albany NY: State University of New York Press.

Kapur, Jyostna. (2003, Spring). “Free market, branded imagination — Harry Potter and the commercialization of children’s culture.” Jump Cut: A Review of Contemporary Media Online: Jump Cut article visited September 5th, 2004.

Shaw, D. (1991, April 4). “Rise and Fall of American Mass Media: Roles of Technology and Leadership” second annual Roy W. Howard lecture presented at Indiana University Bloomington.

Links of Interest:

1. The Passion of the Christ

2. IMDB’s 100 worst films ever

3. 100 MOST DARING MOVIES EVER MADE

4. Best Movies

Image Credits:

1. Fahrenheit 9/11

2. The Passion of the Christ

Please feel free to comment.