In the Gif Space
Michael Newman / University of Wisconsin-Milwaukee



Maybe some of these gifs are familiar, playing on a silent loop in your mind and putting a smile on your face just from reading them described in words: crying Dawson gif, Jennifer Lawrence OK gif, Michael Jackson popcorn gif, blinking white guy gif, clapping Citizen Kane gif, Homer into the hedges gif, “This is Fine” dog gif, Kermit typing gif, Leo/Gatsby champagne toast gif, pizza delivery to a room on fire gif.

There’s a pleasure in the familiar when reliving such moments of emotional acuity and then reliving them again and again as they repeat. But these little loops are useful too, subbing in for verbal exchange in texts or tweets when emoji are too generic and words are elusive or just take work to summon. Language can be less potent, less immediately and fully expressive anyway. Gifs function in conversation as embodied emotional displays to make up for a lack of facial expressions and gestures online. [ ((Jackson Tolinsand Patrawat Samermit, “GIFs as Embodied Enactments in Text-Mediated Conversation,” Research on Language and Social Interaction 49:2 (2016), 75-91.))] You can borrow a pair of rolling eyes from Krysten Ritter or Judge Judy to convey exasperation.

Gifs are available and easy to find using tools integrated into platforms like Twitter and the keyboards for messaging apps, which allow searching for specific content like “Drake” or “thumbs up” and suggest categories of reaction like “OMG,” “oh snap,” and “oops.”

These gifs come in many varieties, but the most familiar tend to be clips of commercial pop culture using figures who are already well known to gif sharers. SpongeBob, Liz Lemon, Kermit the Frog, Rihanna, Kim K, Britney, and the Minions are gif superstars. Harry Potter and Star Trek movies, The Office and Adventure Time furnish myriad clips. How did infectious loops extracted from movies and shows find their way into interpersonal communication, into tweetstorms, Facebook comments, and Slack convos, silently auto-playing on all of our screens? Turning brief bits of movies and television into gifs spread as a grassroots practice of media fandom within communities that function as gift economies (though the media industry has also gotten into the game of making its own gifs). [ ((Gillan, Jennifer, “Textural Poaching Twin Peaks: The Audrey Horne Sweater Girl GIFs,” Series 2.2 (2016), 9-24; Uhlin, Graig, “Playing in the Gif(t) Economy,” Games and Culture 9.6 (2014), 517-527; Lesley Willard, “Tumblr’s Gif Economy: The Promotional Function Of Industrially Gifted Gifsets,” Flow, July 27, 2016.))] Like so much about mass media in the 21st century, the story of gifs is one of cultural power and agency see-sawing from industries to people and back again.

Giffing is still a practice of fans, remixing content as a new vernacular of repurposed moments. But the platforms where gifs are shared, the social networks and messaging apps, are integrating fun, user-friendly tools for search and categorization so that most sharers are now distant from the original work of producing and circulating a gif. Many sharers of gifs might have little knowledge of the original context or authorship of the little loop used to convey “mic drop,” “smh,” or “bitch please.” The gif is functioning as a more semantically replete and richly connotative version of an emoji. [ ((Huber, Linda, “Remix Culture and the Reaction Gif,” Gnovis: A Journal of Communication, Culture & Technology, February 25, 2015.))] Good gifs crop out extraneous parts of the frame, showing nothing but a feeling, like these gifs of Jonah Hill and Emma Watson. (Ever notice how gifs are rarely in the 16:9 ratio of most professional media content?)

What’s left is not so much a clip from a movie or a TV show (Get Him to the Greek and The Jimmy Kimmel Show, respectively) as an affect bomb you drop to express your reaction in a given moment. A gif can be many things, but this use is an expressive enactment: eyerolling, jawdropping, facepalming, shrugging, clapping, nodding, or hairflipping. The meaning is a feeling: ecstatic or pissed or judging you. Gifs don’t communicate information so much as expose interiority: this is me right now.

Both the platforms and the services facilitating gif-sharing are commercial, for-profit actors whose ultimate purpose is to exploit the sharing economy, collecting data about users and commodifying our attention. The story of gifs is exemplary of the digital age in popular media as users appropriate commercial media for their own uses and pleasures, only to have their participation and creativity, their communities of practice, milked as free labor producing commercial value for someone else.

The effects on the gif world go beyond commodification to shape what kind of gifs are made, used, and taken up as a shared resource of pop culture quotations. [ ((On commodification of gifs see Kate M. Miltner and Tim Highfield, “Never Gonna GIF You Up: Analyzing the Cultural Significance of the Animated GIF,” Social Media + Society 3.3 (2017).))] Making money off of gifs is shaping the kind of gifs that become familiar and available. Just as Netflix is defining a new canon of television shows based on its available content, so are the gif companies enshrining a gif canon of loops that frequently rise to the top of searches and of the “trending” suggestions, like the exuberant twerking girl from Toddlers and Tiaras and the Fozzie Bear facepalm.

The magical randomness and beauty of discovery when it comes to gifs is fading as usage is encrusted in convention, as gif sharers draw from familiar genres and tropes of giffing organized into a repository of reactions organized alphabetically, “agree” to “you got this.” What was subcultural and novel is becoming formulaic and predictable.

Two rival startups greasing the wheels of this gif mainstreaming are Giphy and Tenor, and both have the same agenda of owning “the gif space.” This means becoming Google for emotions, facilitating a flow of expression rather than information. If Facebook owns the “social graph,” the gif startups want to draw an “emotional graph.”

Giphy is better known and more highly valued, and functions both as an online search engine for gifs on its own site and in Twitter and Facebook, and a brand partner for live television events, capturing thousands of moments of award shows, sporting events, and other content as it airs. It is also a gif studio making originals such as bundles of gestures and expressions featuring actors from the CW series Riverdale, whose target audience of people too young to remember Hillary Clinton as a First Lady overlaps with Giphy’s.

You are likely to have used a Tenor product if you have sent a gif in a text message; their GIF Keyboard search tool is used on millions of mobile devices.

Both Giphy and Tenor have a vast library of gifs tagged by content and type, organized into “reaction” categories, and the millions and billions of searches for these gifs could be fertile terrain for advertising. Tenor has sold the top result in searches for certain terms like “yolo” to sponsors promoting a new movie or a brand like Domino’s, which is to say, sometimes the gif you share is an ad. While this isn’t the apocryphal “subliminal advertising” of the past, it shares with other instances of seamless or frictionless marketing strategies — branded entertainment, ads in search results — a subtlety and integration into patterns of media usage that makes you wonder how much consumers recognize that they’re targets of a persuasive campaign and a big data operation. [ ((On the history of subliminal advertising see Charles Acland, Swift Viewing: The Popular Life of Subliminal Influence (Durham: Duke UP, 2012); on branded entertainment see Cynthia B. Meyers, “Branded Entertainment Reshapes Media Ecosystem,” Media Industries Project, n.d.))]

Reaction gifs illustrate the push-pull dynamics of participatory media as user networks are subsumed and exploited by commercial actors from legacy media giants to cool-kid startups. They occupy a blurry boundary space between media industries and grassroots communities. Giffers appropriate mass media, poaching fleeting moments for new purposes. Gifs can pay tribute and give users new opportunities to geek out over fan crushes and favorite scenes. In turn, as highly engaging, “relatable” media, gifs are ripe for being returned to the circuit of commercial exchange, offering another opportunity to extract value from the audience’s attention as corporate profit-seeking creeps into every once-private moment of life.

The gif canon is at once a new slang, [ ((Jason Eppink, “A Brief History of the GIF (So Far),” Journal of Visual Culture 13 ( 2014): 298-306, describes gifs as “a digital slang, a visual vocabulary unencumbered by authorship.”))] a lexicon of emotion expressions, a monument of fandoms, and a platform for monetizing billions of people’s conversations, enlivening them with popcorn munching, side-eye glancing, and happy dancing. As commerce and advertising colonize more and more of our online realms, we really should wonder: whose gif space is it?

Image Credits:
1. Michael Jackson
2. Donald Glover
3. Krysten Ritter
4. Judge Judy
5. Jonah Hill
6. Emma Watson
7. Honey Boo Boo
8. Fozzy Bear
9. Camila Mendes
10. Lili Reinhart
11. Madelaine Petsch
12. James Harden

Please feel free to comment.




Teach-Ins and Twitter
Michael Newman / University of Wisconsin-Milwaukee

teach-in against the Vietnam War, 1965

Vietnam War era Teach-in, March 1965.

The first teach-in was held at the University of Michigan at Ann Arbor in 1965, and the -in came from the sit-ins at lunch counters and other segregated public places in the 1950s and 60s where African-Americans demanded equality. It was organized by professors who gathered up members of the campus community to protest the Vietnam War by teaching about the conflict, an alternative to a work stoppage. A teach-in was to be “a shrewd means of energizing the university without disrupting it.” (( Charles DeBenedetti, An American Ordeal: The Antiwar Movement of the Vietnam Era (Syracuse: Syracuse UP, 1990), 108. )) The overnight event was attended by thousands of students and hundreds of professors. It was covered in the national news.

Many more teach-ins were staged on college campuses during these tumultuous years when universities were at the center of movements for social justice and New Left politics. They were thought to make many students who had not been paying much attention to foreign affairs conscious of America’s involvement in Vietnam. A teach-in at Berkeley attracted as many as 30,000 participants (such estimates being contentious) and was a bona fide media event, with folk singers and well-known figures like Dr. Benjamin Spock and Norman Mailer. It was broadcast on the radio. There were also pro-war teach-ins, but whatever ideas were conveyed, the key purpose of a teach-in was nonviolent demonstration through pedagogy, building a platform for public intellectual discourse. Faculty and students along with members of the public could engage politically within the space of the university, and the university gave legitimacy to dissent over the war.

One statement at a 1965 teach-in provoked a huge controversy over academic freedom and political speech. Eugene D. Genovese, a history professor at Rutgers University, spoke these words at a teach-in on his campus: “Those of you who know me know that I am a Marxist and a Socialist. Therefore, unlike most of my distinguished colleagues here this morning, I do not fear or regret the impending Viet Cong victory in Vietnam. I welcome it.” This was quoted in the papers, and became an issue in the New Jersey gubernatorial race when the challenger called on the sitting governor to fire Prof. Genovese. In this instance, the academic freedom of an outspoken critic was protected when Rutgers’ president and Board of Governors took no action against him.

Teach-ins have not been common for several generations, but the engagement of university students and faculty in political debate perseveres, and in some ways it has a more public presence than it ever did. As they have for years, critical scholars speak publicly, appear on television and radio, and write for mainstream media outlets. But few platforms have the real-time immediacy of twitter, or its potential to become the grounds for controversy. Twitter is a polarizing medium. Its ardent users really get it, and use it in ways that outsiders find irritating, confounding, or nonsensical. Twitter is many things for many people, but one way it is being used is very similar to the teach-ins of the 1960s. It is a platform for academics’ social and political criticism, with an often broader potential for spreading dissent than can be contained in a campus auditorium. But the same public soapbox that twitter represents as a place for critical opinion, commentary, and information is also a bright potential target for the outrage machines seeking offending statements to call out.

The most famous case of this is Steven Salaita, and the tweets at the middle of his ordeal are, like the statements of the Vietnam protests, about a war that should be of concern to all Americans. The Israel-Gaza conflict in the summer of 2014 may have shed no American blood, but our government’s policies in the Middle East mean that we are never disinterested bystanders to clashes between Israelis and Palestinians. Salaita, who was hired to teach at the University of Illinois at Urbana-Champaign beginning in Fall 2014, tweeted passionately and with great urgency and anger about a catastrophic Israeli military assault. (( Some of these tweets are collected here. )) Salaita was “unhired” during the summer of 2014 as a consequence of his tweets on the war and Israel’s occupation of Palestine, and his legal action against the university is ongoing.

He is not the only scholar whose tweeted statements on matters of public concern have drawn chilling responses from right-wing media as well as campus leaders. Saida Grundy, recently hired to teach at Boston University, faced conservative outrage over tweets she wrote about racial politics and American history, particularly for calling out white people. Sara Goldrick-Rab, a professor at the University of Wisconsin-Madison, was similarly targeted for a tweet describing a conversation with her grandfather, who saw similarities between our Governor Walker and Adolph Hitler, and calling Walker and Republican lawmakers fascists, as well as tweets at undergraduates about the situation in the UW System suffering cuts and changes to tenure and shared governance. It can hardly be random that those singled out for their critical, so-called offensive views are often women and people of color.

Twitter seems to invite clashes of contexts, and as a platform for dissent and public pedagogy, it has clear virtues and limitations. To members of a community within the world of twitter, who engage with one another on a regular basis and share each other’s codes and references, twitter can sustain remarkably vigorous discussion and debate. The short character limit is a constraint (though it also makes for succinct writing), but detractors often miss the crucial point that you can tweet more than once. Twitter is a constant flow of ideas among communities just as much as it is a collection of very brief expressions by individuals. It’s also a medium typically used for conversation rather than polished, edited prose. To outsiders, however, tweets are easily excerpted from the flow, abstracted from their context, and run up the flagpole as banners of transgression. They said WHAT!? Because twitter is so public, and tweets are so easy to quote and embed in stories, the public intellectuals using it are taking huge risks. This isn’t always good for movements critical of the current power structures, and yet it does help to spread a message. The potential for public pedagogy in scholarly tweets is promising but there can be unreasonable costs.

And these costs can only be managed if our universities rededicate themselves to the fundamental values of shared governance and academic freedom. If only the leadership of the University of Illinois in 2014 had shown the same judgment as the Rutgers administrators and Governors in 1965. Much has changed since then, but one undeniable factor has to be the shrinking investment of the state in public education, and the privatization and corporatization of academe. The influence of pro-Israel donors on the UIUC administration and the Board of Trustees was undoubtedly a cause of Salaita’s unhiring.

My own University of Wisconsin System — where I earned two degrees and have taught for 18 years — has been one prominent example of a public institution weakening its academic freedom protections as its transforms from a thriving public trust into a corporatized and privatized shadow of itself. To be nimble, flexible, efficient, etc., our masters in state government, abetted by a friendly Board of Regents appointed by a very conservative governor, have dramatically diminished the faculty’s rights. In place of the nation’s strongest tenure and shared governance protections enshrined in state statute we have new conditions wherein layoffs can be made in the event of academic program changes imposed from on high. This weakening of our position might seem disconnected from the contemporaneous outrages over Salaita and other outspoken tweeting profs. They are actually part of the same political process of curtailment of the freedom to do critical or controversial work in higher education. Who on a UW System campus will feel free to speak out against an Israeli (or American) war, or will sustain research programs on stem cells or climate change, or will have confidence to criticize an administration’s complicity in shifting the costs of education from the public to the individual? Why would we feel confident that administrators will protect us?

Under current conditions, all kinds of pedagogy are under threat — our work in the traditional classroom as well as the public discourse of blogs, tweets, Facebook updates, Chronicle columns, and whatever else we do to share our knowledge and insight online or off. A real university needs not just to tolerate but to incubate critical, unpopular, and controversial ideas. The teach-ins of the 21st Century, whatever forms they assume, will need the freedom even to outrage.

Image Credits:
1. Vietnam War era Teach-in, March 1965.

Please feel free to comment.




The Celebrity Sex Tape, Where Porn Meets Reality TV
Michael Z. Newman / University of Wisconsin – Milwaukee

US Weekly

A May 2008 Cover of Us Weekly

During the summer of 2007, between the second and third seasons of The Hills, a scandalous rumor spreads in the celebrity media. Lauren Conrad and Heidi Montag have been best friends, but when Heidi starts dating Spencer Pratt he predictably comes between them. The most believable reality is that Spencer, seeing a ticket to money and fame in his penchant for instigating conflict, feeds gossip outlets the lie that Lauren and her boyfriend Jason have a sex tape. When the cable television thread of the narrative picks up, Lauren confronts Heidi, who might even be innocent of these shenanigans: “You know what you did!”

There was no Lauren Conrad sex tape, but the very idea posed a grievous harm. And The Hills assumed that its viewers knew all about the summer’s true-life drama; Us had just splashed the characters on its cover along with the headline “THE PLOT TO DESTROY LAUREN.” Such drama relies, in the first place, on an understanding of celebrity sex tapes, what they are and what they mean. The audience for celebrity gossip can fill in for itself what one might look like. This imagined object is invested with volatile notions of propriety, privacy, and sexualized gender identity.

Like pornography, a sex tape consists of images forbidden in other media genres. Like reality TV, a sex tape publicizes the everyday lives of real people. Similar to both porn and reality TV, a sex tape offers a sense of heightened access to the real, a view of life we don’t see in mainstream entertainment. As amateur media, however, it is meant for the personal consumption of the creators, and lacks the style of commercial productions. The stolen or leaked sex tape appears to be a document of private passion rather than a product of the media industries. (( Minette Hilyer, “Sex in the Suburban: Porn, Home Movies, and the Live Action Performance of Love in Pam and Tommy Lee: Hardcore and Uncensored,” in Porn Studies, edited by Linda Williams (Durham, NC: Duke UP, 2004), 50-76. ))

But a sex tape is not merely amateur porn, which also meets these criteria, and of which perhaps millions of hours exist. As a form in particular of celebrity culture, it at once intensifies and defies some of the characteristics of that ever-more-ubiquitous genre.

Intensifies: the celebrity sex tape offers intimate access to the personal life of the famous person, seemingly overcoming the filtering of publicity in ways similar to paparazzi photos, Instagram, and Twitter. Like much celebrity media now, the sex tape signals a loss of remoteness of public personages. In place of the mystique of privacy with calculated access through artifices of publicity, now we are offered the mystique of transparency and intimacy. Not only are stars just like us; they’re available to us in ordinary moments, the bread and butter of docusoaps. That the sex tape is generally represented as a forbidden entree into celebrity life makes it all the more fascinating as an inside view. When you watch a naked Paris Hilton grasping her boyfriend’s penis, you cannot help but marvel at the fact that this exists and you can look at it whenever you like. It’s reality TV without commercial constraints and contrivances.

keeping up with kardashians

A 2009 E! Promo for Keeping Up With the Kardashians

And defies: unlike the typical celebrity media of supermarket tabloids and sites like I’m Not Obsessed, the sex tape is aimed at straight men (though not exclusively). Along with lad mags like Maxim and babe blogs like Egotastic!, the circulation of videos of celebrities having sex represents a flipside of the usual focus of celebrity culture on the ideology of unattainable female beauty and the fantasy of a glamorous good life. The straight male celeb consumer — and despite gender proscription, straight men are consumers — is a voyeur taking sexual pleasure in looking. This dudely consumption promotes an aroused connoisseurship of bodies, bangin’ bods, simultaneous with a contempt for female agency or intellect. As in the cliché about effective casting of female parts: women want to be her, men want to fuck her. Now you see it, not a simulation or suggestion, not foreplay, careful angles, obscured breasts and genitals, polite cutting away. And unlike most celebrity culture, sex tapes are also marketed as straight porn. 1 Night in Paris was the bestselling adult DVD of 2005. However exploitative, that movie helped Paris Hilton establish her brand as a celebrity well known by a wider audience than many others.

The rise of sex tapes represents one kind of broadening of celebrity culture’s audience. It also stands as an example of the mainstreaming of a highly sexualized media culture at the meeting point of pornography and reality television. Some of the biggest sex tapes have starred women who maximize the intertextual appeal of reality TV and tabloids: particularly Paris Hilton (The Simple Life) and Kim Kardashian (Keeping Up with the Kardashians). These are women walking a thin line between glitzy Hollywood and trashy Vegas, quintessentially “famous for being famous.” But this insult never quite fits, as their fame comes from being hot and rich and connected — from being “celebutantes.” At least Pamela Anderson, one template for babe celebs, had a showbiz career before unintentionally becoming a porn star. The sex tape celebutantes are essentially bodies for sale, and are generally thought to be vapid — at least as star personae. That they have leveraged their bodies into good brands speaks not only of entrepreneurial savvy, but also of postfeminist culture that makes possible a media enterprise in which women sell their own sexualized image and identity to both female and male consumers. They maximize publicity not just in the sense of publicizing themselves, but of making public what is ordinarily private.

A sex tape made public by a woman’s ex-boyfriend violates her privacy. Ethically it appears to be black and white, and we take Paris and Kim at their word that this attention is humiliating. But at the same time, they might gain in esteem from being poised victims of betrayal, and also gain brand awareness. They make a show of consternation and regret, but have also been boosted into new levels of recognition. If their public identities are fashioned as bodies men want to get with, their images can be bolstered by representations of what it’s like. In the modern configuration of celebrity, women are selling porny images of themselves quite eagerly. The differences between erotic pictures in mainstream media, like David LaChapelle’s 2000 Vanity Fair photo of Paris Hilton, and explicit sex tapes might be a matter of degree. (( Thomas Fahy, “One Night in Paris (Hilton): Wealth, Celebrity, and the Politics of Humiliation,” in Pop-Porn: Pornography in American Culture, edited by Ann C. Hall and Mardia J. Bishop (Westport, CT: Praeger, 2007), 75-98. )) What sex tapes want in polish and modesty they make up for in authenticity, and celebrities are selling that all the time.

Hilton Vanity Fair 2000

David LaChapelle’s 2000 Vanity Fair photo of Paris Hilton

Thus the flipped nature of sex tapes post-Paris: a wannabe star can actively embrace a kind of attention that was not long ago out of bounds, turning a sex tape into a product rather than by-product of self-sexualized celebrity branding. This has been the fictional conceit of some film and television representations, as on The L.A. Complex. Alicia, a struggling young dancer, takes a job making a sex tape produced to revive the career of a former child actor. It leads to an opportunity in porn, though she gives it up when offered a more legitimate job dancing. It is also the real-life story of Farrah Abraham, the reality TV/tabloid celebrity who comically denied having shot a porn video on purpose.

The negotiation of positive and negative values is more productively ambiguous with Kim Kardashian. In s1e1 of Keeping Up with the Kardashians (2007), Kim and her family talk about the need to respond to questions about the sex tape when Kim goes on Tyra in her first major media appearance. At first mention, the Kardashians speak in sober tones of regret. Kim is loathe to answer the questions, and her mom Kris conveys anger and frustration. But Kim and her sisters also joke about it. She says she made it “because I was horny and I felt like it,” and they giggle. Making a video recording of sex is hard to represent as a moral failure. Keeping Up exploits the notoriety that the sex tape brings its characters, and assumes the audience’s familiarity in ways quite similar to The Hills. While Kim performs her regret, she also works the advantage presented by any kind of fame. The sexualized celebrity of Kim Kardashian, the “princess” of her family, is at the heart of this series from the start.

The sex tape’s status as both taboo and a magnet for attention is essential to its role in the commercial exploitation of female celeb brands. Calling it a tape, even as digital video is now stored on microchips, speaks to the residual low cultural status of amateur porn associated with analog video, with its cheap connotations. (( On the enduring associations of high and low status with film and video in the digital age, see Michael Z. Newman, Video Revolutions: On the History of a Medium (New York: Columbia UP, 2014), 95-97. )) Sex tapes are dirty but true, shameful but lucrative. They humiliate their subjects while promoting them as desirable sex objects. They offer a deeper fantasy than the tabloids and reality series, and even of ordinary porn: ultimate access to the commodified celebrity body.

Image Credits:

1. A May 2008 Cover of Us Weekly
2. A 2009 E! Promo for Keeping Up With the Kardashians
3. David LaChapelle’s 2000 Vanity Fair photo of Paris Hilton

Please feel free to comment.




Immersive Media: Whose Fantasy?
Michael Z. Newman / University of Wisconsin – Milwaukee

Samsung TVs

Samsung’s exhibit featuring ultra-HD TV sets at CES 2014, Las Vegas

LG’s booth at the 2014 Consumer Electronics Show in Las Vegas is filled with screens of many sizes and appeals. A sales rep in his twenties is going through a 30-second spiel selling a gadget his employer is eager for us to talk about. “What happens,” he explains, “is that it will immerse you in it.” What will do this to me is a 6-inch smartphone. LG’s pitch isn’t size per se, but the novel curved form of this new rectangle, which inclines at either end like the edges of a shallow bowl. Pressed to describe the benefits of a mobile device that isn’t a flat tile like all the others, he leans on his talking point: “the actual feeling of it, the immersive part.”

Immersive, a tech buzzword of the moment, might seem to describe a pleasurable state of absorption, something like the “flow” or being “in the zone” that people often crave. Just as the business of consumer electronics wants to be selling new products all the time, it also likes to have new language to sell them, and immersive speaks to current hopes and fears. Along with it come terms like smart and connected, and novel product categories like wearables have names insiders seem to take pleasure in saying. Like interactive and multimedia, two of yesteryear’s sizzling tech terms, these are power-packed units of a language that speaks through individuals, corporations, and their publicity battalions in support of an ideology of consumerist leisure. The ultimate goal of those displaying immersive (or smart, or whatever) on walls inside CES, repeating it ad nauseam as a talking point in exhibit demos, and hyping it in keynotes, is to sell this year’s wares while making last year’s undesirable. The immersive artifact could be a smartphone, a television, a gaming headset, a hi-res audio player, or a website or app; to be immersed in it might mean to become lost in limitless pleasures of information and entertainment.

immersive bellagio

“Immersive” 4k Samsung sets

Of the many products described as immersive at this year’s CES, the ones getting the heaviest buzz and craziest crowds were the TVs. Some are curved like the smartphone (and many cinema screens today), but the bigger selling points are their ultra-HD picture resolution in the 4K format (next year maybe they’ll be 8K), their enormous size, and in some instances their OLED screens, which amplify contrast and present true color values. Some are 21×9, which will please cinematographers and aspect ratio connoisseurs. These televisions are so big you supposedly build your house around them. One 85-inch UHD currently for sale fetches $40,000 (cue amusing Amazon reviews).

The 4K format has roughly twice the visual information of the current standard HD resolution of 1080, which is similar to the 2K format of many digital cinema projectors. The pictures look good big. On the TVs I saw, images were very bright, colorful, and crisp.

parsley tv

Demo content on UHD TV sets emphasizing natural imagery to show off the lifelike representation of the format

No one said it explicitly, but what I heard in descriptions of these images was that the hyperrealism of UHD is supposed to be transporting and illusionistic. The beholder might mistake the image for reality, and feel that they are present in the representation. Little commercial media has been produced at 4K resolution, and demo imagery was prepared specially for the show. Shots of the Vegas cityscape and natural wonders (even food porn) might prompt thoughts about just how different it would really be to see these sights unmediated. I will admit to having one of those “I’ve never seen that” reactions, though maybe I’m easily impressed, and it was my first time at CES.

life space ux

Sony’s immersive living space has projectors for throwing UHD images on tables and ceilings as well as walls

This quality is intensified in Sony’s Ultra Life UX demo, a model apartment with a queue outside in which I waited my turn in the immersive video mockup. The centerpiece is Sony’s short throw 4K projector, which can show an image of up to 147 inches issuing from a console sitting against the very wall on which the picture is displayed. The same room has projectors by the pot lights and pendants throwing UHD images onto a tabletop (which becomes a touch interface computer display) and even above the sofa where the spectator sits. As an image shot on a Paris street is projected on the wall, the starry sky above is represented on the ceiling, and sounds of the location are amplified by the room’s stereo system.

oculus rift

CES attendees trying out a flying and shooting game using the Oculus Rift headset

The sense of being enveloped by video is undoubtedly strongest, according to the logic of immersive tech, in gaming that uses the Oculus Rift virtual reality headset in place of the usual screens. I was too impatient to wait in the long line at Intel’s corner of the Convention Center, but I hung around as a booth rep gave a bombastic monologue. Four CES attendees sat playing a fighter pilot game with VR headsets strapped on their faces, and above them Intel’s man addressed a throng a onlookers. After assuring us that only because “Intel is inside” is the product so cool, he continued, gesturing at the players whose bodies were beside us but whose souls were evidently elsewhere: “They are Tom Cruise! How immersive and how real it feels!”

It’s not only the moving image that the tech world thinks of as immersive. The same idea has been used to promote the redesign of the New York Times website, which has replaced page navigation with scrolling and accommodates more multimedia elements with bigger images. The Times would like to engage readers deeply, the better to serve them advertising. The old model of chasing pageviews has been abandoned as new metrics and conceptions of audience value have emerged. The idea of this audience’s behavior is not so different than the video or gaming scenario, though: the individual is arrested by technology and design of media, ushered into a pleasurable state of absorption in representation full of attention and focus. Sometimes tech hype about users and their desired habits of attention becomes confused by its own rhetoric. When Marissa Meyer, addressing a packed theater for her keynote, pitched a Yahoo! news app’s content as both “bite-size” and “immersive,” I wondered how she got from A to B. But the underlying message is consistent: your precious users will never leave you. They will lose themselves in your product and be yours forever.

The owner of a UHD set may find value in being immersed or transported, perhaps even in fooling the eye that a two-dimensional array of pixels is real, though the product is too new to know what social significance if any it will come to have. This type of TV set is being sold as the new hot thing not to satisfy any widespread public clamoring for more vivid and lifelike video, but because of other products’ recent failures. The inability of 3D and “smart” televisions to coax consumers to improve upon 5 or 10 year-old HD sets means that the companies exhibiting at CES need something else to sell, something new to fill the tech press’s pages. Perhaps picture quality will succeed to persuade early adopters unimpressed by stereoscopy and web widgets. Recent trends are not encouraging for sellers of TV sets, and at least for a time they will face the same problem as they did with 3D: insufficient content in the new format. Even if producers shoot in UHD and distributors make it available, streaming jumbo-sized 4K files might not work without the fastest connections. To be very profitable, the electronics trade dreams of mass TV set replacement on a cycle of just a few years, and its current mandate is to set new standards of picture quality that render our 1080 sets obsolete.

immersive gaming

Intel’s exhibit at CES highlights the immersive potential of video games played with VR displays

Establishing a more immersive video standard is more likely to be the wish of the tech industry than its customers. While their interests are only partially overlapping, the electronics and media trades alike are eager to command the full, perpetual attention of consumers, and to keep them spending more and more on leisure, even at a time of economic stagnation for all segments of the population except the rich. Many companies want to win viewers back to the living room TV set or the game console from the ubiquitous mobile screens and computer displays used for watching web video, playing games, and texting and tweeting; many others want to keep us hooked on monthly subscription services with high and rising fees. Content companies of all kinds want a large audience dedicated to them, unable to live without their offerings. When we hear of immersive experiences we might think of losing ourselves in the pleasures of watching, or playing, or reading–or scrolling through, or interacting with–media. Maybe technology and design are crucial to such states of immersion. But it will be, first of all, to the mutual benefit of the media and electronics firms if we concede the terms of our pleasure to their ideals of technological sophistication and expectations of audience activity, and subsume our fantasies in theirs.

Image Credits:

1. Samsung’s exhibit featuring ultra-HD TV sets at CES 2014, Las Vegas – from author’s personal collection
2. “Immersive” 4K Samsung sets – from author’s personal collection
3. Demo content on UHD TV sets emphasizing natural imagery to show off the lifelike representation of the format – from author’s personal collection
4. Sony’s immersive living space has projectors for throwing UHD images on tables and ceilings as well as walls – from author’s personal collection
5. CES attendees trying out a flying and shooting game using the Oculus Rift headset – from author’s personal collection
6. Intel’s exhibit at CES highlights the immersive potential of video games played with VR displays – from author’s personal collection

Please feel free to comment.




When Television Marries Computer
Michael Z. Newman / University of Wisconsin – Milwaukee

Atari Breakout

A screenshot from Atari’s Breakout

When Television Marries Computer” was the headline of a 1983 New York Times review of Pilgrim in the Microworld, a book about David Sudnow’s experience playing the Atari game Breakout. The review concludes that the marriage is “to go the way of the hula hoop, Rubik’s Cube and the pinball machine,” but there can be no question decades later that the couple is still happy together, and not only in the form of video games. YouTube, Netflix, DVRs, and dozens of other technologies and services combine television and computer. And despite the implied novelty, the TV set and computer were hardly newlyweds in 1983. It was not uncommon in the 1960s and 70s to read of a near future of televisions hooked into computer networks that would transform them into electronic newspapers and libraries, nodes for connecting to the whole world’s information. By 1983, many televisions in several countries were connected to videotex services that delivered news and other information via computer and could be used for two-way communication. Raster-scanning CRT video displays, basically television sets minus the broadcasting receiver, were often employed as computer terminal output in the 1960s and 70s. And video games and computers plugged into televisions in the home had been commonplace objects for several years by 1983.

The 1970s was a particularly fecund period for the marriage of television and computer. The emerging culture of home or personal computing, which was inseparable from the development of video games, saw the television set as a crucial component of its ensemble of technologies. We might not realize today what a big deal it was when people then were able, for instance, to type letters and see them at once appear on a television screen. More exciting yet was a computer that would be under the real-time, interactive control of its individual user. Computers over the years had many input/output devices, from punch cards and blinking lights to teletype and light pens. But during the 70s, the standard components of the personal computer using a silicon microchip came to be a QWERTY keyboard and a CRT display — often actually a television set.

radioelectronics

The cover of Radio Electronics from September 1973

A device predating the consumer market home computers like the Apple II and Commodore PET was the TV Typewriter, which could be assembled to display text on a CRT screen and could be used as the input/output for a microcomputer built from a kit. The TV Typewriter was called “the opening shot of the computer revolution,” and it helped establish the CRT as the standard output of PCs, in contrast to the teletype used with the minicomputer. ((Paul E. Ceruzzi A History of Modern Computing, 2nd ed. (Cambridge, MA: MIT P, 2003),226.)) Video games, which take their name from the televisual apparatus, were often the most frequent use of microcomputers before they became integrated into the standard array of workplace tools.

Inscribed in such devices was a utopian ideal of appropriating the computer, long seen as an instrument of institutional control and depersonalization, as a toy for creative individuals to play with in their own personal ways. ((Thomas Streeter, The Net Effect: Romanticism, Capitalism, and the Internet (New York: New York UP, 2011).)) But these objects also express the fantasy of transforming the television set, long identified with network broadcasting and commercial mass media, and remaking it as a participatory rather than one-way technology. The reputation of television as a cause of serious, widespread social problems, at best a fancy time-killing machine, made for idealistic rhetoric about the potential of new technology — from cable and videotape to games and computers networks — to redeem TV from its debased status. Popular rhetoric of the time mobilized citizens to talk back to TV, agitated for public access to the airwaves, and called on the networks to make television function more in the interests of participatory democracy.

Innovators of the 1970s whom we might credit with standing alongside television and computer at their wedding (or at least with fixing them up), often imagined the union in terms of TV marrying up. Video games’ inventors, including Ralph Baer and Nolan Bushnell, described their motivation as providing something to do with a TV set other than watch network programs. ((Tristan Donovan, Replay: The History of Video Games (East Sussex, UK: Yellow Ant Media, 2010), 8; Judy Klemesrud, “Bang! Boing!” Ping! It’s King Pong,” New York Times, April 24,1978.)) Steve Wozniak narrates in his autobiography how excited he was by Pong, which was interesting not so much as a game but as a way of manipulating an image on a TV screen, a key appeal of the Apple computers Wozniak subsequently created. ((Steve Wozniak with Gina Smith, iWoz: How I Invented the Personal Computer, co-founded Apple, and had fun doing it (New York: W.W. Nortion & Co., 2006), 140.))

Representations of personal computers in the popular press might contrast them with television in terms of value and legitimacy, as in an Esquire story about computer hobbyists from 1977 playing the game Star Trek on their workshop-assembled computers. The best thing to be said for Star Trek was that it was “not quite so moronic as the television series from which it borrows its name.” ((Timothy Ferris, “Solid State Fun,” Esquire (March 1977), 101, 121-124.)) An early 80s ad for the Commodore 64 presented the computer as something “on television tonight,” presenting its variety of programs organized into half-hour time-slots as in television listings. Computers offered programming preferable to whatever was available over the air.

A Commodore 64 Advertisement

A Commodore 64 Advertisement

The fact that a television screen was an essential component of the new artefact called a home computer made for inevitable comparisons between rival uses of the hardware. One of the most poetic and prophetic voices of home computer culture was Ted Nelson, and while he is most often noted today for originating the term hypertext, much of his 1974 book Computer Lib/Dream Machines was devoted to the utopian, liberating possibilities of the CRT as a graphical display for computers. ((Computer Lib/Dream Machines can be a challenging book to cite, as it has two halves with separate pagination, and it underwent multiple editions. I am citing page numbers parenthetically from the Dream Machines half of a volume which is both the “first edition” and the “Ninth printing, September 1983,” which contains some post-1974 pages.)) Dream Machines prophesies future applications of computers with screens as playthings and tools of communication, “to help people write, think and show.” (2)

Dream Machine Cover

One of the covers of Dream Machines

The cover pictures a Superman in sandals floating toward a white-hot screen, entering a paradise of interactive communication through images. Nelson was no fan of TV, arguing that it had hardly progressed since the 50s and that “the bulk of television is about as bad as it ever was.” But he was exhilarated by the prospect of using CRT screens, “lightning in a bottle,” (6) in interactive computing allowing users to make pictures imaginatively and change them through their input. “IF COMPUTERS ARE THE WAVE OF THE FUTURE,” he proclaims in caps lock, DISPLAYS ARE THE SURFBOARDS.” (22) Nelson’s vision of hypertext was not strictly verbal, combining multiple media for diversion and information. It was to be the combination of

a book, a TV show and a penny arcade…a vast tapestry of information, all in plain English (spiced with a few magic tricks on the screen), which the reader may attack and play for the things he wants, branching and jumping on the screen, using simple controls as if he were driving a car…A person is writing to other people, just as before, but on magical paper he can cut up and tie in knots and fly around on. (58)

This techno-utopian vision was to turn television into computerized play substituting a creative individual’s explorations for commercial programming and transmission. Television’s electronic scanning imagery, by the terms of this imagining, becomes magical and transformative, a far cry from the mass media fare of the networks.

As with many futuristic fantasies, there is a mixture in Dream Machines of outlandish absurdity and profound vision. Today’s computer screens do much of what Nelson imagined four decades ago even if they rarely seem like magical paper to fly around on. But what I find particularly salient in the historical continuities of TV-computer convergence thinking is the status of network-era US television broadcasting as the enduring bad object, the institution in woeful need of technological fixing up. Today we valorize our contemporary interventions into television at the expense of the past, reproducing ideological assumptions about mass media that have changed little in more than half a century.

In a recent technology-themed issue, The Atlantic published a list of its “50 Greatest Breakthroughs Since the Wheel.” While the internet comes in at number 9 and radio at 29, TV is only 45th greatest. The magazine’s explanation for the low score speaks to historical continuity in convergence rhetoric when it comes to television:

Many years from now, perhaps people will regard the second half of the 20th century as the brief moment when broadcast TV could seem a dominant technology. With its obvious-in-retrospect limitations, like one-way information flow rather than interactivity, and dependence on heavy hardware for best display, maybe TV was bound to be a transition to some other system more tailored to individual tastes. Or maybe our panelists were embarrassed to vote for it.

The presentism here is familiar to the point of cliché from discourses of television’s cultural legitimation, in which interactive technologies appear to give users newfound agency and thereby improve TV’s status. The limitations perceived in The Atlantic were widely felt in years past, as I have tried to illustrate. These limitations have often figured into the logic behind television’s marriage with computers. Sometimes the things we think are definitively of our own age turn out actually, upon historical inquiry, to be recurring nightmares or fantasies.

Image Credits:

1. A screenshot from Atari’s Breakout
2. The cover of Radio Electronics from September 1973 – from author’s personal collection
3. A Commodore 64 Advertisement – from author’s personal collection
4. One of the covers of Dream Machines – from author’s personal collection

Please feel free to comment.




Flow Favorites: The Bronze Fonz
Michael Z. Newman / University of Wisconsin-Milwaukee

Full Title: Flow Favorites: The Bronze Fonz: Public Art/Popular Culture in Milwaukee, Wisconsin

Flow Favorites

Every few years, Flow’s editors select our favorite columns of the last few volumes. We’ve added special introductions to these important pieces, and appended the original comments to the piece below. Enjoy!

Flow Senior Editor Mabel Rosenheck:
One trend we’ve seen more and more of at Flow in the last few volumes is the discussion of media in public spaces. Whether digital art in the subway via the mobile phone, or online archives of urban indie rock performances the city, like the music, is simultaneously “un-mediated and hyper-mediated.” In addition to Michaela Ardizzoni and Ben Aslinger’s articles, my favorite example is Michael Z. Newman’s “The Bronze Fonz,” which explores not only the relationship between art and popular culture, but between cultural memory and urban space. Through the Bronze Fonz, Newman points to the palimpsests inherent not only in media or the city, but in everyday life.

fonz

The Bronze Fonz

In August the latest in a series of recent public artworks in the U.S. honoring TV and movie characters was installed in downtown Milwaukee, Wisconsin: a life-size bronze statue of Arthur Fonzarelli– Fonzie, The Fonz–with his signature two thumbs up. Other statues, most sponsored by TV Land, have included Rocky Balboa at the Philadelphia Museum of Art, Ralph Kramden at the Port Authority Bust Terminal in New York City, and Mary Richards tossing her tam skyward in Minneapolis.

The Fonz of course was a main character in the ABC sitcom Happy Days whose black motorcycle jacket was long ago enshrined at the Smithsonian alongside Archie and Edith Bunker’s chairs. Although shot in Los Angeles in the 1970s, Happy Days was set in Milwaukee in the 1950s, as was its spinoff Laverne & Shirley. Henry Winkler, who played Fonzie, and much of the rest of the Happy Days cast “returned” to Milwaukee for an unveiling. A crowd showed up. Speeches were made. Photos were snapped. Now the statue is a point on the city’s tourist itinerary, along with brewery tours and museums and the pretty lakeshore.

[youtube]http://www.youtube.com/watch?v=mWo0WE7cPjU&feature=related[/youtube]

The announcement in 2007 of plans for the Bronze Fonz was the occasion for more than just curiosity and excitement. Within the city’s arts community, the Bronze Fonz was an object of controversy and scorn. Rather than a project spurred by local arts institutions, the Bronze Fonz was the product of funds raised by Visit Milwaukee, a non-profit agency whose mandate is to promote tourism to boost the local economy. The Bronze Fonz was launched without the sanction of anyone in the local art world, and the local art world bristled at this trespass onto its turf. A prominent gallery owner threatened to close up shop and leave town if the plans went ahead (his gallery did close earlier this year). The director of the Milwaukee Art Museum wrote in opposition to the Bronze Fonz. The Milwaukee Journal-Sentinel art critic followed the controversy in a series of blog posts gently slanted against the project. A debate was joined between defenders of a concept of public art as serious and important to a city’s legitimate cultural identity on one side, and defenders of the virtues of increased tourism and fun on the other. The aesthetic mission of art opposed the crowd-pleasing, commercial taint of entertainment, and entertainment won the day.

This episode was yet another occasion for reasserting the cultural hierarchies which place television and pop culture as illegitimate, and which function to reproduce social distinctions. Even the statue’s promoters bought this line. The CEO of Visit Milwaukee, Dave Fantle, told a Journal-Sentinel reporter that the Bronze Fonz is “not art but a piece of pop culture.” In part this clarification was self-justifying; Visit Milwaukee managed an end around the usual processes involved in installing a public artwork in a prominent city location. But no matter the motivation, the discourse positioning a Fonzie statue as at best frivolous fun misses the appeal and significance of popular culture in ordinary people’s lives. To look at the photos people take of themselves posing with the statue is to recognize the depth of our identification with the narratives of popular culture. It might be fun, but it isn’t frivolous.

people with fonz

Sam and Mike with the Bronze Fonz

But at stake in this debate was more than just the promotion of values endorsed by the snooty guardians of culture or the populist civic boosters. The identity of the city was also a term in the controversy, and this identity is tied up in the representation of place in popular culture like Happy Days and Laverne & Shirley.

Like many Midwestern cities, Milwaukee’s past as a site of productive industry is more glorious than its post-industrial present. Today unemployment is high, especially among the city’s large and segregated racial underclass, and economic growth is slow. The city does not like to present this face to the region, the country and the world. It hopes to attract “creative class” young people and spendthrift convention tourism. It wants to seem like a hip, vibrant town with plenty of attractions to excite out-of-towners.

Happy Days (and Laverne & Shirley) might be a fond memory, but is not much to hang onto as a point of pride for Milwaukee. The setting was in some ways incidental to the show. Its producer Garry Marshall wanted to set Happy Days in his hometown of the Bronx, but the folks at Paramount thought this would come off as too ethnic (Brant, 20). They were looking to duplicate the family format of 1950s shows like Father Knows Best and clearly preferred to represent an idealized middle America. Wisconsinites are proud of their culture, for instance their foodways (the German-influenced beer and sausages; dairy foods like cheese and frozen custard; and Friday fish fry events where the Brandy Old Fashioneds are more often savored than the fish). But the ethnic or regional identity of Wisconsin or Milwaukee is hardly in evidence in Happy Days. For instance, more of the show’s characters have Italian names (Fonzarelli, Delvecchio) than the more locally common German ones. One hears more New York accents like Winkler’s than Wisconsin accents. You wouldn’t know from watching Happy Days that the typical white Milwaukeean speaks more like Sarah Palin than Mrs. C.

The setting of the show in Milwaukee was in a sense the denial of place rather than an investment in it. Marshall, and Italian-American and New Yorker, seems to have wanted to do urban-ethnic, but in some ways the specificity of his characters’ identities got scrubbed from the representation. Setting the show in Milwaukee was a way of striking a note of Americana — an idealization of “normal” America just as much as it was an idealization of the 1950s as a more innocent and familiar and comforting time in history. Jefferson High, where the characters go to school, and Arnold’s drive-in restaurant, where they slurp milkshakes and dance to jukebox records, are meant to be the kind of school and hangout that you might find anywhere in the USA.

cast of happy days

Fonz with the cast of Happy Days

Milwaukee can be tickled to have a Fonzie statue because Fonzie is essentially cool. But it’s hard to be proud to have been chosen as an undistinct enough place to stand in for Anytown, USA. America of the big cities and coasts is often blind to the specificity of the places in the middle of the country, and Happy Days is symptomatic of this. (Things change in Laverne & Shirley, which is more blue-collar and which has the brewery setting to add local color.)

But we are also proud to have been the setting of a popular show at all, an opportunity for “Milwaukee” to have been uttered on a regular basis to an audience of millions. To this day, legend persists that a frozen custard stand called Leon’s was the inspiration for Arnold’s. Leon’s has gorgeous vintage signage and looks like it hasn’t changed since the 1950s.

Leon's Frozen Custard Stand

Leon’s Frozen Custard Stand

If you ask the custard server if it’s true about Leon’s being the original Arnold’s she might say yes (I’ve heard it myself). But the truth is otherwise. Tom Miller, a producer at Paramount, suggested setting Happy Days in the suburban Milwaukee neighborhood where he grew up after Garry Marshall’s Bronx setting was nixed (Brant, 20). Jefferson High was based on Nicolet High in Glendale, WI. Arnold’s was based on a drive-in called The Milky Way on Port Washington Rd. that no longer exists. But it pleases people around town to think that the place that inspired Arnold’s still serves food, and that you can still go back there if you want to visit the 1950s as represented on television. Like everywhere else, nostalgia for a past that never existed lives in Milwaukee.

Michael Z. Newman revisits his column for Flow Favorites:
“The Bronze Fonz” was an occasion for me to think about the connection between a television show I loved as a child and the city where I have lived for seven years as an adult. Now when I occasionally catch a Happy Days episode I find the Milwaukee setting pretty laughable. The distance between the representation of 1950s Milwaukee on TV and my experience of the real Milwaukee in the past few years is great. But I remain a fan of the Bronze Fonz even if he stands for a Hollywood fantasy of middle-American wholesomeness.

Image credits:
1. The Bronz Fonz – photo by author
2. Sam and Mike with the Bronze Fonz – used with permission
3. Fonz with the cast of Happy Days – Author’s personal collection
4. Leon’s Frozen Custard Stand – photo by author

Diane Negra said:

Thanks for this thoughtful, beautifully written piece. I really like the way you draw attention to the role of tv-based public (though often privately sponsored) art, to tv nostalgia and to the slipperiness of ethnic representations in popular culture. Miriam Greenberg’s new book on urban branding is certainly relevant to this discussion.
-November 3rd, 2008 at 3:23 pm

Carly Kocurek said:

Yes, I agree. This is an interesting thing to think about. I myself grew up in a region that has been made iconographic through the writings of Larry McMurtry — Texasville was shot in my home town — and there’s an interesting overlap in the way that the specificity of something like McMurtry’s work can work to the same ends as the placelessness of something like Happy Days. In both, there’s a sort of flattening of the real and a crystallization of time, and the past that was can function much like the past that never was.

And, of course, the debate over the purpose and role of public art is always contentious. I think the question I always turn to is which public is the art supposed to serve? The general public, who seems pleased with the statue? Or, the art public, who finds it revolting? I think too often “public art” means art provided to the public, rather than for or with the public.

In short, a fascinating piece that could be the jumping off point for a number of discussions.
-November 4th, 2008 at 11:12 am

Mabel said:

Although the Rocky statue in Philadelphia was not actually related to any of the other statues, it too has a fascinating and pertinent history. It was constructed by Sylvester Stallone for use in Rocky III after which he “donated” it to the City of Philadelphia and the Philadelphia Museum of Art… they however didn’t want it for the exact reasons you cited about the Fonz statue in Milwaukee: the art world said it was a movie prop not art. It was subsequently placed at the Spectrum arena in South Philadelphia (where the fights in Rocky I and II took place) and moved back to the art museum briefly for filming other movies (including subsequent Rocky movies and also the movie Philadelphia). What is particularly interesting is the way in which after 25 years the Philadelphia art community allowed the statue to be re-erected at the art museum (though at the bottome of the stairs, not the top as in the movies) because of the immense interconnection between the city and Rocky both in cultural history and in public history and in the minds of the people of Philadelphia.

My point with this extended recounting of the Rocky statue in Philadelphia is that it is very intersting to see the ways in which different cities adopt and adapt their cultural representations in media over time. Of course with or without a statue the iconic steps are bounded up with tourists (and locals) humming “gonna fly now” daily

What is further interesting in the case of Philadelphia is that it is has a history of public art from the Calders to Robert Indiana’s Love statue to the Percent for Art program started in 1959 mandating one percent of all new building projects be devoted to public art. Where has Rocky fit into this tradition of art and public art which has ranged from 37 foot realist William Penn statue atop City Hall to Alexander Calder’s abstract “stabiles” to the pop art of Robert Indiana? In this range of artwork and definitions of art, why did it take so long to accept a pop cultural icon (and Academy Award winner) to be accepted as art?

In any case, great article and a debate about media and place which I would love to see explored further.
-November 4th, 2008 at 3:43 pm

Dave Sagehorn said:

Great work. I really enjoyed this essay and fully intend to get my own picture next to the Bronze Fonz next time I’m in Milwaukee.

I recently did some work on how Wisconsin is represented on television, although more specifically through the show “Aliens in America” – and in doing so realized that the list of Wisconsin-set shows is surprisingly lengthy. But that was one of my favorite aspects of the Bronze Fonz debates, that in one article I read someone argued that the Fonz was at least cooler than Shirley Feeney and Laverne DeFazio. So the issue is for some not only about high culture/low culture divides, but rather that Milwaukee has such an excess of television history that they can choose which TV icon to have bronzed.

I think it’s interesting to consider that this statue is not only being tied to the show’s fictionalized 1950s, but perhaps also to the 1970s Milwaukee that saw “Happy Days” in its initial run – there are multiple eras in play to be nostalgic for, and the statue could just as easily evoke personal memories of where and when someone watched the show in addition to recalling the narrative setting. Those kinds of personal ties are real even if the show’s semi-generic version of Milwaukee is decidedly less so.
-November 7th, 2008 at 9:05 am

Daniel Marcus said:

Nice piece, Mike, especially about Milwaukee as a stand-in for Anywhere, USA and thus leeching any meaning from its potential specificity. I eagerly await the statue of four teenagers sitting in around a table in a smoke-filled basement as a tribute the That ’70s Show, which did play up the Wisconsin setting a little bit more. Perhaps the whole thing could periodically swivel 90 degrees.
-November 21st, 2008 at 5:53 pm

Michele La Rue said:

The 50’s were my parents generation, and the Fonz is accurately portrayed as a real person in Milwaukee. I really believe that the Bronze Fonz really represents the city of Milwaukee very well. It immortalized the values he portrayed on the t.v. series “Happy Days”. Although jobs are hard to find now compared to the 1950’s, the Bronze Fonz makes a positive thing for people to view when they come to Milwaukee. I’m glad the producers of “Happy Days” chose Milwaukee as the home of the Fonz. The Bronze Fonz represents a more peaceful, fun loving 1950’s as compared to the hectic lifestyles of today.
May 14th, 2009 at 10:00 am

Please feel free to comment.




P2P TV: Ethical Considerations
Michael Z. Newman / University of Wisconsin-Milwaukee

description goes here

Lost: The Most Torrented Show of 2008

During one recent week, more than 1.7 million people accessed the latest episodes of Heroes and Lost using BitTorrent. Other heavily shared shows included 24, The Big Bang Theory, and Battlestar Galactica. TV accounts for half of BitTorrent traffic, and clearly many viewers are substituting downloads for other forms of access. In 2008, Lost was the most torrented show and its most popular episode was downloaded more than 5.7 million times (13.4 million American viewers tuned in the old-fashioned way, according to Nielsen).

The industry worries that this distribution stream will drain its profits. During the Pirate Bay trial, one of the products for which the media companies demanded compensation was the first season of Prison Break. But some shows’ fortunes have risen with BitTorrent, much as syndication and iTunes have helped Law & Order and The Office. The initial fan enthusiasm for Battlestar Galactica in the US was aided by its availability online before SciFi began to run it here (it aired in the UK first). Episodes of Pushing Daisies, Sarah Connor Chronicles, and True Blood, which might appeal to young, affluent, and technically adept torrenters, have been leaked to the network to arouse buzz. If file-sharing is sometimes figured in the media industries as a criminal, even terrorist threat, it is sometimes also seen as good promotion.

description goes here

Battlestar Galactica

P2p TV can be used in many ways, for many reasons. Some users catch up on missed episodes or seasons. Having caught up, they might tune in using more conventional delivery systems. For others, BitTorrent is a source of programs from abroad and helps overcome the maddening asynchronicity of international television flows. Australians often have to wait weeks or months after a favorite show airs in the US before the local channel makes it available. Streaming video at network websites or Hulu is unavailable because of geo-blocking. Participants in online fan communities find this infuriating, and BitTorrent overcomes the scheduling inconveniences imposed by TV channels. ((Tama Leaver, “Watching Battlestar Galactica in Australia and the Tyranny of Digital Distance,” Media International Australia 126 (February 2008), 145-154.)) I use BitTorrent to make clips and caps to use in teaching or research or to post on my blog. This offers me easily downloaded high-def files free of DRM, which, unlike Hulu videos, can be saved to disc. That is, I use p2p to access what I can’t conveniently get another way. It’s mostly content I have already accessed legitimately.

Too many questions are raised by this development in the technology and experience of television to consider in this space, questions having to do with many aspects of the economics of the media business, indeed, questions about what “television” means. I want to focus in the space remaining on one issue that arises in relation to p2p TV: the ethics, from the user’s perspective, of this mode of television distribution and consumption.

Unlike ordinary movie and television viewing, p2p sharing often involves an ethical calculation. File-sharers risk legal action, however unjustified, for copyright infringement. They rationalize their behavior in terms of costs and benefits, and justify their practices not only in terms of legal categories like fair use but also in terms of judgments about right and wrong. For instance, they reason that it’s ethical (or should be) to download content not otherwise available, or to a sample a product before committing to it. Perhaps they believe that downloading is justified when one has also paid for the product. (On the evidence that The Dark Knight was both the box office and BitTorrent champion of 2008, we can assume that many viewers paid to see the film and downloaded it for free.)

description goes here

The Dark Knight: Box Office and BitTorrent Champion of 2008

This kind of reasoning is evident from the way the popular press covers file-sharing. Following the propaganda of media trade groups, p2p is often portrayed in mainstream media as stealing. Lev Grossman jokes that using BitTorrent to access television shows means selling your soul. Yet some writers see justification or at least compromise. A Slate columnist describes BitTorrent as “TiVo for the tech-savvy and the ethically flexible.” An essay in The Independent remarks that “downloading of movies is apparently a victimless crime, and if there is a victim, it’s ‘The Man.’” It might be wrong to steal, but when it’s the devil whom you’re stealing from, well, not so bad. We can call this a compromise ethic. Many of those using BitTorrent might prefer to access television content legitimately, but they see no better alternative right now than file-sharing. They might even judge the media industries as unethical for using DRM, for charging too much (the average monthly cable bill was $85 last year), or for impeding the synchronization of global distribution. They might see their own questionable ethics in a favorable light when compared with the media industry’s.

By contrast, champions of free culture defend piracy in moral terms, owning the name pirate as in “Pirate Bay” and seeing a cultural imperative to save society from corporate and legal overreaching. Pirates of this sort are activists against excessive copyright legislation and enforcement, defenders of the public domain and the cultural commons. Their ethic is one of community and peer exchange as an antidote to corporate consumerism. They might quote Richard Stillman’s GNU software manifesto: “I consider that the golden rule requires that if I like a program I must share it with other people who like it.” Siva Vaidhyanathan describes MP3 file-sharing as “a rational revolt of passionate fans.” ((Siva Vaidhyanathan, Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity (New York: New York UP, 2001), 197.)) Matt Mason writes in The Pirate’s Dilemma, “If democracy is about creating processes that allow people to empower themselves, then pirates are clearly the perfect catalysts for such processes.” ((Matt Mason, The Pirate’s Dilemma: How Youth Culture is Reinventing Capitalism (New York: Free P, 2008), 47.)) Copyleft activists and sympathizers see virtue in precisely what the media industries most fear about file-sharing: the removal of popular culture from the circuit of commercial exchange. This piracy is counter-hegemonic. The media industries want to control and monetize every window of distribution; champions of free culture want to preserve a gift economy of prosumers. ((Johan A. Pouwelse, Pawel Garbacki, Dick Epema and Henk Sips, “Pirates and Samaritans: A decade of measurements on peer production and their implications for net neutrality and copyright” Telecommunications Policy 32 (2008), 701-712.)) Theirs is a pirate ethic.

Peer distribution of television would seem to demand a somewhat different set of calculations when compared with music and movie file-sharing. For as long as these cultural forms have been produced industrially, the business model of music and movies has been to exchange media for money. The disruption of this exchange allows the audience to access for free what otherwise might have cost it materially, which can easily be framed ethically as free-riding. (This despite the fact that we access “free” movies and music all the time, for instance, when using public libraries or visiting friends.) Television, with some exceptions, is an advertising-driven medium, in which the exchange is of the audience’s attention for the sponsor’s commercials. With digital technologies, audiences are increasingly empowered to avoid commercials, but this can hardly be cast in negative moral terms when so many people consider advertising a nuisance or even a cause of harmful social effects. DVRs probably disrupt the business model of “free” TV more than BitTorrent, yet one rarely observes an ethical discourse around digital time-shifting. (In 2002 Jamie Kellner, then Turner’s CEO, called DVR use “stealing” and was widely ridiculed.) ((Ted Johnson, “TiVo-lution,” Variety (June 7, 2004)) Like the DVR, p2p de-commercializes TV. BitTorrent files of TV shows are ad-free. But the DVR is a digital VCR, a legitimate cultural technology. BitTorrent, by contrast, is the new Napster.

Thus TV torrenters, like the ones who comment at the news aggregator site Digg, often defend their practice in ways that establish specific ethical norms for television. They reason that those already paying for cable are entitled to download, and insist that DVR recordings and downloads as ethically equivalent. BitTorrent is merely “bandwidth-consuming timeshifting.” As one user explains, “The networks BROADCAST their shows, sending them out FOR FREE into the air all over the country. How can they claim that I am stealing if they are giving it away for free?”

Image Credits:
1.) Lost: The Most Torrented Show of 2008
2.) Battlestar Gallatica
3.) The Dark Knight: Box Office and BitTorrent Champion of 2008

Please feel free to comment.




TV Binge
Michael Z. Newman / University of Wisconsin-Milwaukee

The Fischer family of HBO\'s Six Feet Under

The Fisher family of HBO’s Six Feet Under

We watched Six Feet Under for an episode or two when it began in 2001. I was put off by its inconsistent tone and wasn’t sure I was up for a show in which someone dies every week. Years later, long after it had established a wide and passionate following and run its final episode, we began to watch Six Feet Under again and this time we pretty much couldn’t stop.

The show follows the Fisher family, their employees and friends and lovers, across five cable-size seasons: sixty episodes, fifty hours of drama. The characters grow and change, they have relationships and marry and have children, new characters are introduced while old ones are forgotten. As befits a shows set in a funeral home, some of the most (and least) beloved characters die. It took the original audience more than five years to see all of this play out. They had to wait a week for each new episode and endure months of hiatus between seasons. By contrast, we watched in two sustained bursts of viewing first in the summer of 2008, while much of the schedule was filled with junk, and then in the early winter of 2008 and 2009 after the November sweeps period again ushered in a period of reruns and specials. We binged.

As much as any narrative medium, television affords intense engagement with characters. We get to know the people on the screen so intimately that they become our TV friends. Sometimes we know them better than our real-life friends, because we get so much insight into their psychology, their secrets, their hopes and fears and dreams. Spending years with characters, they become regular visitors to our living rooms, like pals we see week after week at the same hangout. Binging intensifies the pleasure of this engagement by making characters all the more present in our lives. The relationship becomes more like a passionate but doomed affair, a whirlwind that enlivens us so well for a time, only to leave us empty and lost when it sadly, inevitably, ends.

Binging isn’t new. As a kid I watched my favorite shows several times a day for years. But the fact that I could binge on a show like Six Feet Under is certainly a product of its historical context. We watched the show on DVD, programming our revival at our pleasure. This kind of viewing experience emphasizes the serial nature of a soapy show such as SFU, reproducing the daily rhythms of daytime drama with content originally intended to be seen weekly.

Binging also makes one more conscious of the season as a narrative unit. Shorter seasons and long hiatuses between them encourage viewers of original cable shows to see the season as a meaningful narrative category. So of course do DVD sets. When we watched The Wire we would near the end of a season and go online for our “re-up” (Baltimore slang for a fresh shipment of heroin) of a new season to have ready when we would be. A season of twelve or thirteen episodes can more naturally be constructed as a coherent story, and viewing those twelve or thirteen hours over the course of a week or two (or less) shows off this unity.

The Wire

Students from HBO’s fourth season of The Wire

The shows I binged on as a kid were programs like Three’s Company that aired in the hours between school and dinner. They are episodic rather than serial. Often the episodes were rerun in a different order from their original airing, though I do recall the sitcoms of my childhood progressing year by year, so that you would see a few months of Chrissie episodes of Three’s Company followed by a brief stint of Cindy episodes, and then a few months of Terri episodes only to return to the early Chrissie seasons. Other than these cast changes, the binging Three’s Company viewer would have scant awareness of the sequencing of episodes. I certainly never had any idea when a season of the show was beginning or ending. Each episode tells its own story and the viewer doesn’t need to have seen the earlier ones to get the later ones. But watching all of a serial drama like SFU on a binge is like tearing through a thick novel in a week at the beach. It’s all one story, and even though episodes and seasons resolve at their conclusions in some respects, as a viewer you always find yourself in the middle.

Other things we have watched this way were similarly soapy in their narrative form. We often binge on the first season or two of a show to catch up when critics and friends tell us we’re missing something good. “Quality TV” these days demands a completist mandate: start at the beginning. This first happened for us with The Sopranos. Battlestar Galactica was another. I find now that I liked BSG much more during our early-seasons binge, when I could better keep its complicated story straight. Long breaks between seasons make it harder for me to remember the situation, the stakes, the state of characters’ relationships with one another. The experience of the first two seasons was so vivid from binging that the subsequent ones have made less of an impression, and my enthusiasm for BSG has diminished.

Battlestar Galactica

The cast of Sci-Fi’s Battlestar Galactica

No matter the format of a show, watching on a binge intensifies the continuity of character arcs. Shows that are hybrids of episodic and serial narrative forms like Degrassi: The Next Generation and Judging Amy seem more serialized when viewed in heavy regular doses. The Wire, which has fairly self-contained season-long arcs with strong thematic and narrative unity and coherence, starts to seem more like five volumes of one big book (e.g., Proust’s In Search of Lost Time) than like a series of novels with continuing characters (e.g., the crime novels of the show’s writers Dennis Lehane and George Pelecanos).

This new kind of binging further removes us from the conditions that obtained for television viewing for many decades. The DVD frees us from the program schedule and the flow of content, which in many cases includes commercials, promos, and idents, as does watching with the aid of a DVR or iTunes or BitTorrent. The new technologies that give viewers the agency to program their own media mark a shift from ephemeral to collectible content. TV shows and movies are now more like books: you can own or borrow copies of them, use them whenever you like, and keep them on a shelf as an advertisement of your taste.

Something is lost in this process, and we should be wary of accepting this new way of viewing as an evolutionary step. In some ways, binging feels unnatural. Watching this way, we lose our connection to the larger viewing audience as community and to the temporality of broadcasting that unites a program with the moment of its airing. (Viewers still get this experience from sports and reality TV and news and talk shows — from genres of programming inimical to binging.) We also lose a significant aesthetic effect of the weekly rhythm of the prime-time serial. It’s hard to be specific about what this means, but it’s a function of how we appreciate a show that airs once a week, and takes a break of a few months between seasons — what we pay attention to, how it makes an impression on us, and how the interval of time between episodes and seasons encourages us to talk and think about the characters and their situations in the space between installments of their story. Binging makes the experience of television more intense and personal. It can feel like too much of a good thing, and maybe it is.

Image Credits:

1. The Fisher family of HBO’s Six Feet Under
2. Students from HBO’s fourth season of The Wire
3. The cast of Sci-Fi’s Battlestar Galactica




The Bronze Fonz: Public Art/Popular Culture in Milwaukee, Wisconsin
Michael Z. Newman / University of Wisonsin-Milwaukee

fonz

The Bronze Fonz

In August the latest in a series of recent public artworks in the U.S. honoring TV and movie characters was installed in downtown Milwaukee, Wisconsin: a life-size bronze statue of Arthur Fonzarelli– Fonzie, The Fonz–with his signature two thumbs up. Other statues, most sponsored by TV Land, have included Rocky Balboa at the Philadelphia Museum of Art, Ralph Kramden at the Port Authority Bust Terminal in New York City, and Mary Richards tossing her tam skyward in Minneapolis.

The Fonz of course was a main character in the ABC sitcom Happy Days whose black motorcycle jacket was long ago enshrined at the Smithsonian alongside Archie and Edith Bunker’s chairs. Although shot in Los Angeles in the 1970s, Happy Days was set in Milwaukee in the 1950s, as was its spinoff Laverne & Shirley. Henry Winkler, who played Fonzie, and much of the rest of the Happy Days cast “returned” to Milwaukee for an unveiling. A crowd showed up. Speeches were made. Photos were snapped. Now the statue is a point on the city’s tourist itinerary, along with brewery tours and museums and the pretty lakeshore.

[youtube]http://www.youtube.com/watch?v=mWo0WE7cPjU&feature=related[/youtube]

The announcement in 2007 of plans for the Bronze Fonz was the occasion for more than just curiosity and excitement. Within the city’s arts community, the Bronze Fonz was an object of controversy and scorn. Rather than a project spurred by local arts institutions, the Bronze Fonz was the product of funds raised by Visit Milwaukee, a non-profit agency whose mandate is to promote tourism to boost the local economy. The Bronze Fonz was launched without the sanction of anyone in the local art world, and the local art world bristled at this trespass onto its turf. A prominent gallery owner threatened to close up shop and leave town if the plans went ahead (his gallery did close earlier this year). The director of the Milwaukee Art Museum wrote in opposition to the Bronze Fonz. The Milwaukee Journal-Sentinel art critic followed the controversy in a series of blog posts gently slanted against the project. A debate was joined between defenders of a concept of public art as serious and important to a city’s legitimate cultural identity on one side, and defenders of the virtues of increased tourism and fun on the other. The aesthetic mission of art opposed the crowd-pleasing, commercial taint of entertainment, and entertainment won the day.

This episode was yet another occasion for reasserting the cultural hierarchies which place television and pop culture as illegitimate, and which function to reproduce social distinctions. Even the statue’s promoters bought this line. The CEO of Visit Milwaukee, Dave Fantle, told a Journal-Sentinel reporter that the Bronze Fonz is “not art but a piece of pop culture.” In part this clarification was self-justifying; Visit Milwaukee managed an end around the usual processes involved in installing a public artwork in a prominent city location. But no matter the motivation, the discourse positioning a Fonzie statue as at best frivolous fun misses the appeal and significance of popular culture in ordinary people’s lives. To look at the photos people take of themselves posing with the statue is to recognize the depth of our identification with the narratives of popular culture. It might be fun, but it isn’t frivolous.

people with fonz

Sam and Mike with the Bronze Fonz

But at stake in this debate was more than just the promotion of values endorsed by the snooty guardians of culture or the populist civic boosters. The identity of the city was also a term in the controversy, and this identity is tied up in the representation of place in popular culture like Happy Days and Laverne & Shirley.

Like many Midwestern cities, Milwaukee’s past as a site of productive industry is more glorious than its post-industrial present. Today unemployment is high, especially among the city’s large and segregated racial underclass, and economic growth is slow. The city does not like to present this face to the region, the country and the world. It hopes to attract “creative class” young people and spendthrift convention tourism. It wants to seem like a hip, vibrant town with plenty of attractions to excite out-of-towners.

Happy Days (and Laverne & Shirley) might be a fond memory, but is not much to hang onto as a point of pride for Milwaukee. The setting was in some ways incidental to the show. Its producer Garry Marshall wanted to set Happy Days in his hometown of the Bronx, but the folks at Paramount thought this would come off as too ethnic (Brant, 20). They were looking to duplicate the family format of 1950s shows like Father Knows Best and clearly preferred to represent an idealized middle America. Wisconsinites are proud of their culture, for instance their foodways (the German-influenced beer and sausages; dairy foods like cheese and frozen custard; and Friday fish fry events where the Brandy Old Fashioneds are more often savored than the fish). But the ethnic or regional identity of Wisconsin or Milwaukee is hardly in evidence in Happy Days. For instance, more of the show’s characters have Italian names (Fonzarelli, Delvecchio) than the more locally common German ones. One hears more New York accents like Winkler’s than Wisconsin accents. You wouldn’t know from watching Happy Days that the typical white Milwaukeean speaks more like Sarah Palin than Mrs. C.

The setting of the show in Milwaukee was in a sense the denial of place rather than an investment in it. Marshall, and Italian-American and New Yorker, seems to have wanted to do urban-ethnic, but in some ways the specificity of his characters’ identities got scrubbed from the representation. Setting the show in Milwaukee was a way of striking a note of Americana — an idealization of “normal” America just as much as it was an idealization of the 1950s as a more innocent and familiar and comforting time in history. Jefferson High, where the characters go to school, and Arnold’s drive-in restaurant, where they slurp milkshakes and dance to jukebox records, are meant to be the kind of school and hangout that you might find anywhere in the USA.

cast of happy days

Fonz with the cast of Happy Days

Milwaukee can be tickled to have a Fonzie statue because Fonzie is essentially cool. But it’s hard to be proud to have been chosen as an undistinct enough place to stand in for Anytown, USA. America of the big cities and coasts is often blind to the specificity of the places in the middle of the country, and Happy Days is symptomatic of this. (Things change in Laverne & Shirley, which is more blue-collar and which has the brewery setting to add local color.)

But we are also proud to have been the setting of a popular show at all, an opportunity for “Milwaukee” to have been uttered on a regular basis to an audience of millions. To this day, legend persists that a frozen custard stand called Leon’s was the inspiration for Arnold’s. Leon’s has gorgeous vintage signage and looks like it hasn’t changed since the 1950s.

Leon's Frozen Custard Stand

Leon’s Frozen Custard Stand

If you ask the custard server if it’s true about Leon’s being the original Arnold’s she might say yes (I’ve heard it myself). But the truth is otherwise. Tom Miller, a producer at Paramount, suggested setting Happy Days in the suburban Milwaukee neighborhood where he grew up after Garry Marshall’s Bronx setting was nixed (Brant, 20). Jefferson High was based on Nicolet High in Glendale, WI. Arnold’s was based on a drive-in called The Milky Way on Port Washington Rd. that no longer exists. But it pleases people around town to think that the place that inspired Arnold’s still serves food, and that you can still go back there if you want to visit the 1950s as represented on television. Like everywhere else, nostalgia for a past that never existed lives in Milwaukee.

Image credits:
1. The Bronz Fonz – photo by author
2. Sam and Mike with the Bronze Fonz – used with permission
3. Fonz with the cast of Happy Days – Author’s personal collection
4. Leon’s Frozen Custard Stand – photo by author

Please feel free to comment.




lonelygirl15: The Pleasures and Perils of Participation

lonelygirl15 Video Blog on You Tube

lonelygirl15 Video Blog on You Tube

IN ADDITION TO OUR REGULAR COLUMNISTS AND GUEST COLUMNS, FLOW IS ALSO COMMITTED TO PUBLISHING TIMELY FEATURE COLUMNS, SUCH AS THE ONE BELOW. THE EDITORS OF FLOW REGULARLY ACCEPT SUBMISSIONS FOR THIS SECTION. PLEASE VISIT OUR “CALLS” PAGE FOR CONTACT INFORMATION.

The best television show of the summer of ’06 by far, says Virginia Heffernan, television critic and blogger for The New York Times, was not Rescue Me or Entourage, or even Project Runway. It actually wasn’t on television, strictly speaking, but on YouTube.1 It was lonelygirl15, which is the username of its star, Bree, a personable sixteen year-old video blogger who has appeared in a regular series of short videos, some of which have been viewed more than half a million times. In her vlogs, Bree sits facing a camera in her bedroom and opens up about her life. She talks about her strict, religious parents, her homeschooling, and her friend Daniel (a/k/a Danielbeast), who often appears in the background sprawled on Bree’s bed reading a magazine. Daniel supposedly edits Bree’s videos and occasionally posts some of his own in response to hers; he is probably also madly in love with her. When we first encountered them, lonelygirl15 and Danielbeast seemed like just another couple of kids with a computer and a webcam engaging in participatory culture and creative expression. They could have been any of millions of young people living in the age of web 2.0, when the consumer/producer distinction is passé and one’s whole life is “content.”

lonelygirl15 with scissors

lonelygirl15 with scissors

But it is now evident that despite appearances, Bree and Daniel are fictional characters rather than real people and that the videos are not their creation alone. It is also clear that those who participated with Bree and Daniel in creating the lg15 phenomenon–by responding to their videos with other videos, by writing about them and spreading their fame, by engaging them in discussion, by taking them at face value, all in good faith–were in some sense bamboozled by a crafty group of storytellers eager to harness the power and exploit the credulity of a new media community and the tastemakers who track it. Like the marks of any good trick, we who were fooled want at once to congratulate and castigate the tricksters. Most of all, we want to understand how they pulled it off. lg15 demonstrates the sharp double edges of today’s participatory media. Web-shared video seems like a democratic technology making it possible for ordinary people in their everyday lives to create an alternative to mainstream media. lonelygirl15 demonstrates that it can also evidently be used by more sophisticated media producers to engage that very audience of eager participants–and to what ends it remains to be seen.

When she first attracted widespread attention, Bree was supposedly an average girl who might just make it in Hollywood on the strength of a homemade screen test. Her videos inspired a legion of admirers and detractors, many of whom posted response videos to YouTube. She got written up not only in numerous entries of Heffernan’s blog, but in the Times of London and in New York magazine, which declared that the videos marked “the birth of a new art form,” a line that the lg15 creators later parroted to advertise their intentions. Just as Bree’s star was rising, however, the lg15 community began to doubt the very authenticity that made her persona compelling. One YouTuber expressed skepticism because the videos use fill lights and have a story arc, two things absent from the typical vlog. Bree would reply to some fellow users’ and some reporters’ e-mails, but remained coy about her identity and location. Wikipedia editors opposed her inclusion in their website with the wacky fervor of JFK conspiracy nuts on the grounds, partly, that no one could say if she was real or fake. An untold multitude began to pay more than casual attention and many found themselves descending, in the terms of their fan community, “down the rabbit hole.” Could lonelygirl15 be a viral marketing campaign for a consumer product yet to be revealed, or an elaborate promotion for another media product, like a horror movie, or more simply a Blair Witch-style mockumentary? Could it all be a kind of game? Is Bree sweet or is she a fraud? Is she sixteen or more like twenty-one? Are the videos spontaneous or scripted? When it came to lg15, the lines between reality and fiction, natural personal expression and the mechanics of narrative, authenticity and contrivance were never very clear, and this is what made it such vital, momentous entertainment.

lonelygirl15 reads Viral Marketing for Dummies

lonelygirl15 reads Viral Marketing for Dummies

Although the audience is still watching and wondering about Bree and Daniel, the show lost some of its special charm beginning on September 8, when a confession signed “The Creators” posted on the lonelygirl15.com message board claimed that the videos are the work of filmmakers telling a story rather than kids innocently sharing their lives with the world. The following day, the Los Angeles Times reported that an e-mail supposedly from Bree to a MySpace account had been traced to an IP address at the Creative Artists Agency in Beverly Hills. On September 12, the community was abuzz with news that the woman who appears as Bree in the videos is a 19 year-old actress from New Zealand named Jessica, whose MySpace pictures (concealed by a privacy setting but previously public) were discovered in Google’s cache, apparently by some industrious fans’ sleuthing. For several weeks the YouTube and lg15 fan communities had been skeptical. But with this series of revelations, it seemed that part of what made lg15 so fascinating was gone. lg15 worked the tension between faith and doubt, but without that it began to seem like just another commercial media come-on. As one blogger declared, “lonelygirl15 minutes are up.”

Finally, over the following several days, many media outlets reported identities of “The Creators” as filmmakers Miles Beckett, Ramesh Flinders and Greg Goodfried, and their objective as initiating “a new form of collaborative online entertainment” (Adler). It turns out that the means of their production were quite modest. According to published reports, all they needed were a computer and some software, a $150 camera, some cheap desk lamps, a good actress they found using craigslist, and an open YouTube community where anyone can post or view a video. They shot lonelygirl15 in their own bedrooms.

In the beginning were the videos. Watching them, one can’t help but sympathize with Bree, who complains in her first vlog about her strict upbringing and confesses that the town where she lives is so boring that she has nothing better to do than spend time on her computer. She quotes from Guns, Germs and Steel and idolizes Richard Feynman. Her room is decorated with a Napoleon Dynamite poster and a pink feathered boa. She calls herself a dork and possesses every dorky smart boy’s fantasy, a combination of brains and beauty that makes her seem like a perfect fusion of Alex and Mallory Keaton. Most important, though, Bree has talent as a performer. She uses her eyes, lips, arms, and legs like a comedian of the silent screen. Every time she frowns, shows a wide-opened gaze or protruding tongue, every time she rests her chin on her knee she expresses some essential quality of adolescence. The internet is home to innumerable kids goofing off in front of a camera, pulling faces, dancing in their bedrooms, hamming it up. Bree does this routine as well as anyone.

The real lonelygirl15 at the VH1 Big06 Awards

The real lonelygirl15 at the VH1 Big06 Awards

But that was at most half of lonelygirl15’s appeal. The rest was a product of speculation about the production and provenance of the videos. To those like me who were taken in by lg15, the experience of tracking the theories sometimes overwhelmed the experience of watching the show. Long before The Creators’ confession, there were many hints that lg15 wasn’t just a sincere production of two talented kids. Not only are the videos well lit and not only do they trace an arc, they also have unusually clean sound, good postproduction work with fast motion and music, and lots of editing. Although they are hardly technically polished like a feature film, they are several notches above the typical teenage YouTuber’s work. More curious yet, in a video posted August 6, we see a shrine in Bree’s room to Aleister Crowley, the British occultist. The suggestion that her family’s faith is something obscure, cultish, and possibly Satanic seemed contrived to generate controversy and discussion. Another hint of fakery came when it was discovered that the domain name of a fan site, lonelygirl15.com, was registered one month before the first video was posted, suggesting a planned launch. Finally, for two months no one was able to pin down who exactly Bree is or where she lives. No friends came forward and people’s efforts to get in touch with her produced only a few uninformative e-mails.

Taken together, the primary texts and their surrounding discourses made lg15 into compelling narrative. The videos themselves are full of plot, scenes that play out between Bree and Daniel and stories she tells about her conflict with her homeschooling parents. Blogs, message boards, comment threads, and articles about the videos generated other kinds of narrative: speculative stories that fans and amateur lonelygirl scholars wove about the creation of the vlogs. Considering both the videos and these discourses surrounding them, lg15 was a rich and layered text. One became immersed in its world just as when watching a television serial like Battlestar Galactica or reading a fat novel with a good plot. The videos and the ideas that sprouted from them brought to mind Lost and The Da Vinci Code, and although the audience for lg15 was still smaller than for either of those, its passion was as strong.

Before the confession of fakery, Heffernan told NPR’s On The Media that lg15 fans would not be disappointed to find out that it’s not exactly what it appears to be. Everyone had basically assumed that to be the case. But the fans were dying to know the rest of the story not only of Bree and Daniel and her religious rites and strict parents, but also of how The Creators managed to hook us on it, how they got our attention and what their objectives were. lg15 appealed to what Neal Harris, in his biography of P.T. Barnum, calls “the operational aesthetic.” Barnum’s hoaxes and museum exhibits presented astonishing spectacles and the audience would be eager not only to witness the incredible, but also, essentially, to be shown how the showman pulled it off. The videos were an internet equivalent and the audience was eager to see it unraveled. We were taken in by the magic but at the same time, we needed to know how it had been conjured. The internet has been the site of a zillion hoaxes, but there is something unusual about this one. It comes during this cycle of greater democratization and interactivity, with media users and makers coming together like never before. lg15 has been a contradictory phenomenon, at once drawing its energy from the web community and taking advantage of it. Daniel and Bree’s videos are among the best things to watch on any size screen these days, and this is a testament not only to the craft and imagination of their makers, but also to the audience that has made their feat possible in so many ways.

Note
1 Heffernan made this claim in an interview on NPR’s On The Media during the episode of September 1, 2006.

Image Credits:

1. Video Blog on You Tube

2. With Scissors

3. Reading Viral Marketing for Dummies

4. The real lonelygirl15

Works Cited
Adler, Shawn. “Internet Phenom Lonelygirl15 Reveals Secrets, Says Bree Is Someone ‘Everybody Can Relate To’.” MTV News. MTV Networks. 14 Sept. 2006.

Harris, Neil. Humbug: The Art of P.T. Barnum. Chicago: U of Chicago P, 1981.

Richards, Jonathan. “Worldwide Fame for a Lonely Girl.” Times online. Times Newspapers Ltd. 19 Aug. 2006.

Rushfield, Richard. “Lonelygirl? Not any Longer.” latimes.com. Los Angeles Times. 16 Sept. 2006.

Sternbergh, Adam. “Hey There, Lonelygirl.” New York Magazine. New York Magazine Holdings. 28 Aug. 2006.

Please feel free to comment.