“Forever Young”: Digital De-Aging, Memory, and Nostalgia
Kathleen Loock / Europa-Universität Flensburg

Digital de-aging allowed Robert De Niro to play the same character at multiple points in his life.
In The Irishman, digital de-aging allowed Robert De Niro to play the same character at multiple points in his life.

Digital de-aging, the process of making actors appear younger on screen than they actually are, is becoming the new normal in Hollywood. After first experimentations with computer-generated youthfulness in a flashback scene of X-Men: The Last Stand (Ratner) in 2006, and—already much more refined—in 2008’s The Curious Case of Benjamin Button (Fincher), TRON: Legacy (Kosinski), which was released in 2010, had 60-year-old actor Jeff Bridges play opposite his digitally rejuvenated self, who looked like the 32-year-old Bridges from 1982’s TRON (Lisberger). Having an older character interact with his younger self in this manner pushed the boundaries of digital de-aging, which has since further advanced and become an increasingly more convincing and, overall, more common visual effect in Hollywood cinema. From Orlando Bloom as unaging elf Legolas in The Hobbit: The Desolation of Smaug (Jackson, 2013) to Michael Douglas’s Hank Pym in Ant-Man (Reed, 2015) and Ant-Man and the Wasp (Reed, 2018), from Robert Downey Jr., who plays a young version of Tony Stark in Captain America: Civil War (Russo & Russo, 2016), to the de-aged Kurt Russell in Guardians of the Galaxy Vol. 2 (Gunn, 2017), Johnny Depp in Pirates of the Caribbean: Dead Men Tell No Tales (Rønning & Sandberg, 2017), and Colin Firth in Kingsman: The Golden Circle (Vaughn, 2017)—(mostly male) actors are getting a digital facelift that erases their real age and chronology.

Over the past decade, digital de-aging has become increasingly more convincing and more common in Hollywood cinema, turning back the clock on (mostly male) actors.
Over the past decade, digital de-aging has become increasingly more convincing and more common in Hollywood cinema, turning back the clock on (mostly male) actors.

The practice has finally become so prevalent that 2019 has variously been called “a monumental year for de-aging in film” (Kemp), “a year haunted by the digital phantoms of movie stars as they once looked” (Dowd), and the year when “Hollywood has become unstuck in time” (Breznican). A total of six blockbusters—Captain Marvel (Boden & Fleck), Avengers: Endgame (Russo & Russo), It Chapter Two (Muschietti), Gemini Man (Lee), Star Wars: The Rise of Skywalker (Abrams), and The Irishman (Scorsese)—de-aged their stars with the help of visual effects companies like Lola VFX, Weta Digital, and Industrial Light & Magic (ILM). These companies have perfected their different techniques over the last few years, which involve “digital cosmetics” to smooth out wrinkles and remove blemishes with patches, blurs, glows, and digital paint, as well as tracking markers, scans, CGI models, performance capture technology, and reference material from past performances that is combined with the new footage. Considering the digitally edited faces of Robert De Niro, Al Pacino, and Joe Pesci in The Irishman, director Martin Scorsese expressed worries about what he calls the “youthification” of actors he has known and worked with all his life (Rose). Digital de-aging is supposed to be an invisible effect in the service of unprecedented realism. But, as Scorsese’s misgivings and many critics’ voices show, it still remains a controversial filmmaking tool and one, I argue, that enters into competition with memory and nostalgia for the past.

Martin Scorsese relied on the digital de-aging of actors Robert De Niro, Al Pacino, and Joe Pesci to tell his character-driven, decades-spanning mob epic, The Irishman.

To be sure, the increasingly realist aesthetic of digital de-aging has been lauded as a breakthrough for visual effects technology and storytelling. It has proven to solve problems with the rules about time and time travel in the Marvel Cinematic Universe, and made it possible to realize the science-fiction premises of movies like TRON: Legacy, Terminator: Genisys (Taylor, 2015), and Gemini Man, where time warps and cloning drive the plots, as well as decades-spanning epics such as The Irishman, that center on the long-term development of (aging) characters, without layers of make-up, prosthetics, or casting different (i.e., younger) actors in the same roles. Advocates of the practice have pointed out how de-aging supports the suspension of disbelief as it allows filmmakers to create less disruptive links between the past and the present. “Youthified” actors have also commented on how de-aging might impact the longevity of their careers. In The Irishman: In Conversation, the Netflix special feature that has Scorsese, De Niro, Pacino, and Pesci talking about the production of The Irishman, De Niro weighs in on the de-aging technology, predicting that, “We’ll all be able to act for another 30 years.” At a screening of Gemini Man, Will Smith joked about the future use of his data and how he will no longer need to stay in shape: “There’s a completely digital 23-year-old version of myself that I can make movies with now. … I’m gonna get really fat and really overweight” (King).

In Gemini Man, 51-year-old Will Smith plays opposite a 23-year-old version of himself.

More skeptical observers of the trend have expressed their fears about the legal implications of digital de-aging technology and the data it amasses and about the diminishing prospects for young actors to land a breakout role. They are also worried about the gray area in which Hollywood’s de-aging efforts and inexpensive, accessible deepfake software seem to converge, arguing that “the drive to fool the viewer is the same” (King). Most notably, however, there is disagreement about whether the technology has sufficiently advanced so that de-aging does not have an “uncanny valley” effect (Masahiro). Despite the high degree of verisimilitude digitally rejuvenated faces have achieved in Hollywood, something seems to be off with “youthified” versions of familiar actors on screen that threatens to disturb audiences and cause discomfort. The uncanniness of a de-aged Will Smith or Robert De Niro can be located in the occasional weird sheen on their altered features as well as in the unnatural movements that either seem inhumanly fast and smooth (in the case of Gemini Man) or belong to an elderly, less intense actor rather than the one that digital de-aging technology has created. “You can make a seventy-something Robert De Niro look young (or at least, come somewhat close to it),” writes Vulture’s Bilge Ebiri, “but you can’t really make him act young. Especially for an audience that remembers what a young Robert De Niro did look like, and sound like, and move like.” This observation is as important as the fact that Gemini Man’s young, muscular Will Smith is nothing like the lanky, mustache-wearing Will Smith audiences know from the 1990s sitcom, The Fresh Prince of Bel-Air (NBC, 1990-1996).

I suggest that there is an alienating disconnect that has ultimately less to do with the technical perfection and realist aesthetic of de-aging and more to do with the ways in which the digital doppelgänger interferes with a star’s intertextuality (i.e., the ways in which an actor’s previous films and—aging—star persona determine readings of his or her performances) and, above all, with the memories, desires, and nostalgic longings that audiences associate with a familiar actor’s actual younger self. A de-aged Will Smith or Robert De Niro, in other words, may serve Hollywood’s storytelling purposes, yet the discrepancy between what audiences recall and what they see onscreen may pose an existential threat to how people understand (and remember) themselves and the world in which they live in relation to popular culture. By following an actor’s work over many years and decades, audiences synchronize their own memories and lived experiences with movies, TV shows, and career trajectories, often with a nostalgic glance backwards that helps to construct and maintain a coherent, consistent sense of identity in the present. If digital de-aging produces “youthified” versions of familiar actors as it helps aging performers to stay “forever young,” it produces alternate realities that threaten to overwrite audience memory and eventually detract from the illusion that de-aging technology seeks to create.

Image Credits:

  1. In The Irishman, digital de-aging allowed Robert De Niro to play the same character at multiple points in his life.
  2. Over the past decade, digital de-aging has become increasingly more convincing and more common in Hollywood cinema, turning back the clock on (mostly male) actors.
  3. Martin Scorsese relied on the digital de-aging of actors Robert De Niro, Al Pacino, and Joe Pesci to tell his character-driven, decades-spanning mob epic The Irishman.
  4. In Gemini Man, 51-year-old Will Smith plays opposite a 23-year-old version of himself.


Breznican, Anthony. “The Irishman, Avengers: Endgame, and the De-aging Technology That Could Change Acting Forever.” Vanity Fair 9 Dec. 2019. Web. 9 Mar. 2020. https://www.vanityfair.com/hollywood/2019/12/the-de-aging-technology-that-could-change-acting-forever

Dowd, A. A. “Gemini Man Uses De-Aging Technology to Make a Case against De-Aging Technology.” AV Club 15 Oct. 2019 Web. 9 Mar. 2020. https://film.avclub.com/gemini-man-uses-de-aging-technology-to-make-a-case-agai-1839041610

Ebiri, Bilge. “So, How Is the De-Aging in The Irishman? Incredibly Impressive.” Vulture 27 Sept. 2019. Web. 9 Mar. 2020. https://www.vulture.com/2019/09/the-de-aging-in-the-irishman-how-bad-is-it.html

Kemp, Matt. “‘Holy
Grail’ Digital Effects Rewinding the Clock for Actors.” AP News 12 Jan.
2020. Web. 9 Mar. 2020. https://apnews.com/43f8ed7e9a753c2191c9af7f4754bd6c

King, Darryn. “The Game-Changing Tech Behind Gemini Man’s ‘Young’ Will Smith.” Wired 24 Sept. 2019. Web. 9 Mar. 2020. https://www.wired.com/story/game-changing-tech-gemini-man-will-smith/

Masahiro, Mori. “The Uncanny Valley.” Trans. Karl F. MacDorman and Norri Kageki. IEEE Robotics & Automation Magazine (June 2012): 98-100. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6213238

Rose, Steve. “Will Hollywood’s New Youthifying Tech Keep Old Actors in Work for Ever?” The Guardian 10 June 2019. Web. 10 Mar. 2020. https://www.theguardian.com/film/2019/jun/10/will-hollywoods-new-youthifying-tech-keep-old-actors-in-work-for-ever

A Streaming Comes Across the Sky: Peak TV and the Fate of Nostalgia
Siobhan Lyons / Macquarie University

Magazine covers of Twin Peaks and X-Files in the '90s
Entertainment Weekly and TV Guide promoting Twin Peaks and The X-Files in the ‘90s

The last decade witnessed an obsession with nostalgia unseen in previous decades. A seemingly infinite list of reboots and revivals of ’80s and ’90s television, from Dallas to Full House, Roseanne to Twin Peaks, as well as ’80s-themed shows including Stranger Things and American Horror Story: 1984, have been enormously popular not only for their novelty, but also due to the ease of access that streaming platforms have afforded contemporary audiences. This longing for the past has become symptomatic of the “presentless” landscape of contemporary culture. No other culture has been able to archive itself so swiftly or had so much access to older films and television shows via Youtube and internet streaming.

Yet ironically, the very act of streaming—a paragon of binge-watching—has had an unprecedented impact on the very nature of nostalgia itself; the plethora of options given to audiences today through streaming, what has come to be known as “peak TV,” has in turn reframed the media landscape to focus on tailored entertainment, which has, in effect, destabilized the experience and formation of nostalgia.

From Twin Peaks to Peak TV

“Peak TV” was termed by FX Chief John Landgraf in 2015 to describe the saturation of television shows that were “drowning us in content.”[ ((Joy Press, “Peak TV Is Still Drowning Us in Content, Says TV Prophet John Landgraf,” Vanity Fair, August 3, 2018, https://www.vanityfair.com/hollywood/2018/08/peak-tv-fx-john-landgraf-tca-donald-glover-chris-rock))] This trend skyrocketed in the ensuing years. In November 2019 alone, Apple TV+ released four drama television shows (For All Mankind, The Morning Show, See and Servant) followed by a fifth in December (Truth Be Told). Twenty more original programs have since been ordered. As an over-the-top (OTT) service, Apple TV+ bypasses broadcasting, cable, and satellite and operates exclusively through the internet, creating a viewing experience that also bypasses time and space, fragmenting audiences as a result.

Apple CEO Tim Cook unveiling new lineup of shows for Apple TV+
Apple CEO Tim Cook unveiling new lineup of shows for Apple TV+

Although both Netflix and Apple TV+ emphasize “original programming,” the sheer number of shows has had a troubling impact on the nature of viewing itself. As Catherine Bouris puts it: “Every week brings new TV shows you ‘have to’ watch, every day brings more hype on social media for yet another show you’ve just got to add to your list.”[ ((“There Are Too Many ‘Must-Watch’ TV Shows And We Need To Start Being More Picky,” Goat, May 27, 2019
https://goat.com.au/tv/there-are-too-many-must-watch-tv-shows-and-we-need-to-start-being-more-picky/))] She describes her television viewing experience today as less a form of entertainment or relaxation and more of a “job”: “I’m overwhelmed by the sheer number of TV shows being produced, I’m overwhelmed by the pressure I’m placing on myself to watch every ‘must see’ TV show.”

Kelly Lawler of USA Today writes that Apple TV+ and Netflix are in effect “ruining” TV:

There’s just too much TV. There was too much in 2018 and 2017, too, and probably even before 2015, when FX chief John Landgraf, coined the term “Peak TV” to describe the then-400+ scripted TV shows on the air. This year, there are more than 500. And that doesn’t count reality shows, talk shows, documentaries and everything else.[ ((“Too much: Why the streaming wars between Apple, Disney, HBO and more are ruining TV,” USA Today, November 14, 2019, https://www.usatoday.com/story/entertainment/tv/2019/11/14/disney-plus-apple-tv-plus-streaming-wars-ruining-tv/2516655001/))]

In contrast, television programming in the mid to late twentieth century was confined to a set number of television shows that allowed audiences and fan bases to form more organically, allowing shows from The Sopranos to Game of Thrones to distinguish themselves. While a few streaming programs have managed to secure notable fan bases in the last few years, such as Stranger Things and BoJack Horseman, the influx of television shows is such that it makes it difficult—if not impossible—to replicate the kind of “golden era” television viewing of previous decades, in which fewer shows were embraced by larger audiences.

The effect is twofold, with the inundation of television shows leading to a more fragmented viewership, which, in turn, is diminishing the formation of nostalgia as something directly related to the shared experience of watching television.

From Communal to “Niche Nostalgia”

Nostalgia—first termed by Johannes Hoffer in 1688 and described as a psychological “disease”[ (( Julie Beck, “When Nostalgia was a Disease,” The Atlantic, August 14, 2013, https://www.theatlantic.com/health/archive/2013/08/when-nostalgia-was-a-disease/278648/))]—is both a personal and shared phenomenon. The kind of nostalgia culture witnessed throughout the 2010s was fostered by the shared viewing of the rebooted shows in their heyday, from The X-Files to Twin Peaks. The broadcast medium—which inculcated both patience and longing—ensured the formation of a substantial audience and fan base from which nostalgia could successfully grow. Cliffhangers and mysteries from “Who Shot J.R.?” to “Who Killed Laura Palmer?” were embraced as communal narratives. As John Hartley points out:

Broadcast TV proved to be better than the press, cinema or even radio at riveting everyone to the same spot, at the same time—in fear, laughter, wonderment, thrill or desire. Television’s emblematic moments—the shooting of J.R. Ewing in Dallas or J.F. Kennedy in Dallas; the moon landings; the twin towers; Princess Diana’s wedding and funeral; the Olympics and football World Cup finals—the cliffhangers, weddings, departures and finales gathered populations from across all demographic and hierarchical boundaries into fleetingly attained but nevertheless real moments of “we-dom.”

Time and People Magazine speculating on J.R.’s shooter in Dallas
Time and People Magazine speculating on J.R.’s shooter in Dallas

Dhiraj Murthy argues that “these events were discussed for days and months afterwards and became part of the cultural memory for a generation of American television watchers.”[ ((Dhiraj Murthy, Twitter: Social Communication in the Twitter Age (Cambridge: Polity, 2013) 35.))] These moments have considerably lessened with the advent of streaming platforms, which have radically altered the experiential nature of television to the point where there is no longer the same sense of communal nostalgia.

In the late twentieth century, broadcast television was defined by a significant sense of community, by the limitations of the broadcast medium that forced audiences to be in a certain place at a certain time. Reruns and videos became the only means of watching a missed episode, which often meant that families or friends would be watching a finite number of programs together.

The 2010s and 20s, in contrast, have such an overabundance of content that the ability for communal nostalgia to form in the first place is in a state of flux since tastes and choices are so divided. While this may mean that tastes are much less homogenous than they used to be, it also potentially signals the demise of the nostalgic experience in television as facilitated by group fandom. As Paul Hiebert argues, it’s conceivable that “for coming generations the feeling of nostalgia might eventually disappear—not due to over-exposure or diminished effect, but because of its inability to form in the first place.”[ (( Paul Hiebert, “Will the Web Kill Nostalgia?,” Pacific Standard, March 31, 2015 https://psmag.com/environment/not-unless-we-invent-a-time-machine))]

While many contemporary viewers are, of course, watching and bonding over similar shows, there is nothing that guarantees our viewership experiences are entirely shared like they used to be since there is a lower concentration of viewers for any one show. As Brian Raferty asks: “Years from now, when we finally gaze back at the pop highlights of this modern age, will any of us even be looking in same direction?”[ ((Brian Raferty, “Enjoy the Early-’00s Nostalgia Wave—It Might Be the Last Revival,” Wired, May 24, 2017, https://www.wired.com/2017/05/the-future-of-nostalgia/))] He claims that “future waves of nostalgia will focus less on specific pop-cultural explosions, and more on the technologies that allowed them to spread. That’s partly because it’s never been easier to tune out the mass culture, making shared moments all the more rare.”

Streaming culture has encouraged fragmented, isolated consumer habits that limit the ability for a decade to formulate its own zeitgeist in the same way as the ’60s, ’70s, ’80s and ’90s did. As Dev Allen argues: “The mainstream has split into a thousand different subcultures which is practically impossible to track.”[ ((Dev Allen, “Why Nostalgia Could End with the ’90s,” Now This Nerd, August 28, 2018 https://www.youtube.com/watch?v=PCvzfOj8spk&list=LLCbbhgGkmTNgJm9VrBT_8DA&index=42&t=0s))] Discussing the 2010s, he says: “What would a 2010s throwbacks even look like?,” further observing that “there’s no such thing as a zeitgeist anymore.”

Amanda D. Lotz observes that while television retains significance, it does so by “aggregating a collection of niche audiences.”[ (( Amanda D. Lotz, The Television Will Be Revolutionized (New York: New York University Press, 2014) 34.))] By extension, television also creates and promotes niche nostalgia, whereby nostalgia increasingly becomes an isolated, tailored experience. Not only does this mean that there will be fewer moments of “we-dom,” as Hartley puts it, but the absence of this sense of “we-dom” will have a corresponding effect on our ability to formulate a distinct cultural character that would form the basis of future nostalgia.

Image Credits:

  1. Entertainment Weekly and TV Guide promoting Twin Peaks and The X-Files in the ‘90s
  2. Apple CEO Tim Cook unveiling new lineup of shows for Apple TV+
  3. Time and People Magazine speculating on J.R.’s shooter in Dallas


Gender, Place, and Nostalgia in The Marvelous Mrs. Maisel
Helen Morgan-Parmett / University of Vermont

Maisel promo
Promotional image for The Marvelous Mrs. Maisel, featuring the leading cast posed amidst a New York City street.

“It’s interesting. My father pointed out that my favorite part about a newspaper is the ads for shoes. And I felt bad about that, but now I think maybe they just put those ads in newspapers to distract us. Because if women don’t realize what’s going on in the world, they won’t step in and fix it. Because they will fix it- And accessorize it!” (Midge Maisel, Season 1, Episode 4, “The Disappointment of the Dionne Quintuplets”)

The Marvelous Mrs. Maisel (Amazon, 2017-present) is a dramedy set in late 1950s New York City about a Jewish housewife (Miriam (Midge) Maisel) from the Upper West Side who stumbles into a nascent career in stand-up comedy after her husband cheats on her with his secretary, and, subsequently, leaves her. In the series’ fourth episode, Midge accidentally ends up at a protest in Washington Square Park. As she strolls through the park with her son, someone is playing the piano as children, mothers, strollers, and passerby meander. Disrupting her reverie, she is bumped by another woman, who apologizes before running off, declaring, “I hope she hasn’t spoken yet!” We soon learn “she” is Jane Jacobs[ (( Jacobs is best known for her book The Death and Life of Great American Cities and her efforts at grassroots organizing to protect Greenwich Village, her neighborhood, from Robert Moses’ “slum clearance” plans in the 1950s that were to make way for the building of the interstate highway system. See Jacobs, Jane. The Death and Life of Great American Cities. New York: Random House, 1961. ))], the real-life activist who critiqued and fought 1950s urban renewal policy for its privileging of industry, cars, and capital over people, culture, and experience. Midge makes her way to the rally and finds it is a protest of Robert Moses’ plan to demolish the park to make way for the interstate highway.[ (( The scene is based on an actual event, where activists protesting Moses’ plan organized a rally that has become memorialized in a photo of the “Last Car thru Washington Square” (although it would not be until April 1959 that the Square was actually closed to traffic. http://www.washingtonsquareparkblog.com/2013/04/05/54-years-since-washington-square-park-officially-closed-to-traffic/. ))] Before she knows it, Midge is called to the microphone, much in the same coincidental manner she finds her way to the microphone at the Gaslight Café in the season’s pilot. Ever the performer, Midge works the crowd, declaring that now that she, and other women, are aware of the harms being done to their city, not only will they not stand for it, but they “will fix it, and accessorize it.”

Midge speaking at rally
Miriam (Midge) Maisel (played by Rachel Brosnahan) speaking in Washington Square Park at a Jane Jacobs rally against Robert Moses’ plan to build a highway through the park.

Midge and protestors
Midge speaking at Washington Square Park rally, surrounded by protestors holding signs declaring statements such as “Strollers not cars” demonstrate the gendered dimensions of Jacobs’ and other’s critique of Robert Moses’ urban renewal plan for New York and other cities, by arguing that cities needed to make space for mothers, children, and families.

1958 rally and filming
Left: Picture taken from 1958 rally featuring “Last Car Thru Washington Square.” Right: Picture of filming on-set of The Marvelous Mrs. Maisel, replicating the “Last Car Thru Washington Square” rally.

The rally is so
far Midge’s only entry into any kind of formal political activism. Instead, her
comedy routine, largely pushing boundaries against 1950s gender norms, becomes
the primary focus of her (coincidental) politics. Still, the rally is a notable
scene for what it intimates about the show’s commentary on New York City, both
then and now, and, perhaps especially, what women might do about it.

Much like the beleaguered New York of Jane Jacobs’ era, New York City has again become subject to a discourse of urban crisis—not because it is faltering, as in Jacobs’ day, but because it is prospering. As the city becomes an urban playground for the rich, and iconic neighborhood establishments close to make way for luxury condos no one will ever occupy, lamentations over a lost New York abound. Consider, Jeremiah Moss’ popular blog and book, Vanishing New York[ (( Moss, Jeremiah. Vanishing New York: How a Great City Lost Its Soul. New York: Dey Street Books, 2017. ))], dedicated to cataloging the disappearance of a New York both iconic and mundane. As a July 2007 post ominously notes, “Now I wait, hiding inside these bricks, blighted and condemned, for the wrecking ball to come for me as it will eventually come for you. In the end, we will all be lost in the pile of this vanishing city.”

Moss is neither the first nor last to decry that New York just isn’t what it used to be. In 1967, Joan Didion’s now famous essay, “Goodbye to All That,” kicked off a whole genre of writing about a New York loved and lost. Although Didion’s essay is more about a longing for a lost youth than for Moss’ vanishing New York, the current mode of “Goodbye New York” is much more along the lines of the latter, where everyone from celebrities to unknown struggling artists wonder if New York is really worth it anymore. For David Byrne, New York is becoming a city for the 1% that no longer makes things and is especially hostile to the social and economic conditions that foster art, creativity, and culture. For Ann Friedman, whose essay, “Why I’m Glad I Quit New York at Age 24,” went viral, New York is “that guy”—you know the one—“the prom king. He knows he’s great, and he’s gonna make it really, really hard on you if you decide you want to love him.” Friedman is just one of a number of women penning such essays, leading the feminist blog Jezebel to query, “Is Dumping New York City ‘A Girl Thing?’”

Amidst this longing for a lost New York, it is little wonder that some of the most popular recent television series set in the city are period pieces, including The Marvelous Mrs. Maisel.  In its fastidious set design, aimed to transform today’s city streets into the sites, scents, sounds, and experience of 1950s New York, the series produces affects of nostalgia. As production designer, Bill Groom noted, “It’s nice sometimes to capture a little bit of the New York that’s disappearing.” The series imagines a New York that was more gritty and “authentic” than the one of today, but also a past in which women were at the cusp of a movement and leading the charge to create a city that responded to their needs. Undoubtedly, this nostalgia is as much about the present and future as the past, responding to gendered discourses of urban crisis in present-day New York City while reminding us of a past in which New York women were poised to create change.[ (( Eichhorn, Kate. “Feminism’s There : On Post-Ness and Nostalgia.” Feminist Theory 16, no. 3 (December 2015): 251–64. https://doi.org/10.1177/1464700115604127. See also Boym, Svetlana. The Future of Nostalgia. New York: Basic books, 2001. ))]

What are we to make of this nostalgic longing, especially as it becomes pegged to Jane Jacobs and her influential work on gender and the city? As Jacobs makes her way into contemporary popular culture, her theories of women’s importance in creating vital and vibrant cities is also seeing a revival in urban planning and renewal strategies. Marguerite Van Den Berg suggests Jacobs’ appeal is precisely because of her emphasis on gender—“femininity is here associated with the imagined future city: a city of creativity and spontaneity…Jacobs is mobilized because she symbolizes this non-modern spontaneity, but also because she is a woman.”[ (( Van Den Berg, Marguerite. Gender in the Post-Fordist Urban: The Gender Revolution in Planning and Public Policy. Cham, Switzerland: Palgrave Macmillan/Springer Nature, 2019, 25. ))] Yet, unlike Midge, the women of today’s New York, especially women of color and working class women, are not only fighting new Robert Moses figures—real estate developers and tycoons who want to clear the city to make way for new high rises and highways. They are also, in a sense, fighting today’s Jane Jacobs, whose ideals have been adapted by real estate developers and bureaucrats alike to preserve and reinvent the city’s “authenticity,” driving up rents and displacing the poor as much as those strategies more Moses-like.[ (( Zukin, Sharon. Naked City: The Death and Life of Authentic Urban Places. Oxford; New York: Oxford University Press, 2010. ))] While Midge joins arms with other women calling for the preservation of Washington Square Park, if she were transported to today, would she be protesting yet another coffee chop, Edison lightbulb decorated craft beer pub, cat café, or axe-throwing range?

One can only wonder.

Nostalgia is a
powerful affect. It remains to be seen how Mrs. Maisel exactly imagines
Midge and other women will ultimately “fix” and “accessorize” the city. I guess
we will just have to keep watching.

Image Credits:

  1. Promotional image for The Marvelous Mrs. Maisel, featuring the leading cast posed amidst a New York City street.
  2. Miriam (Midge) Maisel (played by Rachel Brosnahan) speaking in Washington Square Park at a Jane Jacobs rally against Robert Moses’ plan to build a highway through the park.
  3. Midge speaking at Washington Square Park rally, surrounded by protestors holding signs declaring statements such as “Strollers not cars” demonstrate the gendered dimensions of Jacobs’ and other’s critique of Robert Moses’ urban renewal plan for New York and other cities, by arguing that cities needed to make space for mothers, children, and families.
  4. Left: Picture taken from 1958 rally featuring “Last Car Thru Washington Square.” Right: Picture of filming on-set of The Marvelous Mrs. Maisel, replicating the “Last Car Thru Washington Square” rally.


I Lost my Wife to Facebook, and Other Myths that Might be True

by: Michele Byers / Saint Mary’s University

Collective Facebook

Collective Facebook

I first heard about Facebook a few months ago. My cousin, an undergraduate student, was having dinner at my house and mocking my husband mercilessly about his MySpace page. MySpace, she intoned, was over; Facebook was “it.” As a scholar of media, she found me guilty of not keeping up with the times. The Internet isn’t really one of my main research areas, I reasoned; I barely have time to keep up with my email. But I was intrigued. The only way to get into the Facebook system is to sign up and in so doing get a page of one’s own. I did so. A bare bones page… but four hours later I was still “on” Facebook, complaining to my husband about how few old friends I could find.

Facebook is a social networking site or social utility. It has one of those dizzying pedigrees we associate with the Internet age. In 2004, its creator, Mark Zuckerberg, began with the idea of developing an online version of the paper facebook produced for incoming students at Harvard, where he was a student. In less than a year he had dropped out of school and moved to California. Within the next six months, Facebook was made available to students at most universities in the US, soon high school students were invited to join, and networks were expanded to include Britain, Canada, and Australia. Less than two years after it began, Facebook opened its doors to the general public and was estimated to have 12 million users. Today it is said to have as many as 20 million users as rumours circulate that a buyout for as much as 2 billion dollars is in the works.

Mark Zuckerberg

Mark Zuckerberg

Facebook is a strange place. You can provide an almost unlimited amount of personal information on your page. You can post pictures. You can engage in an ongoing commentary about your life, minute by minute. The system also creates a running mini-feed that lets you know every minute change any one of your “friends” effects on their pages. You can invite people you know and strangers to be your friends, to come to social events, or to join Facebook groups that you start or are merely a member of. A lot of people use Facebook like email. It’s a place to chat and to keep people informed about your life. But for some people — and this is how I, and I suspect many people my age (the over 35s, who are even older than 25–34 year old “oldies” discussed in one recent Globe & Mail article), get hooked on Facebook — it’s a place to find old friends that we weren’t likely to track down anywhere else. And this is where, I suspect, we differ from younger cohorts of Facebookers who probably aren’t as nostalgic as we are… yet.

Nostalgia is a definite key for many of the Facebookers I know — and I can’t quite believe how words like Facebooker and Facebooking flow out of me after just a few short weeks of interface. While quite a few of my “friends” are people I work with, live near, or am related to, the ones I really seek out and the ones I, in a sense, have thus far derived the greatest satisfaction from being connected to, are the old friends I lost touch with along the way. Frederic Jameson says that “an addiction to the photographic image is itself a tangible symptom of an omnipresent, omnivorous, and well-nigh libidinal historicism” (18), one that “cannibalizes” the past into a cacophony of “overstimulating ensembles” (19). This is an apt description of Facebook for many users (and perhaps of many users as well).

An April 23 editorial in the Toronto Star makes the libidinal quality of the Facebook experience explicit by suggesting that social networking sites are now rivaling Internet pornography for sheer number of users. Jen Gerson writes, “though the website du jour may change, the desire to be connected all the time isn’t going to go away among the youth cohort.” I agree, although I think the focus on youth misses the fact that people in their 30s, 40s, 50s (and so on) are increasingly using social networking sites to stay connected with their pasts as well as their presents. Here’s one example: I recently invited one of my best friends from high school, a busy working mom with four kids, to join Facebook. She so rarely responds to her email, I thought she’d just delete it. But, just a week later she sent me a message (on Facebook) saying she just couldn’t keep herself from trolling the system looking for old friends.

Linda Hutcheon makes the important point that a predilection for seeing postmodern culture and its artifacts as inherently nostalgic is something to be cautious of. She’s right of course. Nostalgia often implies the longing for a mythic past, whose doors, always barred to us, make what is behind them infinitely desirable and whose completedness masquerades as simplicity, as authenticity, as a time that was really “real.” We can go home, but we can’t go home. In her recent book, Giving An Account of Oneself, Judith Butler argues that one of the problems of doing so — of giving an account of oneself — is that there is always an originary part that we cannot know. A piece that is beyond language and memory but that is nonetheless foundational to who we are. We are, on some level, aware of this, and search for ways of contacting or connecting with this originary part for which we have no vocabulary of enunciation. This may be part of Facebook’s appeal. Many people are casting around in its multitudes looking for connections to a past they feel cut off from; from parts of themselves they think might be lost; for threads of a narrative that will allow them to give a fuller and more complete account of themselves.

And yet, is there an ironic aspect of Facebook? Is this type of postmodern cultural production/immersion (as we are both in it and, in a sense, co-producers of it) simply a reification of a mythic past, a giving in to longing for something unrecoverable, or do we engage with Facebook via an “ironic distance,” or both (Hutcheon)? Hutcheon argues that irony and nostalgia are both responses to things, rather than things in and of themselves. Facebook can be read as a space where irony and nostalgia co-exist, especially for those who use it to seek out the past. The pages and groups people create are genuine but evidence a type of ironic distancing that comes from a recognition that the people reading are, in some sense, at a temporal distance. These groups play a bit like the newest ad in the Diet Pepsi “Forever Young” campaign. Called “Make-Out,” the ad features a couple in their thirties who wish they could make-out like they did when they were teens… flash to them making out in the grocery store and at parent-teacher interviews to the 1983 Bonnie Tyler power-ballad “Total Eclipse of the Heart.” Happily, they return to their present, but with the recognition that Diet Pepsi keeps them young, connected to their youth… like Facebook.

Facebook T-shirt

Facebook T-shirt

Ironic, nostalgic, Facebook does have a high school quality. One friend told me she had resisted joining because of an experience on an earlier social utility: she kept looking for ex-boyfriends and their new girlfriends, comparing the “coolness” of their sites and hers, comparing the numbers of friends each had. It was, she finally explained, like being back in high school. Alyssa Schwartz, in the Globe & Mail, discovered similar invocations of a “high school mentality,” finding Facebook guilty of “bring[ing] back behavior that went out with plaid shirts, Tuff boots and Nirvana.” Note how Schwartz’s invocation of high school, here coded through fashion and music, marks the ironic nostalgia of Facebook participation, but also locates “old” Facebookers as people who were teens in the 1990s, not the 1980s (shoulder pads, doc martens, and Madonna), 1970s (hot pants, platform shoes, and Led Zepplin), or 1960s (mini skirts, saddle shoes, and The Beatles).

There is clearly much to be studied in the worlds of Facebook. The desire to network the past in the present, the availability and massive usage of this technology, certainly warrants closer examination.

Works Cited
Jen Gerson. “Social Networking rivals porn on Web.” The Toronto Star. April 23, 2007.
Linda Hutcheon. “Irony, Nostalgia, and the Postmodern.”
Fredric Jameson. Postmodernism or, the Cultural Logic of Late Capitalism. Durham: Duke UP: 1991.
Alyssa Schwartz. “Grownups get their Facebook fix.” Globe & Mail. March 31, 2007.

Image Credits:
1. Collective Facebook
2. Mark Zuckerberg
3. Facebook T-shirt

Please feel free to comment.