I can’t recall exactly when I first heard “precariat”—a portmanteau of “precarious” and “proletariat”—in conversation. It was probably late 2013, during a juiced-up bargument—a portmanteau of “bar” and “argument”—at Sidewalk Cafe or Josie’s in the East Village. The term refers to people who have no sustainable means of selling their labor. At the time I first heard it, some artists were speculating that networked culture and emergent technologies would not prove emancipatory. The digital platforms we’d been using to foster alternative economies and new audiences, they argued, would soon turn all of that free content we’d produced into data used to commodify and atomize us.
Within a year or two, things got worse. In cities around the United States, overeducated, underemployed, student-indebted artists realized that they’d been taken for a ride. Art schools that sold MFA teaching credentials in bulk had quietly pivoted away from investing in full-time faculty, opting for inexpensive patchworks of one-semester adjunct positions. Galleries and museums that once hired art handlers for reliable, decently compensated jobs were now using temporary contract labor. Institutions were acting like platforms.
Suddenly everyone was on the hustle. Everyone was a brand. But no one, it seemed, was an actual employee. Initially it was tempting to believe the gig economy offered potential. With that flexibility, you could do a residency! But nobody but the independently wealthy would have any money left at the residency’s end. The precariat got stuck with stagnant wages, escalating rents, relentless calls from student loan sharks. Job boards advertised full-time, unpaid internships requiring two years of experience.
The silver lining, I suppose, is that many artists pursued collective, community-oriented projects. And culture workers adopted class consciousness, recognizing shared interests with workers outside their sector. Just look at the New Museum, where this year employees unionized with United Auto Workers Local 2110, who also represent workers at the Bronx Museum and the Museum of Modern Art. —Sean J Patrick Carney
Either “naturally” inherited or ardently sought, privilege—such as the honor (with a price) of being a MoMA trustee—has recently been assailed as pretty much the root of all social evil. Activists and culture critics now invoke the term for virtually anything that some people can have or do, but all people can’t: from living in a Hudson Yards high-rise (“[apparently] wealth and status can be considered inclusive if some token non-elites are allowed to gawk at those who enjoy such privileges,” Rob Horning wrote in A.i.A., June/July 2019) to gaining political power on the basis of race or wealth, rather than merit or vision.
While “white privilege” is the most blatant, persistent, and oft-cited strain of this condition, it can take myriad forms—many of them marked by the offender’s real or purported lack of self-awareness. Thus the all-too familiar injunction, “Check your privilege!” In 2016, Korean émigré artist Nikki S. Lee, known for her blend-in-with-the-group photo series, was taken to task by Contemptorary blog writer Eunsong Kim for wearing blackface in her “Hip-Hop Project” (2001) and for failing to name or share profits with participants in her “Hispanic Project” (1998). Kim asserted that “as a body that can enter and exit out of communities, as one who gets to DECIDE where she belongs and for how long and with whom, Lee positions herself as fundamentally more privileged.”
What’s the alternative to perpetuating privilege? (A practice so common that “to privilege” became a ubiquitous verb in the 2010s.) Perhaps never, for a moment, to forget the message of David Hammon’s spring 2019 installation at Hauser & Wirth, Los Angeles. There his huddled, ramshackle, homeless-style tents were inscribed with the simple phrase “this could be u.” —Richard Vine
In the 1960s and ’70s, social practice art developed alongside the civil rights movement, feminism, and antiwar activism. Artists like Suzanne Lacy and Mierle Laderman Ukeles chose the street over the studio, creating collaborative works that tackled issues including rape, environmental degradation, and economic inequality and emphasized engagement with the public rather than object-making. In 1998, curator Nicolas Bourriaud’s book Relational Aesthetics presented a new generation of artists’ participatory tactics, ranging from Rirkrit Tiravanija’s Thai food feasts in galleries to Pierre Huyghe’s staged parades in small towns.
But the 2010s may be considered the decade in which social practice finally entered the institution. New York–based public art nonprofit Creative Time, thriving under the leadership of artistic director Anne Pasternak and curator Nato Thompson, commissioned works and organized summits that addressed problems such as gun violence and climate change. Art historian Claire Bishop leveled a sharp critique at socially oriented art in her tome Artificial Hells (2012), arguing that such work should be evaluated on aesthetic rather than ethical terms. In 2010, Chicago-based artist Theaster Gates started the ambitious Rebuild Foundation, which provides affordable housing, arts programming, and other means of community revitalization on the city’s South Side. In summer 2016, at the New Museum in New York, Simone Leigh presented The Waiting Room, a lauded installation that centered on healing the black community and offered services such as herbalist treatments and yoga classes. Activism has taken on new importance in art institutions during the Trump era. Both Ukeles and Lacy have recently received major museum retrospectives: the former at the Queens Museum in 2016, and the latter at the San Francisco Museum of Modern Art in 2019. —Wendy Vogel
The idea of identifying and managing exposure to trauma triggers originally comes from PTSD treatment protocols developed in the 1980s for combat veterans. Trigger warnings migrated into online feminist communities as early as the 1990s—primarily flagging mentions of sexual assault, eating disorders, and self-harm—but the term went mainstream in the 2010s. Slate declared 2013 the “Year of the Trigger Warning,” citing widespread debates over whether they belonged in university classrooms, museums, theaters, or the news. Trigger warnings have alternately been cast as a welcome attempt to mitigate harm and as a symptom of the coddling of millennials, who refuse exposure to any idea or image that might unsettle their view of the world.
The most acrimonious debates over trigger warnings took place in academia, but the concept posed particular problems within the realm of art and culture, where generally progressive attitudes hit up against a reluctance to cede the power of transgression and shock. Triggering, in the sense of prompting an immediate and visceral reaction in the viewer is generally assumed to be the hallmark of good art, but the connotations of the term in the 2010s tended to tilt negative. In 2016, a number of black staff members at the Contemporary Art Museum in St. Louis protested the institution’s exhibition of works by Kelley Walker that included appropriated images of police brutality smeared with chocolate or toothpaste, arguing in an open letter that the show “trigger[ed] a retraumatization of racial and regional pain.” The museum responded by putting warning signs outside of the show acknowledging its potentially disturbing content. Such signs are now commonplace at museums, even in the case of shows that seem, in the scheme of things, pretty tame.
In the catalogue essay for the exhibition “Trigger: Gender as a Tool and as a Weapon,” held at the New Museum in 2017, curator Johanna Burton describes the show’s title as “the true elephant in the room,” albeit one that she doesn’t explicitly take up in the text. Despite the connotations of the title, the show foregrounded intergenerational dialogues about gender identity rather than provocation or offensiveness. Instead, its use suggested a reclamation of the word “trigger,” which, by the end of the decade, had become politically toxic. Indeed, the phrase “content warning/notice” has largely replaced “trigger warning”—in part because the association of “trigger” with gun violence might be, well, triggering. —Rachel Wetzler