Article Lead Image

Illustration via Max Fleishman

The myth of the millennial

Technology now defines our generational gaps.

 

Gillian Branstetter

Internet Culture

Posted on Jun 1, 2016   Updated on Feb 29, 2020, 6:04 am CST

Opinion

If you can remember hair metal bands, leg warmers, and Nancy Reagan, there’s a good chance you might be a millennial. At least according to the New York Times which, in yet another forced rendition of “kids these days” mania, recently defined the millennial generation as “those born between 1977 and 1995.”

It’s an astonishingly broad range of people. It puts a 39-year-old in the same “generation” as a 21-year-old. In fact, it literally means a millennial could feasibly have given birth to another millennial. The term itself—with all its buzzwordy, ear-grating popularity—is made functionally useless by how many people it gets applied to. Like “middle class” or “nerd,” the word “millennial” has expanded as an identifier to include just about anyone the speaker wants to include, either for the purpose of their argument or sheer laziness.  As much as generational differences matter at all, millennials are a distinct, defined group sandwiched by two other distinct, defined groups.

It turns out, however, that such multi-decade approaches to demarcating generations as used by the Times is not so absurd. According to the U.S. Census Bureau, baby boomers (a generation millennials recently surpassed in size) are defined as “individuals born in the United States between mid-1946 and mid-1964,” a neat period of time bordered by World War II and the dawn of the fourth season of Mad Men. A Harvard study placed the birth dates of generation X between 1966 and 1984. And millennials are often described as a two-decade bunch. William Strauss, a sociologist who coined the term “the Millennial Generation,” even stretches it over all birth dates from 1982 to 2004.

If the strides between generations are so wide, what do we gain by defining them at all? 

Which means a 12-year-old is currently in the same generation as their 34-year-old parent. But if the strides between generations are so wide, what do we gain by defining them at all? What we mean by “generation” is best ascertained by the cultural ephemera we attach to each—baby boomers get everything from Woodstock to disco and generation X gets everything from MTV to the end of Seinfeld. Boomers get Vietnam, Watergate, and other things Forrest Gump related. Generation X gets the Challenger disaster and the Rodney King riots.

As Stephen Metcalf notes in Slate, “the most obvious social events that convert a cohort into a generation, from merely demographic fact into poetic destiny, are revolution and war.” This might certainly have been true for the so-called “Lost Generation” of World War I or the “Greatest Generation” of World War II. Both conflicts were so encompassing of the culture and the population—both by body count and retaliatory effects on the economy—they were inescapable facts of life for nearly anyone alive during them.

It has been some time since conflict of such scale has afflicted the affluent West. As Metcalf also argues, Boomers relationship with Vietnam is as “a war they did not fight; a war, in fact, they evaded fighting.” Similarly, generation X’s relationship with conflict is best summed up by the fictional Tyler Durden in Fight Club, who once whined: “We have no Great War. No Great Depression. Our Great War’s a spiritual war.” 

Millennials, though coming of age during the two longest wars in American history, remain largely unscathed by the economic sacrifices typically required of such prolonged conflict. As Christopher Jusuf opines in his call to leadership in The Hill: “Millennials have been able to cast aside responsibility for the wars onto the older generation that was in power.”

Many might suggest the Sept. 11 terrorist attacks—and the mass traumatic episode it cast upon a millennial childhood—was such a chapter-ending event, a clear point separating Before and After. But for the average millennial, the attacks themselves were of little consequence. As Foreign Policy noted on the 10th anniversary of the attacks, the 2000s saw sweeping change far surpassing the historical importance of that one day, including climate change, the 2008 financial collapse, and the Arab Spring.

Also selected by Foreign Policy is the largest sea change for most Americans in the last few decades, if not in the last century: The rise of the Internet and smartphones. Think about it: What has had more influence on your daily life, ISIS or Facebook? Did you alter your own behavior after the Arab Spring or after you first heard of someone getting fired for tweeting a nasty joke? Have you spent more time contemplating the rise of India and China or vomiting rainbows on Snapchat?

In place of life-shifting turmoil—a substitution afforded to most of us by cultural and geographic privilege—such nonsense shapes the experiences that come to define generations. They are what generational theorist Karl Mannheim once termed “the curve of the progress of the human species in terms of its vital substructure.”

Think about it: What has had more influence on your daily life, ISIS or Facebook?  

As the culture becomes more closely tied to technological development, its pace quickens alongside it. The time between AOL Instant Messenger (launched in 1997) and Facebook (2004) is a mere seven years. Netflix opened up its streaming platform in 2008. Just eight years later, Americans spend more time on Netflix than reading, socializing, or exercising combined. Eight years since the dawn of the App Store and 2015 revenue topped $20 billion.

These might seem silly in comparison to worldwide wars or devastating economic collapse, but the impact of the Internet and social media has been the largest alteration most Americans have experienced in their lifetime. And considering the exponential pace at which such environments change, the meaningful endpoints for a generation—a unit of measurement limited by culture—should be getting closer and closer together.

A person born in 1977 is going to have a completely different relationship to technology than someone born in 1995. The latter 39-year-old might have obtained their first smartphone at 31 and probably didn’t have a cell phone until they were 23. Their expectations for digital entertainment were more likely shaped by Napster than Youtube, and they are significantly unlikely to have experience with geographically-based apps like Tinder or Uber.

A person born in 1995, however, is living in an almost completely different digital space. This 21-year-old likely goes to a college with YikYak and has never known a parent-free Facebook. They’ve never had a Blockbuster card and have little purpose for owning a non-mobile device. Such distinctions can seem superficial in a historical context, but the degree they matter in a Western nation free of traumatic upheaval cannot be understated, and in fact might become the historical context themselves.

Neither one of the above individuals is usefully classified as a millennial—a term most usefully defined as a person whose cultural memory doesn’t go farther back than the Internet but exceeds the spread of smartphones, which would put their birth between 1985 and 1994. 

The global shift centered around and caused by technology is not comparable to the standards of war or revolution set by Metcalf. In their place, however, such innovations have come to define the delineation of our experiences over time.   

Gillian Branstetter is a social commentator with a focus on the intersection of technology, security, and politics. Her work has appeared in the Washington Post, Business Insider, Salon, the Week, and xoJane. She attended Pennsylvania State University. Follow her on Twitter @GillBranstetter.

Share this article
*First Published: Jun 1, 2016, 10:04 am CDT