- Rotten Tomatoes wants to see your ticket stub to leave a verified review 5 Years Ago
- ‘Sonic the Hedgehog’ movie delayed to 2020 to fix his look 5 Years Ago
- ‘Swamp Thing’ gets off to a promising start, but can it tell a convincing love story? Today 11:34 AM
- ‘Falling on deaf ears’: ‘Queer Eye’ star sparks conversation about ableist idioms Today 11:15 AM
- Parents are spending thousands on YouTube camps that teach kids how to be famous Today 10:43 AM
- In season 2 of ‘She’s Gotta Have It,’ Spike Lee remains unapologetically himself Today 10:36 AM
- Trump selling Pride shirts is a grotesque insult to the LGBTQ community Today 10:27 AM
- Logan Paul is being mocked for pulling out of slapping competition Today 9:57 AM
- 47 House Democrats sign criticized net neutrality working group letter Today 9:17 AM
- How ‘and I oop’ became the perfect reaction meme for shocking developments Today 8:47 AM
- Netflix’s ‘The Perfection’ is a totally unhinged, WTF horror film Today 8:00 AM
- ‘After Maria’ looks at the long road to recovery after Hurricane Maria Today 7:30 AM
- ‘Star Wars: Knights of the Old Republic’ movie is reportedly in the works Today 7:04 AM
- Why do conservatives hate Ocasio-Cortez’s community garden? Today 6:30 AM
- What happens to Hawkeye’s Disney+ spinoff after ‘Avengers: Endgame’? Today 6:30 AM
In 2006, Harvard also conducted a Facebook study that went too far
Don’t worry… be happy?
Horrified by the recent news that Facebook likes to play with your emotions? You might have a short memory.
Back in 2006, a group of Harvard researchers did a remarkably poor job handling a treasure trove of longitudinal Facebook data collected from 1,640 students at a mysterious, unidentified college (spoiler alert: It was Harvard’s class of 2009).
The researchers had the well wishes from not only Facebook itself at the time, but Harvard’s Institutional Review Board (IRB). The study, Tastes, Ties, and Time: Facebook data release (T3), was initially published on Harvard’s Dataverse Network in 2008 and and pulled in 2010 to “ensure the privacy of students in the dataset.” Um.
From an outline of the research goals:
The dataset comprises machine-readable files of virtually all the information posted on approximately 1,700 FB profiles by an entire cohort of students at an anonymous, northeastern American university.
Profiles were sampled at one-year intervals, beginning in 2006. This first wave covers first-year profiles, and three additional waves of data will be added over time, one for each year of the cohort’s college career.
Though friendships outside the cohort are not part of the data, this snapshot of an entire class over its four years in college, including supplementary information about where students lived on campus, makes it possible to pose diverse questions about the relationships between social networks, online and offline.
According to the Chronicle of Higher Education, everything started falling apart in 2008 when the study caught the attention of Michael Zimmer, an Assistant Professor at the University of Wisconsin at Milwaukee who discovered that the data wasn’t anonymous at all and later published his ethical objections, which are myriad. Among them, the fact that the researchers used student research assistants at Harvard to gain access to non-public profile data, which included “not only the subjects’ self-reported gender and ethnicity, but also their home, state, nation of origin, political views, sexual interests, college major, relational data, and cultural interests.”
While the Harvard study doesn’t share the “mood manipulation” component that unsettled so many Facebook users recently, it’s a good reminder that the ethics of studying social data have always been murky, and the studies themselves borderline (or completely) creepy.
Photo by Taylor Hatmaker
Taylor Hatmaker has reported on the tech industry for nearly a decade, covering privacy and government. Most recently, she was the Debug editor of the Daily Dot. Prior to that, she was a staff writer and deputy editor at ReadWrite, a tech and business reporter for Yahoo News, and the senior editor of Tecca. Her editorial interests include censorship, digital activism, LGBTQ issues, and futurist consumer tech.