A man looking at security camera footage.

Andrey_Popov/Shutterstock (Licensed)

A new surveillance state is here: Inside your own home

Our homes are looking and feeling much different.

 

Jake Pitre

Tech

Posted on Dec 6, 2021   Updated on Dec 7, 2021, 7:28 am CST

The pandemic accelerated numerous troubling technologies or practices, but the one that may raise the most concerns is the rapid encroachment of surveillance technology into our private space at home as many were forced to lock down for extended periods of time. 

There are countless examples of how surveillance technology has become commonplace in our lives and homes. While attending school from home, students were forced to contend with faulty e-proctoring systems intended to prevent cheating that give false positives, alongside countless other losses of data privacy. Ring cameras have proliferated and Ring-captured videos go viral on TikTok every day. Employees are concerned about how new surveillance measures threaten their privacy as employers seek to track their every keystroke or monitor their engagement via AI-powered cameras. And now, Ring has introduced a home security drone called Always Home Cam which navigates its way through the halls of your house. 

In other words, our homes are looking and feeling much different. The pandemic has only added to the growing use and domesticization of surveillance technologies and systems, meaning our homes are becoming sites of increased security in ways that challenge assumptions about privacy and publicity—and there is a burgeoning sense that this implementation is happening far more quickly than we can understand its potential consequences.

As academics Jathan Sadowski and Sophia Maalsen have put it: some of the largest tech companies in the world “are now our roommates.” These new neighbors and their universal surveillance have raised questions about how kids and young people who have grown up amid the proliferation of data collection on social media and elsewhere will be impacted. 

As schools and universities invest more in e-proctoring and other systems (one study shows that 80% of teachers in the U.S. use this software, and only one in four limit the tracking to school hours), groups like Fight for the Future aim to put an end to it. 

Erica Darragh, Fight for the Future’s community manager, told the Daily Dot that this edtech is “prone to error and discriminatory against students of color,” and “children are a lot more vulnerable to disinformation or the effects of being fed content that leads into mental health issues, self harm, eating disorders, and these surveillance networks help collect the data that influences these algorithms.” 

Oftentimes, Darragh points out, it’s the companies that approach schools, promising a magical solution to the “problem” of remote learning. 

The use of this technology is, naturally, moving faster than regulation, legislation, or other accountability measures are being administered. It’s also more overt than ever—students are required in many cases to download what essentially amounts to spyware in order to take a class. 

Mark Andrejevic, professor at Monash University and author of Automated Media, argues that this kind of tech gives the illusion of shared control, instilling a feeling of empowerment. But he told the Daily Dot that, in fact, “the control is asymmetric. What alarms me is the idea that, despite any advantages we may get, that this might be a longer-term shift where these technology companies become the mediators for workplace and educational relations.” 

This recalls Shoshana Zuboff’s claim that we are now in an age of “surveillance capitalism,” under which our personal data is gathered by profit-seeking companies to manipulate our behavior. 

The pandemic forced many to become more acquainted than ever with their own homes, and various corporate and institutional interests found an opportunity to collect more data, to keep users glued to their screens, and to track and locate with more accuracy. 

For some, it might be difficult to imagine installing a Ring-designed, Amazon-owned security drone within one’s own home. But the numbers don’t lie: Ring cameras were plastered all over websites as recommended holiday gifts.

Andrejevic said the popularity of this technology shouldn’t be surprising, noting that as “state actors and major corporations embrace these levels of surveillance in the name of risk minimization and opportunity maximization, that we receive the same lesson—you can do this at home, too.” 

In other words, this tech is sold to consumers as the more surveillance you have, the more you can subject your domestic space to the same control that you are being subjected to. Schools, parents, and employers all, in their own way, seek this control, which is itself a form of power. As the world has become more complex and confounding, buying into some semblance of control seems like a natural response. 

One further danger, Andrejevic and others have warned, is the loss of social trust in society

As tech like online monitoring and AI-powered security devices increases in use, “we need to think about civic alternatives to these commercial systems that mediate our social relations,” Andrejevic says.

It’s no accident that these companies take the focus away from governments, financiers and banks, real estate companies, and other interests that are just as invested in surveilling their citizens/consumers and using their own data or someone else’s to entrench their market domination.

Moreover, along with getting the attention of people in power, more effort could be spent trying to educate young people about what data is collected on them and what it is used for.

Fight for the Future, for instance, is running a campaign to ban e-proctoring and lists which schools use it and which have committed not to. At the very least, Darragh argues, getting this information to kids “can help influence the way they interact with it and how they think about it.” 

Meanwhile surveillance technology continues to creep into our every day lives: class action lawsuits are underway in Illinois, where students are alleging that the use of e-proctoring violated their consent; Facebook is supposedly deleting users’ facial recognition data, though they haven’t ruled out using the technology in other ways; and more and more employers are pressuring their workers to “accept home surveillance.”

Surveillance technology isn’t going away, and our relationship to what is public and what is private, and how we feel about the security and comfort of our own homes, is what’s at stake. 


Read more about Big Tech

Congress barrels forward with EARN IT Act, determined to end encrypted messaging online
How little tech is turning the tide in the fight against big tech
FTC warns of ‘huge surge’ in social media scams
How the FTC can use ‘data minimization’ to immediately strengthen consumer privacy
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Share this article
*First Published: Dec 6, 2021, 7:00 am CST