facebook emotions emoji icons

Photo via dolphfyn/Shutterstock (Licensed)

Classified documents reveal Facebook’s predatory ad campaigns targeted at children

This isn't the first time the social network has taken advantage of its users' vulnerabilities.


Phillip Tracy


Posted on May 1, 2017   Updated on May 24, 2021, 3:58 pm CDT

Facebook is known for using unsavory tactics to manipulate millions of users, but a new revelation is putting the social media giant under fire for targeting youths.

Confidential documents obtained by the Australian reveal how Facebook has been targeting vulnerable children with predatory advertising practices, and people are outraged.

The leaked files say the company is using algorithms to monitor posts, comments, and interactions to gather information on youths who “need a confidence boost,” and exploiting their emotional state with promoted advertising campaigns. The 23-page leak, dated this year, says Facebook looks at how users as young as 14 interact with the social network to determine if they feel “defeated,” “overwhelmed,” “stressed,” “anxious,” “nervous,” “stupid,” “silly,” “useless,” or a “failure.” Instead of finding ways to help these teenagers, Facebook gives advertisers what it calls a “sentiment analysis” to target those who are most vulnerable.


Facebook is also helping advertisers go after children who are concerned with “looking good and body confidence” or “working out and losing weight.” The report reveals the lengths Facebook goes to gain psychological insight into teenage emotion, “Monday-Thursday is about building confidence; the weekend is for broadcasting achievements.” Facebook Australia is also said to be using image-recognition tools on Facebook and Instagram to show advertisers how people represent different moments such as “meal times.”

The documents, which were written by two Australian Facebook executives, only provide data specific to Australia and New Zealand, where Facebook is said to be keeping its eyes on 6.4 million students and “young [people] in the workforce.”

Facebook Australia admitted to the abhorrent practices, telling the Australian, “The data on which this research is based was aggregated and presented consistent with applicable privacy and legal protections, including the removal of any personally identifiable information.”

A spokesperson also provided a lengthy apology with details on what happens next, “We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate,” but would not specify if these practices were being used at other international locations.

If the document is accurate, the social media giant could be in violation of the Australian Code for Advertising and Marketing Communications, which applies to children 14 and younger.

This isn’t the first time the social network has gone under the spotlight for tapping into the emotions of its nearly 2 billion monthly active users. In 2012, Facebook conducted an experiment on 750,000 of its members to determine if tampering with their news feeds would affect their emotional state. The project left users feeling manipulated, some claiming they felt like they were being used as “lab rats.”

The Daily Dot reached out to Facebook and will update this article if we hear back.

Update 12:52pm CT, May 1: In response to a request for comment, Facebook directed the Daily Dot to a previous statement on the matter. You can read it below in full.

On May 1, 2017, The Australian posted a story regarding research done by Facebook and subsequently shared with an advertiser. The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state.

The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.

Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight.

Share this article
*First Published: May 1, 2017, 11:43 am CDT