Facebook Inc. has had quite a rough week during the first month of Oct.
On Oct. 4, 2021, from about 11:40 a.m. to 6:30 p.m. EST, Facebook encountered a nearly 6-hour outage affecting all of its encompassing products, including Facebook, Instagram and WhatsApp. In an ironic twist, the news was publicized from Facebook via Twitter.
The cause of the outage, Facebook claims, was due to themselves and not a hack. The routers that Facebook’s internal systems rely upon to network traffic had an error occur in its update, causing a cascade effect that shut down not just its products, but its own major internal systems. Employees reported they had to literally break into their own offices because the headquarter’s security and access system wasn’t operational.
It was reported that a second outage hit Facebook’s systems on Oct. 9, although it did not affect the entire user base, nor were there reports that Facebook’s internal systems were down during the second outage.
For many consumers in the U.S. and other English-speaking countries, the loss of social media was an inconvenience. However, in many parts of the world, including developing countries, Facebook functionally acts as the entire Internet, and the outage shut off many essential services and communications for these countries. Many small businesses in the U.S. also rely heavily upon Facebook’s platforms for their own revenue and product placement, many coming from businesses run by people from marginalized communities.
That, however, was not the only headline that appeared with Facebook’s name on it this week.
On Tuesday, Oct. 5, former Facebook data scientist and whistleblower Frances Haugen testified before a Senate subcommittee. The details of her accounts bring to light many of the well-guarded secrets Facebook holds in what could be Facebook Inc.’s most historic scandal, surpassing the Cambridge Analytica scandal.
In the testimony, Haugen released internal research regarding how Instagram was negatively affecting the mental health of young teens, and how the company was researching and targeting children under the age of 13.
When personal information is collected by an ad network for targeted advertising, this ad network comes into direct violation of COPPA, or the Children’s Online Privacy Protection Act.
The problem Haugen expressed in her testimony is that, while Facebook may not directly save or record personal information about a user under the age of 18, its algorithm’s artificial intelligence, or AI, can still create targeted ads towards younger demographics because of how the algorithm is designed.
The information Haugen provided brought forth a substantial amount of concerning data regarding how Facebook’s AI algorithms and engagement-based ranking, or EBR, can both lead users down a rabbit hole of misinformation that they can’t get out of.
In the leaked documents, internal research within Facebook showed how seemingly innocuous ads, such as ads for healthy eating recipes, can steer young women towards pro-anorexia advertisements and groups within a substantially short amount of time.
Other important information included the faulty safety mechanisms behind how Facebook curbs misinformation and virality.
Facebook’s safety mechanisms and content moderation are only able to curb roughly 13% of its misinformation content online, and these safety mechanisms stop working on users who scroll over 2000 posts a day. These users also tend to be the most vulnerable demographics, such as recently widowed individuals.
This testimony was the first glimpse into the realities of Facebook and how it operates. Facebook is driven entirely by metrics inside a one-room office stretching a quarter mile long. Its modus operandi is both its driving force and its biggest threat, according to Frances Haugen.
In a space driven entirely by profits, growth and short term metrics, where each employee is working in parallel with their colleagues, there is no one able to step up and make decisions that enable the platform to enact systems that keep its users safe.
This is a pivotal moment for Facebook—and perhaps a moment of no return. Now that individuals are given a glimpse into the tech giant’s underbelly, it’s vital that the U.S. government act to make change inside Facebook’s operations. Otherwise, its users and the generations that follow will be at a substantial risk to their mental health and safety.