Nearly two decades ago, Facebook launched and quickly became a part of our everyday lives. It’s allowed people to make friendships, grow businesses, stay in touch with family members (or snooze them), and share the good, bad, and ugly in their lives. Fast forward to 2021: Facebook has nearly 2 billion daily users. And is valued at more than $900 billion. But not everyone has hit the “like” button when it comes to the social media giant. Over the years, it’s been at the center of a number of scandals, congressional hearings and PR nightmares. (Think: privacy issues and election interference.) And it's recently gotten into hot water...again. First, in September, reports – based in part on documents provided by a whistleblower – revealed some of the company’s alleged practices. Including how Facebook apparently downplayed Instagram’s negative impact on teen girls. Now, the Facebook Papers – aka redacted docs initially provided to Congress by the whistleblower – are shining an even brighter spotlight on the company. And news reports – based on the thousands of internal FB documents – are spilling the tea on FB HQ’s alleged moves on everything from the Jan 6 riot to its impact in foreign countries. To better understand what the F is going on with FB and what its future could look like, we’re breaking down why the Facebook Papers are flooding the world’s news feed. The Facebook Whistleblower: Meet Frances Haugen In September, The Wall Street Journal reported leaked findings from Facebook’s internal research on Instagram. Courtesy of a whistleblower. The report painted a picture of how the company allegedly turned a blind eye to Instagram’s harmful effects. And said the company never made the research public. Then, in October, the world heard from the whistleblower herself. First in a “60 Minutes” interview. Then, before US lawmakers and members of UK Parliament. Haugen, a former product manager, left Facebook in May after she became frustrated with the company’s lack of transparency about its platforms’ potential harm to users. And has since been spilling the beans… To the public: On “60 Minutes,” Haugen said Facebook faced constant “conflicts of interest” between what’s good for the public and what’s good for the company. And that today’s version of the platform is “tearing our societies apart.” She said Facebook is aware that hateful content is feeding its algorithm...and pockets. To lawmakers on the Hill: Facebook allegedly knew it was promoting anorexia content to young users. And that "authoritarian or terrorist-based leaders" (think: China and Iran) were using the platform to spy on people. Haugen claimed FB put “astronomical profits before people.” She called on Congress to regulate social media in the way it handles Big Tobacco. And to make it possible for people to sue companies over algorithms that promote harmful content. Now, Republicans and Democrats are throwing their support behind regulating Facebook (think: changing how it reaches users and boosts content). To lawmakers across the pond: Haugen reiterated how FB’s a platform for hate and extremism. And that it fails to keep children safe from harmful content. She called the company negligent and urged UK lawmakers to hold the social media giant responsible – reportedly saying “until the incentives change, Facebook will not change.” It’s a message that could further fuel European govs’ ongoing efforts to crack down on Big Tech. But there could be more updates in store: Haugen’s scheduled to speak with EU officials in November. In the meantime, the world has been focused on the Facebook Papers. Which brings us to... Highlights from the Facebook Papers In late October, 17 US news organizations started exposing Facebook’s alleged practices. And some of these articles are TL;DR for some people. So we Skimm’d them for you: A threat to democracy...Facebook was apparently in over its head when it came to tackling misinformation about the 2020 election. The company stepped up efforts to help stop the spread of violence and misinformation before Election Day – but apparently removed some of those safeguards leading up to the Jan 6 riot. Meaning, the company allegedly didn't have the right policies or procedures needed to halt the Stop the Steal movement. So far-right users still used FB to organize and spread false content. All of which helped lead to the Jan 6 riot, which left five people dead, about 140 officers injured, and democracy threatened. A platform for crime...Documents allegedly found that FB’s been falling short when it comes to taking down content with links to terrorism in the Middle East and North Africa. And that a Mexican drug cartel was using FB as a source to recruit, train, and pay members. Plus, there are allegations of human trafficking and exploitation on the site. Think: allowing accounts that feature the photos of people being sold as maids in the Middle East. The concerns around people trading and selling maids reportedly prompted Apple to threaten to cut FB from its app store. Faults around the world...The impact of Facebook’s actions – or lack thereof – aren’t only noticeable in the US. The docs reportedly mention several other countries, including: India. Aka the company’s largest market with 340 million users. FB has apparently been a breeding ground for anti-Muslim propaganda and hate speech. And things like bots and fake accounts have impacted the country on a national and political scale, including elections. Vietnam. Where CEO Mark Zuckerberg allegedly caved in to the gov. And personally agreed to censor posts by anti-gov dissidents. Not doing so could have caused FB to get kicked out of one of Asia's most profitable markets. Myanmar and Sri Lanka. Content on FB in these countries has allegedly stoked physical violence against religious or ethnic groups. Including the Rohingya Muslims in Myanmar. Things like translation issues and lack of cultural awareness of certain regions have apparently allowed this type of content to live on the platform. Like it, share it…According to company docs, Facebook looked into the effects of removing its “like” button. It came after reportedly learning that its popular feature could cause "stress and anxiety" for Instagram users. But the company found that user engagement with posts and ads (aka the company's bread and butter) took a hit. In the end, FB reportedly didn’t move beyond testing. And despite finding that the share button helped spread false or misleading content, it hasn’t removed that feature either. Company research also apparently found that FB groups suggestions could lead users “down the path to conspiracy theories.” And the company’s algorithm has been blamed for fueling hate. Trying to stay cool...As owner of Instagram and WhatsApp, Facebook’s been accused of having a monopoly in the social media sphere. But the company is apparently fearful of losing out on younger users. When it comes to the Facebook app, the number of teen users in the US has reportedly declined by 13% since 2019. And that drop’s expected to grow in the coming years. The company’s even tried to repackage Instagram to appeal to younger users. Enter: “Instagram Kids.” But the social media platform for those under 13 hit a snag after backlash from lawmakers. Now, the company’s hit pause on the rollout. And says it’ll “work with parents, experts, policymakers, and regulators” to address their concerns about the proposed app. How Did Facebook Get...Here? Many may be wondering how things got this bad for the company. The Reel(s) answer: Facebook allegedly ignored the warning signs, from research findings to employees’ flags. And the company’s actions have reportedly created division within FB: Some employees have made efforts to steer FB in the right direction. But others have stood behind management’s decisions, with one person even calling executives “brilliant.” Through all of this, Facebook has maintained that Haugen and the papers are painting a “false” narrative. It said that the Facebook Papers were a “curated selection out of millions of documents at Facebook.” And that allegations that it profits “at the expense of people's safety or wellbeing” conflicts with their true interests. Adding, that it has invested $13 billion and has over 40,000 employees working to keep users safe. The weeks of bad news came before Zuckerberg announced that the company’s changing its name to Meta. The rebrand will bring all of its apps and tech under one new umbrella. And shines a light on its new focus: building the metaverse. It’s a virtual space where you can interact with others via an avatar. (Yep, that’s apparently going to be a thing.) TBD whether a name change will be enough to help the company shift the current conversation away from its alleged public harm. What the Facebook Papers Mean for Facebook Now that’s meta. Amid all of the scrutiny, Facebook has continued to find succe$$. One study says it may be ‘too big to fail.’ And that if the platform were to shut down, it could hurt developing countries that depend on it for things like news and staying connected to family and friends. But the newest allegations have orgs advocating for racial justice in tech saying Zuckerberg should resign. Adding that the findings reveal that leadership “continue to sacrifice the safety of our communities to line their pockets.” The orgs are reportedly calling on users to log off FB on Nov 10 to send the company a message. On the Hill, there’s been bipartisan support to keep Facebook in check. Now, some lawmakers are saying the docs show why strong legislative action needs to happen. Sen. Kirsten Gillibrand (D-NY) said gov regulations “have not kept pace” with evolving algorithms. And called for the creation of a new fed agency to protect individuals’ data and privacy. Meanwhile, the SEC, which also has access to the docs, could investigate further. And has the power to fine Zuckerberg or remove him as chairman – though it's not clear how likely that would be. theSkimm Over the last decade, Facebook’s public profile has changed amid growing allegations that it’s not protecting the very people who have helped it succeed. Now, the company is facing what could be its worst crisis yet. But while some people may consider cutting ties with FB, it might not be easy to part ways with a powerful tool that’s tightly woven into billions of lives. Updated on Oct 28 - Updated to reflect that Facebook's changing its company name to Meta