Close Modal

Broken Code

Inside Facebook and the Fight to Expose Its Harmful Secrets

Look inside
Large Print (Large Print - Tradepaper)
$35.00 US
6"W x 9.17"H x 1.12"D   (15.2 x 23.3 x 2.8 cm) | 18 oz (505 g) | 12 per carton
On sale Nov 28, 2023 | 528 Pages | 9780593793190
Sales rights: US, Canada, Open Mkt
THE NEW YORK TIMES BOOK REVIEW EDITORS’ CHOICE • By an award-winning technology reporter for The Wall Street Journal, a behind-the-scenes look at the manipulative tactics Facebook used to grow its business, how it distorted the way we connect online, and the company insiders who found the courage to speak out

"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside." —New York Times Book Review

Once the unrivaled titan of social media, Facebook held a singular place in culture and politics. Along with its sister platforms Instagram and WhatsApp, it was a daily destination for billions of users around the world. Inside and outside the company, Facebook extolled its products as bringing people closer together and giving them voice.

But in the wake of the 2016 election, even some of the company’s own senior executives came to consider those claims pollyannaish and simplistic. As a succession of scandals rocked Facebook, they—and the world—had to ask whether the company could control, or even understood, its own platforms.

Facebook employees set to work in pursuit of answers.  They discovered problems that ran far deeper than politics. Facebook was peddling and amplifying anger, looking the other way at human trafficking, enabling drug cartels and authoritarians, allowing VIP users to break the platform’s supposedly inviolable rules. They even raised concerns about whether the product was safe for teens. Facebook was distorting behavior in ways no one inside or outside the company understood. 

Enduring personal trauma and professional setbacks, employees successfully identified the root causes of Facebook's viral harms and drew up concrete plans to address them. But the costs of fixing the platform—often measured in tenths of a percent of user engagement—were higher than Facebook's leadership was willing to pay. With their work consistently delayed, watered down, or stifled, those who best understood Facebook’s damaging effect on users were left with a choice: to keep silent or go against their employer.

Broken Code tells the story of these employees and their explosive discoveries. Expanding on “The Facebook Files,” his blockbuster, award-winning series for The Wall Street Journal, reporter Jeff Horwitz lays out in sobering detail not just the architecture of Facebook’s failures, but what the company knew (and often disregarded) about its societal impact. In 2021, the company would rebrand itself Meta, promoting a techno-utopian wonderland. But as Broken Code shows, the problems spawned around the globe by social media can’t be resolved by strapping on a headset.
1

Arturo Bejar’s return to Facebook’s Menlo Park campus in 2019 felt like coming home. The campus was bigger than when he’d left in 2015—Facebook’s staff doubled in size every year and a half—but the atmosphere hadn’t changed much. Engineers rode company bikes between buildings, ran laps on a half-mile trail through rooftop gardens, and met in the nooks of cafés that gave Facebook’s yawning offices a human scale.

Bejar was back because he suspected something at Facebook had gotten stuck. In his early years away from the company, as bad press rained down upon it and then accumulated like water in a pit, he’d trusted that Facebook was addressing concerns about its products as best it could. But he had begun to notice things that seemed off, details that made it seem like the company didn’t care about what its users experienced.

Bejar couldn’t believe that was true. Approaching fifty, he considered his six years at Facebook to be the highlight of a tech career that could only be considered charmed. He’d been a Mexico City teenager writing computer games for himself in the mid-1980s when he’d gotten a chance introduction to Apple co-founder Steve Wozniak, who was taking Spanish lessons in Mexico.

After a summer being shown around by a starstruck teenage tour guide, Wozniak left Bejar an Apple computer and a plane ticket to come visit Silicon Valley. The two stayed in touch, and Wozniak paid for Bejar to earn a computer science degree in London.

“Just do something good for people when you can,” Wozniak told him.

Success followed. After working on a visionary but doomed cybercommunity in the 1990s, Bejar spent more than a decade as the “Chief Paranoid” in Yahoo’s once-legendary security division. Mark Zuckerberg hired him as a Facebook director of engineering in 2009 after an interview held in the CEO’s kitchen.

Though Bejar’s expertise was in security, he’d embraced the idea that safeguarding Facebook’s users meant more than just keeping out criminals. Facebook still had its bad guys, but the engineering work that Facebook required was as much social dynamics as code.

Early in his tenure, Sheryl Sandberg, Facebook’s chief operating officer, asked Bejar to get to the bottom of skyrocketing user reports of nudity. His team sampled the reports and saw they were overwhelmingly false. In reality, users were encountering unflattering photos of themselves, posted by friends, and attempting to get them taken down by reporting them as porn. Simply telling users to cut it out didn’t help. What did was giving users the option to report not liking a photo of themselves, describing how it made them feel, and then prompting them to share that sentiment privately with their friend.

Nudity reports dropped by roughly half, Bejar recalled.

A few such successes led Bejar to create a team called Protect and Care. A testing ground for efforts to head off bad online experiences, promote civil interactions, and help users at risk of suicide, the work felt both groundbreaking and important. The only reason Bejar left the company in 2015 was that he was in the middle of a divorce and wanted to spend more time with his kids.

Though he was away from Facebook by the time the company’s post-2016 election scandals started piling up, Bejar’s six years there instilled in him a mandate long embedded in the company’s official code of conduct: “assume good intent.” When friends asked him about fake news, foreign election interference, or purloined data, Bejar stuck up for his former employer. “Leadership made mistakes, but when they were given the information they always did the right thing,” he would say.

But, truth be told, Bejar didn’t think of Facebook’s travails all that much. Having joined the company three years before its IPO, money wasn’t a concern, and Bejar was busy with nature photography, a series of collaborations with the composer Philip Glass, and restoring cars with his daughter Joanna, who at fourteen wasn’t yet old enough to drive. She documented their progress restoring a Porsche 914—a 1970s model derided for having the aesthetics of a pizza box—on Instagram, which Facebook had bought in 2012.

Joanna’s account became moderately successful, and that’s when things got a little dark. Most of her followers were enthused about a girl getting into car restoration, but some showed up with rank misogyny, like the guy who told Joanna she was getting attention “just because you have tits.”

“Please don’t talk about my underage tits,” Joanna Bejar shot back before reporting the comment to Instagram. A few days later, Instagram notified her that the platform had reviewed the man’s comment. It didn’t violate the platform’s community standards.

Bejar, who had designed the predecessor to the user-reporting system that had just shrugged off the sexual harassment of his daughter, told her the decision was a fluke. But a few months later, Joanna mentioned to Bejar that a kid from a high school in a neighboring town had sent her a picture of his penis via an Instagram direct message. Most of Joanna’s friends had already received similar pics, she told her dad, and they all just tried to ignore them.

Bejar was floored. The teens exposing themselves to girls who they had never met were creeps, but they presumably weren’t whipping out their dicks when they passed a girl in a school parking lot or in the aisle of a convenience store. Why had Instagram become a place where it was accepted that these boys occasionally would—or that young women like his daughter would have to shrug it off?

Bejar’s old Protect and Care team had been renamed and reshuffled after his departure, but he still knew plenty of people at Facebook. When he began peppering his old colleagues with questions about the experience of young users on Instagram, they responded by offering him a consulting agreement. Maybe he could help with some of the things he was concerned about, Bejar figured, or at the very least answer his own questions.

That was how Arturo Bejar found himself back on Facebook’s campus. Just shy of fifty and highly animated—Bejar’s reaction to learning something new and interesting is a gesture meant to evoke his head exploding—he had unusual access due to his easy familiarity with Facebook’s most senior executives. Dubbing himself a “free-range Mexican,” he began poring over internal research and setting up meetings to discuss how the company’s platforms could better support their users.

The mood at the company had certainly darkened in the intervening four years. Yet, Bejar found, everyone at Facebook was just as smart, friendly, and hardworking as they had been before, even if no one any longer thought that social media was pure upside. The company’s headquarters—with its free laundry service, cook-to-order meals, on-site gym, recreation and medical facilities—remained one of the world’s best working environments. It was, Bejar felt, good to be back.

That nostalgia probably explains why it took him several months to check in on what he considered his most meaningful contribution to Facebook—the revamp of the platform’s system for reporting bad user experiences.

It was the same impulse that had led him to avoid setting up meetings with some of his old colleagues from the Protect and Care team. “I think I didn’t want to know,” he said.

Bejar was at home when he finally pulled up his team’s old system. The carefully tested prompts that he and his colleagues had composed—asking users to share their concerns, understand Facebook’s rules, and constructively work out disagreements—were gone. Instead, Facebook now demanded that people allege a precise violation of the platform’s rules by clicking through a gauntlet of pop-ups. Users determined enough to complete the process arrived at a final screen requiring them to reaffirm their desire to submit a report. If they simply clicked a button saying “done,” rendered as the default in bright Facebook blue, the system archived their complaint without submitting it for moderator review.

What Bejar didn’t know then was that, six months prior, a team had redesigned Facebook’s reporting system with the specific goal of reducing the number of completed user reports so that Facebook wouldn’t have to bother with them, freeing up resources that could otherwise be invested in training its artificial intelligence–driven content moderation systems. In a memo about efforts to keep the costs of hate speech moderation under control, a manager acknowledged that Facebook might have overdone its effort to stanch the flow of user reports: “We may have moved the needle too far,” he wrote, suggesting that perhaps the company might not want to suppress them so thoroughly.

The company would later say that it was trying to improve the quality of reports, not stifle them. But Bejar didn’t have to see that memo to recognize bad faith. The cheery blue button was enough. He put down his phone, stunned. This wasn’t how Facebook was supposed to work. How could the platform care about its users if it didn’t care enough to listen to what they found upsetting?

There was an arrogance here, an assumption that Facebook’s algorithms didn’t even need to hear about what users experienced to know what they wanted. And even if regular users couldn’t see that like Bejar could, they would end up getting the message. People like his daughter and her friends would report horrible things a few times before realizing that Facebook wasn’t interested. Then they would stop.

When Bejar next stepped onto Facebook’s campus, he was still surrounded by smart, earnest people. He couldn’t imagine any of them choosing to redesign Facebook’s reporting features with the goal of tricking users into depositing their complaints in the trash; but clearly they had.

“It took me a few months after that to wrap my head around the right question,” Bejar said. “What made Facebook a place where these kinds of efforts naturally get washed away, and people get broken down?”

Unbeknownst to Bejar, a lot of Facebook employees had been asking similar questions. As scrutiny of social media ramped up from without and within, Facebook had accumulated an ever-expanding staff devoted to studying and addressing a host of ills coming into focus.

Broadly referred to as integrity work, this effort had expanded far beyond conventional content moderation. Diagnosing and remediating social media’s problems required not just engineers and data scientists but intelligence analysts, economists, and anthropologists. This new class of tech workers had found themselves up against not just outside adversaries determined to harness social media for their own ends but senior executives’ beliefs that Facebook usage was by and large an absolute good. When ugly things transpired on the company’s namesake social network, these leaders pointed a finger at humanity’s flaws.

Staffers responsible for addressing Facebook’s problems didn’t have that luxury. Their jobs required understanding how Facebook could distort its users’ behavior—and how it was sometimes “optimized” in ways that would predictably cause harm. Facebook’s integrity staffers became the keepers of knowledge that the outside world didn’t know existed and that their bosses refused to believe.

As a small army of researchers with PhDs in data science, behavioral economics, and machine learning was probing how their employer was altering human interaction, I was busy grappling with far more basic questions about how Facebook worked. I had recently moved back to the West Coast to cover Facebook for the Wall Street Journal, a job that came with the unpleasant necessity of pretending to write with authority about a company I did not understand.

Still, there was a reason I wanted to cover social media. After four years of investigative reporting in Washington, the political accountability work I was doing felt pointless. The news ecosystem was dominated by social media now, and stories didn’t get traction unless they appealed to online partisans. There was so much bad information going viral, but the fact-checks I wrote seemed less like a corrective measure than a weak attempt to ride bullshit’s coattails.

Covering Facebook was, therefore, a capitulation. The system of information sharing and consensus building of which I was a part was on its last legs, so I might as well get paid to write about what was replacing it.

The surprise was how hard it was to even figure out the basics. Facebook’s public explainers of the News Feed algorithm—the code that determined which posts were surfaced before billions of users—relied on phrases like “We’re connecting you to who and what matters most.” (I’d later learn there was a reason why the company glossed over the details: focus groups had concluded that in-depth explanations of News Feed left users confused and unsettled—the more people thought about outsourcing “who and what matters most” to Facebook, the less comfortable they got.)

In a nod to its immense power and societal influence, the company created a blog called Hard Questions in 2017, declaring in its inaugural post that it took “seriously our responsibility—and accountability—for our impact and influence.” But Hard Questions never delved into detail, and after a couple of bruising years of public scrutiny, the effort was quietly abandoned.

By the time I started covering Facebook, the company’s reluctance to field reporters’ queries had grown, too. Facebook’s press shop—a generously staffed team of nearly four hundred—had a reputation for being friendly, professional, and reticent to answer questions. I had plenty of PR contacts, but nobody who wanted to tell me how Facebook’s “People You May Know” recommendations worked, which signals sent controversial posts viral, or what the company meant when it said it had imposed extraordinary user-safety measures amid ethnic cleansing in Myanmar. The platform’s content recommendations shaped what jokes, news stories, and gossip went viral across the world. How could it be such a black box?

The resulting frustration explains how I became a groupie of anyone who had a passing familiarity with Facebook’s mechanics. The former employees who agreed to speak to me said troubling things from the get-go. Facebook’s automated enforcement systems were flatly incapable of performing as billed. Efforts to engineer growth had inadvertently rewarded political zealotry. And the company knew far more about the negative effects of social media usage than it let on.

This was wild stuff, far more compelling than the perennial allegations that the platform unfairly censored posts or favored President Trump. But my ex-Facebook sources couldn’t offer much in the way of proof. When they’d left the company, they’d left their work behind Facebook’s walls.

I did my best to cultivate current employees as sources, sending hundreds of notes that boiled down to two questions: How does a company that holds sway over billions of people actually work? And why, so often, does it seem like it doesn’t?

Other reporters did versions of this too, of course. And from time to time we obtained stray documents indicating that Facebook’s powers, and problems, were greater than it let on. I had the luck of being there when the trickle of information became a flood.
NAMED A PORCHLIGHT BOOKS BEST BUSINESS BOOK OF THE YEAR

"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside."
New York Times Book Review

"Broken Code offers a comprehensive, briskly reported examination of key systems governing [Facebook] and their many failings... A smartly reported investigation into the messy internal machinations of one of the world’s most important and least understood companies."
—Washington Post


“Jeff Horwitz has written a blockbuster expose of Facebook, the notoriously secretive social media giant whose benign mission—connecting people—masked a growing propensity towards some of humanity’s worst impulses. Populated by concerned, brave employees who defied their employer and leaked thousands of pages of internal documents to Horwitz, with the imperious, remote Mark Zuckerberg and his top lieutenants at the center, Broken Code is brilliant reporting and a page-turning narrative of immense importance.”
James B. Stewart, Pulitzer Prize-winning investigative journalist and New York Times bestselling author

“A dogged and meticulous reporter, Jeff Horwitz is at the height of his powers in Broken Code, a penetrating portrait of one of the most significant companies in the world and of one of the great new challenges of this technological era.”
Ronan Farrow, Pulitzer Prize-winning investigative journalist and New York Times bestselling author

"An unsettling account….Stories of executives bumbling their way through or outright ignoring issues within the company are breathtaking and troubling… Horwitz’s reporting shines...This convincingly makes the case that Facebook’s pursuit of growth at any cost has had disastrous offline consequences."
—Publishers Weekly

"Readers interested in the ethics of the internet and technology, the business aspects of social media, and social media's impact on society at large will be fascinated. Horwitz has created an essential resource."
—Booklist

"A well-researched, disturbing study of a tech behemoth characterized by arrogance, hypocrisy, and greed." 
Kirkus Reviews

"Impressive reporting... A thoroughly documented portrait of a company that recognizes its products have harmed people yet declines to meaningfully change them."
San Francisco Chronicle
© Camas Goble
JEFF HORWITZ is a technology reporter for The Wall Street Journal. His work on “The Facebook Files” won the George Polk Award for Business Reporting and the Gerald Loeb Award for Beat Reporting. Previously an investigative reporter for the Associated Press in Washington, DC, he lives in the San Francisco Bay Area. View titles by Jeff Horwitz
Available for sale exclusive:
•     Canada
•     Guam
•     Minor Outl.Ins.
•     North Mariana
•     Philippines
•     Puerto Rico
•     Samoa,American
•     US Virgin Is.
•     USA

Available for sale non-exclusive:
•     Afghanistan
•     Aland Islands
•     Albania
•     Algeria
•     Andorra
•     Angola
•     Anguilla
•     Antarctica
•     Argentina
•     Armenia
•     Aruba
•     Austria
•     Azerbaijan
•     Bahrain
•     Belarus
•     Belgium
•     Benin
•     Bolivia
•     Bonaire, Saba
•     Bosnia Herzeg.
•     Bouvet Island
•     Brazil
•     Bulgaria
•     Burkina Faso
•     Burundi
•     Cambodia
•     Cape Verde
•     Centr.Afr.Rep.
•     Chad
•     Chile
•     China
•     Colombia
•     Comoro Is.
•     Congo
•     Cook Islands
•     Costa Rica
•     Croatia
•     Cuba
•     Curacao
•     Czech Republic
•     Dem. Rep. Congo
•     Denmark
•     Djibouti
•     Dominican Rep.
•     Ecuador
•     Egypt
•     El Salvador
•     Equatorial Gui.
•     Eritrea
•     Estonia
•     Ethiopia
•     Faroe Islands
•     Finland
•     France
•     Fren.Polynesia
•     French Guinea
•     Gabon
•     Georgia
•     Germany
•     Greece
•     Greenland
•     Guadeloupe
•     Guatemala
•     Guinea Republic
•     Guinea-Bissau
•     Haiti
•     Heard/McDon.Isl
•     Honduras
•     Hong Kong
•     Hungary
•     Iceland
•     Indonesia
•     Iran
•     Israel
•     Italy
•     Ivory Coast
•     Japan
•     Kazakhstan
•     Kyrgyzstan
•     Laos
•     Latvia
•     Lebanon
•     Liberia
•     Libya
•     Liechtenstein
•     Lithuania
•     Luxembourg
•     Macau
•     Macedonia
•     Madagascar
•     Maldives
•     Mali
•     Marshall island
•     Martinique
•     Mauritania
•     Mayotte
•     Mexico
•     Micronesia
•     Moldavia
•     Monaco
•     Mongolia
•     Montenegro
•     Morocco
•     Myanmar
•     Nepal
•     Netherlands
•     New Caledonia
•     Nicaragua
•     Niger
•     Niue
•     Norfolk Island
•     North Korea
•     Norway
•     Oman
•     Palau
•     Palestinian Ter
•     Panama
•     Paraguay
•     Peru
•     Poland
•     Portugal
•     Qatar
•     Reunion Island
•     Romania
•     Russian Fed.
•     Saint Martin
•     San Marino
•     SaoTome Princip
•     Saudi Arabia
•     Senegal
•     Serbia
•     Sint Maarten
•     Slovakia
•     Slovenia
•     South Korea
•     South Sudan
•     Spain
•     St Barthelemy
•     St.Pier,Miquel.
•     Sth Terr. Franc
•     Suriname
•     Svalbard
•     Sweden
•     Switzerland
•     Syria
•     Tadschikistan
•     Taiwan
•     Thailand
•     Timor-Leste
•     Togo
•     Tokelau Islands
•     Tunisia
•     Turkey
•     Turkmenistan
•     Ukraine
•     Unit.Arab Emir.
•     Uruguay
•     Uzbekistan
•     Vatican City
•     Venezuela
•     Vietnam
•     Wallis,Futuna
•     West Saharan
•     Yemen

Not available for sale:
•     Antigua/Barbuda
•     Australia
•     Bahamas
•     Bangladesh
•     Barbados
•     Belize
•     Bermuda
•     Bhutan
•     Botswana
•     Brit.Ind.Oc.Ter
•     Brit.Virgin Is.
•     Brunei
•     Cameroon
•     Cayman Islands
•     Christmas Islnd
•     Cocos Islands
•     Cyprus
•     Dominica
•     Falkland Islnds
•     Fiji
•     Gambia
•     Ghana
•     Gibraltar
•     Grenada
•     Guernsey
•     Guyana
•     India
•     Iraq
•     Ireland
•     Isle of Man
•     Jamaica
•     Jersey
•     Jordan
•     Kenya
•     Kiribati
•     Kuwait
•     Lesotho
•     Malawi
•     Malaysia
•     Malta
•     Mauritius
•     Montserrat
•     Mozambique
•     Namibia
•     Nauru
•     New Zealand
•     Nigeria
•     Pakistan
•     PapuaNewGuinea
•     Pitcairn Islnds
•     Rwanda
•     S. Sandwich Ins
•     Seychelles
•     Sierra Leone
•     Singapore
•     Solomon Islands
•     Somalia
•     South Africa
•     Sri Lanka
•     St. Helena
•     St. Lucia
•     St. Vincent
•     St.Chr.,Nevis
•     Sudan
•     Swaziland
•     Tanzania
•     Tonga
•     Trinidad,Tobago
•     Turks&Caicos Is
•     Tuvalu
•     Uganda
•     United Kingdom
•     Vanuatu
•     Western Samoa
•     Zambia
•     Zimbabwe

About

THE NEW YORK TIMES BOOK REVIEW EDITORS’ CHOICE • By an award-winning technology reporter for The Wall Street Journal, a behind-the-scenes look at the manipulative tactics Facebook used to grow its business, how it distorted the way we connect online, and the company insiders who found the courage to speak out

"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside." —New York Times Book Review

Once the unrivaled titan of social media, Facebook held a singular place in culture and politics. Along with its sister platforms Instagram and WhatsApp, it was a daily destination for billions of users around the world. Inside and outside the company, Facebook extolled its products as bringing people closer together and giving them voice.

But in the wake of the 2016 election, even some of the company’s own senior executives came to consider those claims pollyannaish and simplistic. As a succession of scandals rocked Facebook, they—and the world—had to ask whether the company could control, or even understood, its own platforms.

Facebook employees set to work in pursuit of answers.  They discovered problems that ran far deeper than politics. Facebook was peddling and amplifying anger, looking the other way at human trafficking, enabling drug cartels and authoritarians, allowing VIP users to break the platform’s supposedly inviolable rules. They even raised concerns about whether the product was safe for teens. Facebook was distorting behavior in ways no one inside or outside the company understood. 

Enduring personal trauma and professional setbacks, employees successfully identified the root causes of Facebook's viral harms and drew up concrete plans to address them. But the costs of fixing the platform—often measured in tenths of a percent of user engagement—were higher than Facebook's leadership was willing to pay. With their work consistently delayed, watered down, or stifled, those who best understood Facebook’s damaging effect on users were left with a choice: to keep silent or go against their employer.

Broken Code tells the story of these employees and their explosive discoveries. Expanding on “The Facebook Files,” his blockbuster, award-winning series for The Wall Street Journal, reporter Jeff Horwitz lays out in sobering detail not just the architecture of Facebook’s failures, but what the company knew (and often disregarded) about its societal impact. In 2021, the company would rebrand itself Meta, promoting a techno-utopian wonderland. But as Broken Code shows, the problems spawned around the globe by social media can’t be resolved by strapping on a headset.

Excerpt

1

Arturo Bejar’s return to Facebook’s Menlo Park campus in 2019 felt like coming home. The campus was bigger than when he’d left in 2015—Facebook’s staff doubled in size every year and a half—but the atmosphere hadn’t changed much. Engineers rode company bikes between buildings, ran laps on a half-mile trail through rooftop gardens, and met in the nooks of cafés that gave Facebook’s yawning offices a human scale.

Bejar was back because he suspected something at Facebook had gotten stuck. In his early years away from the company, as bad press rained down upon it and then accumulated like water in a pit, he’d trusted that Facebook was addressing concerns about its products as best it could. But he had begun to notice things that seemed off, details that made it seem like the company didn’t care about what its users experienced.

Bejar couldn’t believe that was true. Approaching fifty, he considered his six years at Facebook to be the highlight of a tech career that could only be considered charmed. He’d been a Mexico City teenager writing computer games for himself in the mid-1980s when he’d gotten a chance introduction to Apple co-founder Steve Wozniak, who was taking Spanish lessons in Mexico.

After a summer being shown around by a starstruck teenage tour guide, Wozniak left Bejar an Apple computer and a plane ticket to come visit Silicon Valley. The two stayed in touch, and Wozniak paid for Bejar to earn a computer science degree in London.

“Just do something good for people when you can,” Wozniak told him.

Success followed. After working on a visionary but doomed cybercommunity in the 1990s, Bejar spent more than a decade as the “Chief Paranoid” in Yahoo’s once-legendary security division. Mark Zuckerberg hired him as a Facebook director of engineering in 2009 after an interview held in the CEO’s kitchen.

Though Bejar’s expertise was in security, he’d embraced the idea that safeguarding Facebook’s users meant more than just keeping out criminals. Facebook still had its bad guys, but the engineering work that Facebook required was as much social dynamics as code.

Early in his tenure, Sheryl Sandberg, Facebook’s chief operating officer, asked Bejar to get to the bottom of skyrocketing user reports of nudity. His team sampled the reports and saw they were overwhelmingly false. In reality, users were encountering unflattering photos of themselves, posted by friends, and attempting to get them taken down by reporting them as porn. Simply telling users to cut it out didn’t help. What did was giving users the option to report not liking a photo of themselves, describing how it made them feel, and then prompting them to share that sentiment privately with their friend.

Nudity reports dropped by roughly half, Bejar recalled.

A few such successes led Bejar to create a team called Protect and Care. A testing ground for efforts to head off bad online experiences, promote civil interactions, and help users at risk of suicide, the work felt both groundbreaking and important. The only reason Bejar left the company in 2015 was that he was in the middle of a divorce and wanted to spend more time with his kids.

Though he was away from Facebook by the time the company’s post-2016 election scandals started piling up, Bejar’s six years there instilled in him a mandate long embedded in the company’s official code of conduct: “assume good intent.” When friends asked him about fake news, foreign election interference, or purloined data, Bejar stuck up for his former employer. “Leadership made mistakes, but when they were given the information they always did the right thing,” he would say.

But, truth be told, Bejar didn’t think of Facebook’s travails all that much. Having joined the company three years before its IPO, money wasn’t a concern, and Bejar was busy with nature photography, a series of collaborations with the composer Philip Glass, and restoring cars with his daughter Joanna, who at fourteen wasn’t yet old enough to drive. She documented their progress restoring a Porsche 914—a 1970s model derided for having the aesthetics of a pizza box—on Instagram, which Facebook had bought in 2012.

Joanna’s account became moderately successful, and that’s when things got a little dark. Most of her followers were enthused about a girl getting into car restoration, but some showed up with rank misogyny, like the guy who told Joanna she was getting attention “just because you have tits.”

“Please don’t talk about my underage tits,” Joanna Bejar shot back before reporting the comment to Instagram. A few days later, Instagram notified her that the platform had reviewed the man’s comment. It didn’t violate the platform’s community standards.

Bejar, who had designed the predecessor to the user-reporting system that had just shrugged off the sexual harassment of his daughter, told her the decision was a fluke. But a few months later, Joanna mentioned to Bejar that a kid from a high school in a neighboring town had sent her a picture of his penis via an Instagram direct message. Most of Joanna’s friends had already received similar pics, she told her dad, and they all just tried to ignore them.

Bejar was floored. The teens exposing themselves to girls who they had never met were creeps, but they presumably weren’t whipping out their dicks when they passed a girl in a school parking lot or in the aisle of a convenience store. Why had Instagram become a place where it was accepted that these boys occasionally would—or that young women like his daughter would have to shrug it off?

Bejar’s old Protect and Care team had been renamed and reshuffled after his departure, but he still knew plenty of people at Facebook. When he began peppering his old colleagues with questions about the experience of young users on Instagram, they responded by offering him a consulting agreement. Maybe he could help with some of the things he was concerned about, Bejar figured, or at the very least answer his own questions.

That was how Arturo Bejar found himself back on Facebook’s campus. Just shy of fifty and highly animated—Bejar’s reaction to learning something new and interesting is a gesture meant to evoke his head exploding—he had unusual access due to his easy familiarity with Facebook’s most senior executives. Dubbing himself a “free-range Mexican,” he began poring over internal research and setting up meetings to discuss how the company’s platforms could better support their users.

The mood at the company had certainly darkened in the intervening four years. Yet, Bejar found, everyone at Facebook was just as smart, friendly, and hardworking as they had been before, even if no one any longer thought that social media was pure upside. The company’s headquarters—with its free laundry service, cook-to-order meals, on-site gym, recreation and medical facilities—remained one of the world’s best working environments. It was, Bejar felt, good to be back.

That nostalgia probably explains why it took him several months to check in on what he considered his most meaningful contribution to Facebook—the revamp of the platform’s system for reporting bad user experiences.

It was the same impulse that had led him to avoid setting up meetings with some of his old colleagues from the Protect and Care team. “I think I didn’t want to know,” he said.

Bejar was at home when he finally pulled up his team’s old system. The carefully tested prompts that he and his colleagues had composed—asking users to share their concerns, understand Facebook’s rules, and constructively work out disagreements—were gone. Instead, Facebook now demanded that people allege a precise violation of the platform’s rules by clicking through a gauntlet of pop-ups. Users determined enough to complete the process arrived at a final screen requiring them to reaffirm their desire to submit a report. If they simply clicked a button saying “done,” rendered as the default in bright Facebook blue, the system archived their complaint without submitting it for moderator review.

What Bejar didn’t know then was that, six months prior, a team had redesigned Facebook’s reporting system with the specific goal of reducing the number of completed user reports so that Facebook wouldn’t have to bother with them, freeing up resources that could otherwise be invested in training its artificial intelligence–driven content moderation systems. In a memo about efforts to keep the costs of hate speech moderation under control, a manager acknowledged that Facebook might have overdone its effort to stanch the flow of user reports: “We may have moved the needle too far,” he wrote, suggesting that perhaps the company might not want to suppress them so thoroughly.

The company would later say that it was trying to improve the quality of reports, not stifle them. But Bejar didn’t have to see that memo to recognize bad faith. The cheery blue button was enough. He put down his phone, stunned. This wasn’t how Facebook was supposed to work. How could the platform care about its users if it didn’t care enough to listen to what they found upsetting?

There was an arrogance here, an assumption that Facebook’s algorithms didn’t even need to hear about what users experienced to know what they wanted. And even if regular users couldn’t see that like Bejar could, they would end up getting the message. People like his daughter and her friends would report horrible things a few times before realizing that Facebook wasn’t interested. Then they would stop.

When Bejar next stepped onto Facebook’s campus, he was still surrounded by smart, earnest people. He couldn’t imagine any of them choosing to redesign Facebook’s reporting features with the goal of tricking users into depositing their complaints in the trash; but clearly they had.

“It took me a few months after that to wrap my head around the right question,” Bejar said. “What made Facebook a place where these kinds of efforts naturally get washed away, and people get broken down?”

Unbeknownst to Bejar, a lot of Facebook employees had been asking similar questions. As scrutiny of social media ramped up from without and within, Facebook had accumulated an ever-expanding staff devoted to studying and addressing a host of ills coming into focus.

Broadly referred to as integrity work, this effort had expanded far beyond conventional content moderation. Diagnosing and remediating social media’s problems required not just engineers and data scientists but intelligence analysts, economists, and anthropologists. This new class of tech workers had found themselves up against not just outside adversaries determined to harness social media for their own ends but senior executives’ beliefs that Facebook usage was by and large an absolute good. When ugly things transpired on the company’s namesake social network, these leaders pointed a finger at humanity’s flaws.

Staffers responsible for addressing Facebook’s problems didn’t have that luxury. Their jobs required understanding how Facebook could distort its users’ behavior—and how it was sometimes “optimized” in ways that would predictably cause harm. Facebook’s integrity staffers became the keepers of knowledge that the outside world didn’t know existed and that their bosses refused to believe.

As a small army of researchers with PhDs in data science, behavioral economics, and machine learning was probing how their employer was altering human interaction, I was busy grappling with far more basic questions about how Facebook worked. I had recently moved back to the West Coast to cover Facebook for the Wall Street Journal, a job that came with the unpleasant necessity of pretending to write with authority about a company I did not understand.

Still, there was a reason I wanted to cover social media. After four years of investigative reporting in Washington, the political accountability work I was doing felt pointless. The news ecosystem was dominated by social media now, and stories didn’t get traction unless they appealed to online partisans. There was so much bad information going viral, but the fact-checks I wrote seemed less like a corrective measure than a weak attempt to ride bullshit’s coattails.

Covering Facebook was, therefore, a capitulation. The system of information sharing and consensus building of which I was a part was on its last legs, so I might as well get paid to write about what was replacing it.

The surprise was how hard it was to even figure out the basics. Facebook’s public explainers of the News Feed algorithm—the code that determined which posts were surfaced before billions of users—relied on phrases like “We’re connecting you to who and what matters most.” (I’d later learn there was a reason why the company glossed over the details: focus groups had concluded that in-depth explanations of News Feed left users confused and unsettled—the more people thought about outsourcing “who and what matters most” to Facebook, the less comfortable they got.)

In a nod to its immense power and societal influence, the company created a blog called Hard Questions in 2017, declaring in its inaugural post that it took “seriously our responsibility—and accountability—for our impact and influence.” But Hard Questions never delved into detail, and after a couple of bruising years of public scrutiny, the effort was quietly abandoned.

By the time I started covering Facebook, the company’s reluctance to field reporters’ queries had grown, too. Facebook’s press shop—a generously staffed team of nearly four hundred—had a reputation for being friendly, professional, and reticent to answer questions. I had plenty of PR contacts, but nobody who wanted to tell me how Facebook’s “People You May Know” recommendations worked, which signals sent controversial posts viral, or what the company meant when it said it had imposed extraordinary user-safety measures amid ethnic cleansing in Myanmar. The platform’s content recommendations shaped what jokes, news stories, and gossip went viral across the world. How could it be such a black box?

The resulting frustration explains how I became a groupie of anyone who had a passing familiarity with Facebook’s mechanics. The former employees who agreed to speak to me said troubling things from the get-go. Facebook’s automated enforcement systems were flatly incapable of performing as billed. Efforts to engineer growth had inadvertently rewarded political zealotry. And the company knew far more about the negative effects of social media usage than it let on.

This was wild stuff, far more compelling than the perennial allegations that the platform unfairly censored posts or favored President Trump. But my ex-Facebook sources couldn’t offer much in the way of proof. When they’d left the company, they’d left their work behind Facebook’s walls.

I did my best to cultivate current employees as sources, sending hundreds of notes that boiled down to two questions: How does a company that holds sway over billions of people actually work? And why, so often, does it seem like it doesn’t?

Other reporters did versions of this too, of course. And from time to time we obtained stray documents indicating that Facebook’s powers, and problems, were greater than it let on. I had the luck of being there when the trickle of information became a flood.

Praise

NAMED A PORCHLIGHT BOOKS BEST BUSINESS BOOK OF THE YEAR

"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside."
New York Times Book Review

"Broken Code offers a comprehensive, briskly reported examination of key systems governing [Facebook] and their many failings... A smartly reported investigation into the messy internal machinations of one of the world’s most important and least understood companies."
—Washington Post


“Jeff Horwitz has written a blockbuster expose of Facebook, the notoriously secretive social media giant whose benign mission—connecting people—masked a growing propensity towards some of humanity’s worst impulses. Populated by concerned, brave employees who defied their employer and leaked thousands of pages of internal documents to Horwitz, with the imperious, remote Mark Zuckerberg and his top lieutenants at the center, Broken Code is brilliant reporting and a page-turning narrative of immense importance.”
James B. Stewart, Pulitzer Prize-winning investigative journalist and New York Times bestselling author

“A dogged and meticulous reporter, Jeff Horwitz is at the height of his powers in Broken Code, a penetrating portrait of one of the most significant companies in the world and of one of the great new challenges of this technological era.”
Ronan Farrow, Pulitzer Prize-winning investigative journalist and New York Times bestselling author

"An unsettling account….Stories of executives bumbling their way through or outright ignoring issues within the company are breathtaking and troubling… Horwitz’s reporting shines...This convincingly makes the case that Facebook’s pursuit of growth at any cost has had disastrous offline consequences."
—Publishers Weekly

"Readers interested in the ethics of the internet and technology, the business aspects of social media, and social media's impact on society at large will be fascinated. Horwitz has created an essential resource."
—Booklist

"A well-researched, disturbing study of a tech behemoth characterized by arrogance, hypocrisy, and greed." 
Kirkus Reviews

"Impressive reporting... A thoroughly documented portrait of a company that recognizes its products have harmed people yet declines to meaningfully change them."
San Francisco Chronicle

Author

© Camas Goble
JEFF HORWITZ is a technology reporter for The Wall Street Journal. His work on “The Facebook Files” won the George Polk Award for Business Reporting and the Gerald Loeb Award for Beat Reporting. Previously an investigative reporter for the Associated Press in Washington, DC, he lives in the San Francisco Bay Area. View titles by Jeff Horwitz

Rights

Available for sale exclusive:
•     Canada
•     Guam
•     Minor Outl.Ins.
•     North Mariana
•     Philippines
•     Puerto Rico
•     Samoa,American
•     US Virgin Is.
•     USA

Available for sale non-exclusive:
•     Afghanistan
•     Aland Islands
•     Albania
•     Algeria
•     Andorra
•     Angola
•     Anguilla
•     Antarctica
•     Argentina
•     Armenia
•     Aruba
•     Austria
•     Azerbaijan
•     Bahrain
•     Belarus
•     Belgium
•     Benin
•     Bolivia
•     Bonaire, Saba
•     Bosnia Herzeg.
•     Bouvet Island
•     Brazil
•     Bulgaria
•     Burkina Faso
•     Burundi
•     Cambodia
•     Cape Verde
•     Centr.Afr.Rep.
•     Chad
•     Chile
•     China
•     Colombia
•     Comoro Is.
•     Congo
•     Cook Islands
•     Costa Rica
•     Croatia
•     Cuba
•     Curacao
•     Czech Republic
•     Dem. Rep. Congo
•     Denmark
•     Djibouti
•     Dominican Rep.
•     Ecuador
•     Egypt
•     El Salvador
•     Equatorial Gui.
•     Eritrea
•     Estonia
•     Ethiopia
•     Faroe Islands
•     Finland
•     France
•     Fren.Polynesia
•     French Guinea
•     Gabon
•     Georgia
•     Germany
•     Greece
•     Greenland
•     Guadeloupe
•     Guatemala
•     Guinea Republic
•     Guinea-Bissau
•     Haiti
•     Heard/McDon.Isl
•     Honduras
•     Hong Kong
•     Hungary
•     Iceland
•     Indonesia
•     Iran
•     Israel
•     Italy
•     Ivory Coast
•     Japan
•     Kazakhstan
•     Kyrgyzstan
•     Laos
•     Latvia
•     Lebanon
•     Liberia
•     Libya
•     Liechtenstein
•     Lithuania
•     Luxembourg
•     Macau
•     Macedonia
•     Madagascar
•     Maldives
•     Mali
•     Marshall island
•     Martinique
•     Mauritania
•     Mayotte
•     Mexico
•     Micronesia
•     Moldavia
•     Monaco
•     Mongolia
•     Montenegro
•     Morocco
•     Myanmar
•     Nepal
•     Netherlands
•     New Caledonia
•     Nicaragua
•     Niger
•     Niue
•     Norfolk Island
•     North Korea
•     Norway
•     Oman
•     Palau
•     Palestinian Ter
•     Panama
•     Paraguay
•     Peru
•     Poland
•     Portugal
•     Qatar
•     Reunion Island
•     Romania
•     Russian Fed.
•     Saint Martin
•     San Marino
•     SaoTome Princip
•     Saudi Arabia
•     Senegal
•     Serbia
•     Sint Maarten
•     Slovakia
•     Slovenia
•     South Korea
•     South Sudan
•     Spain
•     St Barthelemy
•     St.Pier,Miquel.
•     Sth Terr. Franc
•     Suriname
•     Svalbard
•     Sweden
•     Switzerland
•     Syria
•     Tadschikistan
•     Taiwan
•     Thailand
•     Timor-Leste
•     Togo
•     Tokelau Islands
•     Tunisia
•     Turkey
•     Turkmenistan
•     Ukraine
•     Unit.Arab Emir.
•     Uruguay
•     Uzbekistan
•     Vatican City
•     Venezuela
•     Vietnam
•     Wallis,Futuna
•     West Saharan
•     Yemen

Not available for sale:
•     Antigua/Barbuda
•     Australia
•     Bahamas
•     Bangladesh
•     Barbados
•     Belize
•     Bermuda
•     Bhutan
•     Botswana
•     Brit.Ind.Oc.Ter
•     Brit.Virgin Is.
•     Brunei
•     Cameroon
•     Cayman Islands
•     Christmas Islnd
•     Cocos Islands
•     Cyprus
•     Dominica
•     Falkland Islnds
•     Fiji
•     Gambia
•     Ghana
•     Gibraltar
•     Grenada
•     Guernsey
•     Guyana
•     India
•     Iraq
•     Ireland
•     Isle of Man
•     Jamaica
•     Jersey
•     Jordan
•     Kenya
•     Kiribati
•     Kuwait
•     Lesotho
•     Malawi
•     Malaysia
•     Malta
•     Mauritius
•     Montserrat
•     Mozambique
•     Namibia
•     Nauru
•     New Zealand
•     Nigeria
•     Pakistan
•     PapuaNewGuinea
•     Pitcairn Islnds
•     Rwanda
•     S. Sandwich Ins
•     Seychelles
•     Sierra Leone
•     Singapore
•     Solomon Islands
•     Somalia
•     South Africa
•     Sri Lanka
•     St. Helena
•     St. Lucia
•     St. Vincent
•     St.Chr.,Nevis
•     Sudan
•     Swaziland
•     Tanzania
•     Tonga
•     Trinidad,Tobago
•     Turks&Caicos Is
•     Tuvalu
•     Uganda
•     United Kingdom
•     Vanuatu
•     Western Samoa
•     Zambia
•     Zimbabwe