The Chaos Machine by Max Fisher The Inside Story of How Social Media Rewired Our Minds and Our World
What's it about?
The Chaos Machine (2022) explores the dark side of social media. The design of apps like Facebook and Twitter, combined with the nature of human psychology, often make social media bring out the worst in us.
Max Fisher, The Chaos Machine. The inside story of how social media rewired our minds and our world. For many of us, it starts as soon as we wake up. We immediately reach for our phones, checking our notifications, Facebook, Twitter, Instagram.
Hopefully we've got some likes and comments, giving us that dopamine rush we've come to crave. If that sounds familiar, The Chaos Machine is for you. Perhaps you're one of the lucky few who's managed to kick the habit. Either way, there's something to be learned here, something to make us all reflect on what's happening to society. In The Chaos Machine, journalist Max Fisher delves into the dark side of social media. These apps aren't just a harmless distraction, but powerful tools carefully designed to keep us hooked.
Platforms like Facebook and Twitter exploit our psychological vulnerabilities, often with dire real-world consequences, from pandemic misinformation to political upheaval. Social media is changing our brains and our world. It's time, says Fisher, to rethink our relationship with these platforms. And after this lesson, you might just want to delete your accounts altogether. So, if you're ready, let's get started. Exploiting Our Psychological Weaknesses If you've ever felt addicted to social media, finding yourself endlessly scrolling, or checking your notifications compulsively, just know that you're not alone.
And it's not your fault. Your brain has been hijacked, which was essentially the plan all along. Sean Parker, Facebook's first president, once revealed that social media apps were intentionally designed to consume our time and attention, as much as possible. The strategy involves giving users small hits of dopamine through likes or comments, making them feel good.
However, what makes these apps truly addictive is the inconsistent nature of the psychological rewards. It's a technique called intermittent variable reinforcement. This method is similar to how slot machines work in casinos. Sometimes you win, sometimes you don't. On social media, sometimes your post gets a lot of likes, but sometimes it gets nothing. This uncertainty keeps users posting, scrolling, and seeking the next reward.
It's subtle, and many people don't realize how addicted they are. But they soon get stuck in what Sean Parker calls a social validation feedback loop, constantly seeking approval from others. Social media taps into our desire for social validation and self-expression. It allows us to express our identities and affiliations, whether it's through a post about a football team or a political stance. This desire for social identity affirmation is deep-rooted. It's why we feel the need to put up a flag, wear a t-shirt with the logo of the college we went to, or display a bumper sticker.
Websites like BuzzFeed cater to this need. Think of all the lists like, 31 things only people from a small town will understand. While this might seem harmless, it can exacerbate divisions and foster an us-versus-them mentality. Sometimes it leads to serious real-world consequences. Many of us are vaguely aware of social media's addictiveness and its negative impact. But what you may not know is just how deliberately these apps were designed to exploit our psychological weaknesses.
During his research, Max Fisher interviewed senior members of staff at companies like Facebook and was disturbed by what he discovered. Many of the people Fisher spoke to shrugged off criticism and denied responsibility for the negative effects social media had on users. It was a bit like the executives of a cigarette factory, saying that they couldn't understand all the complaints about the health impact of their product. Absurd, to say the least.
Going beyond our social limits. In 2013, Facebook was facing a challenge. User growth was stagnating and the company needed a strategy to boost engagement. So they decided to experiment.
Was it possible to go beyond the Dunbar limit? The Dunbar limit is named after the anthropologist Robin Dunbar, who once proposed that humans had a social limit, a cognitive cap. At most, we're able to manage around 150 relationships. That's because we evolved in groups that had a maximum of 150 people. Back in 2013, the average Facebook user had around 130 friends. To stimulate growth, Facebook's algorithm started showing users content from what was known as weak ties, friends of friends.
This tactic expanded users' social circles significantly. Soon, Twitter adopted a similar approach, encouraging users to follow and engage with friends of friends. This strategy probably seemed harmless, or even beneficial. The social media companies could have argued that they were simply promoting connection. What's wrong with that? Well, when we're pushed beyond our neurological limits, there are consequences.
Consider studies conducted with rhesus monkeys and macaques who, like us, also have a social limit. The studies show that larger group sizes lead to increased aggression and distrust, as these animals struggle to navigate larger social networks. They become more focused on hierarchies and control within the group. Social media users have experienced similar effects over the past decade. As platforms like Facebook and Twitter expanded our social circles beyond the Dunbar limit, online behavior shifted. The digital space became more hostile, and people grew increasingly radicalized.
Renée DiResta is a tech investor who spent time investigating anti-vaccine groups on Facebook. According to her, the company has built an outrage machine. A user might join a standard parenting group and then be shown groups spreading medical misinformation—for instance, the false claim that the Zika virus is manufactured. From there, the algorithm could lead the user to other, even more extreme conspiracies.
Essentially, the algorithm recognizes that a user who's interested in one conspiracy will probably be interested in another. With just a few clicks, the user is exposed to increasingly extreme content—content they might never have chosen to seek out in the first place. But the social media companies don't care about the consequences. If users are engaged and the companies are making money, that's all that matters.
Outrage and punishment. We might like to think that we avoid arguments on the whole, and that we aren't naturally angry people. Yet, when it comes to social media, many of us find it hard to resist. Just take a look at Facebook—all those heated debates and outraged comments.
Outrage, particularly moral outrage, is a powerful and deeply rooted instinct that drives much of our online behavior. Imagine an earlier time in human civilization, when our social groups were limited to around 150 people. How did these groups ensure everyone got along and followed the rules? The answer? Through moral outrage. When someone broke social norms, others would become angry, and they would broadcast their anger to the rest of the group, to ensure the transgressor was punished.
This instinctive behavior is now amplified on social media. Posts expressing moral outrage go viral as our evolutionary instincts kick in, compelling us to shame and punish wrongdoers publicly. A notable example occurred in 2020, when a birdwatcher in New York's Central Park asked a woman to leash her dog. She refused, and the situation escalated. The birdwatcher, a Black man, pulled out his phone to record the incident. In response, the woman, who was white, called 911, falsely claiming she was being threatened by an African-American man.
In a time when police violence against Black people was often in the news, the video went viral, reaching 40 million views. Levels of moral outrage reached fever pitch, as people contacted the employers of the dog owner, pressurizing them to fire her, which they did. People were so incensed that they even contacted the shelter where the woman had adopted her dog. Under pressure, she temporarily returned her dog to the shelter. The social media collective had decided this woman needed to be punished, but the online response was clearly out of proportion. Even the birdwatcher had doubts.
I'm not excusing the racism, he said, but I don't know if her life needed to be torn apart. On an evolutionary level, we are wired to get a dopamine hit from punishing a perceived wrongdoer. The larger the audience, the more willing we are to express outrage and enact punishment. This explains why, on platforms like Facebook and Twitter, moral outrage and shaming can spiral out of control, with more and more people joining in the frenzy. There's no doubt that social media has a disturbing, unprecedented power to both connect and divide us. Real-World Consequences Adding fuel to the fire are hate speech and misinformation, which are endemic on most social media platforms.
To give just a few examples, there were the anti-vaccine posts during the COVID-19 pandemic, the YouTube channel of the conspiracy theorist Alex Jones, and extremist Facebook posts in Myanmar, which have been blamed for contributing to genocide. That's how bad it can get. The combination of vitriol and misinformation, boosted by the algorithm, is a dangerous one. But until recently, social media companies had made virtually no attempts to stop it.
In June 2020, hundreds of Facebook employees staged a walkout, protesting the company's inaction when Donald Trump wrote a post inciting violence, targeting people who were protesting against racial injustice. Facebook had failed to take down the post. But in August, under pressure, Facebook and Twitter finally took action. Trump had published a video falsely claiming that children were almost immune to COVID, and this time, the post was removed. Social media platforms also began to introduce other measures, like fact-checking boxes and crackdowns on accounts that shared QAnon conspiracies. But it was a case of too little, too late, especially for QAnon.
By late 2020, many social media users had fallen far down the rabbit hole. They believed that the government was corrupt, that election fraud was widespread, and that Trump, not Biden, should be in power. In mid-December 2020, a month and a half after Biden won the election, Trump posted on Twitter, Big protest in DC on January 6th. Be there. We'll be wild. And we all know what happened next.
The Capitol siege. Egged on by Trump and the social media frenzy, thousands descended on Washington, DC and forced their way into the Capitol building in an attempt to overturn the results of the election. In the chaos that followed, many were injured, and some people even died. Ashley Babbitt, a QAnon supporter with an active Twitter account, broke into a room in the Capitol, where she was shot and killed by a police officer. Babbitt was wearing a Trump flag as a cape, and a lot of the commentary on January 6th has focused on Trumpism. While that's certainly one aspect of it, we shouldn't lose sight of what the movement really was—something created by social media.
The people who stormed the Capitol were there because of the posts they'd seen on Facebook, Twitter, and YouTube. They'd been riled up by misinformation, to the point where they felt they needed to act. In the aftermath of the riot, some social media companies banned Trump from using the platforms. But the ban and the solution were only temporary.
Time to turn off the machine? After the Capitol siege, there were calls for change—from politicians, people in the tech industry, and employees of social media companies. Enough was enough. For a while, there was a lot of discussion about how companies could change.
Perhaps subscriptions were the answer. If users paid to log on, social media companies wouldn't be so dependent on ad revenue, and therefore engagement. Ultimately, though, the social media giants don't want to change. They don't see it as their responsibility. And yet, these companies are fully aware of the harm they're causing. And they have been for years.
In public, people like Mark Zuckerberg or Sheryl Sandberg at Facebook will deny responsibility and play down concerns. But in private, it's a different story. In 2021, a Facebook employee-turned-whistleblower shared internal documents with The Wall Street Journal. The whistleblower, Frances Haugen, had had enough. She believed that Facebook was deliberately sacrificing the safety of its users—and was even willing to sacrifice democracy. All the company cared about was profit.
The documents Haugen shared told a troubling story of a company that knew everything. Facebook's executives had been warned of the dangers on the platform, such as vaccine misinformation or the rise in hate speech. But the company did nothing. Haugen later decided to speak publicly about her experiences. In an interview on 60 Minutes, she said that although Facebook had the power to make changes—for example, by tweaking the algorithm to make the site safer—it deliberately chose not to. If users spend less time on Facebook, they'll click on fewer ads, and the company will make less money.
One solution, according to Haugen, would be to turn off the algorithm. Computers shouldn't be the ones deciding what users focus on. But convincing Facebook to switch off the algorithm may not be realistic. And besides, that's not the platform's only problematic feature. Many of the experts Fisher interviewed for The Chaos Machine came to the same conclusion. We'd all benefit if social media were switched off to a certain extent, stripped down and not so tightly woven into itself.
That might mean living with a less interesting internet—an internet with fewer entertaining videos or lively communities. But if that also means living in a world with less hate and misinformation, wouldn't that be worth it? The more people Fisher spoke to about social media, the more he was reminded of the computer Hal in the Stanley Kubrick film 2001, A Space Odyssey. Hal is not supposed to be the villain, but when he malfunctions and attempts to kill the crew on board the spaceship where he's located, they're left with no choice.
They have to turn him off. Even though it's difficult, as it means the loss of this extraordinary computer, the humans must take back control. Perhaps there's a lesson there for all of us. In this lesson to The Chaos Machine by Max Fisher, you've learned that social media addiction isn't your fault.
It occurs by design. Platforms like Facebook and Twitter are carefully crafted to keep users hooked, like a slot machine in a casino. Our need for social validation and expression drives us to engage more on social media, but this also fosters division. In addition, these platforms deliberately push us past our social capacity limits, leading to increased aggression and radicalization.
For example, anti-vaccine posts and conspiracy theories spread rapidly, often leading users down extreme rabbit holes. The Capitol siege on January 6, 2021, was a stark example of social media's power. Fueled by misinformation and calls for protest, a mob stormed the Capitol, resulting in absolute chaos. Frustratingly, social media companies have been slow and reluctant to make any significant changes. Internal documents shared by a Facebook whistleblower indicate that these platforms prioritize profit over user safety. Some experts suggest we need to rethink our relationship with social media, possibly even getting rid of it altogether, in order to reduce harm and misinformation.
It's a challenging idea, but one that could lead to a safer, less chaotic world. Okay, that's it for this lesson. We hope you enjoyed it. If you can, please take the time to leave us a rating. We always appreciate your feedback. See you in the next lesson.
The Chaos Machine (2022) explores the dark side of social media. The design of apps like Facebook and Twitter, combined with the nature of human psychology, often make social media bring out the worst in us.
Max Fisher, The Chaos Machine. The inside story of how social media rewired our minds and our world. For many of us, it starts as soon as we wake up. We immediately reach for our phones, checking our notifications, Facebook, Twitter, Instagram.
Hopefully we've got some likes and comments, giving us that dopamine rush we've come to crave. If that sounds familiar, The Chaos Machine is for you. Perhaps you're one of the lucky few who's managed to kick the habit. Either way, there's something to be learned here, something to make us all reflect on what's happening to society. In The Chaos Machine, journalist Max Fisher delves into the dark side of social media. These apps aren't just a harmless distraction, but powerful tools carefully designed to keep us hooked.
Platforms like Facebook and Twitter exploit our psychological vulnerabilities, often with dire real-world consequences, from pandemic misinformation to political upheaval. Social media is changing our brains and our world. It's time, says Fisher, to rethink our relationship with these platforms. And after this lesson, you might just want to delete your accounts altogether. So, if you're ready, let's get started. Exploiting Our Psychological Weaknesses If you've ever felt addicted to social media, finding yourself endlessly scrolling, or checking your notifications compulsively, just know that you're not alone.
And it's not your fault. Your brain has been hijacked, which was essentially the plan all along. Sean Parker, Facebook's first president, once revealed that social media apps were intentionally designed to consume our time and attention, as much as possible. The strategy involves giving users small hits of dopamine through likes or comments, making them feel good.
However, what makes these apps truly addictive is the inconsistent nature of the psychological rewards. It's a technique called intermittent variable reinforcement. This method is similar to how slot machines work in casinos. Sometimes you win, sometimes you don't. On social media, sometimes your post gets a lot of likes, but sometimes it gets nothing. This uncertainty keeps users posting, scrolling, and seeking the next reward.
It's subtle, and many people don't realize how addicted they are. But they soon get stuck in what Sean Parker calls a social validation feedback loop, constantly seeking approval from others. Social media taps into our desire for social validation and self-expression. It allows us to express our identities and affiliations, whether it's through a post about a football team or a political stance. This desire for social identity affirmation is deep-rooted. It's why we feel the need to put up a flag, wear a t-shirt with the logo of the college we went to, or display a bumper sticker.
Websites like BuzzFeed cater to this need. Think of all the lists like, 31 things only people from a small town will understand. While this might seem harmless, it can exacerbate divisions and foster an us-versus-them mentality. Sometimes it leads to serious real-world consequences. Many of us are vaguely aware of social media's addictiveness and its negative impact. But what you may not know is just how deliberately these apps were designed to exploit our psychological weaknesses.
During his research, Max Fisher interviewed senior members of staff at companies like Facebook and was disturbed by what he discovered. Many of the people Fisher spoke to shrugged off criticism and denied responsibility for the negative effects social media had on users. It was a bit like the executives of a cigarette factory, saying that they couldn't understand all the complaints about the health impact of their product. Absurd, to say the least.
Going beyond our social limits. In 2013, Facebook was facing a challenge. User growth was stagnating and the company needed a strategy to boost engagement. So they decided to experiment.
Was it possible to go beyond the Dunbar limit? The Dunbar limit is named after the anthropologist Robin Dunbar, who once proposed that humans had a social limit, a cognitive cap. At most, we're able to manage around 150 relationships. That's because we evolved in groups that had a maximum of 150 people. Back in 2013, the average Facebook user had around 130 friends. To stimulate growth, Facebook's algorithm started showing users content from what was known as weak ties, friends of friends.
This tactic expanded users' social circles significantly. Soon, Twitter adopted a similar approach, encouraging users to follow and engage with friends of friends. This strategy probably seemed harmless, or even beneficial. The social media companies could have argued that they were simply promoting connection. What's wrong with that? Well, when we're pushed beyond our neurological limits, there are consequences.
Consider studies conducted with rhesus monkeys and macaques who, like us, also have a social limit. The studies show that larger group sizes lead to increased aggression and distrust, as these animals struggle to navigate larger social networks. They become more focused on hierarchies and control within the group. Social media users have experienced similar effects over the past decade. As platforms like Facebook and Twitter expanded our social circles beyond the Dunbar limit, online behavior shifted. The digital space became more hostile, and people grew increasingly radicalized.
Renée DiResta is a tech investor who spent time investigating anti-vaccine groups on Facebook. According to her, the company has built an outrage machine. A user might join a standard parenting group and then be shown groups spreading medical misinformation—for instance, the false claim that the Zika virus is manufactured. From there, the algorithm could lead the user to other, even more extreme conspiracies.
Essentially, the algorithm recognizes that a user who's interested in one conspiracy will probably be interested in another. With just a few clicks, the user is exposed to increasingly extreme content—content they might never have chosen to seek out in the first place. But the social media companies don't care about the consequences. If users are engaged and the companies are making money, that's all that matters.
Outrage and punishment. We might like to think that we avoid arguments on the whole, and that we aren't naturally angry people. Yet, when it comes to social media, many of us find it hard to resist. Just take a look at Facebook—all those heated debates and outraged comments.
Outrage, particularly moral outrage, is a powerful and deeply rooted instinct that drives much of our online behavior. Imagine an earlier time in human civilization, when our social groups were limited to around 150 people. How did these groups ensure everyone got along and followed the rules? The answer? Through moral outrage. When someone broke social norms, others would become angry, and they would broadcast their anger to the rest of the group, to ensure the transgressor was punished.
This instinctive behavior is now amplified on social media. Posts expressing moral outrage go viral as our evolutionary instincts kick in, compelling us to shame and punish wrongdoers publicly. A notable example occurred in 2020, when a birdwatcher in New York's Central Park asked a woman to leash her dog. She refused, and the situation escalated. The birdwatcher, a Black man, pulled out his phone to record the incident. In response, the woman, who was white, called 911, falsely claiming she was being threatened by an African-American man.
In a time when police violence against Black people was often in the news, the video went viral, reaching 40 million views. Levels of moral outrage reached fever pitch, as people contacted the employers of the dog owner, pressurizing them to fire her, which they did. People were so incensed that they even contacted the shelter where the woman had adopted her dog. Under pressure, she temporarily returned her dog to the shelter. The social media collective had decided this woman needed to be punished, but the online response was clearly out of proportion. Even the birdwatcher had doubts.
I'm not excusing the racism, he said, but I don't know if her life needed to be torn apart. On an evolutionary level, we are wired to get a dopamine hit from punishing a perceived wrongdoer. The larger the audience, the more willing we are to express outrage and enact punishment. This explains why, on platforms like Facebook and Twitter, moral outrage and shaming can spiral out of control, with more and more people joining in the frenzy. There's no doubt that social media has a disturbing, unprecedented power to both connect and divide us. Real-World Consequences Adding fuel to the fire are hate speech and misinformation, which are endemic on most social media platforms.
To give just a few examples, there were the anti-vaccine posts during the COVID-19 pandemic, the YouTube channel of the conspiracy theorist Alex Jones, and extremist Facebook posts in Myanmar, which have been blamed for contributing to genocide. That's how bad it can get. The combination of vitriol and misinformation, boosted by the algorithm, is a dangerous one. But until recently, social media companies had made virtually no attempts to stop it.
In June 2020, hundreds of Facebook employees staged a walkout, protesting the company's inaction when Donald Trump wrote a post inciting violence, targeting people who were protesting against racial injustice. Facebook had failed to take down the post. But in August, under pressure, Facebook and Twitter finally took action. Trump had published a video falsely claiming that children were almost immune to COVID, and this time, the post was removed. Social media platforms also began to introduce other measures, like fact-checking boxes and crackdowns on accounts that shared QAnon conspiracies. But it was a case of too little, too late, especially for QAnon.
By late 2020, many social media users had fallen far down the rabbit hole. They believed that the government was corrupt, that election fraud was widespread, and that Trump, not Biden, should be in power. In mid-December 2020, a month and a half after Biden won the election, Trump posted on Twitter, Big protest in DC on January 6th. Be there. We'll be wild. And we all know what happened next.
The Capitol siege. Egged on by Trump and the social media frenzy, thousands descended on Washington, DC and forced their way into the Capitol building in an attempt to overturn the results of the election. In the chaos that followed, many were injured, and some people even died. Ashley Babbitt, a QAnon supporter with an active Twitter account, broke into a room in the Capitol, where she was shot and killed by a police officer. Babbitt was wearing a Trump flag as a cape, and a lot of the commentary on January 6th has focused on Trumpism. While that's certainly one aspect of it, we shouldn't lose sight of what the movement really was—something created by social media.
The people who stormed the Capitol were there because of the posts they'd seen on Facebook, Twitter, and YouTube. They'd been riled up by misinformation, to the point where they felt they needed to act. In the aftermath of the riot, some social media companies banned Trump from using the platforms. But the ban and the solution were only temporary.
Time to turn off the machine? After the Capitol siege, there were calls for change—from politicians, people in the tech industry, and employees of social media companies. Enough was enough. For a while, there was a lot of discussion about how companies could change.
Perhaps subscriptions were the answer. If users paid to log on, social media companies wouldn't be so dependent on ad revenue, and therefore engagement. Ultimately, though, the social media giants don't want to change. They don't see it as their responsibility. And yet, these companies are fully aware of the harm they're causing. And they have been for years.
In public, people like Mark Zuckerberg or Sheryl Sandberg at Facebook will deny responsibility and play down concerns. But in private, it's a different story. In 2021, a Facebook employee-turned-whistleblower shared internal documents with The Wall Street Journal. The whistleblower, Frances Haugen, had had enough. She believed that Facebook was deliberately sacrificing the safety of its users—and was even willing to sacrifice democracy. All the company cared about was profit.
The documents Haugen shared told a troubling story of a company that knew everything. Facebook's executives had been warned of the dangers on the platform, such as vaccine misinformation or the rise in hate speech. But the company did nothing. Haugen later decided to speak publicly about her experiences. In an interview on 60 Minutes, she said that although Facebook had the power to make changes—for example, by tweaking the algorithm to make the site safer—it deliberately chose not to. If users spend less time on Facebook, they'll click on fewer ads, and the company will make less money.
One solution, according to Haugen, would be to turn off the algorithm. Computers shouldn't be the ones deciding what users focus on. But convincing Facebook to switch off the algorithm may not be realistic. And besides, that's not the platform's only problematic feature. Many of the experts Fisher interviewed for The Chaos Machine came to the same conclusion. We'd all benefit if social media were switched off to a certain extent, stripped down and not so tightly woven into itself.
That might mean living with a less interesting internet—an internet with fewer entertaining videos or lively communities. But if that also means living in a world with less hate and misinformation, wouldn't that be worth it? The more people Fisher spoke to about social media, the more he was reminded of the computer Hal in the Stanley Kubrick film 2001, A Space Odyssey. Hal is not supposed to be the villain, but when he malfunctions and attempts to kill the crew on board the spaceship where he's located, they're left with no choice.
They have to turn him off. Even though it's difficult, as it means the loss of this extraordinary computer, the humans must take back control. Perhaps there's a lesson there for all of us. In this lesson to The Chaos Machine by Max Fisher, you've learned that social media addiction isn't your fault.
It occurs by design. Platforms like Facebook and Twitter are carefully crafted to keep users hooked, like a slot machine in a casino. Our need for social validation and expression drives us to engage more on social media, but this also fosters division. In addition, these platforms deliberately push us past our social capacity limits, leading to increased aggression and radicalization.
For example, anti-vaccine posts and conspiracy theories spread rapidly, often leading users down extreme rabbit holes. The Capitol siege on January 6, 2021, was a stark example of social media's power. Fueled by misinformation and calls for protest, a mob stormed the Capitol, resulting in absolute chaos. Frustratingly, social media companies have been slow and reluctant to make any significant changes. Internal documents shared by a Facebook whistleblower indicate that these platforms prioritize profit over user safety. Some experts suggest we need to rethink our relationship with social media, possibly even getting rid of it altogether, in order to reduce harm and misinformation.
It's a challenging idea, but one that could lead to a safer, less chaotic world. Okay, that's it for this lesson. We hope you enjoyed it. If you can, please take the time to leave us a rating. We always appreciate your feedback. See you in the next lesson.
Comments
Post a Comment