Mark Zuckerberg’s Squirrel Paradox

Everyone around me continuously criticized Facebook et al. for ruining our family dinners, work meetings, and even weekend parties. Today, we praise them as the saviors of humanity in this pandemic age.

Covid-19 has pushed us into an anti-social life and work habits. Some argue that products like Facebook help us get over this crisis and make us experience their real value

As the use of these platforms increased during this period (Figure 1), We should seriously start thinking about their side-effects. They yield multiple flows that turn out to be dangerous to our societies. A critical view of these services can help them grow and open space for innovation in this field that seems restricted and dominated this last decade.

This article will focus on two fundamental flows in the social media paradigm: The Filter Bubbles and Personal Data Exposure. They both threaten our freedom and dictate how much these platforms could help to manipulate us.

Figure 1 — Source: NYTime: The Virus Changed the Way We Internet.

Filter Bubbles and The Squirrel Paradox

“A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” — Mark Zuckerberg

This principle that Zuckerburg refers to is called the “Hierarchy of Death” in journalism, and it is considered a pillar of the new social media landscape.

To describe the issue in detail: Presume you somehow gathered a third of humanity on a single platform (the case of Facebook 2.7 billion user base). How can you keep those people hooked and be sure that they will consume mostly information on your platform? The problem is naturally challenging. However, The bright people building those platforms came up with a joined solution that is technically optimal.

We refer to their solution as “The Filter Bubbles” throughout this article.

Photo by Tim Marshall on Unsplash

Zuckerberg was a psychology major at Harvard though he took mostly computer science classes. Not a coincidence that the filtering solution blends two concepts from psychology and computer science:

  • Confirmation bias: is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values.
  • Divide and conquer: is an algorithm design paradigm. A divide-and-conquer algorithm works by recursively breaking down a problem into two or more sub-problems of the same or related type until these become simple enough to be solved directly.

Try to create a bubble around the user and then feed him what he is already comfortable with. God said, “Let there be light,” and there was light.

In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs — and because these filters are invisible, we won’t know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas — Eli Parsier

The Filter Bubble (or the comfort zone) is the information field relevant to you — i.e., information close enough to your network or that the AI thinks you can enjoy. On Facebook, for example, that would be the posts from your friends and pages/people you follow and the publications that are similar to them in content, type, or topics.

Bystander at the Switch

Most modern social media platforms will let their AI decide what to show you based on your interactions, location, network, and preferences. What If what the AI chooses to show harms you? It brings us to a very known AI choice dilemma named the Trolley problem.

Figure 2 — Source: https://tomkow.typepad.com/tomkowcom/trolley_problem/

The most basic version of the dilemma, known as “Bystander at the Switch” or “Switch”, goes thus:

There is a runaway trolley barrelling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:

- Do nothing and allow the trolley to kill the five people on the main track.
- Pull the lever, diverting the trolley onto the side track where it will kill one person.

What is the right thing to do?

AI filters turned the social network into a nest of closed bubbles. You see the information inside the bubble that is tailored to please you and confirm your thoughts and ideas. Hence the optimal technological solution becomes a crucial problem for freedom of speech, innovation, and democracy in general.

Can we entirely rely on AI to filter information on social media? Is the filter bubble approach the only possibility for content serving and recommendation on these platforms?

The Great Scandal of Cambridge Analytica

Away from the harm that AI can make, There is an enemy more dangerous and more trained to battle: People. They are very skilled at misusing tools for personal, political, and financial interests. The story of Cambridge Analytica is a living example of how Social Media can be misused to manipulate us.

To fully understand this matter, we must first go back a few years ago. In 2014 at Cambridge University, researchers at the university have developed a technique to understand a person’s psychological profile only through their activity on Facebook, especially in terms of what they like.

A London-based consumer and political opinion research firm, Cambridge Analytica, is interested in this work and approaches the group of researchers to work with them. They refuse, except for a professor of psychology, the Russian-American Dr. Aleksandr Kogan. The latter knows his colleagues’ techniques, and he decides, in June 2014, to develop his application, called ThisIsYourDigitalLife. Its principle is simple: it offers to pay users to complete psychological tests by accessing their data on Facebook.

Aleksandr Kogan sold this data he collected to Cambridge Analytica for around a million dollars. Nearly 270,000 people have downloaded ThisIsYourDigitalLife, thinking they are participating in a university study.

In secret, the application collects data from your Facebook friends without them knowing. In the end, more than 50 million profiles were thus illegally recovered by Cambridge Analytica between 2014 and 2015. It represents one of the largest illegal data collections in the history of Social Media.

Note that this is not a hack: no vulnerabilities has been exploited. Aleksandr Kogan took advantage of the normal functioning of Facebook at the time. In April 2015, Facebook finally restricted the number of data developers had access to.

We will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014 — Mark Zuckerburg

Cambridge Analytica was hired for Donald Trump’s presidential campaign. Its role was to optimize audience targeting for ads and fundraising. It is also believed to have influenced the mass during the Brexit campaign.

In the case of the Cambridge Analytica affair, personal data exposure allowed for political manipulation. It directly affects democracy and threatens our life, history, and future. It can happen again: data breaches occur every day with larger and larger cyberattacks are being performed on these platforms.

Why is this centralized social media concept the dominant one? Can we think of a more secure way to share and consume data on social networks? Can we build on top of the blockchain concept to solve this issue?

The Curious Case of @RealDonaldTrump

The emergence of social media has changed how political communication takes place in the United States. Political institutions such as politicians, political parties, foundations, institutions, and political think tanks use social media platforms, like Facebook and Twitter, to communicate with and engage voters.

In 2008, Barack Obama was the first to use the Internet to organize supporters, advertise, and communicate with individuals in a way that had been impossible in previous elections. Trump took the concept to another level. You would seriously think that the Internet would benefit humanity to create sustainable exchange and peace between nations.

Let’s look at the case of the Iran conflict that rose this year after the killing of the commander Qassim Suleimani at the Baghdad International Airport. President Trump showcased the military action by posting an American flag image on Twitter without any text.

Social media pushes leaders to become more and more popular and to publish attractive content that users are thriving to see. Would you think this is an ethical way to handle political and international issues?

“…decisions were often made because of ideology and politics. As a result many wrong-headed actions were taken, ones that did not solve the problem at hand but that fit with the interests or beliefs of the people in power.” — Joseph E. Stiglitz

I join Joseph E. Stiglitz, to say that today’s politics are influenced by social media even more than political ideologies, and this is not sane for the future of our world. We need to make it hard not to fall into the same traps again and again. Between extremism and populism, there is a thin line. Social Media platforms are making it fade out by not taking concrete actions to establish clear rules against war, manipulation and hate messages.

The U.S. House of Representatives passed a resolution this year to stop President Donald Trump from further military action against Iran, rebuking the president days after he ordered a drone strike that killed a top Iranian commander and raised fears of war. He responded through Twitter. Trump called Pelosi “Crazy” and told reporters he did not need Congress’ approval for military action against Iran.

I don’t have to, and you shouldn’t have to, because you have to be able to make split-second decisions sometimes. Sometimes you have to move very, very quickly. — Donald J. Trump

You can see why this is a problem. No one wants the world’s future and global peace to be conditioned by someone’s mood on Twitter.

There are no issues in using social media to market your ideas or gain popularity, but at what cost? I am anti-war, but if you want to initiate one, is Twitter the most suitable medium to declare it to other nations?

Conclusion

Social media platforms are Amazing. They made us connect in ways never possible before, and they are also helping in this crisis period. I use Facebook every day to ask about my family. However, viewing the limits and criticizing the weak points is always a great way to improve and develop better solutions.

“Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them. About the only thing you can’t do is ignore them. Because they change things. They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.”

Rob Siltanen, Apple’s ‘Think Different’ Campaign

I still think that we can positively change the world. We can use the power of the Internet to make humanity collaborate, work out its issues and focus on what benefits all of us. The platforms we build and how we treat and distribute information across users are the keys to succeed in building a sustainable and harmless social media platform.

Medium is an excellent social writing platform. It helps everyone discover ideas from people worldwide and make the writing experience profitable to the writer and the platform at the same time.

Meanwhile, 2.7 billion user on Facebook are just trolling and guessing if Trump will win or loose the elections next week.

Data Consultant. Databricks Certified Associate Developer.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store