An Associated Press article last week reported on German Neo-Nazi groups’ use of Facebook, Instagram and YouTube to “spread their ideology, draw in recruits and make money through ticket sales and branded merchandise.”

The prospect of Nazism reviving in Germany, its birthplace, is nightmarish enough. Worse is that it’s happening with the assistance of American social media. But having just finished reading “An Ugly Truth” — a recently published account of Facebook’s rise from a 2004 Harvard College dorm-room project to a nearly trillion-dollar social media conglomerate with almost 3 billion users — I can’t say I’m surprised.

Facebook’s story mirrors that of Frankenstein. Mary Shelley’s 1818 novel about a young scientist’s obsessive quest to create a humanoid from non-living matter uncannily anticipated the reckless hubris of Facebook founder Mark Zuckerberg’s mission to connect every human being on the planet to a kind of virtual global frat house. In each, the monster got out of the creator’s control.

To comprehend the dangers of Facebook, it’s necessary to know a bit about human psychology and how the platform manipulates that psychology.

People have many irrational impulses, often existing at an unconscious or subconscious level.

Studies have shown, for instance, that the brain is wired for a number of biases which interfere with our perception of reality. These include the tendency of people to estimate the probability of an event by the ease with which they can recall examples, the pessimistic belief that the future is likely to be worse than the present, and the inclination to interpret situations in ways that support the outlook of the social groups to which they, as individuals, belong.

Advertisement

As a hypothetical example of the three biases working in tandem, suppose there’s extensive television news coverage of a fatal plane crash of unknown cause which claims hundreds of lives. The coverage could lead certain viewers to believe that commercial air travel is extremely dangerous, that lax FAA inspection practices are making it increasingly unsafe, and that permissive immigration policies are giving foreign terrorists easy access to board and sabotage flights. Each belief would be false.

Zuckerberg’s brilliant idea behind Facebook was that people could be persuaded to share enormous amounts of personal information in order to keep or expand their circle of friends through free online access to a virtual social network. The network permitted users to send messages to one another and share items of interest from websites dealing with everything from politics to sports, cooking, travel, dating, child-rearing and gardening. The large user base, coupled with the technical innovation of algorithmic amplification, made it into a lucrative targeted advertising business.

Artificial Intelligence algorithms monitor the frequency, duration and content of user “clicks,” not only on users’ pages but on the various websites they and their virtual friends “like.” This provides a metric of a user’s interests, allowing Facebook to provide each a daily menu of “news feeds” tailored to the user’s interests and to invite the user to join Facebook virtual “groups” that mesh with those interests. Such offerings amplify engagement and increase the overall time users spend on the platform.

Given the biases of the human psyche, algorithmic amplification prompts many users to migrate to sites and groups which reinforce their natural tendency towards irrational thinking, pessimism and social self-segregation — in other words, to enter dark, paranoid echo chambers of alternate reality.

Zuckerberg started Facebook with the best of intentions. But his single-mindedly relentless pursuit of user growth, and the advertising revenues that flowed from it, led him to turn a blind eye to the ways in which Facebook was enabling violent extremism, warping the psyche of its users, endangering national security and wrecking the essential institutions of our democracy.

Holding the controlling interest in a monopolistic company virtually unchecked by government regulation, Zuckerberg repeatedly ignored internal and external warnings about the collateral damage Facebook was causing. He reacted only when media exposes, congressional hearings and public outrage forced him to and then only with half-hearted measures to excise the most offending posts through a kind of internal censorship known as “content moderation.”

Advertisement

The spread of neo-Nazism, along with other white supremacist, conspiracy-theorist and anti-government extremist groups spewing disinformation and incendiary calls to hate and violence, is only one of the epidemics Facebook has facilitated. Others include:

• The damage which social media bullying has inflicted on the mental health of teenagers;

• The viral spread of ethnic hate speech in Myanmar, which escalated into a genocidal campaign of murder and forced migration against the Rohingya Muslim minority;

• Russia’s interference in the 2016 presidential campaign;

• The propagation by political operatives of blatantly false information vilifying elected officials and candidates for political office;

• The spread of misinformation over Facebook regarding public health recommendations to stem the spread of COVID-19; and

Advertisement

• Donald Trump’s use of Facebook to promote anti-immigrant hatred, racism, political extremism, mistrust of election integrity, and finally, on Jan. 6, 2021, political insurrection.

The time has arrived and, indeed, is long overdue, for thoughtful governmental regulation of Facebook and other digital social media platforms, which have shown they can’t be trusted to self-regulate.

Public health and safety, national security, law and order, and the very survival of democracy may depend upon it.

Elliott Epstein is a trial lawyer with Andrucki & King in Lewiston. His Rearview Mirror column, which has appeared in the Sun Journal for 15 years, analyzes current events in an historical context. He is also the author of “Lucifer’s Child,” a book about the notorious 1984 child murder of Angela Palmer. He may be contacted at epsteinel@yahoo.com

Comments are no longer available on this story

filed under: