Treat social media like the climate crisis, researchers say
Mark Zuckerberg must hand over the keys, according to a team of researchers. In one new paper published in the peer-reviewed journal PNAS (Proceedings of the National Academy of Sciences of the United States of America), a diverse group argues that the social media problem is reaching a level of urgency that climate change present. At this point, it seems silly not to muster all the available resources to check a system that instigated genocidal violence, organized an insurgency, increased resistance to vaccines during a pandemic, and in danger asylum seekers, to name a few.
Seventeen academics, including disinformation researchers, technological ethicists, climatologists, biologists, psychology theorists, and anthropologists write that we should view the social media catastrophe as a climate change-like ‘crisis discipline’ and public health. A crisis discipline requires urgent interdisciplinary collaboration in order to understand and solve the problem, both in the laboratory and field work, global climate modeling, mathematical predictions and ecological models. They list some good things about the dispersed social collaboration that works (Wikipedia …) and the potential of social media (promoting “the voices of historically disenfranchised groups “). This doesn’t sound much like the real consequences, they remind us, where the amplified disinformation and paranoia pose a serious threat in a world already facing a climate crisis, the threat of nuclear war, a pandemic, racism, hatred, famine, inequality, etc.
Tell me something I didn’t get in documentaries, podcasts, books, daily blogs, and political bickering, you say. SOh, they focus less on specific crimes and catastrophes – asking us, instead of consider social media from an evolutionary perspective. They compare the network to “collective behavior”, like locusts devouring everything in their path, rather than groups of hunter-gatherers of a hundred people. The leaderless structure that makes this drone-like behavior possible are âcomplex systemsâ, such as the global economy. The potential for disaster increases exponentially with the system: âWhen disrupted, complex systems tend to exhibit finite resilience, followed by catastrophic, sudden and often irreversible changes in functionality,â they write.
They argue that this shit just doesn’t work with a for-profit model that algorithmically prioritizes the delivery of emotionally driven content and brings people together in echo chambers where noise equals attention. Because companies have little incentive to share exactly what they are doing, or to change models, they write: âThis raises the possibility that some business models are fundamentally incompatible with a healthy society. In other words, unplug Facebook. âDecisions that impact the structure of society should not be guided by the voices of individual stakeholders, but rather by values ââsuch as non-maleficence, benevolence, autonomy and justice. “
Taking the next logical step, they suggest elevating the social media architect to a solemn and respectable post, requiring something like a Hippocratic Oath.
Zuckerberg isn’t committing to anything anytime soon. Their more immediately actionable proposal would combine behavioral science and a macro understanding of algorithmic manipulation, believing that we “lack the scientific framework we would need to answer even the most fundamental questions facing tech companies and their regulators.” . It’s not that we lack case studies; the international human rights group Avaaz has developed its own tools to study Billions of optimized by algorithm case of disinformation-sharing on Facebook, including detection of network failure to identify itself Steve Bannon’s quite predictable astroturf mission.
A body of literature would at least leave fewer facilities for tech CEOs, who walked out of hearings with weak excuses, a few promises, and impenetrable nonsense.