fbpx
News & Analysis

Nobel Peace Prize winner Maria Ressa warns social media is killing our shared reality

- November 24, 2021 3 MIN READ
Maria Ressa
Rappler co-founder and Nobel Prize laureate Maria Ressa

Speaking at the Australian Strategic Policy Institute’s Sydney Dialogue event last week, Ressa – founder of Filipino news site Rappler and the country’s first Nobel Laureate – described what she called big tech’s “insidious” manipulation of human biology.

“There’s something fundamentally wrong with our information ecosystem because the platforms that deliver the facts are actually biased against the facts,” Ressa said.

“The world’s largest delivery platform for news is Facebook and social media in general has become a big behaviour modification system.”

Ressa was talking to questions about whether social media companies ought to create different versions of their platforms to protect ‘weaker’ democracies from the damaging effects of online propaganda campaigns and misinformation.

One of the revelations in the recent Facebook Papers was that the social media company struggled to effectively moderate content to match its growing scale – the more places Facebook reached, it seemed, the less control its Silicon Valley headquarters appeared to have over the way information moved.

“Our biology is very, very vulnerable to this technology,” Ressa said.

“The design of this technology and the way it can insidiously manipulate people is powerful in the same way as genetic engineering technology.

She used the example of gene editing technology CRISPR, saying that governments and regulators put “guard rails” in place very quickly around it during its development.

“This is what we failed to do collectively on information technology,” Ressa continued.

“Now it is manipulating our minds insidiously, creating alternate realities, and making it impossible for us to think slow at a time when we need to solve existential problems.”

 

Junk food for the mind

Ressa’s fellow panelist in the discussion, Dr Zeynep Tufekci, an Associate Professor with the University of North Carolina and long-time critic of the use of data to manipulate information flows, agreed that the information technology landscape as it stands is toxic to individuals and societies.

Dr Tufekci said the ongoing debate around censorship by tech companies and social media often ignores the more fundamental problem with how these products get designed in the first place.

“It’s easier to try and say ‘who should we kick off which platform’ and harder to think about how we need to shift the entire information ecology by design,” she said.

“It’s like food. If you have humans who evolved under conditions of hunger and then you build a cafeteria – the business model of which is to keep you there – that cafeteria is going to serve you chips, ice cream, chips, ice cream one after the other.

“In that case you have taken a very human vulnerability – hunger – and you’ve monetised it using an automated cafeteria.”

Similarly, Dr Tufekci suggests the human need for information, knowledge, and social connection has been monetised in a way that takes advantage, as Ressa said, of our very biology.

She doesn’t blame the engineers working on these technologies, but rather suggests we have done a poor job of incentivising companies to build more careful products.

“It’s not because the people working on these technologies are not great or smart or well-meaning,” Dr Tufeki said.

“[Fixing this] has to be something that we ask them to do, rather than not telling them what to do, then getting mad at them.”

 

One fix at a time

Twitter’s Head of Legal, Policy and Trust, Vijaya Gadde, defended the position of social media companies by saying that solving some of the well-known problems with these platforms isn’t always simple but that it can be done.

“We piloted a bunch of things at Twitter like what we call nudges,” Gadde said.

“These are just quick little pop-ups that appear before you’re tweeting information or before you re-tweeting an article which might say ‘did you actually read this article?’ or a warning to say ‘this was considered misleading by certain groups, are you sure you want to retweet this’.

“And we’ve had remarkable incidents of reducing harm on the platform because of those little speed bumps that we’re putting in place.

“So those are the types of things I want to encourage platforms to do and experiment with but the thing is that if the solutions were easy, we would have found them and implemented them already.