He Predicted The 2016 Fake News Crisis. Now He's Worried About An Information Apocalypse. By Charlie Warzel

Our platformed and algorithmically optimized world is vulnerable - to propaganda, to misin-formation, to dark targeted advertising from foreign governments - so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact. But it’s what he sees coming next that will really scare the shit out of you.

In mid - 2016, Aviv Ovadya realized there was something fundamentally wrong with the internet — so wrong that he abandoned his work and sounded an alarm. A few weeks before the 2016 election, he presented his concerns to technologists in San Francisco’s Bay Area and warned of an impending crisis of misinformation in a presentation he titled “Infocalypse.”

The web and the information ecosystem that had developed around it was wildly unhealthy, Ovadya argued. The incentives that governed its biggest platforms were calibrated to reward information that was often misleading and polarizing, or both. Platforms like Facebook, Twitter, and Google prioritized clicks, shares, ads, and money over quality of information, and Ovadya couldn’t shake the feeling that it was all building toward something bad — a kind of critical threshold of addictive and toxic misinformation. The presentation was largely ignored by employees from the Big Tech platforms — including a few from Facebook who would later go on to drive the company’s NewsFeed integrity effort.

“At the time, it felt like we were in a car careening out of control and it wasn’t just that everyone was saying, ‘we’ll be fine’ - it’s that they didn't even see the car,” he said. Ovadya saw early what many - including lawmakers, journalists, and Big Tech CEOs - wouldn’t grasp until months later: Our platformed and algorithmically optimized world is vulnerable - to propaganda, to misinformation, to dark targeted advertising from foreign governments - so much so that it threatens to undermine a cornerstone of human discourse: the credibility of fact. But it’s what he sees coming next that will really scare the shit out of you.

“Alarmism can be good - you should be alarmist about this stuff,” Ovadya said one January afternoon before calmly outlining a deeply unsettling projection about the next two decades of fake news, artificial intelligence–assisted misinformation campaigns, and propaganda. “We are so screwed it's beyond what most of us can imagine,” he said. “We were utterly screwed a year and a half ago and we're even more screwed now. And depending how far you look into the future it just gets worse.” That future, according to Ovadya, will arrive with a slew of slick, easy - to - use, and eventually seamless technological tools for manipulating perception and falsifying reality, for which terms have already been coined - “reality apathy,” “automated laser phishing,” and "human puppets."

Which is why Ovadya, an MIT grad with engineering stints at tech companies like Quora, dropped everything in early 2016 to try to prevent what he saw as a Big Tech–enabled information crisis. “One day something just clicked,” he said of his awakening. It became clear to him that, if somebody were to exploit our attention economy and use the platforms that undergird it to distort the truth, there were no real checks and balances to stop it. “I realized if these systems were going to go out of control, there’d be nothing to reign them in and it was going to get bad, and quick,” he said.
Today Ovadya and a cohort of loosely affiliated researchers and academics are anxiously looking ahead toward a future that is alarmingly dystopian. They’re running war game–style disaster scenarios based on technologies that have begun to pop up and the outcomes are typically disheartening.

For Ovadya - now the chief technologist for the University of Michigan’s Center for Social Media Responsibility and a Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia - the shock and ongoing anxiety over Russian Facebook ads and Twitter bots pales in comparison to the greater threat: Technologies that can be used to enhance and distort what is real are evolving faster than our ability to understand and control or mitigate it. The stakes are high and the possible consequences more disastrous than foreign meddling in an election - an undermining or upending of core civilizational institutions, an "infocalypse.” And Ovadya says that this one is just as plausible as the last one - and worse... read more:


Popular posts from this blog

Third degree torture used on Maruti workers: Rights body

Haruki Murakami: On seeing the 100% perfect girl one beautiful April morning

Albert Camus's lecture 'The Human Crisis', New York, March 1946. 'No cause justifies the murder of innocents'

The Almond Trees by Albert Camus (1940)

Etel Adnan - To Be In A Time Of War

After the Truth Shower

James Gilligan on Shame, Guilt and Violence