From Vice Motherboard:
This year, the “news aggregator law” came into effect in Russia. It requires websites that publish links to news stories with over one million daily users (Yandex.News has over six million daily users) to be responsible for all the content on their platform, which is an enormous responsibility.
“Our Yandex.News team has been actively working to retain a high quality service for our users following new regulations that impacted our service this past year,” Yandex told Motherboard in a statement, adding that to comply with new regulations, it reduced the number of sources that were aggregated from 7,000 to 1,000 with “official media licenses.”
The predicable result of the Russian government’s new law is that the government can better influence what information is surfaced to Russian citizens: when state news outlets release the same press release, en masse, Yandex1 and other major aggregators with a large number of readers are predominantly exposed to what the government wants them to see. So while Russia may interfere with foreign countries’ political processes by exploiting how social network and aggregator algorithms function (along with out-and-out illegal exfiltration and modification of communications data) they, themselves, are trying to immunize themselves to equivalent kinds of threats by way of the liabilities they place on the same kinds of companies which do business in Russia.
More broadly, the experience in Russia and changes in how Yandex operates should raise a warning flag for caution advocates in the Western world who are calling for social media companies to be (better) regulated, such as by striking down or modifying Section 230 of the Communications Decency Act (CDA). While there are clear dangers associated with these companies operating as contemporary digital sovereigns there are also risks associated with imposing harsh liability systems for publishing other persons’ content.
While such regulations might reduce some foreign interference in political systems it could simultaneously diminish the frequency at which legitimate alternative sources of information which are widely surfaced to the public. It remains unclear just how we should regulate the spread of malicious political messaging2 but, at the same time, it’s critical to ensure that any measures don’t have the detrimental effect of narrowing and diminishing the political conversations in which citizens can participate. It’s the very freedoms to have such conversations that distinguishes free democratic countries from those that are more autocratic.
- Sidenote: Yandex is the only website I’ve ever had to block from scraping my professional website because it was functionally acting as a DDoS. ↩
- One idea would be to deliberately cut down on how easy it is to spread any and all information. By requiring additional manual effort to share content only the most motivated would share it. Requiring actual humans to share content with other humans, if done in a robust way, might cut down on the ability of bots to automatically propagate content as though ‘real’ people were sharing it. ↩