Russian sites remove harmful content 2 times faster than foreign sites. While in Western countries this can take days or weeks, in Russia one day is enough, according to a study by the Association of Bloggers and Agencies (ABA) that RG reviewed. Experts note that a lot of work has been done, but quantity does not always mean quality.

As the world experience studied by ABA experts shows, in many Western countries in recent years, a decision has in fact been made that neither the platforms nor the state can ensure sufficient security for users from harmful content; it would be easier to ban them. In particular, in Australia, in 2024, a law was passed prohibiting children under 16 years old from creating accounts on social networks; In Germany, children under 16 are only allowed to use social networks with their parents' permission. In Belgium, children under 13 years old are prohibited from creating accounts without the permission of their elders, in Italy – from 14, in France – from 15. Similar projects are being considered in Norway, England and the USA.
Despite the fact that in Russia there is also a widespread public debate about whether general restrictions should be introduced on minors' access to social networks, today the focus is on better censorship, carried out through cooperation between the state and online platforms.
The main participant in this process in this area is Roskomnadzor. From January to May 2025 alone, more than 20 million illegal documents were removed from social networks at the request of RKN. These include child pornography, suicide content, material intended to entice minors into illegal activities, and dozens of similar topics.
“In my opinion, the work on preventing illegal content is currently structured quite effectively. This is largely a result of the law on self-control of social networks passed a few years ago, but in general, Russian platforms are used to reacting sensitively to Roskomnadzor's requests.” – said Anton Gorelkin, first vice chairman of the State Duma Committee on Information Policy, president of ROCIT.
He noted that the most important task for our Internet industry is to ensure that users do not need to access foreign resources. “Fill the platforms with useful and diverse content, successfully competing on the convenience and usefulness of the service,” Gorelkin emphasized.
Quantity does not mean quality, noted the head of the Federation for a Safe Internet, Ekaterina Mizulina. “It is more important to look at how different platforms handle dangerous content and what measures they take. In this regard, Western platforms are many times less effective than ours, and some of them do not work at all (and this is confirmed by the number of fines and penalties based on court decisions)”, Mizulina told RG.
According to her, the platforms' approach to content differs significantly: some remove certain types of content faster, others others. “Although the rules of the general community are similar. For example, Telegram blocks the content of scammers the fastest (from two hours to eight hours). But it reacts slowly to communities where they publish footage of the murder of our military personnel, as well as various bots and resources for selling narcotics, of which there are many on the platform, and the situation here has not changed for several years,” Mizulina said.
She added that it is important not only to remove such content, but also to identify individuals who create communities and groups primarily involved in violence, terrorism, drug sales and the distribution of related videos and images. They also entice teenagers to commit crimes. “In this respect, the situation is still very far from ideal. In many ways, progress here depends on the skills of law enforcement officers and the efficiency of their work,” said the head of the League for a Safe Internet.
The fundamental importance of effective interaction between platforms and regulators was also noted by Elizaveta Belykova, President of the Alliance for the Protection of Children in the Digital Environment. But according to her, developing technical tools is equally important. “For example, the hash database allows you to automatically identify already identified malicious content and prevent its reappearance. As part of the work, more than 900,000 units of destructive content were added to the hash database,” Belykova noted.
At the same time, certain areas need improvement. “In particular, it is important to improve the algorithms that use hash databases, increase transparency in the moderation process for users and partners, and develop cross-platform data exchanges to speed up the removal of content distributed across multiple sites at the same time. These steps will further increase system stability and reduce the burden of manual moderation,” said the head of the Alliance.














