Wikipedia was launched back in 2001. Its creators did not assume it would become so popular. It was designed with the idea that volunteer editors could collectively resist various threats. At that time, no one thought about botnets and sophisticated propaganda, and no tools were created to detect attacks. The Wikimedia Foundation, which oversees Wikipedia, still does not recognize the seriousness of the problem.
We see that complex attacks can go unnoticed for months and years, and involved participants can rise high in the hierarchy of editors. Sometimes the power of the community is not even enough to eliminate the consequences of such malicious editing.
In 2019, we uncovered a network of bots engaged in political propaganda in the Russian Wikipedia. Many media outlets, including Meduza ("Revenge of the editors"), released reports about this loud story. Our persistence allowed us to detect the attack, identify all participants and complete the investigation, despite the lack of technical tools.
It took half a year from the start of the attack until the botnet was blocked - too long a period. Cleaning up the articles from malicious edits took even more time.
In the spring of 2021, we began developing Wikify.io - a set of tools for studying Wikipedia, monitoring community activity, and controlling articles. Our goal is not only to detect threats at early stages, but also to understand the development of the encyclopedia's community. We are creating tools not only for ourselves, but also offering them to other researchers.
The long-term plan is to scale the tools to all language sections. In the future, we will build an analytics team that will monitor Wikipedia's health 24/7/365, identify threats and nip them in the bud. We also want to build a team that develops advanced tools for regular editors.