The page now appearing on Mronline is less a conventional article than a warning: access to the site is being filtered through Anubis, a proof-of-work layer designed to frustrate automated scraping. The message says the system is there because large-scale AI crawlers have become heavy enough to strain services, and that the aim is to make bulk extraction of content costly while leaving ordinary readers largely unaffected.

Anubis draws on the older Hashcash concept, which Adam Back proposed in 1997 as a way to make spam less economical by forcing senders to do computational work before delivery. In the same spirit, Anubis asks a visitor’s browser to complete a challenge that is easy to check but expensive to mass-produce, turning scale itself into the obstacle for bots. Documentation and related discussions describe it as open-source software built for precisely this kind of front-line defence, especially on sites that want to keep material available without giving free rein to scraping tools.

The notice also underlines a practical trade-off that has become common across the web: stronger defences can mean friction for legitimate users, particularly if their browsers lack modern JavaScript support or rely on privacy tools that interfere with the challenge. That tension helps explain why systems such as Anubis are increasingly being used as a temporary barrier rather than a perfect solution, buying time for site operators while they decide how to respond to aggressive automated access.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services