Claudfeller, one of the world’s largest iernet infrastructure providers in the world, unveiled an advanced tool called AI Labyrih. The tool is designed to couer web creeping robots that, without permission, collect educational data from websites to use artificial ielligence projects. The company’s blog announced that this free and optional tool, after ideifying the inappropriate robot behavior, directs them to a path of links produced by artificial ielligence. The purpose of this is to reduce speed, confuse and waste the sources of destructive perpetrators.
Websites have been using the Robots.txt protocol, which in a text file, issued or rejected web reptiles. However, companies active in the field of artificial ielligence, such as ahropic and perplexity ai, have been accused of ignoring the protocol. In his reports, Claudfeller said that it has more than 5 billion requests from web robots daily, and although it has tools for ideifying and blocking malicious cases, this often results in the change of attackers.
Claudfeller has iroduced a new way to deal with these robots. Instead of blocking the robots, it caught them in a thousand artificial ielligence. In this way, robots are forced to process data that has nothing to do with real website information. According to Claudfeller, the system acts as an advanced trap. Artificial ielligence crawls that look for hidden links to fake pages are caught in the trap. While ordinary users never pay atteion to these links.

This makes it easier to ideify malicious robots for cloudflare. It is also possible to ideify new patterns and signs of robots that are not normally recognizable. According to published information, these misleading links are designed so that they are not visible to the real website visitors.
“Our experience has shown that if we first select a variety of topics and then produce separate coe for each, the results will be both more varied and attractive,” says Kefler in his blog post. “Since preveing the publication of inaccurate information in cyberspace is of particular importance to us, the coe produced must be based on scieific facts and knowledge and not limited to the website reviewed.”
Website administrators can use AI Labyrih feature by logging io the robot settings section of their site’s cloudfoller dashboard. The company has said that this feature is the only first step in using productive artificial ielligence to couer robots. The purpose of this feature is to create complex networks of related websites that will be very difficult to detect for robots. The AI Labyrih capability has similarities to the nepethes designed to mislead the reptiles in a sea of the worthless data produced by artificial ielligence.



