Claudfeller, one of the world’s largest internet infrastructure providers in the world, unveiled an advanced tool called AI Labyrintth. The tool is designed to counter web creeping robots that, without permission, collect educational data from websites to use artificial intelligence projects. The company’s blog announced that this free and optional tool, after identifying the inappropriate robot behavior, directs them to a path of links produced by artificial intelligence. The purpose of this is to reduce speed, confuse and waste the sources of destructive perpetrators.
Websites have been using the Robots.txt protocol, which in a text file, issued or rejected web reptiles. However, companies active in the field of artificial intelligence, such as anthropic and perplexity ai, have been accused of ignoring the protocol. In his reports, Claudfeller said that it has more than 5 billion requests from web robots daily, and although it has tools for identifying and blocking malicious cases, this often results in the change of attackers.
Claudfeller has introduced a new way to deal with these robots. Instead of blocking the robots, it caught them in a thousand artificial intelligence. In this way, robots are forced to process data that has nothing to do with real website information. According to Claudfeller, the system acts as an advanced trap. Artificial intelligence crawls that look for hidden links to fake pages are caught in the trap. While ordinary users never pay attention to these links.
This makes it easier to identify malicious robots for cloudflare. It is also possible to identify new patterns and signs of robots that are not normally recognizable. According to published information, these misleading links are designed so that they are not visible to the real website visitors.
“Our experience has shown that if we first select a variety of topics and then produce separate content for each, the results will be both more varied and attractive,” says Kefler in his blog post. “Since preventing the publication of inaccurate information in cyberspace is of particular importance to us, the content produced must be based on scientific facts and knowledge and not limited to the website reviewed.”
Website administrators can use AI Labyrinth feature by logging into the robot settings section of their site’s cloudfoller dashboard. The company has said that this feature is the only first step in using productive artificial intelligence to counter robots. The purpose of this feature is to create complex networks of related websites that will be very difficult to detect for robots. The AI Labyrinth capability has similarities to the nepethes designed to mislead the reptiles in a sea of the worthless data produced by artificial intelligence.
RCO NEWS