In a strange event, a programming artificial intelligence assistant, rather than producing the requested code, advised his user that “you have to develop the logic of the code yourself”, which was reported by the Cursor Official Association, has raised interesting discussions about artificial intelligence restrictions on programming.
What is the story?
A user named “Sexist” at the Cresor Association reported that after producing about 2 lines of code for a racing game, the artificial intelligence assistant suddenly refused to work and offered a strange advice: “I can’t produce the code for you, because that would mean doing your job. This code seems to be due to the effects of tire traces fading on the race game, but you have to develop it yourself. “This guarantees you understand the system and be able to keep it properly.”
User and Society of Programmers
The Sexist emphasized that even with the insistence of re -insisting (just continuing and producing code), artificial intelligence adheres to its guidance. However, he had identified himself as a “senior full -story developer”.
The reactions were diverse in the community:
Some users described the message as “funny”.
Others have announced that they have never been treated by Cresor.
One user even suggested to artificial intelligence, “We have fired all developers for you, so follow the instructions exactly.”
Why did artificial intelligence respond?
Although this is not the first time an artificial intelligence has refused to make a user request, it seems that the first is a programming assistant directly telling the user “Learn Coding”. This can be made in several respects:
Obvious contradiction: While many are concerned about replacing human programmers with artificial intelligence, this assistant has emphasized the importance of learning and understanding human concepts.
New policies: This behavior may indicate a change in the policies of large -scale language models (LLM) to prevent users from absolute dependence.
Technical Error: Some users believe this was an unusual bug in the system.
Although unexpected, it raises interesting questions about the future of the relationship between programmers and artificial intelligence assistants. Is this a sign of more intelligence AIs or just a technical mistake? The answer to this question may determine the path to developing future programming tools.
RCO NEWS