Virginia State University and Polytechnic Institute have published a report in which some of the possible biases of artificial intelligence tools ChatGPT It has been described about the environmental justice of different cities.
Using a list of 3,108 US counties, the research group asked ChatGPT to answer a question about environmental justice issues in each county. With this question, they were referring to ChatGPT's ability to understand challenges Environmental justice and measure its attention to other issues such as population density and average household income.
Possible bias of ChatGPT; Limited data for smaller cities?
In this research, data from large cities such as Los Angeles County, California More than 10 million person It has a population as well as smaller counties such as Loving County, Texas 83 people They used themselves. According to them, ChatGPT has had no problem identifying the environmental justice challenges of large and densely populated counties, but has had limited abilities to provide local information.
According to scientists, ChatGPT only for 515 counties out of the total number (3018) or 17 percent They provide specific local information on environmental justice challenges. Another thing scientists have pointed out is that in the most populous states, like Delaware or California, less than 1% of the population lived in counties that could not get specific information from ChatGPT. In contrast, they write about sparsely populated states:
“In rural states like Idaho and New Hampshire, More than 90 percent “The population lives in counties that cannot receive special local information.”
The researchers in their study have also provided a map that shows the counties without access to this type of information. The red areas in the image below do not have access to this information.
ChatGPT is a large language model developed by OpenAI and can simply answer user questions. This technology can perform a wide range of applications such as content production and information collection or even data analysis and translation. However, ChatGPT has so far been accused of bias in other reports, which could raise concerns that it is spreading misinformation.
RCO NEWS