The meta artificial intelligence research team is closer to decoding human thought. The company, in collaboration with the Basque Center in the field of cognition, brain and language, has developed an artificial intelligence model that is capable of reconstructing sentences from brain activity with up to 80%. The research relies on a non -invasive way to record brain activity, and the company says it can pave the way for technology to help people who have lost the ability to speak.
Functional mechanism
Unlike current brain-ranging interfaces, which often require the use of aggressive implants, the Meta’s approach rely on Magnation of the Magnation and Electrozettelography (EEG) methods. These techniques measure brain activity without surgery. The artificial intelligence model was taught based on the recording of brain activity from 35 volunteers while typing the sentences. When tested on new sentences, Meta claims that it can accurately predict up to 80% of the characters typed using MEG data, which is at least twice as effective as the EEG -based decoding.
This method still has limitations. Meg needs a magnetic protected room and participants must remain constant for accurate reading. The technology is also tested only on healthy people, so its effectiveness is unclear for people with brain injury.
Artificial intelligence is also mapping how words are formed in our minds
Meta’s artificial intelligence, beyond decoding the text, helps researchers understand how the brain converts ideas into language. The artificial intelligence model analyzes Meg records and tracks brain activity at milliseconds. This model shows how the brain converts abstract thoughts into words, syllables, and even finger movements when typing.
One key finder is that the brain uses a “dynamic neural code”, a mechanism that chains different stages of language formation and at the same time keeps past information available. This can explain how people integrate the sentences when talking or typing.
Meta’s research re -confirms that artificial intelligence can someday allow non -invasive interfaces to the brain and computer for people who cannot verbally communicate. But at the moment, this technology is not ready for real -world use. Decoding precision requires improvement and MEG hardware limitations out of the laboratory environment.
Meta is investing in partnerships to advance this research. The company has announced a $ 2.2 million donation to the Rothschild Foundation Hospital to support current studies. Also, the company is working with institutions such as Neuropin, Inria and CNRS in Europe and seeking to develop The best artificial intelligence It is in this regard.
Source: gizmochina
RCO NEWS