Skip to content
Search

Latest Stories

AI systems process data 'startlingly like' human brain, finds UC Berkeley study

The newly documented similarities in brain waves and AI waves are a benchmark on how close researchers are to building mathematical models that resemble humans as closely as possible

AI systems process data 'startlingly like' human brain, finds UC Berkeley study

A new study has revealed that artificial intelligence (AI) systems are capable of processing data in a manner that is remarkably comparable to how the brain decodes speech.

Researchers from the University of California, Berkeley, tracked individuals' brain activity as they listened to the word "bah" only once. The signals produced by an AI system that had been taught to understand English were then matched to the brain activity.


The two signals' side-by-side comparison graph revealed a startling likeness. The researchers claimed that the data was unaltered and that it was raw.

"Understanding how different architectures are similar or different from humans is important," said Gasper Begus, assistant professor of linguistics at UC Berkeley and lead author of the study published recently in the journal Scientific Reports.

That is because, he said, understanding how those signals compare to the brain activity of human beings is an important benchmark in the race to build increasingly powerful systems.

For example, Begus said, having that understanding could help put guardrails on increasingly powerful AI models. It could also improve our understanding of how errors and bias are baked into the learning processes.

To do so, Begus turned to his training in linguistics.

He said that the sound of spoken words enters our ears and gets converted into electrical signals, which then travel through the brainstem and to the outer parts of our brain.

Using electrodes, researchers traced that path in response to 3,000 repetitions of a single sound and found that the brain waves for speech closely followed the actual sounds of language.

The researchers transmitted the same recording of the "bah" sound through an unsupervised neural network - an AI system - that could interpret sound. They then measured the coinciding waves and documented them as they occurred.

Begus said he and his colleagues are collaborating with other researchers using brain imaging techniques to measure how these signals might compare. They're also studying how other languages, like Mandarin, are decoded in the brain differently and what that might indicate about knowledge.

Many models are trained on visual cues, like colours or written text - both of which have thousands of variations at the granular level. Language, however, opens the door for a more solid understanding, Begus said.

The English language, for example, has just a few dozen sounds.

"If you want to understand these models, you have to start with simple things. And speech is way easier to understand," Begus said.

In cognitive science, the researchers said, one of the primary goals is to build mathematical models that resemble humans as closely as possible.

The newly documented similarities in brain waves and AI waves are a benchmark on how close researchers are to meeting that goal, they said.

(PTI)

More For You

Sri Lanka Floods: Cyclone Ditwah Strikes, Death Toll Rises

The Disaster Management Centre (DMC) said the toll increased after more bodies were recovered in the worst-hit central region, where mudslides buried most of the victims earlier this week.(Photo: Getty Images)

Sri Lanka floods: Troops deployed as death toll rises to 69

SRI LANKAN troops worked on Friday to rescue hundreds of people stranded by rising floodwaters as weather-related deaths reached 69 and 34 people were reported missing.

Helicopters and navy boats carried out several rescue operations, taking residents from rooftops, treetops and villages cut off by the floods.

Keep ReadingShow less