Learn by Listening: AI Systems Analyse Speech Signals Similar to Human Brains
Featured june, 2024 for neuroscience
Summary: Artificial intelligence (AI) systems have the ability to process signals in a manner akin to how the brain decodes speech, which may aid in the explanation of how AI systems work. Researchers observed participants' brain activity using electrodes placed on their heads as they listened to a single syllable and compared it to an AI system that had been trained to learn English. They discovered that the patterns were strikingly similar, which may help in the creation of more potent systems.
Key details
In a newly published study, researchers discovered that the signals generated by an AI system trained to learn English were very comparable to brain waves monitored as subjects listened to the single syllable "bah."Scientific Reports, a journal.
As the participants listened to the sound, the team monitored their brain activity using a system of electrodes placed on their heads. They then compared the brain activity to the signals generated by an AI system.
As AI systems permeate every aspect of daily life in industries ranging from healthcare to education, understanding how and why they give the information they do is becoming increasingly important.
Scientists are surprised by the discovery that artificial intelligence (AI) systems can handle information in a manner strikingly similar to how the brain processes speech, according to new study from the University of California, Berkeley.claim might be able to shed some light on the mysterious workings of AI systems.
Researchers from the Berkeley Speech and Computation Lab monitored individuals' brain waves as they listened to the single syllable "bah" while wearing electrodes on their heads. The signals produced by an AI system that had been taught to understand English were then matched to the brain activity.
"The shapes are remarkably similar," said Gasper Begum, lead author of the study just published in the journal Scientific Reports and assistant professor of linguistics at the University of California, Berkeley. "That tells you similar things get encoded, that processing is similar."
The surprising similarities between the two signals can be seen in a side-by-side comparison graph.
Begum stated, "The data has not been altered. This is uncooked.
claim could help to explain the enigmatic operation of AI systems.
Wearing electrodes on their heads, participants were asked to listen to the single-syllable "bah" while researchers from the Berkeley Speech and Computation Lab recorded their brain waves. The brain activity was then compared to the signals generated by an AI system that had been trained to comprehend English.
"The shapes are remarkably similar," said Gasper Bagus, lead author of the study just published in the journal Scientific Reports and assistant professor of linguistics at the University of California, Berkeley. "That tells you similar things get encoded, that the processing is similar."
Comments