Embry-Riddle Researchers Develop AI System to Decode Aviation Speak

The system uses automatic speech recognition to convert spoken radio transmissions into text.
Aug. 29, 2025
7 min read

Embry-Riddle Aeronautical University researchers have developed a system that uses artificial intelligence to transcribe and translate aviation radio communications.

While other areas of aviation have undergone automation and technological advancements, radio communication has largely remained unchanged, said Andrew Schneider, an assistant professor in the College of Aviation who directs the university’s Speech and Language AI Lab.

The system uses automatic speech recognition to convert spoken radio transmissions into text. Natural language processing interprets and refines that text by standardizing terminology, formatting spoken numbers and call signs, removing filler words and flagging potential errors.

Allowing for a large-scale analysis of pilot-controller communications, the system could reveal patterns, phraseology errors and safety concerns that were previously difficult to study, Schneider asserted.

“Schneider said, “We see an opportunity here for another leap forward to help controllers and pilots have safer radio communication.”

In the future, the system could also be used to provide immediate feedback to student pilots and help instructors better target communication issues.

Schneider is collaborating with Dr. Jianhua Liu, associate professor of Electrical and Computer Engineering, to build the AI-driven system.

“With this research, we can improve the efficiency of communications and reduce errors,” said Liu, who has a background in machine learning.

Schneider and Liu’s research has received two grants totaling $30,000 from Embry-Riddle’s Boeing Center for Aviation and Aerospace Safety, including a joint College of Engineering grant.

“We simply couldn’t have launched this work without that support—it enabled us to move from concept to reality,” Schneider said.

Their research—first supported in 2023 with a $15,000 grant—began with the collection of radio communication recordings from 12 high-traffic U.S. airports. The audio was fed into automatic speech recognition tools, such OpenAI’s Whisper, to create transcriptions.

The off-the-shelf models averaged an 80% word error rate, highlighting the need for aviation-specific tools, said Schneider, a third‑generation pilot and linguist.

“Aviation English isn’t standard conversational grammar— it’s a condensed, highlyspecific phraseology spoken over a noisy radio where words get clipped and specialized jargon abounds,” he said.

Liu used his expertise in signal processing, which involves the conversion of analog signals into digital data, to customize an automatic speech recognition tool that reduced the word error rate from 80% to less than 15%.

The system was used in a NASA-funded project, where information from flight deck communications was harvested from audio with high background noise.

“In aviation, we have done a great job with using numerical data, but until now we haven’t had the tools to use qualitative data at the same scale,” said Dr. Kristy Kiernan, associate director of the Boeing Center for Aviation and Aerospace Safety and the lead investigator on the NASA project.

Kiernan added, “Large language models can open up whole new data sources that we can leverage to improve safety. That’s really exciting.”

At the university’s Second Annual Safety Research Symposium earlier this year, Liu and Schneider shared their initial findings in a presentation titled “Monitoring ATC Communication Issues with ASR and NLP.”

They also delivered a presentation, “Showcasing Speech and Language AI (SaLAI) in Aviation and Aerospace Applications,” at the university’s AI Summit this past fall.

“Their project, developing automatic speech recognition for aviation applications, has many potential use cases,” said Kiernan.

Looking ahead, Schneider and Liu said they are developing real-time applications where the system would interface with aircraft systems to help with tasks like:

  • Detecting inconsistencies between verbal instructions and aircraft behavior
  • Flagging missed calls 
  • Assisting with checklist verification
  • Serving as a smart co-pilot
  • Enhancing situational awareness
  • Preventing communication breakdowns before they escalate

Research assistant Sung Jun “Kevin” Cho, who recently graduated from Embry-Riddle with a Bachelor of Science in Aeronautical Science and has worked on the project, said the potential for AI technology to enhance aviation safety extends to “the real-time use of AI to correct our mistakes, ensuring efficiency and effectiveness of radio communications.”  

Sign up for Aviation Pros Newsletters
Get the latest news and updates.