
Have you ever turned down the radio while you were driving to see better?
While seeing and hearing are two very different senses, in that kind of situation, your brain is working to process the audio and visual stimuli at the same time..
“It’s cognitive processing that’s happening. Even if it’s in the background of your awareness, that makes it a little bit more challenging to do what the task at hand is,” said Heath Jones, a research neuroscientist for the Army Aeromedical Research Laboratory. “That overloads you to some degree and it depends on what those signals are. Now, if it’s just the radio playing, that’s fine, but if it’s something where it’s someone telling you the instructions on how to get there, you may not be able to turn that down.”
Too much external stimuli can cause cognitive overload — something that pilots can experience in more dangerous circumstances than just missing an exit on the highway.
The same is true, the Army is finding, with operators of drones.
But a technique called multi-sensory cueing, which incorporates auditory and haptic or ‘touch’ alerts, Jones said, can help alleviate some of that overload.
“The idea is we can start to offload some of that information that we’re putting on the visual system onto these other systems,” Jones said. “It should not increase the cognitive burden as much as having them try to look at everything while flying the aircraft and make sure they know what’s outside of the aircraft as well.”
As the Army incorporates more unmanned aerial systems, or UASs, into its formations, researchers like Jones are looking at how audio and sensory cues can help drone operators. In the same way that regular pilots have to monitor dozens of radios and screens in the cockpit, drone operators have a similar job of tracking a lot of visual information all at once.
“A lot of the technologies that we use in the cockpit, we can use in the control stations for UAS,” Jones said. “If you think of the human in the cockpit or in the UAS control station or in the air traffic control environment, they deal with a lot of the similar issues.”
The idea of multi-sensory cueing as a way to tackle operator “overload” was acknowledged back in 2015 in an Air Force’s chief scientist report which said that drone operations would require multi-tasking and cause “limited visual attention.” The chief also wrote that multi-sensory cues could “compensate for loss of haptic and auditory information.”
One method that Army researchers are looking at to reduce cognitive burden is something that the Air Force Research Laboratory even had a hand in developing more than a decade ago. What the Air Force described as “3-D audio” or a “sound environment that mimics the way the human body receives aural cues” is what Jones calls “spatializing audio” or separating different sound streams, like multiple radio channels in the cockpit.
The audio can come from a “virtual space where it sounds like it’s coming outside of the head from a location in front of you,” Jones said. For a pilot or operator talking to other aircraft in their formation, “you could tune that radio on your left and put it in a virtual space location to your left,” he added.
Another technique that researchers are exploring is haptic cueing through sensations like vibrations in a shoulder pad or under the seat which could give pilots “information about your aircraft relative to where it’s supposed to be,” Jones said.
This summer, researchers will demo multi-sensory cueing at a UAS summit at Fort Novosel, Alabama, said Bethany Ranes, a sensory scientist for the human performance group at the Army’s aeromedical lab.
“Multi-sensory queuing has been used with our helicopter pilots,” Ranes said. “The reason we’re doing the demo in August is because it’s a ready-to-deploy technology that they can pop right in and start to use with the UAS operators in a variety of different kinds of training environments.”
At Fort Novosel, the full-motion Blackhawk simulator has a multisensory cueing system and others like the Longbow trainer for Apache helicopter crews, the transportable simulator for Blackhawks, and the Blackhawk aircrew simulator have “features of motion cuing” built into aircrew seats, according to the simulation directorate.
According to aeromedical lab officials, their simulator for Grey Eagle and Shadow drones “can easily be fitted” with multisensory cueing technology but efforts to install it are on hold until there is a more formal plan for UAS crew stations.
“It will be easier to write requirements and implement these technologies into modernization efforts than for re-fitting the ‘enduring’ or ‘legacy’ fleets,” aeromedical lab officials said.
Pilots
The concept of multi-sensory cueing has been looked at for traditional pilots who are typically monitoring dozens of buttons, screens and several radios playing on their headsets. With an overwhelming amount of information to track, pilots can get overwhelmed by multiple stimuli and miss things that are sometimes right in front of them, a psychological phenomenon called inattentional blindness.
A New Zealand university study noted in 2022 that while there’s been a lot of research and focus on inattentional blindness for car drivers distracted by cellphone conversations, “almost no comparable research has been conducted within the aviation domain despite the significance of both ground-based and mid-air collisions.”
But with multi-sensory cueing, Jones said that the alerts are more attuned to natural human instincts and could notify operators that they need to respond by making a decision or course correcting. Feeling a “buzz buzz buzz” on your right shoulder or hearing a “beep beep beep” in your right ear would tell the pilot that there’s something on their right side that needs attention.
“If you think about the auditory system and its development, you needed to know whether the tiger was behind you on the right or left so you know which way to go,” Jones said. “You don’t really need to think about these cues as much. You just need to know what they mean and what to do after you feel them.”
One device developed by Army aeromedical laboratory researcher Angus Rupert, the Tactile Situation Awareness System, or TSAS, uses a vibration vest to alert pilots of the aircraft orientation irregularities. For example, if a plane is tilting slightly left, the vest buzzes on their left side to prompt the pilot to correct it.
“If the phone is in your left pocket and it buzzes, you know it’s on the left side. Now, if you imagine taking eight phones and putting it on a belt around you. You could then buzz someone in the front, back, side or the cardinal directions to give them some information,” Jones said.
Despite existing research and development of audio and haptic cueing technology by the military itself, it’s not as prolific across the force — a common problem that the military deals with when it comes to developing and buying emerging technology.
“I think the thing that’s going to move it faster are unfortunate accidents that occur that could have been prevented with technology such as this. The most recent is the accident above the Potomac,” Jones said. “A lot of these technologies can provide information to the pilot that does not take their eyes off of their instruments [and] does not put another display in front of them for them to look at or monitor.”
The Army Combat Readiness Center conducted a survey with five trained investigators to look at the use of TSAS for preventing accidents, according to a summer 2015 article in Flightfax. Between 1992 and 2011, there were 330 Class A rotary-wing mishaps that resulted in fatalities or permanent disability and which did at least $2.5 million in damage. Investigators determined that almost a quarter of the accidents (63 fatalities and more than $700 million in costs), could have been prevented with the TSAS vest.
An Army investigation of an August 2017 crash that killed five soldiers during nighttime training off Hawaii found that the pilots had experienced “spatial disorientation,” meaning they couldn’t determine their position and altitude relative to the earth’s surface. Pilots experienced the same spatial disorientation issues during a 2015 Black Hawk helicopter crash during night training flight off the coast of Florida which killed four Louisiana National Guardsmen and seven special operations Marines.
“There’s nothing you can see outside. It’s just pitch black. There’s nothing that can give you any information,” Jones said. “The forces they’re feeling from their position relative to the gravitational forces and how the forces of the movement of the aircraft work, they didn’t even realize that they were in as bad a shape as they were.”
In April 2024, after 12 crashes and 10 deaths over a six-month-span, the Army ordered additional safety training across aviation units to “reinforce” pilot skills, including spatial awareness.
Making the job less tedious
Historically, drone operator jobs have been very passive or as Ranes put it bluntly: boring.
“A lot of the time they’re just sitting there. The fancy schmancy way that we refer to that in our work is there’s a mismatch in cognitive load — where we have non-optimized transitions from low workload to high workload, which just means you’re bored out of your gourd for a really long time, but then suddenly you have to act very quickly and decisively.”
By watching the evolution of drones on the battlefield in Ukraine, the Army has realized that “a passive role for a UAS operator is a completely under-utilization of what we can be doing with unmanned technology,” according to Ranes.
While trying to make the jobs less monotonous with different types of cueing to keep pilots alert to changes around them, researchers are still figuring out what types of information need to be cued.
“We’re trying to understand the cognitive offloading in terms of what would be more beneficial to offload, what would be detrimental to take away from the pilot,” Jones said. “Those types of questions still need to be answered to some degree.”
Jones said he could envision a reality where drone operations are less visual and more interactive.
“I can imagine someone just flying and they’re able to do a little bit more than what they’re able to do now because a lot of this information is being provided to them. That allows them to keep their eyes out of the displays whenever they are able to see outside the aircraft, and when they aren’t able to see outside the aircraft, they don’t get so focused on the one thing that’s going to keep the aircraft level that they miss everything else.”