TRAINING AND SIMULATION
IT2EC NEWS - Air Force Wants to Sift Through Simulation Data to Boost Training
By Sean Carberry
Air Force photo
ROTTERDAM, Netherlands — Like the other services, the U.S. Air Force is using more and more virtual and simulated training technology, which is generating volumes of data. The Air Force Research Laboratory’s Human Effectiveness Directorate is trying to determine if the right data are being collected and how to use it for maximum training effect.
There are gaps in data gathering and processing — the service needs more granular data and better tracking of a service member’s training — and on the back end — it needs better artificial intelligence tools make sense of the data, according to Summer Rebensky, a research scientist with Aptima, a contractor working for AFRL.
“We're working not only to look at live, virtual and constructive training and be able to map technologies and training experiences that best suit the training requirements, but also be able to develop training in an agile way that adapts not only to the training environments, but also to the individuals’ proficiencies and needs,” she said at IT2EC, one of Europe's biggest training and simulations conferences.
“Some of the simulators and training systems that we have now collect data at a very high level instead of the granularity that we really need to be able to fully leverage the AI and machine learning that's being developed today,” she said.
The tools largely exist to capture the level of granularity the service needs, so it is not as much a matter of technology, but communicating needs to simulators' developers, she said.
“Whoever is funding the tools to be developed isn't necessarily the end user,” she said. “We try to play an active role in connecting those two entities to ensure that we're involving them day one of the development of those tools.”
That also requires looking internally to determine what data are needed. That involves interviewing subject matter experts, “seeing what data can be pulled off of these simulators, what kinds of behaviors can be observed, tracked and measured,” she said.
“We're able to integrate all those measures together into measures of performance, instantiate those within simulation, refine those measures,” which then creates a feedback loop, she added.
One problem with current training is that it is often pass-fail, she said. “We see performance is just that kind of a uni-dimensional measure. But when we talk about really capturing performance, we would argue that there's many more aspects to performance — time, accuracy, resources, use, the strategies and decision making that was made,” and that data are all collected in different ways, some from the simulators, some from expert observation.
“Let's say we have two pilots practicing landings for the first time,” she said. “We look at simulator data and the network traffic we're getting off at the simulator. You may see that they both are able to land the aircraft pretty well and within the same performance metrics that we laid out. However, if one pilot was cool, calm and collected while he was landing the aircraft, and the other guy was sweating like the guy from “Airplane,” we wouldn't necessarily say they're both at the same level of readiness.”
Hence, adding in more physiological and biometric measurements and data will lead to a better picture of training effectiveness and readiness, she said.
“In terms of physiological measures, a lot of what we're looking at are stress or workload related measures,” particularly related to the cognitive burden of decision making when working with complex systems and technologies, she said.
“So, they're overloaded with information and data themselves, so how do we ensure that they can handle different levels of workload of the amount of information that's getting pushed to them?” she said. To do that, AFRL is measuring heart rate information along with using functional near-infrared spectroscopy and electroencephalograms to look “at how much oxygen is being sent to your prefrontal cortex — so, how much blood is your brain sending to help you think harder,” she continued.
As military operations grow more complex, warfighters need to be more adaptable and resilient, she said. “Being able to capture that physiological state allows us a piece of information beyond what we can observe, which is, are they really understanding the gravity of a situation?
“For example, in air traffic control, people may be attending to different aircraft but not recognize the gravity of the situation where there's an imminent collision,” she continued. “So, by using those physiological indicators, we can identify points in which there may be decision-making gaps as well.”
“So, it's when we're able to fuse all of these data metrics together that we're able to really paint the picture of readiness,” she said.
Gathering the right data and at the right level of detail is the first challenge. Next is being able to move it, she said. “Everyone is developing these really cutting-edge technologies, but we need to be able to get, send and receive the data back and forth,” she continued. That requires “working as a community towards interoperability.”
Related to moving the data is connecting the data points in a service member’s training history to develop a complete picture, she said.
“Tracking the trainee across the pipeline, across their career — a lot of these pieces of data exists within only each of their locations,” she said. “So how do we actually track someone throughout their career so that we can identify pain points a couple years down the line and bring them back to the beginning of the training pipeline to address them?”
And then there are gaps in deriving meaning from all the data, she added.
“So, we can collect a lot of various pieces of information from simulators nowadays, but how do we translate that into meaningful actionable insights about readiness?” she continued. “Raw data is being captured in large masses, but then how do we distill that down to an actual insight for the instructor level? That's where we need a lot of the science and the engineering to marry.”
Which is why modularity is essential “so that we can develop these systems that can adapt with emerging technologies and data pieces that are coming out,” she said. “So, they'll have to be extensible to be able to plug and play with the newest and greatest technologies and models.”
Better and more artificial intelligence will also help sort the data and derive meaning, she said.
The work of the Human Effectiveness Directorate is just about delivering better training to service members, it is focused on the entire ecosystem and using tools and data to develop and improve the capabilities of instructors as well, she said.
“Using these tools can also ensure that we have all of our instructors at the same level of readiness, support them where they may not be able to identify gaps themselves and alleviate some of the pressure that's all on the instructors to bring up the next generation,” she said.
Topics: Training and Simulation