A team of student researchers from Air Command and Staff
College, Air University at Maxwell Air Force Base, Alabama,
conducted an Adaptive Flight Training Study during January
2018 to aid in the Air Force’s advancement in training and
education through virtual reality.
Second Lt. Kenneth Soyars, 14th Student Squadron student pilot,
takes off during a virtual reality flight simulation Jan. 10, 2018,
on Columbus Air Force Base, Mississippi. Two subjects flew at a time
but no other subjects were allowed to watch or learn from other
individuals’ sorties. The Adaptive Flight Training Study pushed
subjects to learn through the VR technology. (U.S. Air Force photo
by Airman 1st Class Keith Holcomb)
|
The study was held
primarily to find out if the VR environment would help
adults learn at or above the rates they are currently
learning, and how the brain works and reacts in conjunction
with other parts of the body during the learning process.
Three test groups were tasked to fly a T-6 Texan II
simulator with no prior T-6 flying experience. The groups
ranged from experienced pilots who had not flown the T-6;
pilots who have limited flying experience and none within
the T-6; and the final group had no flying experience
whatsoever.
“We took the idea of learning through
advanced technologies like VR, and came up with our idea of
a targeted learning system,” said Maj. Matt Elmore from Air
Command and Staff College student. “We are focusing on how
our troops learn, using technology to measure the person,
the environment and their performance, to see if we can
provide better feedback both adaptively in the curriculum
and to provide variables or indicators to select people for
certain jobs based on the results.”
The three test
groups flew four simulations; the first simulated flight set
the baseline so the data could be compared to the other
three flights. The task was to fly a basic sortie around
Columbus AFB and land safely.
During the baseline
simulation flight, participants were given 10 minutes to
read instructions of the pattern they would be flying and
how to operate the aircraft. For their virtual training
sessions the subjects were given three learning
environments, providing less optical and auditory cues as
they progressed to help them learn their task.
Following their training sessions the subjects would fly a
final flight in the T-6 Texan II flight simulator to
determine if there was any amount of improvement through the
virtual reality training.
“The data we are gathering
can hopefully help us start to determine the key factors of
what makes individuals succeed or perform better,” Elmore
said. “Now this won’t be an end all be all but it’s good to
be on the leading edge of this and start the conversation.”
Second Lt. Madeline Schmitz, 14th Student Squadron student
pilot, prepares to take flight in the T-6 Texan II flight simulator
Jan 10, 2018, on Columbus Air Force Base, Mississippi. The Adaptive
Flight Training Study here pushed subjects to learn through the
virtual reality technology but used the T-6 flight simulator as a
baseline to compare the other VR sorties to. This allowed
researchers to see if the subject’s flights got better or worse
after the VR flight training. (U.S. Air Force photo by Airman 1st
Class Keith Holcomb)
|
Because there is a lot of data that is being gathered,
there are multiple groups attached to this project. Only a
handful of individuals came to Columbus AFB to set up and
conduct the study, each of them with a specific skill and
portion of the study to control.
“There’s a lot of
use cases with our technology that the Air Force was trying
to do, like being able to actually measure the activity of
the brain as a student was learning to fly,” said David
Zakariaie, CEO of Senseye.
The Senseye team members
are primarily setting up the gear, running the VR programs,
and collecting the data from where, when and how the
subject’s eyes are moving throughout their sorties.
“We’re focusing on pilots now, but everything that we are
doing here today, could be applied to almost any [Air Force
specialty code],” Zakariaie said.
Along with the
tracking of eye movement there is another set of data that
will be collected; the heart and respiratory patterns will
be tracked throughout the study to see if any connection can
be made to patterns of success or failure during subject’s
flights.
“We were briefed on the study and learned
they wanted to include, in all of this, an element of state
assessment,” said Capt. Wesley Baker, Air Force Research
Laboratory Deputy Program Manager for Cognitive Performance
Optimization. “For the purposes of this study I will be
measuring the heart rate and respiration data of 15
individuals’ as they fly in the simulations.”
The
data is specifically being collected from the eyes, heart,
and lungs to possibly find each individuals estimated
maximum cognitive loads; this is a factor on the success of
the research, as the implications of the possible findings
could be applied to limitless training environments across
the Air Force.
“What we want to prove is that a
virtual reality environment will help our students learn at
a faster rate than the traditional methods, and more
effectively,” Elmore said. “The real question is where can’t
this kind of learning go? We can drive this training and
make it work for us instead of playing catch up and that’s a
big takeaway, if we become early adopters.”
By U.S. Air Force Airman 1st Class Keith Holcomb
Provided
through DVIDS
Copyright 2018
Comment on this article |