Even Realistic Videogames like Call of Duty Won’t Help Us Win Wars

Opinion: Research shows that millennial cadets' digital skills don't assist them on the virtual battlefield.
Image may contain Human Person and Call of Duty
New military research shows that playing videogames like Call of Duty doesn't help soldiers on the virtual battlefield.Sledgehammer Games/Activision

Millennials can now storm the beaches of Normandy and fight Nazis with a new level of realism. Sure, past games like like Castle Wolfenstein let them role play America’s Greatest Generation. But the recently released Call of Duty: WWII provides a more human and realistic dimension of war, whereas previous videogames delved into fantasy or featured cyber-mutant soldiers out of a bad '90s movie.

Will all this realism help future soldiers perform more effectively on the battlefield? To answer this question, a team of West Point researchers headed for the wooded hills that encircle the upstate New York campus, where every summer cadets must perform a series of challenges and exercises to demonstrate their leadership capabilities, ability to work in teams, and so forth, all while cut off from the outside world. It is a real attempt to replicate the conditions of war.

For one of the scenarios, cadets were tasked with conducting an urban raid mission on a clutch of abandoned buildings made to look like a town one might encounter in the Mekong Delta or Anbar Province. Half of the cadets were randomly furnished with a packet of high-definition photos of the target buildings; the other half were given virtual-reality goggles that provided 360-degree footage of the area, similar to Google Street View. One was analog and primitive, the other high-tech and provided near-perfect situational awareness of their surroundings. Guess which one cadets preferred?

In virtually all the performance indicators measured, cadets performed better with photos. They favored sketching out their surroundings on a pad of paper, rather than using goggles that gave them the ability to see 360 degrees, both around and between buildings, the very places enemies tend to lurk.

There is a common assumption that millennials are what behavioral psychologists call digital natives. Given the popularity of games like Call of Duty or World of Warcraft, as well as younger people's near-addiction to their personal handheld devices and church-like devotion to social media, one would expect millennials to be savvier and faster learners when it comes to incorporating new digital technologies at the tactical level.

Based on our research, it turns out this assumption may be false.

We see preliminary evidence of cognitive overload in cadets when they are presented with new technologies in a lab-like environment meant to mimic the stresses of urban warfare. That's true even among those weaned on a diet of videogames.

Our group of military researchers surveyed the cadets after the experiment, which revealed that cadets found the virtual reality goggles unhelpful, preferring to try to crawl up to the buildings to see them with their own eyes.

The military services are always looking to train soldiers as effectively and efficiently as possible. As the digital age rapidly advances, the military has sought to answer the following question: Can synthetic gaming environments enhance capability development?

A number of military organizations, such as the Army’s Research, Development, and Engineering Command, simulate and model new weapons systems. One game that just entered beta-testing, Operation Overmatch, lets up to eight soldiers combine a variety of capabilities—such as a vehicle with specific communications or night vision gear—to test the tools' effectiveness along different terrain types. Ahead of its launch this year of a Modernization Command, the Army is gearing up to design prototypes to test long-range precision fires, next-generation combat vehicles, network-centric warfare, and air-and-missile defense, among other things.

But many of the gaming prototypes assume a baseline digital nativism among younger players. The takeaways from our experiment challenge many key assumptions about whether gaming environments can enhance capability development. In other words, we should not assume a 17-year-old who grew up playing nothing but videogames—even the most realistic variants—will be especially proficient in using digital technologies in a war zone.

Our research also highlighted that many millennials lacked the spatial navigation skills of previous generations, who did not have the luxury of getting directions from Waze or Google Maps.

This has huge implications for future soldiers expected to be experts at navigating land. This finding also becomes problematic when urban battlefields are host to increasingly dense populations, and it's not possible to see the target; in these cases, soldiers must rely solely on aerial or satellite imagery before a mission begins.

To take one recent example: In 2016, in the old city of Mosul, the US military was supporting Iraqi Security Forces on missions where it was impossible to see the target in person. US-backed forces depended on the support of technologies such as drones and satellite to take pictures from afar. Soldiers have to make critical decisions like which road to take, which building to attack, and where to drop bombs based solely on aerial photos or live streaming video, not on their ability to see their objective beforehand as is often practiced in training.

Simulations form the backbone of training for war. And videogames are a relatively efficient way in multiplayer missions to simulate real-world conflict, teach cultural awareness, and test combat skills, as Corey Mead notes in his 2013 book, War Play: Videogames and the Future of Armed Conflict. Games—Counter-Strike: Global Offensive and Full Spectrum Warrior, to name two—have also been used to help treat soldiers coming home with PTSD.

But sometimes senior military officers take for granted that gaming technology can provide a leg up when it comes to training and performance on the battlefield. The Army tends to assume younger generations of soldiers are digital natives who will adapt new technology as easy as the latest social media app.

Of course, virtual reality goggles are but one small piece of the digital universe that will change how we fight in the future. But technology will not replace training. Military recruits who've grown up soaked in technology may in fact require more training—think using maps and road atlases. The challenge will be how to train soldiers in the task they haven’t yet developed in life while integrating technologies like drone footage before using them in combat.

Call of Duty: WWII will let young gamers replicate the storming of Omaha Beach. We should not disregard these games, or other synthetic prototyping models, as useful training tools, especially insofar as they teach our youth about the grim realities of war, not a sanitized or sensationalist version. Yet, we should not assume that millennials who've logged hundreds of hours of video gaming will seamlessly be able to employ new technologies in the heat of battle.

In combat-like scenarios, even tech-savvy millennials default to analog.

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.