The Use of Simulation in Military Training

Some years ago, I worked in the realm of simulation for training with the Navy. I learned so much about the power of using simulation for experiential learning that it left me inspired to explore and learn more. The content in this blog outlines some of my own professional development on the topic.

Generally, the Canadian military makes wide use of simulators as training platforms. To name a few examples, the Royal Canadian Navy has Naval and Bridge Simulators (NABS) on each coast, the Canadian Army stood up the Land Vehicle Crew Training System (LVCTS) project to purchase a range of vehicle crew trainers, and the Royal Canadian Air Force has various flight simulators, as well as a simulated airport control tower to train Aerospace Controllers/Operators. Simulation is also used in other fields, such as in health care. It would seem that the sky is the limit to the use of simulation in training. New and exciting, though, can rarely be the sole reason for change. The value of simulation, including the benefits and drawbacks, within the context of training, will always need to be evaluated on a case per case basis.

Simulation in training, however, has shown the potential to provide a method of instruction that can engage students in a rich and active learning experience.  Assuming a thorough analysis of the training requirements is performed upfront and the best practices in instructional design are adhered to throughout all phases, simulation can result in skills that are transferable to the workplace and it can result in a sound return of investment (ROI) for training budgets.  As Page and Smith (1998) wrote, “the need for, and demands on, military simulation are continually increasing. Driven largely by fiscal necessity, an increasing pressure to employ simulation is driving exploration into new methods for modeling combat activities” (p. 57). 

Simulation has been defined as a technique, “to replace and amplify real experiences with guided ones, often “immersive” in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion” (Lateef, 2010, p.1).  Simulation can sometimes be mistaken for a tool or an instructional technology.  While it is true that simulation may make use of a certain tool or technology, it is, in fact, an instructional method.  As stated by Timothy Clapper (2010), “technology is just a tool to be used in conjunction with a good learning plan that enhances and does not replace the need for active engagement activities” (p. e12).  Having the use of a technologically cutting-edge simulator, for example, does not exclude the need to adhere to good instructional design principles.

Simulation is broken down into three main types: live, virtual and constructive.  These three types can be characterized by these simple distinctions: 1) live simulation uses real people and real equipment/systems; 2) virtual simulation uses real people with simulated equipment/systems; and 3) constructive simulation uses simulated people and simulated equipment/systems. 

Live simulation uses real people and real equipment/systems.  Technicians often work on real engines in learning how to take them apart, fix them and put them back together.  Military exercises typically consist of real soldiers running through simulated exercise scenarios in the field, often using real vehicles and real weapons.   

Virtual simulation has real people using simulated equipment/systems.  A simulated firing range has a real person firing a simulated rifle into a simulated firing range of targets.

A constructive simulation has simulated people, environment and equipment/systems.   An example of a constructive simulation could be found with the Canadian Virtual Naval Fleet (CVNF).  This 3D training environment allows sailors to immerse themselves in the 3D ships to familiarize themselves with the different ship classes and to train on procedures related to their occupations (National Defence, 2012).

The best type of simulation, whether it is live, virtual or constructive, will often depend on the resources available. The best ROI is dependent, among other things, on how often the simulation will be used.  Live simulation, for example, has a quite high cost as it is usually highly dependent on human resources and other materials.  Virtual simulation, however, has a medium cost, comparably, in that it often requires less resources and it can frequently be reused.  Constructive simulations are often the lower cost option in that they frequently require the lowest human resource and material cost and they can be used for many serials without a lot of additional costs.  Due to all of these factors, a sound cost benefit analysis is an important step in determining which of these types of simulation would best serve a particular learning need.

Fidelity is a common word used in relation to simulation-based learning.  It is a term used to define the “degree to which the simulator replicates reality” (Beaubien & Barker, 2004, p. 2).  A typology of fidelity, adapted from Rehmand et al. (1995) and described by Beaubien and Baker (2004), focuses on three main aspects of fidelity: 1) equipment fidelity; 2) environmental fidelity; and 3) psychological fidelity. 

The equipment fidelity speaks to the level of similarity the simulator has to the actual equipment that one is training to use.  Environmental fidelity speaks to the accuracy of the sensory cues, such as motion and visual, to the actual environment to be experienced in the future on the job.  Psychological fidelity speaks to the trainee’s level of overall belief that the simulator is similar to that in a real-life situation.  This fidelity factor is the one that allows the trainee to suspend disbelief and feel themselves truly in the real-world situation while engaged in the simulation.  Psychological fidelity of a simulation is the main indicator of whether the skills learned in the simulation will, indeed, transfer to the real-life on-the-job situation.  This is also the one factor that can be the most influenced by instructional designers, in that the creation of realistic scenarios within the simulation can go a long way to increasing the psychological fidelity of the simulation (Beaubien & Barker, 2004).

When deliberating on the best value for money in simulation investments, it is important to remember that “technology that simulates the environmental or equipment characteristics can increase the psychological fidelity of well designed training scenarios, but cannot compensate for poorly designed ones” (Beaubien & Barker, 2004, p. 2).  Prioritizing good instructional design principles can be most valuable in the development of simulation-based learning.   

Generally speaking, the higher the equipment and environmental fidelity requirement decided upon, the higher the cost that a developer can expect to pay for a simulator.  A meta-analysis of the research in the field has show that “the benefits of simulation training can be had from low-cost, desk-top simulations, equally or more so than from expensive high-fidelity simulators” (Hahn, 2011, p. 10).  Determining the necessary level of fidelity in each of the three areas is worth the time and effort to get it right from the early training design and requirements gathering phases.  

Much current research delves into validating the effectiveness of simulation in learning through determining the level of success of the skill transfer.  It is possible for learning to take place within a simulator but not actually translate into a real life situation.  As Hahn (2011) aptly states, “if learning [in a simulator] does not result in transferable skills, the training is for naught” (p.1). 

The best way to judge if there has been an effective level of skill transfer would simply be to watch the student perform the tasks trained in a real life setting following the simulated training experience.  More elaborate studies, such as the one that will follow, are often undertaken to further evaluate and quantify the actual percentage of skill transfer that has been achieved through the use of simulations for training. 

A report presented to the 2011 Interservice/Industry Training, Simulation & Education Conference (I/ITSEC) outlined a study performed to evaluate the skill transfer of Royal Canadian Mounted Police (RCMP) cadets from pistol training.  The report titled “Pistol Skills Transfer from a Synthetic Environment to Real World Setting” (Krätzig et al., 2011) studied RCMP cadets (N=124) to compare those who trained in a simulated small arms trainer to those who trained exclusively on a live-fire range.  The cadets participated in 18 fifty-minute training sessions, live fire for three control groups and in the simulated trainer for one experimental group, and they were all assessed on the live-fire range.  The live-fire control group fired 2300 rounds each in total throughout all training and assessments.  The experimental group who trained in the simulator only fired 200 rounds each, all of which were during the assessments alone. 

Although there were some variations on the different assessments, no significant difference was shown in the assessment scores overall.  Although some cadets who were trained in the simulated trainer failed their first final qualification assessment, all passed after some remedial training.   It was felt that the addition of the recoil in the live-fire session may have distracted the cadets who had not experienced this recoil in the simulated trainer.  The authors stated in the discussion that the experiment “provided evidence that the skills needed for the basic LEO [law enforcement officer] pistol shooting, can be acquired in a synthetic environment” (Krätzig et al., 2011, p. 6-7). 

A follow-up to this study was presented at the 2014 I/ITSEC by Mr. Krätzig.  The three year longitudinal study that followed the same RCMP members showed that the RCMP members who had originally been trained mainly by the simulated small arms trainer continued to show a high level of success in their yearly recertification.  In fact, in comparison with the other RCMP members who had originally been trained by live-fire alone, they showed higher scores in the following years.  Mr. Krätzig surmised that the fact that those who were originally trained in the simulated small arms trainers, had the opportunity to fire the weapon more often, had a quieter environment to train and likely were able to hear the mentoring of the instructors to their classmates unlike those on the loud live-firing range, were all potential factors that led to the better retention of their skills (Krätzig, 2014).

Business case analysis research has often been done to study the ROI of simulation.  In the military, a savings of bullets, the usage of vehicles and the related maintenance costs, the saving of fuel for ships and aircraft are all costs that can be saved when virtual or constructive simulations are used.  Operators can be more competent and prepared as they are often afforded more practice time in a simulator and they can have experiences, in simulation, with situations that they might not be able to practice in real live settings (e.g. fighting a fire on a ship). 

The future of simulation in military training is sure to expand.  Most predict that simulation will never completely eliminate live training, in that a soldier will always need to fire real bullets from a real rifle before going to combat and a pilot will always need to fly a real plane before being certified.  Simulation will, however, play an increasingly strong role in complementing and supplementing training.  Some, such as seen in Simonson et al. (2012), predict that “virtual worlds will represent the standard learning environments at some point in our future” (p. 132).

The rising cost of fuel and resources will cause organizations to continue looking for new ways to train in the future, as has been happening within the Canadian Armed Forces.  According to the “Royal Canadian Air Force (RCAF) Modeling and Simulation Strategy and Roadmap (Training) 2025” (National Defence, 2014), the RCAF recognized that they must “shift from live to virtual based methods in order to achieve more effective and efficient training” and that “whenever live and virtual training methods offer the same training value, the preference will be to choose the virtual training method” (p. 6).  Live training on actual equipment and the use of live simulations has been seen as the most valuable training for many years.  This required change to the higher use of virtual and constructive simulated training represents a massive, yet required, cultural shift that has been taking place over the past years. 

It’s been too long since I attended the Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) and perhaps some of my information here is aging. I will be sure to add a visit to I/ITSEC to my professional development bucket list in order to update my knowledge in this area. If you have any thoughts on this blog, I would be happy to engage in discussion below.


Beaubien, J., & Baker, D. (2004). The use of simulation for training teamwork skills in health care: How low can you go? Quality and Safety in Health Care, 13(Suppl 1), 51-56.

Clapper, T. (2010). Beyond Knowles: What those conducting simulation need to know aboutadult learning theory. Clinical Simulation In Nursing, 6(1), e7-e14.

Krätzig, G. P., M. Hyde, et al. (2011). “Pistols Skills Transfer from a Synthetic Environment to Real World Setting”. The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC) 2011(1).

Lateef, F. Simulation-based learning: Just like the real thing. Journal Of Emergencies, Trauma &Shock [serial online]. October 2010;3(4):348-352.

National Defence. (2012). Canadian Virtual Naval Fleet: Fact Sheet. Courcelette, QC: Navy Learning Support Centre (East).

National Defence. (2014). Royal Canadian Air Force Modeling and SimulationStrategy and Roadmap (Training) 2025. [Executive Summary]. Ottawa, ON: Directorate of Air Simulation and Training.

Page, E., & Smith, R. (1998). Introduction to military training simulation: a guide for discrete event simulationists. 1998 Winter Simulation Conference Proceedings (Cat No98ch36274), (1), 53. doi: 10.1109/WSC.1998.744899. 

Rehmann, A., Mitman, R., & Reynolds, M. (1995). A handbook of flight simulation fidelity requirements for human factors research. Technical Report No. DOT/FAA/CT-TN95/46. Wright-Patterson AFB, OH: Crew Systems Ergonomics Information Analysis Centre.

Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2012). Teaching and learning at a distance: Foundations of distance education (5th ed.) Boston, MA: Pearson.


Leave a Reply