More popular in the 90’s it seems as though some are still married to the whole idea of soft-skills training. This is like looking back in history for us here at Visual Purple and seeing something that we did 10+ years ago and then thinking of how far our simulations have advanced since the early days of soft-skills training. In today’s training era there are so many more options than what traditional types of soft-skills training once offered. It’s kind of like having an 80’s hair-do in 2010; some people just might look at you a little strange. No longer is it as unique or flexible as once thought- not too immersive, nor entertaining. Today faster computers allow the use of high-fidelity 3D environments and other vast improvements over what ‘old school’ (oh I mean soft-skills training) once offered. And dare I mention the outdated CBT (Computer-Based Training) regime?!
In a follow-up blog post to the prior Training 4-1-1 Part 1 which addressed measuring training support I thought it important to post further proven statistics on learning outcomes from learning games. These stats are encouraging evidence that serious games do make a real difference.
The below statistical references are from the Kauffman Foundation in the Kauffman Thoughtbook 2009 excerpt.
Can We Prove It?
What proof do we have that any or all of this is true, that games can produce
superior learning outcomes? Well, the proof is precious little because the field is
so new, but at least it is positive. Witness these games:
• Supercharged! [electrostatics]—a 28 percent increase in learning outcomes
over lecture (K. Squire et al, 2004).
• Geography Explorer [geology]—a 15 to 40 percent increase in learning
outcomes over lecture (P. McClean et al, 2001).
• Virtual Cell [cell biology]—a 30 to 63 percent improvement in learning
outcomes over lecture (ibid).
• Dimenxian [algebra]—an average increase of one test grade (e.g., from B to
A) for most kids, up to three grades for underachieving kids (N. Etuk, 2006).
• River City [ecology, scientific inquiry]—a 370 percent increase in test
scores over lecture for D students; a 14 percent increase in test scores over
lecture for B students (D. Ketelhut, 2007).
• NIU Torcs [numerical methods]—twice as much time spent by gameplaying
kids on their homework, much more highly detailed concept
maps (B. Coller, 2006).
Although the topic of this blog post can apply to a wide variety of training types – I’d like to focus on measuring ‘baked – in’ training support within decision-based and virtual world training applications. So here’s the BIG question: Are you measuring training support within your current solutions? And the obvious follow up question: If yes, then why?
It is always a challenge to measure the transfer of training to the learner, but is critical in allocating resources based on proven past performance. Of course you could go to Amazon and buy a book about the how-to’s of measuring training support, or you could save yourself the hassle and just make sure that when choosing a training solution a specific level of training support is available and proven.
Although there are some general principles that go into training measurement, weighing the pros and cons of a training platform and tracking ROI are key to any training solution success. Corporate training provides organizations with a distinct business and strategic advantage, making them smarter than their competition. According to a recent Training Industry Quarterly ezine article. An estimated 40% to 80% of training content failing to take root with learners; training leaders are seeing a massive amount of waste and unrealized potential. So how are you going to guarantee that the next training/learning solution that you bring into your organization provides for high learner retention rates and a strong Return on Investment?
From a past article written by Dorn Williams of Manage Smarter
“Training is a critical component in any organization’s strategy, but organizations don’t always evaluate the business impact of a training program. Given the large expenditures for training in many organizations, it is important to develop business intelligence tools that will help companies improve the measurement of training effectiveness. These tools need to provide a methodology to measure, evaluate, and continuously improve training, as well as the organizational and technical infrastructure (systems) to implement the methodology. Cross-functional and reporting and learning analytics provide important connections between the measures of learning effectiveness offered by a learning management system (LMS) and the larger enterprise metrics that indicate whether learning is transferred and positively affects business results.
Business Performance Impact
Unless a training program exists simply for the sake of training, results should be measured and measurements should include business performance data, not just training data. Including selected metrics—such as sales, customer satisfaction, workplace safety, productivity and others—into a reporting strategy can help demonstrate where training has increased revenue or decreased costs. Measurements that consider performance improvements can provide a benchmark for training effectiveness. After implementing a training initiative or changing an existing program, an organization can observe and record a change in performance. To evaluate retention rates, there should be a lag between the training and these behavior measurements.
Many organizations are unable to evaluate their programs beyond the first two Kirkpatrick levels because they lack the tools to collect the data to make higher level evaluations. In part, LMSs, the most common repository for training data and mechanism to deliver training, make lower level evaluations easy but don’t provide any tool for higher level evaluation. Most LMSs automatically will track and report information required for Level One and Level Two analyses.”
Our services are not cookie-cutter or out-of-the-box, rather we collaborate with our clients and experts to create cutting-edge training solutions. Since we track EVERYTHING in our sims (especially important in virtual world sims), the training measurement taken from Visual Purple produced simulations have returned superb results. No surprise that the factor consistently getting the most press is Return on Investment (we’ll talk about ROI in future blog posts as it is an often misunderstood and elusive metric). Clear cut benefits to the bottom line and the organizations goals are keys to implementing any type of training solution.