eLearning – (EFL/ESL)
A safety guide for training nature guides on best practices to support hikers enjoying activities along the Colorado River.
Japanese Ministry of the Environment (Tohoku)
End-to-end design (ADDIE)
Objectives preparation
Survey design
User Interviews
Low-fidelity rapid prototyping
High-fidelity prototyping
Usability testing & Validation
Presentation
Noteworthy
Illustrator
PowerPoint
Articulate 360
Strenuous hikes through rough mountain terrain, and kayaking or rafting along rapids–both mild and wild–can be a challenge for beginners. Trail guides who accompany tourists on hikes and other outdoor adventures can be the difference between a safe, enjoyable hiking experience and a disaster. The Japan Environment Ministry of Tohoku (client) hired me to design a training prototype for their non-native English-speaking trainees who were set to offer guided tours along parts of the Michinoku Coastal Trail in the northeastern Tohoku region.
The experiential scenarios allow the learners to play the role of trail guides in a simulated training session. The training prototype engages various practical knowledge areas capturing a few of the essential job duties the learners are expected to perform when accompanying English-speaking tourists on hikes in Japan. Although the learning module was not designed to provide specific practice in the target language (English), an IELTS Intermediate level was considered the lower limit of expected proficiency prior to being selected for training.
The essential goal is for the trail guides to keep hikers, and rafters safe by:
The intervention uses a multi-mode instructional framework that features experiential e-learning and micro-learning simulations combined with instructor-led, classroom training (ILT), that delivers knowledge via PowerPoint slides, collaborative role-play, and even paper-based learning tools.
Correctly identifying common snakes is an essential survival technique on the trail.
With the deadline being short, a quick start was essential. As such, I used the Successive Approximation Model (SAM) to help guide my design progress. With this iterative model, I was able to simultaneously evaluate and adjust to incoming evaluative research without stalling the development’s progress. The ole ‘move fast and break things’ approach is not always the right choice, but on this project, it was ideal.
Know your river so you don’t shiver! Know your guide so you don’t slide!
In order to plan instructional objectives and design learning strategies, as well as properly sequence the content, I needed an expert’s support.
As such, I relied on the expertise of an SME with advanced-level hiking experience. Like the users, the SME is employed by the client. Unlike the trainees, the SME works as a tourist liaison/PR associate; a management position tasked with overseeing the regional and global growth of the Michinoku Coastal Trail in Tohoku, Japan. She has also visited the Colorado National Park. With the SME’s help, I could refine the scope of the material and uncover the essential tasks and knowledge areas for users to train on. These included:
After interviews with stakeholders, it was agreed that initial training would be designed as several instructor-led training (ILT) sessions in a classroom during office hours. Although I had yet to do any analysis on learner preferences, it was understood that the stakeholders mainly wanted instructor-led training and were uncertain about what e-learning entailed.
Ultimately, after interviews and a questionnaire to gauge things such as learners’:
I began designing the training to be a hybrid of modes. The final design combined ILT including an authentic role-play and two self-paced, responsive, computer-based training (CBT) prototypes. One, an experiential e-learning module and the other, a micro-learning (bite-sized) module optimized for smartphones.
Know your river so you don’t shiver!
Guides need to be able to confidently explain geographical facts without hesitation.
The instructor-led training (ILT) portion was designed as a set of “problem-based” scenarios. Learners, as trail guides, had to build on previous knowledge to solve challenging circumstances. To accommodate stakeholders’ desire to follow along at their pace, some paper-based materials were also provided. ILT was carried out in a classroom using:
After the ILT was completed, the experiential portion that includes authentic trail hiking began. We hiked a small portion of the (nearby) Michinoku Trail to implement a thorough simulation of the problem/solution based activities. As learners carried out the role-played, I observed and made notes about which activities appeared most engaging. One interesting observation was the effectiveness of the mnemonics and heuristics I designed and encouraged learners to chant at intervals.
These were not only great rallying cries, but were vital for:
Later, formative evaluations confirmed as much.
Follow your guide so you don’t slide!
Trail guides must act quickly if there’s an injury during the hike
Following Kirkpatrick evaluation principles, I focused on two of the four aspects of the model: Reaction and Learning.
Reaction: Perhaps the most straightforward of the Kirkpatrick levels, Reaction calls for measuring whether the learners were satisfied and found the training relevant to their role, engaging and useful. To quantify reactions, I used a paper-based, post-training survey and questionnaires that took the form of ‘happiness’ and ‘smile sheets’, as well as a classic Likert scale survey. Importantly, the measurement tools offered space for open-ended, written responses. For this Kirkpatrick level, learners were asked to rate their overall experience and satisfaction with the:
Learning: This Kirkpatrick level focuses on whether the learner has managed to fully acquire the target knowledge. Again questionnaires and surveys were used as well as a control and test group to identify gaps in learners’ pre- and post-intervention learning (knowledge). The control group was given training using only the instructor-led (ILT) and experiential role-play modalities, while the test group completed training with both the e-learning tools and the ILT.
To evaluate the learning transfer of the e-learning (test group), I mainly relied on the quiz results. Additionally, I used hand-coded xAPI statements attached to click-drag triggers, mouse movements, etc. during development using the authoring-ware (Articulate). I also used this method to evaluate how long learners spent on tasks and quizzes. The main KPIs that were tracked using either xAPI or good ole’ pen and paper were:
To evaluate the engagement and learning transfer for the control group, myself and an assistant used notes of our observations. We also relied on learner’s self-assessment of their performance and their learning via the metrics:
To tabulate this “analog” data, I developed a simple formula to established a ‘basic standard of acceptable performance’. Once a baseline of performance was established, it was easier to determine the “quality” of the instructional design in terms of reaction and learning. However, a longer-term follow ups would be needed to truly evaluate knowledge retention.
The reaction sheets showed that test group users were excited and engaged by both the ILT material and, in particular, the role-play opportunity. The e-learning modules were also very engaging as users completed all training and their evaluation scores after both the tablet-based and smartphone-based micro-learning showed successful knowledge transfer. Unsurprisingly, reactions to the training differed between the control and test groups. Those who trained with both the e-learning and the paper-based tools (test group) showed comparably higher levels of knowledge transfer as demonstrated by their overall success rate on tasks.
In terms of satisfaction and engagement, again, those who trained with both tools (test group) reacted very positively to the training. For example, a vast majority (89%) answered they were ‘very satisfied’ with the usability of the tool. All test group users (100%) found the intervention ‘very relevant’ to expected job duties and 91% found the tasks (instructional strategies) to be ‘very good’ while only 8% felt the tasks were ‘satisfactory’, with 2% feeling that the tasks ‘difficult’.
Among managers who attended the initial pitch meetings and who, later on, were privy to early, low-fidelity iterations of the learning, there was some frustration that features they had expected to be operational were not accessible in the prototype. I explained that this was due to time/budget constraints.
There were also complaints from learners about seemingly ‘clickable’ buttons that were not actually interactive. Despite these ‘dead zones’, the test group users noted the general polish and interactivity of the e-learning. Among the ILT (control group), some noted that there was ‘too much writing’ on some of the paper-based material. I acknowledged that this was the case and explained to stakeholders that I could easily improve the ratio of “showing not just telling” in the future and especially if the entire intervention were produced as e-learning. Ultimately, thanks to the control group evaluation, I was able to convince management of the merits of e-learning over purely ILT that featured PowerPoint ‘info dumps’ and paper-based interventions. Stakeholders and management came to realize that effectively designed and deployed e-learning can often (although not always) replace wordy description with more visual representation, and audio. All stakeholders and users were ‘mostly satisfied’ with the level-appropriateness of the language used in the module.
Based on the generally positive reactions and the successful transfer of essential knowledge via e-learning, the Environment Ministry ended up investing in further e-learning as part of their staff’s onboarding, upskill training, leadership training, and international PR relations development.