This summer and fall, I’ve been substantially redesigning the training program for new volunteers at Reachout, the crisis hotline where I work. At some point I’ll get around to writing about the rationale for the changes, but in brief:
- Trainees seemed to be less and less fluent in reading, so the existing training program that used lots of training manuals was taking longer and longer
- We saw a lot of variation in preparedness among trainees, due largely to the variable quality of training offered by peer trainers
- Trainee retention was way down and getting worse
- Trainee time-to-complete-training was high and getting higher
Something needed to change.
We more-or-less scrapped the old training model, committing one semester to experimenting with the new approach I designed. It was very important to me, going into this, that everyone understood the reasons for the changes and also got it that the new method might fail completely—since that’s a risk with any total redesign. Everyone needed to get on board with the fact that change was necessary and the fact that our first efforts might not work.
People agreed. And so we jumped in.
Training Weekend got a makeover, focusing almost exclusively on listening and doing a good job with the mechanics of taking supportive-listening calls. We set that as the outcome measure of the Training Weekend: we’d call it a success if we thought that most trainees could handle a simple listening call at the end of the weekend. They succeeded.
Since then, we’ve been doing weekly classes taught by staff. Lots and lots and lots of practice: my rule for teaching was that, wherever possible, we would give the trainees a chance to practice each new thing within 30 minutes of learning it. We assigned homework for the first time in years, and we’ve spent a lot of time on values clarification in addition to hard skills.
The true test of all this will come later on: in the months and years of seeing how these trainees do as volunteers, in the long effort of hotline service. But we can still look at some measures now.
Early results
We’re currently in week seven after Training Weekend, and all of our trainees are currently taking real phone calls (under supervision) as part their practicum experience. We started the semester with nine trainees, and all of them are on the phones right now.
I collected data from the last six semesters of Reachout training. I looked at the total number of trainees (discarding those who never finished Training Weekend), and then counted those who were still involved with Reachout at this point in the semester (retention) and those who were already taking calls under supervision (training speed).
To give the previous semesters the benefit of the doubt for things like vacations, I extended the time period a bit, to eight or nine weeks. So the previous semesters are slightly over-counted. I expressed these later amounts as a percentage of that semester’s trainees, since that makes comparison easier.
Semester | Number of trainees | Still here after 8 weeks | On phones by 8 weeks |
---|---|---|---|
Spring 2011 | 7 | 5 (71%) | 1 (14%) |
Fall 2011 | 14 | 12 (86%) | 7 (50%) |
Spring 2012 | 17 | 9 (53%) | 1 (6%) |
Fall 2012 | 16 | 12 (75%) | 7 (44%) |
Spring 2013 | 12 | 7 (58%) | 3 (25%) |
Average of previous semesters | 13.2 | 9.0 (67%) | 3.8 (28%) |
Fall 2013 | 9 | 9 (100%) | 9 (100%) |
There are lots of ways to look at that data, but on the primary measures I cared about (retention and speed) we’re doing really well.
This stuff matters because we spend somewhere between $700 and $1,500 per trainee by this point in the semester once you factor in food, hall rental, staff time, printing costs, and all the rest of it. The semesters where retention was low (Spring 2012 comes to mind) represent major losses for us in economic terms, not just in terms of future shift coverage.
Losing some trainees is inevitable, and it surprises me that we haven’t had anyone drop out this semester—I think it’s unrealistic to expect perfect retention rates. But if you look at our hotline’s annual budget (somewhere around $140,000 in 2012; we work really cheap when you consider that that includes a building, heat, insurance, three full-time salaries, several part-time people, and a training budget) and then look at our trainee retention losses in 2012, the numbers add up. If we assume $1,500 in expenses/staff time per trainee, the 12 trainees who dropped out by this point in 2012 cost us $18,000—12.8% of our total operating budget for the year, and that doesn’t even consider trainees who dropped out after this point in the semester.
So, I’m excited. I think this class is going to be among our best-trained groups in years, but in any case, we have reason to be pleased with the retention and training speed numbers. It’s great to see that careful instructional design and performance improvement principles really do work in the field.
1 thought on “Reachout Training Re-Design: Early Results”