The Big QuestionMeasuring the performance of fire prevention programs starts with asking why they exist in the first placeBy Jim CrawfordFor some time, many of us in the prevention field have been discussing model performance measures for fire prevention programs. It’s no secret that the measures apply for other injury-control strategies and that they’ve been around for a long time. But the timing seems right to update the terms and provide examples of how they may be useful in our field.
The motivation is simple: We have an obligation as public servants to provide evidence of the results we achieve. But there’s more to measuring performance than might appear on the surface. We can evaluate programs individually, but providing overall evidence of prevention programs to government accountants usually involves a different emphasis. However, the evaluation metrics for each approach are similar and even overlap.
4 MeasuresMeasuring prevention program success can be done in four ways; these measurements represent a continuum that starts even before a prevention effort begins and is used during and after the program to modify it or provide evidence of its results:
1.
Formative measures are used to describe the research that went into understanding the scope of the problem—who was involved and how to reach them. Example: Where are our fires occurring, to whom and why?
2.
Process measures are used to describe and monitor the progress of a program reaching its goals. Example: We planned to reach 1,000 homes in our service area with smoke alarm messages; did we?
3.
Impact measures are used to indicate that changes are in fact beginning to take place; they’re a little more concrete than process measures. Example: We reached the 1,000 homes, we installed 1,500 smoke alarms and six months later, a random check demonstrated that 95% of them were still working.
4.
Outcome measures indicate how well a program achieved its overall goals; they’re the most concrete and take the most time. Example: We monitored the 1,000 homes over a period of years and documented that the overall fire death rate had dropped because more homes had working smoke alarms and people were escaping fires.
There are many variations to these concepts and you won’t learn them all in one column. But it’s important to realize that we must ask ourselves why we are doing something in order to project the ultimate goals and objectives. We’re installing smoke alarms not to reduce fires, but to reduce the risk of fire deaths.
Asking WhyFire and life safety educators do a better job of evaluating their programs than most others in the fire service. When others are measuring how fast they can do something, fire prevention personnel are often focused on the why. But we can still do better, and that’s why some of us are beginning to build a growing national database of examples of these measures and the concepts behind them.
Recently, at the Washington State Fire Prevention Roundtable, Deputy State Fire Marshal Lyall Smith provided an example of how the state Fire Marshal’s Office is monitoring its program for training and inspections of licensed-care facilities. It was evident to me that they had asked the question of why they were doing the inspections in the first place. It was not to do an inspection and count it. It was to identify and remove hazards—ultimately, to reduce the risks associated with those hazards in licensed-care facilities.
How then to measure the impact of that program, especially in light of serious budget cuts to the Fire Marshal’s Office? They realized that if they trained licensed-care facility operators on how to spot and remedy hazards on their own, they might reduce their inspections, their inspection times and ultimately the risks with as little effort as possible. They documented the impacts of their program of education and code enforcement to demonstrate an improvement in the inspection process.
After the trainings took place, overall hazards identified in these care facilities dropped by 48 to 50%. And fewer (as many as 36%) needed follow-up re-inspections. That helps with the bottom line—and provides a solid example of using these measures in the code enforcement world.
There’s always more to the story; every local fire prevention professional knows that. But if we don’t start thinking in terms of performance measurement, and providing evidence of our results, we’re going to be in even more trouble than we are in the current economic climate.
Jim Crawford recently retired as deputy chief and fire marshal with the Vancouver (Wash.) Fire Department and is chair of the NFPA technical committee on professional qualifications for fire marshals. He has written “Fire Prevention Organization and Management,” published by Brady, and has also written a chapter on fire prevention in “Managing Fire and Rescue Services,” published by the International City/County Managers Association. Crawford is a past president of the International Fire Marshals Association and has served on the NFPA’s Standards Council. He is a member of the IAFC and the Institution of Fire Engineers U.S. Branch.Want More Info? · The National Fire Academy offers a great course on some of the concepts in this article, “Demonstrating Your Fire Prevention Program’s Worth.” I highly recommend it. Visit
www.usfa.dhs.gov/nfa/nfa-101309.shtm for more information.
· Those interested in the Washington State Fire Marshal’s licensed-care facility inspection training can contact Karen Jones, acting chief deputy fire marshal, at
karen.jones@wsp.wa.gov.
You need to be a member of My Firefighter Nation to add comments!
Join My Firefighter Nation