- By Mike Halligan
- June 1st, 2012
The start of fall for most of our programs is the start of a new budget year and a flurry of life-safety construction projects that we hope to complete before students return from summer vacation. It should also be a time when we look back at the past year and evaluate the results of our fire prevention and life-safety efforts. Taking time to look back at our risk-reduction efforts will only help ensure that we are reaching our target groups with effective programs that demonstrate a reduction in losses from fires. Simply stated, we should be able to prove our efforts are working.
Interestingly, when I asked at a recent meeting who was conducting full or partial program evaluations, none of my peers were doing so. After talking with some of them, there seems to be several reasons why organizations don’t conduct evaluations of programs. Essentially, there are three concerns that were the leading reasons why these organizations did not conduct fire and life-
safety program evaluations:
- Fear of working with statistics even when simple math is usually all that is required to get the job done.
- Fear that an evaluation may actually identify program weaknesses.
- Lack of adequate knowledge to conduct a meaningful evaluation.
With so much at stake — the lives of our students, guests, and staff — it would seem that we could draw upon the talents of other teams within our community to assemble a team that could help put together a program evaluation that would measure our efforts. There is no need for us to be intimidated by the idea of an evaluation — failure to conduct evaluations may result in a failed risk-reduction effort and cost us lives, injuries, and disruptions in our educational mission.
Review and Compare
An evaluation is simply a review and comparison of data. Once you make the decision to evaluate your program, there must be a collection of objective data. Look at your incident reports, loss records, and local fire department reports (NFIRS). These sources should show you how a typical incident happens, how often it occurs in your buildings, what types of buildings in which they are occurring (classroom, physical plant, residence hall), when incidents occur (day, time, month), cost of the incident, and trends that show the largest increase in types of incidents.
Once data collection is completed and you can compare it to a baseline, you will have a reference point by which to evaluate your efforts. This may require an initial collection of data that spans several years. From the baseline, a benchmark can be set to determine the level of risk change desired and to look at recent initiatives by which to measure the results of any recent programs.
If you will be using a survey to establish a benchmark, make sure it is designed to collect objective information. For example, if you want to measure your staff thoughts on leading causes of fires in your facilities, ask them, “What do you see as the leading cause of fires in our buildings?” If you ask a leading question, the response is more likely to result in a biased answer.
Recognize the Long Term
The outcome of evaluations is a long-term tool. While yearly data may show short-term impacts, it may take three to five years to really see major risk or behavior changes from your efforts. The results of the evaluation, both short- and long-term, should be shared regularly with all the stakeholders in your organization. I suggest holding yearly meetings to share this information and review results. Team members will often have thoughts on how to further improve the evaluation or program to achieve even better results. This will also allow for all stakeholders to agree on revisions of the program.
If your project has a hard completion date, it will be necessary to conduct a final evaluation to show before and after data. The report generated should be shared with the target audience of the program as well as the management team for your school, the local fire department, and, when approved by administration, local community political leaders. Sharing the evaluation will show your local fire department that your school takes fire risk reduction seriously and that your administration is committed to looking at and implementing programs to further reduce the risk of fire in your facilities.
Mike Halligan is the associate director of Environmental Health and Safety at the University of Utah and is responsible for Fire Prevention and Special Events Life Safety. He frequently speaks about performance-based code solutions for campus building projects, is recognized as an expert on residence hall fire safety programs, and conducts school fire prevention program audits/strategic planning. He can be reached at 801/585-9327 or at email@example.com
Mike Halligan is the President of Higher Education Safety, a consulting group specializing in fire prevention program audits, strategic planning, training and education programs and third party plan review and occupancy inspections. He retired after twenty six years as the Associate Director of Environmental Health and Safety and Emergency Management at the University of Utah. He frequently speaks and is a recognized expert on residence hall/student housing fire safety and large scale special event planning. He also works with corporate clients to integrate products into the campus environment that promote safety and security.