From: Allen Niemi <anniemi**At_Symbol_Here**MTU.EDU>
Subject: Re: [DCHAS-L] laboratory safety--looking for solutions
Date: September 24, 2012 3:31:09 PM EDT
Reply-To: DCHAS-L <DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU>
Message-ID: <C3795F5505D64F4494546EA7400C27D71562B649**At_Symbol_Here**CHIMBX1.ad.uillinois.edu>


I'd like to propose some answers to Peter's questions and ask a couple more.

1. In theory we want to eliminate all unintended losses and injuries but, given the nature of humans and working on the unknown, a more realistic goal is to get everyone from the school dean to the students who are involved in laboratory research to be more knowledgeable about the details of safety issues and everyone from the very top on down convinced that they should make it a priority no less important than the others. When that happens, appropriating enough funding for training, ppe, technical experts, etc., is a given.
2. It's a lot like someone (RR?) said about pornography. You know it when you see it. It's not the absence of accidents, it's the willingness and knowledge it takes to learn from them and make corrections. I've worked in research environments where (never mind where) everyone from the bench to the manager participated in new project reviews and regularly scheduled work area inspections. The new hires learned from the experienced workers. The group manger learned before she got to be a manager and likewise all the way up the ladder.

There are "entities" with nationally or globally recognized safety programs. Who are they and what triggered their decision to move in that direction? In the only case I'm familiar with, the decision was reported by old-timers to have resulted from some notorious disasters and hard-to-ignore fatality statistics. That's my fear, that without some major disaster accompanied by a yearly death toll (apparently one fatality is not enough), we will never make the decision. Even worse, I'm afraid that a lack of profit motive alone will be enough to sideline any attempt to gather support for a serious safety culture improvement. In my experience working in a positive safety culture, the program was neither difficult nor expensive. If you are already doing the training, buying the ppe, etc., then the additional cost is almost zero. If someone out there knows how to convince top university management to buy in to safety I'd be very interested in how you did it.

On Mon, Sep 24, 2012 at 1:01 PM, Ashbrook, Peter C <peteash**At_Symbol_Here**illinois.edu> wrote:

Monona,

I totally agree that the focus ought to be talking about overcoming the challenges. But before doing so, please let's stop comparing lab safety in academia to lab safety in industry. Those comparisons raise the defenses of those of us in academia and make it difficult to focus on the real issue, which is improving laboratory safety.

To help move things forward, one has to define terms and ask the appropriate questions. I'm not sure I can do that, but I'll take a stab at it. Here are two:

1. What are we trying to accomplish with our laboratory safety programs?

2. How will we know if our laboratory safety programs are accomplishing this (these) objective(s)?

We can do all the training, with the best trainers in the world. We can have the best laboratory manuals that cover all the hazards perfectly. We can have the best policies and SOPs, and enforce the heck out of them. We can do all these things (at least theoretically), yet still have laboratory accidents. Why?=97sometimes accidents happen despite our best efforts.

By these comments, I do not mean to place the blame for accidents on acts of God. We have a ways to go before our training and guidance and SOPs and enforcement are all up to snuff. We need to establish sustainable programs that will continue after everyone's attention moves on to the next big issue.

In answer to my first question, I would say we want to establish a culture where people not only take their own safety seriously, but look out for the safety of their co-workers. I am struggling to come up with a good answer to my second question. I would really appreciate some good metrics that could be used to support our laboratory safety program.

Like everyone else I know in academia, we are using the UCLA experience and the CSB report on Texas Tech to get people's attention and improve laboratory safety at our institutions. While there may be some out there, I don't know of any schools that are using the obstacles to laboratory safety as an excuse to do nothing. Certainly our professional organizations have made laboratory safety a high profile issue at recent conferences. And I am sure many of us are working on local solutions at our own schools. There are no simple solutions=97it is a lot of work over a long period of time.

Peter

From: DCHAS-L Discussion List [mailto:dchas-l**At_Symbol_Here**MED.CORNELL.EDU] On Behalf Of ACTSNYC**At_Symbol_Here**CS.COM
Sent: Sunday, September 23, 2012 12:18 PM
To: DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU
Subject: Re: [DCHAS-L] School labs vs industrial. Was: Undergrads in research labs - r...

I've been on the road, so this subject lapsed. But I'd like to do an "and furthermore."

People came up with all kinds of legitimate reasons why controlling the hours and enforcing the PPE and other rules on school labs is extremely challenging. But none of these are a reason not to look for a way to overcome these challenges. That's what we need to be talking about.

OK, there are times when students need extra time that are hard to predict, and there are thousands of different types of labs some with more hazardous stuff than others, and there are budget issues, and more and more. Why can't we look at solutions. With computers, surveillance cameras, card swipe entry systems, maybe some motivation for administrators after the UCLA suit against the Board of Regents, surely this is a time when we have a shot at coming up with some ways to at least chip at the edges of this.

They way the subject ended on this forum was a silent endorsement of the strategy to do absolutely nothing. That's just not reasonable at this point.

Monona



In a message dated 9/14/2012 8:23:29 AM Eastern Daylight Time, kim.auletta**At_Symbol_Here**STONYBROOK.EDU writes:


To add to this list of challenges for us in academia to overcome:

1. More than 1/2 our labs are "bio". The use of chemicals is incidental to their experiments/work. Most of the laboratory staff don't consider the majority of these chemicals as hazardous. Their focus is on their bio material and maintaining cultures that do not get contaminated. We also have engineering labs that use chemicals as incidental to their work.


2. An industrial/commercial facility has everyone working for the same goal - make a product that they can sell and everyone keeps their job. We have essentially 1000 different start-up companies of 2-10 people all working for individual goals (their own research money, degrees, papers, recognition).




Kim Gates Auletta
Laboratory Safety Specialist
Environmental Health &Safety
Stony Brook University
Stony Brook, NY 11794-6200
kim.auletta**At_Symbol_Here**stonybrook.edu
631-632-3032
FAX: 631-632-9683
EH&S Web site: http://www.stonybrook.edu/ehs/lab/




On Thu, Sep 13, 2012 at 5:05 PM, ILPI <info**At_Symbol_Here**ilpi.com> wrote:

On Sep 13, 2012, at 3:13 PM, ACTSNYC**At_Symbol_Here**cs.com wrote:


* Just what is it about a school laboratory as opposed to an industrial one that justifies putting a price on life this way?




Justify? Nothing. Explain? It's a lot of things. I suppose the question could be restated as "Why is academic culture not a safety culture despite our best efforts?" Our discussions on the list and at the ACS Philly meeting have hit a lot of these points lately, but I will try to summarize and add a few more thoughts. These are *generalized* and may not apply at all schools - and not all industries are paragons of safety, either.


1. Transient nature of the workers. Professors generally stop working in the lab after a few years - they are consumed by the almighty funding quest, teaching etc. He/she trains the first couple students, those students train the incoming ones, and a giant game of telephone ensues. The result is that people learn procedures that are not entirely safe or downright dangerous because SOP's are either not established, updated, or consulted. And, gotta love this, that original professor who started the chain probably learned all his/her skills the same way. I mentioned a specific example on the list previously - every student in the lab I worked in (including myself) had been taught the incorrect way to syringe t-BuLi and every one of us thought it was correct at the time even though the Aldrich bulletin had another way.





2. Immortality. Most of the students, postdocs, and young faculty aren't old enough to seriously appreciate the consequences of a safety screwup in the lab. Unless you have witnessed a horrific accident, had an underwear-changing near miss, gone to the E-room in the middle of the night sure you were having a heart attack, or had kids who would be orphans if things went wrong, you generally don't tend to tune in that it Really Can Happen To You.


Industry has the same problem, but they solve it with enforceable policies. Do This and you're fired. Don't Do This and you're fired etc. In my academic career I've seen students pushed out of PhD programs for poor academic performance, poor research, plagiarism, personal issues etc., but I have never heard of anyone being forced out or sanctioned for safety issues. Perhaps that's something that should never happen - safety should be about education/improvement/partnership/support rather than blame, but with the UCLA prosecution that whole paradigm is on the brink of seismic shift.





3. Complacency/familiarity. The more you work with something, the less dangerous it seems. You may take every precaution using n-BuLi the first time, but use it enough and you'll start becoming complacent. After you get some on your hand and it doesn't do anything but feel soapy, or you quench a big reaction without incident, you start to cut corners or not use the required PPE. Then comes the day when your scaleup runs away thermally and you don't have an ice bucket, or it's humid and it does catch fire in air. The first couple times you use a dichloromethane wash bottle and get some on your hand, you worry about it, but after a while you're hosing organic gunk off your hand with it.


Industry has the same problem, but in different ways and to a lesser degree.. Changes in scale (Texas Tech, UCLA) occur less often and in many cases expertise is available for the engineering/safety challenges of scaleup. Procedures are more repetitive, and therefore you can draw boundary lines - cut this corner and you're fired etc.





4. Unfamiliarity/number of chemicals. Investigative research requires that you buy and use a lot of different chemicals, many of which you have probably never used before. New toxicity, reactivity, storage, compatibility etc. issues all arise, and a busy grad student may not feel compelled or inclined to investigate these - all while buying the 500 mL bottle instead of the 25 mL one because it's only a few dollars more. To most grad students and amine is an amine, they are all the same. That student might not ever realize a particular diammine crosslinks your DNA.


Again, except perhaps for industrial research, the number of chemicals is far lower. Most places don't want to create and hold onto a storeroom of chemicals that you want to keep on hand "just in case" you happen to need some and can't wait 2 days for your Aldrich order.




5 No formal reporting/corrective mechanisms. When a safety problem comes up in industry, that carries all sorts of direct consequences (workman's comp, down time, lost profits, OSHA worries, whistleblower fears, lawsuits, bad publicity). Corrective action is generally straightforward - fire an employee, make a new rule, post a sign, put a new guard on a machine, whatever.



I've witnessed dozens of near misses and minor accidents in academic labs. I have never been at a school where identifying, reporting, and correcting them was encouraged, let alone even considered. Most go unreported because folks fear being punished or restricted. Which is sad, because one person's near miss could be a lesson that saves someone else from a direct hit. Bringing up safety matters will not earn you tenure - and it could even cost you it because professors should be "focused on research and getting grants" (yes, we all know that safety is an integral and non-removable component of research).


I suppose that the one thing that could go the furthest in changing the academic world would be a safety rewards program. Maybe DCHAS needs to develop a model rewards program just like there are model CHP's and the GHS model. More on that some other day.




6. God Complex/blindness. Certain PI's and researchers will simply not take safety information/advice rules from someone else seriously. "I'm a PhD chemist and I think I know how to handle chemicals safely, thank you very much." We've had discussions here about that one. Then there are those who really do believe they are taking safety seriously but have glaring issues in their lab. Their students wear goggles all the time, there is proper signage and whatnot, but there's four waste bottles (two unlabeled) in the cluttered hood with wires hanging down in front of it. This goes back to # 3 in the end.




7. Inertia. "That's the way things have always been done and we haven't had any problems." This goes up the administrative chain as well. The recent spate of horrid academic accidents has, perversely, put our community into a unique position - we can actually get attention to address long standing issues that administrations formally would have been reluctant to address or spend money on.




8. Dysfunctional organization and/or poor leadership. This is more on a department by department or institutional basis. We all know it happens, but I'm trying to address things we can directly affect/effect.




In summary, there's just a few reasons, just ***off the top of my head***. It goes so way far deeper than this. We could all collaborate and write a book. Wait, it's been done. http://portal.acs.org/portal/PublicWebSite/about/governance/committees/chemicalsafety/CNBP_029720 A big round of applause for everyone who participated in that!


Rob Toreki


======================================================
Safety Emporium - Lab &Safety Supplies featuring brand names
you know and trust. Visit us at http://www.SafetyEmporium.com
esales**At_Symbol_Here**safetyemporium.com or toll-free: (866) 326-5412
Fax: (856) 553-6154, PO Box 1003, Blackwood, NJ 08012




























--
Allen Niemi, PhD
Director
Occupational Safety and Health Services
Room 322 Lakeshore Center
Michigan Technological University
Phone: 906-487-2118
Fax: 906-487-3048

Previous post   |  Top of Page   |   Next post



The content of this page reflects the personal opinion(s) of the author(s) only, not the American Chemical Society, ILPI, Safety Emporium, or any other party. Use of any information on this page is at the reader's own risk. Unauthorized reproduction of these materials is prohibited. Send questions/comments about the archive to secretary@dchas.org.
The maintenance and hosting of the DCHAS-L archive is provided through the generous support of Safety Emporium.