From: Allen Niemi <anniemi**At_Symbol_Here**MTU.EDU>
Subject: Re: [DCHAS-L] laboratory safety--looking for solutions
Date: September 25, 2012 10:54:41 AM EDT
Reply-To: DCHAS-L <DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU>
Message-ID: <920A41E71262413C83CF1842DB9E8536**At_Symbol_Here**bruekbergterm1>


In think your point is well taken about disincentives to reporting but I don't think anyone is suggesting that accidents and close calls would be reported to funding agencies. The methods used by funding agencies to require effective safety programs as a funding prerequisite must be designed to go beyond a simple certifications by an EHS official for the university. I'd like to know if anyone has ever declined to sign off on a funding agency's required EHS program certification on a major research proposal (only some agencies even require them). I think what is being proposed is to establish metrics that an agency would request rather than a simple sign-off. Sure, a school could falsify their metrics but there are ways of deterring even that in the design. We get periodic campus visits from representatives of many major funding agencies, there's no reason they couldn't include some safety performance evaluations while they're here. In many cases, all they have to do is walk through a couple labs. I like where this idea is going and I think the first thing that needs to be done is to get the major scientific research funding agencies on board with the idea that they can, and should, play an important role in improving university research safety. ACS, CSB, OSHA/NIOSH and several other institutions should have enough influence if they can coordinate their story and lay it out in public.

On Mon, Sep 24, 2012 at 8:05 PM, Ben Ruekberg <bruekberg**At_Symbol_Here**chm.uri.edu> wrote:

Pardon my mentioning it, but don't you think that you are setting up a conflict of interests here?

If funding is linked to safety performance, wouldn't that inhibit self-reporting of minor accidents and close calls?

I seem to recall talk about an ever-correcting safety program based on self-reporting of minor incidents and close calls.

It seems to me that you cannot really have both thorough self-reporting and withholding of funding. Perhaps I am being overly pessimistic or failing to see an obvious solution, but I think there might be a problem.

Thank you,

Ben


From: DCHAS-L Discussion List [mailto:dchas-l**At_Symbol_Here**MED.CORNELL.EDU] On Behalf Of Jeskie, Kimberly B.
Sent: Tuesday, September 25, 2012 8:22 AM


To: DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU
Subject: Re: [DCHAS-L] laboratory safety--looking for solutions

Sounds like a great idea. One relatively easy way to do that would be to have an information keeper/assembler and have everyone shoot their ideas for metrics to the body. If we do something like this, I would suggest some parameters around the metrics (e.g. if someone suggests a metric, we also have to identify if it's leading or lagging, truly measurable, objective and can be duplicated).

Kimberly Begley Jeskie, MPH-OSHM

Operations Manager

Physical Sciences Directorate

Oak Ridge National Laboratory

Office: (865) 574-4945

Cell: (865) 919-4134

From: DCHAS-L Discussion List [mailto:dchas-l**At_Symbol_Here**MED.CORNELL.EDU] On Behalf Of Neal Langerman
Sent: Monday, September 24, 2012 6:15 PM
To: DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU
Subject: Re: [DCHAS-L] laboratory safety--looking for solutions

Kim's point about granting agencies is correct. The job we need to do is to provide the metric. In the ideal world, a group would develop a procedure in conjunction with ACS Committee on Chemical Safety and this would be provided to NIH/NSF/DoD/DHS/ etc as a guidance.

If we safety professionals cannot agree on a metric, then the agencies will not be able to and will not be inclined to take action.

So, this group should have a discussion on metrics =96

------------------------------------------------------------------------------------

The information contained in this message is privileged and confidential and protected from disclosure. If the reader of this message is not the intended recipient, or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please notify us immediately by replying to the message and deleting it from your computer.

ACSafety has a new address:

NEAL LANGERMAN, Ph.D.

ADVANCED CHEMICAL SAFETY, Inc.

PO Box 152329

SAN DIEGO CA 92195

011(619) 990-4908 (phone, 24/7)

www.chemical-safety.com

We no longer support FAX.

From: DCHAS-L Discussion List [mailto:dchas-l**At_Symbol_Here**MED.CORNELL.EDU] On Behalf Of Kim Auletta
Sent: Monday, September 24, 2012 2:26 PM
To: DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU
Subject: Re: [DCHAS-L] laboratory safety--looking for solutions

Rob - ACS does need to include safety as a requirement for accreditation, but remember, ACS is only 1 of my accrediting groups. ABET covers the engineering school, and I haven't run into the bio groups yet.

I'd like to add that the bigger "stick" would be to have the GRANTING agencies include safety as a metric in their requirements. This was mentioned in the CSB Texas Tech review, and I think its the 1 way to keep a PI on the safety track. And it needs to be something more than the CEC EH&S signs for some DoD work that's done on campus.

Kim Gates Auletta

Laboratory Safety Specialist

Environmental Health & Safety

Stony Brook University

Stony Brook, NY 11794-6200

On Mon, Sep 24, 2012 at 3:42 PM, ILPI <info**At_Symbol_Here**ilpi.com> wrote:

Peter,

Definitely, it's hard not to take it personally/professionally when all the factors underlying academic safety are laid bare. A lot of folks on the front line are doing everything possible to increase academic safety, and an examination of these challenges is not an indictment of these efforts or people. But we do have to acknowledge that academia and industry are very different and seek solutions that work best for each.

An arguably glib answer to your questions 1 and 2 is simply what Monona was asking for in her initial rhetorical question - that academic labs have a working environment at least as "safe" as industry's. Or, in something right up there with "world peace", to make research as safe as possible.

In pursuit of safer research labs and having identified some of the numerous factors/challenges faced in the academic research laboratory, I would add these questions to your list:

1. How much control do we have over each of these underlying factors/challenges? (Identify underlying factors)

2. What *can* be done in the context of current control? (Identify best practices)

3. What more will it take to raise that level of control and can we do it? (Explore new paradigms)

For example, take the first point I raised below - the transient nature of the workers. We have no control over that. What we can do, as discussed here, is to have each incoming worker registered in a workplace management system that tracks their safety training progress. What can we do to increase control? New ideas or multiple overlapping layers of defense are required there - have workers report the chemicals they are working with, institute laboratory procedure audits, start a safety peer-review program, use your tracking system to ensure that SOP's are read by quizzing workers on them, mandatory refresher courses etc. Some of these will overlap other areas, of course. And you have to be careful not to have it appear burdensome/busywork, so there's a fine balance. Is any of this practical with limit resources? Add "limited resources" as a challenge in itself!

Or take the enormous challenge raised by Kim - "We have essentially 1000 different start-up companies of 2-10 people all working for individual goals (their own research money, degrees, papers, recognition)." Wow, no control over individual motivations. Safety culture and everything we do already is how we mitigate it. But to raise our level of control will take, in my opinion, the ACS requiring safety culture from day 1 of the curriculum - folks need to realize that no matter what they do in the laboratory, safety is an integral part and not an prethought, afterthought or checklist. Accreditation requirements should reflect this - and not just on paper ("we teach safety in all our classes, yes"), but with an on-site audit that shows it is actually *implemented* and *effective*. Likewise, ACS should consider requiring safety and risk assessment/mitigation in the Experimental sections of synthesis papers..

Hmmm, when I said write a book, I think I should have said "multi-volume set".

Rob

======================================================

Safety Emporium - Lab & Safety Supplies featuring brand names

you know and trust. Visit us at http://www.SafetyEmporium.com

Fax: (856) 553-6154, PO Box 1003, Blackwood, NJ 08012

On Sep 24, 2012, at 1:01 PM, Ashbrook, Peter C wrote:

Monona,

I totally agree that the focus ought to be talking about overcoming the challenges. But before doing so, please let's stop comparing lab safety in academia to lab safety in industry. Those comparisons raise the defenses of those of us in academia and make it difficult to focus on the real issue, which is improving laboratory safety.

To help move things forward, one has to define terms and ask the appropriate questions. I'm not sure I can do that, but I'll take a stab at it. Here are two:

1. What are we trying to accomplish with our laboratory safety programs?

2. How will we know if our laboratory safety programs are accomplishing this (these) objective(s)?

We can do all the training, with the best trainers in the world. We can have the best laboratory manuals that cover all the hazards perfectly. We can have the best policies and SOPs, and enforce the heck out of them. We can do all these things (at least theoretically), yet still have laboratory accidents. Why?=97sometimes accidents happen despite our best efforts.

By these comments, I do not mean to place the blame for accidents on acts of God. We have a ways to go before our training and guidance and SOPs and enforcement are all up to snuff. We need to establish sustainable programs that will continue after everyone's attention moves on to the next big issue.

In answer to my first question, I would say we want to establish a culture where people not only take their own safety seriously, but look out for the safety of their co-workers. I am struggling to come up with a good answer to my second question. I would really appreciate some good metrics that could be used to support our laboratory safety program.

Like everyone else I know in academia, we are using the UCLA experience and the CSB report on Texas Tech to get people's attention and improve laboratory safety at our institutions. While there may be some out there, I don't know of any schools that are using the obstacles to laboratory safety as an excuse to do nothing. Certainly our professional organizations have made laboratory safety a high profile issue at recent conferences. And I am sure many of us are working on local solutions at our own schools. There are no simple solutions=97it is a lot of work over a long period of time.

Peter

From: DCHAS-L Discussion List [mailto:dchas-l**At_Symbol_Here**MED.CORNELL.EDU] On Behalf Of ACTSNYC**At_Symbol_Here**CS.COM
Sent: Sunday, September 23, 2012 12:18 PM
To: DCHAS-L**At_Symbol_Here**MED.CORNELL.EDU
Subject: Re: [DCHAS-L] School labs vs industrial. Was: Undergrads in research labs - r...

I've been on the road, so this subject lapsed. But I'd like to do an "and furthermore."

People came up with all kinds of legitimate reasons why controlling the hours and enforcing the PPE and other rules on school labs is extremely challenging. But none of these are a reason not to look for a way to overcome these challenges. That's what we need to be talking about.

OK, there are times when students need extra time that are hard to predict, and there are thousands of different types of labs some with more hazardous stuff than others, and there are budget issues, and more and more. Why can't we look at solutions. With computers, surveillance cameras, card swipe entry systems, maybe some motivation for administrators after the UCLA suit against the Board of Regents, surely this is a time when we have a shot at coming up with some ways to at least chip at the edges of this.

They way the subject ended on this forum was a silent endorsement of the strategy to do absolutely nothing. That's just not reasonable at this point.

Monona



In a message dated 9/14/2012 8:23:29 AM Eastern Daylight Time, kim.auletta**At_Symbol_Here**STONYBROOK.EDU writes:


To add to this list of challenges for us in academia to overcome:

1. More than 1/2 our labs are "bio". The use of chemicals is incidental to their experiments/work. Most of the laboratory staff don't consider the majority of these chemicals as hazardous. Their focus is on their bio material and maintaining cultures that do not get contaminated. We also have engineering labs that use chemicals as incidental to their work.


2. An industrial/commercial facility has everyone working for the same goal - make a product that they can sell and everyone keeps their job. We have essentially 1000 different start-up companies of 2-10 people all working for individual goals (their own research money, degrees, papers, recognition).




Kim Gates Auletta
Laboratory Safety Specialist
Environmental Health &Safety
Stony Brook University
Stony Brook, NY 11794-6200
kim.auletta**At_Symbol_Here**stonybrook.edu
631-632-3032
FAX: 631-632-9683
EH&S Web site: http://www.stonybrook.edu/ehs/lab/




On Thu, Sep 13, 2012 at 5:05 PM, ILPI <info**At_Symbol_Here**ilpi.com> wrote:

On Sep 13, 2012, at 3:13 PM, ACTSNYC**At_Symbol_Here**cs.com wrote:

* Just what is it about a school laboratory as opposed to an industrial one that justifies putting a price on life this way?




Justify? Nothing. Explain? It's a lot of things. I suppose the question could be restated as "Why is academic culture not a safety culture despite our best efforts?" Our discussions on the list and at the ACS Philly meeting have hit a lot of these points lately, but I will try to summarize and add a few more thoughts. These are *generalized* and may not apply at all schools - and not all industries are paragons of safety, either.


1. Transient nature of the workers. Professors generally stop working in the lab after a few years - they are consumed by the almighty funding quest, teaching etc. He/she trains the first couple students, those students train the incoming ones, and a giant game of telephone ensues. The result is that people learn procedures that are not entirely safe or downright dangerous because SOP's are either not established, updated, or consulted. And, gotta love this, that original professor who started the chain probably learned all his/her skills the same way. I mentioned a specific example on the list previously - every student in the lab I worked in (including myself) had been taught the incorrect way to syringe t-BuLi and every one of us thought it was correct at the time even though the Aldrich bulletin had another way.





2. Immortality. Most of the students, postdocs, and young faculty aren't old enough to seriously appreciate the consequences of a safety screwup in the lab. Unless you have witnessed a horrific accident, had an underwear-changing near miss, gone to the E-room in the middle of the night sure you were having a heart attack, or had kids who would be orphans if things went wrong, you generally don't tend to tune in that it Really Can Happen To You.


Industry has the same problem, but they solve it with enforceable policies. Do This and you're fired. Don't Do This and you're fired etc. In my academic career I've seen students pushed out of PhD programs for poor academic performance, poor research, plagiarism, personal issues etc., but I have never heard of anyone being forced out or sanctioned for safety issues. Perhaps that's something that should never happen - safety should be about education/improvement/partnership/support rather than blame, but with the UCLA prosecution that whole paradigm is on the brink of seismic shift.





3. Complacency/familiarity. The more you work with something, the less dangerous it seems. You may take every precaution using n-BuLi the first time, but use it enough and you'll start becoming complacent. After you get some on your hand and it doesn't do anything but feel soapy, or you quench a big reaction without incident, you start to cut corners or not use the required PPE. Then comes the day when your scaleup runs away thermally and you don't have an ice bucket, or it's humid and it does catch fire in air. The first couple times you use a dichloromethane wash bottle and get some on your hand, you worry about it, but after a while you're hosing organic gunk off your hand with it.

Industry has the same problem, but in different ways and to a lesser degree.. Changes in scale (Texas Tech, UCLA) occur less often and in many cases expertise is available for the engineering/safety challenges of scaleup.. Procedures are more repetitive, and therefore you can draw boundary lines - cut this corner and you're fired etc.







4. Unfamiliarity/number of chemicals. Investigative research requires that you buy and use a lot of different chemicals, many of which you have probably never used before. New toxicity, reactivity, storage, compatibility etc. issues all arise, and a busy grad student may not feel compelled or inclined to investigate these - all while buying the 500 mL bottle instead of the 25 mL one because it's only a few dollars more. To most grad students and amine is an amine, they are all the same. That student might not ever realize a particular diammine crosslinks your DNA.


Again, except perhaps for industrial research, the number of chemicals is far lower. Most places don't want to create and hold onto a storeroom of chemicals that you want to keep on hand "just in case" you happen to need some and can't wait 2 days for your Aldrich order.




5 No formal reporting/corrective mechanisms. When a safety problem comes up in industry, that carries all sorts of direct consequences (workman's comp, down time, lost profits, OSHA worries, whistleblower fears, lawsuits, bad publicity). Corrective action is generally straightforward - fire an employee, make a new rule, post a sign, put a new guard on a machine, whatever.



I've witnessed dozens of near misses and minor accidents in academic labs. I have never been at a school where identifying, reporting, and correcting them was encouraged, let alone even considered. Most go unreported because folks fear being punished or restricted. Which is sad, because one person's near miss could be a lesson that saves someone else from a direct hit. Bringing up safety matters will not earn you tenure - and it could even cost you it because professors should be "focused on research and getting grants" (yes, we all know that safety is an integral and non-removable component of research).


I suppose that the one thing that could go the furthest in changing the academic world would be a safety rewards program. Maybe DCHAS needs to develop a model rewards program just like there are model CHP's and the GHS model. More on that some other day.




6. God Complex/blindness. Certain PI's and researchers will simply not take safety information/advice rules from someone else seriously. "I'm a PhD chemist and I think I know how to handle chemicals safely, thank you very much." We've had discussions here about that one. Then there are those who really do believe they are taking safety seriously but have glaring issues in their lab. Their students wear goggles all the time, there is proper signage and whatnot, but there's four waste bottles (two unlabeled) in the cluttered hood with wires hanging down in front of it. This goes back to # 3 in the end.




7. Inertia. "That's the way things have always been done and we haven't had any problems." This goes up the administrative chain as well. The recent spate of horrid academic accidents has, perversely, put our community into a unique position - we can actually get attention to address long standing issues that administrations formally would have been reluctant to address or spend money on.

8. Dysfunctional organization and/or poor leadership. This is more on a department by department or institutional basis. We all know it happens, but I'm trying to address things we can directly affect/effect..






In summary, there's just a few reasons, just ***off the top of my head***. It goes so way far deeper than this. We could all collaborate and write a book. Wait, it's been done. http://portal.acs.org/portal/PublicWebSite/about/governance/committees/chemicalsafety/CNBP_029720 A big round of applause for everyone who participated in that!


Rob Toreki


======================================================
Safety Emporium - Lab &Safety Supplies featuring brand names
you know and trust. Visit us at http://www.SafetyEmporium.com
esales**At_Symbol_Here**safetyemporium.com or toll-free: (866) 326-5412
Fax: (856) 553-6154, PO Box 1003, Blackwood, NJ 08012























--
Allen Niemi, PhD
Director
Occupational Safety and Health Services
Room 322 Lakeshore Center
Michigan Technological University
Phone: 906-487-2118
Fax: 906-487-3048

Previous post   |  Top of Page   |   Next post



The content of this page reflects the personal opinion(s) of the author(s) only, not the American Chemical Society, ILPI, Safety Emporium, or any other party. Use of any information on this page is at the reader's own risk. Unauthorized reproduction of these materials is prohibited. Send questions/comments about the archive to secretary@dchas.org.
The maintenance and hosting of the DCHAS-L archive is provided through the generous support of Safety Emporium.