Sunday, July 26, 2015

WHY do Risk Assessment?

Chris Keil is a technically savvy colleague who has done a lot to advance the science of human exposure modeling.  He is a prime mover and editor of both editions of our bible for occupational exposure modeling:  Mathematical Models for Estimating Occupational Exposure to Chemicals.

Chris recently sent me and other colleagues a note asking for our help in a project his is doing.  An excerpt from his email is presented below:

“I’m doing a project in which I am writing on the WHY of occupational safety and health. Searching for “Ethics and OSH” yields lots of info on the Ethics of OSH *practice* but not so much the philosophical/ethical basis for it.

Lots of the written rationale for OHS is tied to it being a good idea economically. And there are vague references that it is the “right thing to do”. What I’m looking for are scholarly treatments of why OSH is the “right thing to do”.

If you know of any such treatments, please send them my way.”

In my opinion, this issue is fairly apparent and straightforward.   Indeed, I believe that our forefathers in the United States were absolutely brilliant in the fact that they wanted to separate religion from the state but also wanted to define and assert human values that were universally applicable to all people irrespective of religion.   This is not to say that religious principles, particularly Judaeo - Christian beliefs, did not drive these values.  Rather, I believe, they intended that any particular religious dogma would not be associated with the assertion and establishment of these as secular rules to live by.

The second sentence of the July 4, 1776, U.S. Declaration of Independence is particularly blunt, elegant and powerful in this regard:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”,_Liberty_and_the_pursuit_of_Happiness

I would argue that an untoward health effect from a chemical exposure or other workplace hazard is a direct threat to a person’s pursuit of Happiness if not their Life.

Indeed, some believe that the kernel for some of these ideas were voiced by the English philosopher John Locke almost 90 years earlier in 1689 when he wrote about the importance of "life, liberty, health, and indolency of body…" (ref:  same wiki web site as above).

The outdated term “indolency” is defined as:


indolency (plural indolencies)
1.     (obsolete) The lack of pain; absence of pain

It would be hard to argue that this, our country is not based on these principles.  They define who we are and how we should act as a nation and as a people.   To be true to these very clearly stated and agreed to values, it is not hard to imagine that we need to control the threats to “indolency” that might exist within our society from chemical exposure or other workplace hazards.

I have always found it to be particularly difficult and often quite inefficient to manage a risk to health from chemical exposure that was not first reasonably assessed.   Indeed, if we do not even attempt to assess a risk of chemical exposure then it is often tacitly (and often incorrectly) assumed to be negligible.  In short, doing good, proactive OSH allows us to "walk the walk" relative to the most basic of our values.

Doing good OSH may be good for the bottom line but that reason is not even close to why it should be done.   Doing good OSH lies at what should be heart of our agreed to and stated governing values as citizens and people.

As usual, I (and Chris) would love to hear your thoughts on this issue.

Sunday, July 19, 2015

To Measure (or Estimate) Skin Exposure or Not

Those of you who read this blog know that I have featured the work of Chris Packham from the UK and his tireless work on stemming the risk from dermal exposure.

I recently got an email from Chris commenting on a resent “If the only tool you have is a hammer…” blog in which I again highlighted the need for better tools to assess dermal exposure and risk.   Chris’ correspondence goes on at some length recounting the limitations of the current methodologies and in general concluding that we should not even try to measure  (or estimate with models) dermal exposure at this point.  In short, Chris suggests that we put that effort into managing the risk by choking down the exposure rather than assessing it.

I am reproducing the last paragraph of his correspondence below with the essence of his message.

“To measure, or not to measure?
Hygienists, in their training, tend to concentrate on measurement. I suggest that this can lead to a form of ‘tunnel vision’, the view being that if we cannot measure then it is not important for the hygienist. When, admittedly some years ago, I participated in some of the AIHA conferences, it appeared to me that the hygienists’ concern was to be able to demonstrate compliance. The view seemed to be that: “We cannot measure and there are no regulatory limits, so as with skin we cannot demonstrate to the client/employer that he or she is compliant, let us just ignore this.
My opinion, for what it is worth, is that attempting to measure skin exposure takes time away from work to reduce exposure and thus, whilst from an academic viewpoint is, perhaps, time well spent, it does little to reduce the incidence and prevalence of damage to health due to workplace skin exposure. I believe that risk assessment for skin exposure will remain subjective for years to come. What is important is that we attempt to introduce a measure of consistency so that we rate the risks of each task assessment using the same criteria. We can then rank the risks such that we tackle them in order of declining severity. “

I would be happy to send Chris’ entire note to those who request it at; however, I am going to have to respectfully disagree with Chris on this issue. 

My position is that we can always make some quantitative estimate of dermal exposure using available models and information from sources like the EPA Exposure Factors Handbook and some worst case assumptions.   This process then bounds the quantitative upper level of exposure and is amiable to refinement with more data.

I have always found it very difficult to effectively manage any risk which has not been reasonably assessed.  The tools may be blunt but, in my opinion, they are better than not estimating the exposure potential and the risk from that potential at all.  Rather than being an academic exercise, I see these efforts as a concerted and rational effort to quantitatively gauge the level dermal exposure potential as a critical step in the assessment and management of risk.

Although significantly over-estimating the potential exposure, the “on the skin – in the body assumption” used as a worst case by some has its value in quantitatively bounding a worst case.  More important, it points the way to the value of experimental data to lower the uncertainty.   My sense is that the use of relatively poor tools is preferable to not attempting to use estimating tools at all. I believe that how we learn and progress is in the trying; that is, we do the best we can with the tools at hand and report the uncertainties associated with those efforts.

A prime example of this occurred many years ago when I was working for a large chemical company.  We had a product with a carcinogenic residual monomer and the business asked me what level of residual would render a virtually safe dose as was defined for this carcinogen at the time.   My analysis used worst case assumption including the “on the skin – in the body assumption” which, when all was done, meant that the monomer had to be reduced during manufacture to relatively low levels.  This assessment and the documented and rational risk management that came from it stood for a significant period of time until it was agreed that more data would help to refine the risk assessment.   With the addition of good experimental data, we determined that only a relatively small fraction of the amount on the skin could ever get into the systemic circulation and this provided considerable relief relative to how much residual monomer would be allowed within the product.

In this case we were required to present a quantitative estimate of dermal exposure which drove our estimation of risk.  We did so with the best information available to us which ultimately caused the gathering and expense of more data and a successful resolution to the issue.   This could not have been done if we simply abandon any estimation of exposure and instead tried to choke off the exposure without an assessment.

This is not to say that you cannot manage a risk without assessing it, only that it is quite difficult and can potentially be very inefficient to do so and present an impossible situation in a practical sense.

I look forward to continuing this dialog with Chris and other readers of this blog.

Sunday, July 12, 2015

Dimensional Analysis a Simple but Effective Tool

Dimensional Analysis (DA), sometimes called unit analysis, is a method by which modelers or other technologists keep track of the units of numbers during mathematical operations.   It is a reality check in that it tells you if the values(s) that you use and calculate make any sense or not.   Indeed, it has saved me many times from making embarrassing mistakes.  Granted, I still make embarrassing mistakes but it happens less often because of DA.  Also, I have gotten smart enough to always require peer review and sign-off for important work.

Conceptually DA is relatively easy to understand.  You simply need to  know what units are associated with the quantities you are working with and then do the algebra to cancel them out.   

Of course, knowing the units is first. For example, length is always expressed in some common measure like: inch, foot, yard, meter etc.    Area has units of length squared like square meters or m2.   Volume is length cubed; for example, m3.

You do the same for mass:  e.g., grams (g), pounds (lbs).

The quantity “time” is the most interesting and, by definition, most dynamic: e.g.,  seconds (s), minutes (min), hours (hrs), years (yr).

The first step in DA is understanding the conversion factors for the various units within each quantity.  For example: 0.001 mg = 1 microgram (ug) = 1,000 nanograms (ng) = 1,000,000 picograms (pg).

Ultimately, the quantities get combined into the common entities that we know and love: e .g.,  Speed = miles/hr,  Concentration =  g/m3.

The real fun comes when they get combined into more complicated algorithms as is often done in modeling.   For example,  image trying to understand and sort out the following modeling algorithm which describes the dynamic point-in-time concentration in an air volume in a well-mixed box containing a spill:

Here we need to sort things out by doing DA

You would need to know the units of α, M0, V, Q and t.  Which are

M0 = initial spill mass (mg)
α = evaporation rate constant (1/min)
V = effective room volume (m3)
Q = ventilation rate (m3/min)
t = time (min)

Let's start but taking "bite sized" pieces of the equation:

The quantity (α) (M0) has the units of (1/min)(mg) or, when combined they render mg/min.   The quantity (α)(V) – Q has the units of (1/min)(m3) or, when combined, m3/min.    If you divide these units into into the units for (α) (M0) you get mg/min divided by m3/min which cancels out the minutes (min) to give mg/m3 which happens to be the units of C.

The units in the exponents factor completely cancel out so they are “unitless”.   Thus "e" to a unitless power is a unitless number as well.
Two excellent tutorials on DA online are:

The first part of the second url is somewhat animated and I found it to be particularly clear.  You may want to skip the part on the use of DA for chemical reactions but  it is quite interesting.   

If you have avoided diving into this sort of analysis in the past, I urge you to consider these tutorials (or others you might find online), once you understand these concepts you will be able to analyze any algorithm for its units to see if it, its numbers and its predictions make sense.

I love to get comments back from the readers of this blog.  I would really like to know if this subject was of any value or perhaps aimed too low.  I simply know that DA has been a excellent tool for me over the years and wanted to share it.

Sunday, July 5, 2015

Cancer Risk Estimated at Legal OELs and ACGIH Voluntary OELs

I admire Adam Finkel for his intellectual acumen and force.  Adam has been fighting the good fight relative to exposure limits since I have known him.  My friend and colleague Tom Armstrong,  recently made me aware of  items that Adam and his colleagues at the Center for Public Integreity in Washington, DC have published online that provide a remarkably  user-friendly and informative  tool that shows the predicted level of protection provided by OSHA occupational exposure limits (OELs) versus those provided by the American Conference of Governmental Industrial Hygienist (ACGIH). 

One can always argue with how the quantitative level of risks were determined.   Adam and his colleagues anticipated this and provide the details and the rationale online:

For me perhaps the most interesting and useful tool they published in this recent effort  is their “Unequal Risk Investigation” cancer-risk graphic.   A screen shot of this tool is below:

you will be treated to the live version of this tool that allows one to see the difference for OSHA and ACGIH exposure limits for carcinogens on a scale for an estimated 1 in 1000 (10-3) to 1000 in 1000 (10-0)  risk of cancer from exposure at the exposure limits.   You can filter the information by any of 12 categories including construction, manufacturing, health care and agriculture.  The tool also allows you to drill down to the individual chemical for specific risk estimates.  In the above screenshot the details of the estimated risks and uses of trichloroethylene are shown.

For those of you who read this blog regularly, you may recognize that what Adam and his colleagues are doing here is pretty much in line with the idea of Risk@OEL in which the residual risk at any exposure limit is presented as part of the documentation of that limit.  Because of our inability to reasonably establish true thresholds of risk from exposure to non-carcinogens, my thinking is that residual risk should be calculated (with error bands) for all toxic end-points not just cancer.

Indeed, I believe it is important that we all need to be aware of the significant uncertainty that exists around these estimates.   However, uncertainty notwithstanding, my sense is that we need to openly provide these estimates along with our best understanding of the error bands associated with them.

I would love to hear your comments about the information and ideas being presented here.  Do you believe it is OK not to include these estimates and their uncertainties in the documentation of the exposure limits?