LinkedIn

Sunday, July 19, 2015

To Measure (or Estimate) Skin Exposure or Not

Those of you who read this blog know that I have featured the work of Chris Packham from the UK and his tireless work on stemming the risk from dermal exposure.

I recently got an email from Chris commenting on a resent “If the only tool you have is a hammer…” blog in which I again highlighted the need for better tools to assess dermal exposure and risk.   Chris’ correspondence goes on at some length recounting the limitations of the current methodologies and in general concluding that we should not even try to measure  (or estimate with models) dermal exposure at this point.  In short, Chris suggests that we put that effort into managing the risk by choking down the exposure rather than assessing it.

I am reproducing the last paragraph of his correspondence below with the essence of his message.

“To measure, or not to measure?
Hygienists, in their training, tend to concentrate on measurement. I suggest that this can lead to a form of ‘tunnel vision’, the view being that if we cannot measure then it is not important for the hygienist. When, admittedly some years ago, I participated in some of the AIHA conferences, it appeared to me that the hygienists’ concern was to be able to demonstrate compliance. The view seemed to be that: “We cannot measure and there are no regulatory limits, so as with skin we cannot demonstrate to the client/employer that he or she is compliant, let us just ignore this.
My opinion, for what it is worth, is that attempting to measure skin exposure takes time away from work to reduce exposure and thus, whilst from an academic viewpoint is, perhaps, time well spent, it does little to reduce the incidence and prevalence of damage to health due to workplace skin exposure. I believe that risk assessment for skin exposure will remain subjective for years to come. What is important is that we attempt to introduce a measure of consistency so that we rate the risks of each task assessment using the same criteria. We can then rank the risks such that we tackle them in order of declining severity. “

I would be happy to send Chris’ entire note to those who request it at mjayjock@gmail.com; however, I am going to have to respectfully disagree with Chris on this issue. 

My position is that we can always make some quantitative estimate of dermal exposure using available models and information from sources like the EPA Exposure Factors Handbook and some worst case assumptions.   This process then bounds the quantitative upper level of exposure and is amiable to refinement with more data.

I have always found it very difficult to effectively manage any risk which has not been reasonably assessed.  The tools may be blunt but, in my opinion, they are better than not estimating the exposure potential and the risk from that potential at all.  Rather than being an academic exercise, I see these efforts as a concerted and rational effort to quantitatively gauge the level dermal exposure potential as a critical step in the assessment and management of risk.

Although significantly over-estimating the potential exposure, the “on the skin – in the body assumption” used as a worst case by some has its value in quantitatively bounding a worst case.  More important, it points the way to the value of experimental data to lower the uncertainty.   My sense is that the use of relatively poor tools is preferable to not attempting to use estimating tools at all. I believe that how we learn and progress is in the trying; that is, we do the best we can with the tools at hand and report the uncertainties associated with those efforts.

A prime example of this occurred many years ago when I was working for a large chemical company.  We had a product with a carcinogenic residual monomer and the business asked me what level of residual would render a virtually safe dose as was defined for this carcinogen at the time.   My analysis used worst case assumption including the “on the skin – in the body assumption” which, when all was done, meant that the monomer had to be reduced during manufacture to relatively low levels.  This assessment and the documented and rational risk management that came from it stood for a significant period of time until it was agreed that more data would help to refine the risk assessment.   With the addition of good experimental data, we determined that only a relatively small fraction of the amount on the skin could ever get into the systemic circulation and this provided considerable relief relative to how much residual monomer would be allowed within the product.

In this case we were required to present a quantitative estimate of dermal exposure which drove our estimation of risk.  We did so with the best information available to us which ultimately caused the gathering and expense of more data and a successful resolution to the issue.   This could not have been done if we simply abandon any estimation of exposure and instead tried to choke off the exposure without an assessment.

This is not to say that you cannot manage a risk without assessing it, only that it is quite difficult and can potentially be very inefficient to do so and present an impossible situation in a practical sense.

I look forward to continuing this dialog with Chris and other readers of this blog.


Sunday, July 12, 2015

Dimensional Analysis a Simple but Effective Tool

Dimensional Analysis (DA), sometimes called unit analysis, is a method by which modelers or other technologists keep track of the units of numbers during mathematical operations.   It is a reality check in that it tells you if the values(s) that you use and calculate make any sense or not.   Indeed, it has saved me many times from making embarrassing mistakes.  Granted, I still make embarrassing mistakes but it happens less often because of DA.  Also, I have gotten smart enough to always require peer review and sign-off for important work.

Conceptually DA is relatively easy to understand.  You simply need to  know what units are associated with the quantities you are working with and then do the algebra to cancel them out.   

Of course, knowing the units is first. For example, length is always expressed in some common measure like: inch, foot, yard, meter etc.    Area has units of length squared like square meters or m2.   Volume is length cubed; for example, m3.

You do the same for mass:  e.g., grams (g), pounds (lbs).

The quantity “time” is the most interesting and, by definition, most dynamic: e.g.,  seconds (s), minutes (min), hours (hrs), years (yr).

The first step in DA is understanding the conversion factors for the various units within each quantity.  For example: 0.001 mg = 1 microgram (ug) = 1,000 nanograms (ng) = 1,000,000 picograms (pg).

Ultimately, the quantities get combined into the common entities that we know and love: e .g.,  Speed = miles/hr,  Concentration =  g/m3.

The real fun comes when they get combined into more complicated algorithms as is often done in modeling.   For example,  image trying to understand and sort out the following modeling algorithm which describes the dynamic point-in-time concentration in an air volume in a well-mixed box containing a spill:

Here we need to sort things out by doing DA

You would need to know the units of α, M0, V, Q and t.  Which are

M0 = initial spill mass (mg)
α = evaporation rate constant (1/min)
V = effective room volume (m3)
Q = ventilation rate (m3/min)
t = time (min)

Let's start but taking "bite sized" pieces of the equation:

The quantity (α) (M0) has the units of (1/min)(mg) or, when combined they render mg/min.   The quantity (α)(V) – Q has the units of (1/min)(m3) or, when combined, m3/min.    If you divide these units into into the units for (α) (M0) you get mg/min divided by m3/min which cancels out the minutes (min) to give mg/m3 which happens to be the units of C.

The units in the exponents factor completely cancel out so they are “unitless”.   Thus "e" to a unitless power is a unitless number as well.
 
Two excellent tutorials on DA online are:



The first part of the second url is somewhat animated and I found it to be particularly clear.  You may want to skip the part on the use of DA for chemical reactions but  it is quite interesting.   

If you have avoided diving into this sort of analysis in the past, I urge you to consider these tutorials (or others you might find online), once you understand these concepts you will be able to analyze any algorithm for its units to see if it, its numbers and its predictions make sense.

I love to get comments back from the readers of this blog.  I would really like to know if this subject was of any value or perhaps aimed too low.  I simply know that DA has been a excellent tool for me over the years and wanted to share it.




Sunday, July 5, 2015

Cancer Risk Estimated at Legal OELs and ACGIH Voluntary OELs



I admire Adam Finkel for his intellectual acumen and force.  Adam has been fighting the good fight relative to exposure limits since I have known him.  My friend and colleague Tom Armstrong,  recently made me aware of  items that Adam and his colleagues at the Center for Public Integreity in Washington, DC have published online that provide a remarkably  user-friendly and informative  tool that shows the predicted level of protection provided by OSHA occupational exposure limits (OELs) versus those provided by the American Conference of Governmental Industrial Hygienist (ACGIH). 

One can always argue with how the quantitative level of risks were determined.   Adam and his colleagues anticipated this and provide the details and the rationale online:  http://www.publicintegrity.org/2015/06/26/17563/about-our-methodology

For me perhaps the most interesting and useful tool they published in this recent effort  is their “Unequal Risk Investigation” cancer-risk graphic.   A screen shot of this tool is below:





you will be treated to the live version of this tool that allows one to see the difference for OSHA and ACGIH exposure limits for carcinogens on a scale for an estimated 1 in 1000 (10-3) to 1000 in 1000 (10-0)  risk of cancer from exposure at the exposure limits.   You can filter the information by any of 12 categories including construction, manufacturing, health care and agriculture.  The tool also allows you to drill down to the individual chemical for specific risk estimates.  In the above screenshot the details of the estimated risks and uses of trichloroethylene are shown.

For those of you who read this blog regularly, you may recognize that what Adam and his colleagues are doing here is pretty much in line with the idea of Risk@OEL in which the residual risk at any exposure limit is presented as part of the documentation of that limit.  Because of our inability to reasonably establish true thresholds of risk from exposure to non-carcinogens, my thinking is that residual risk should be calculated (with error bands) for all toxic end-points not just cancer.

Indeed, I believe it is important that we all need to be aware of the significant uncertainty that exists around these estimates.   However, uncertainty notwithstanding, my sense is that we need to openly provide these estimates along with our best understanding of the error bands associated with them.

I would love to hear your comments about the information and ideas being presented here.  Do you believe it is OK not to include these estimates and their uncertainties in the documentation of the exposure limits?


Sunday, June 28, 2015

Exposure Modeling Leading to Discovery


Models can often present us with remarkable learning experiences.   Indeed, combining model development with the necessary aspects of experimentation to feed a model can lead to some important discoveries.  This week I am going to recount a discovery that my colleagues and I had while developing a model of the indoor air exposure potential from the off-gassing of pesticide from treated wood.

The pesticide was an important product for the company I was working for at the time and as the Manager of Human Health Risk Assessment, it was my responsibility to conduct a risk assessment on the use of the product as a wood preservative used on wood in the indoor environment.   

Our first experiment was to put the treated wood into a glass chamber and measure its concentration in the chamber’s air.  From this and other experiments, the generation rate of the off-gassing was estimated. 

An early lesson in all this is that models and some simulations are not reality but simply a portrayal of reality.   If they are reasonably portrayals, we can learn something.  Indeed, glass chambers are not real rooms but these results provided a lot of information about the ultimate use of treated wood used indoors in residences.

The system was complicated enough that we decided to do dynamic modeling of the chemical in compartments in a manner that is completely analogous to physiologically based pharmacokinetic (PBPK) models (https://en.wikipedia.org/wiki/Physiologically_based_pharmacokinetic_modelling). That is, each compartment is conceptually constructed to describe the instantaneous and integrated rate of pesticide input to and output from it.   While in any compartment, there was an option to describe changes or reactions of the pesticide during its time in that compartment.

The compartments we chose in our glass chamber experiment were:

·         The treated wood
·         The air space in the chamber
·         The chamber walls





A remarkable bit of technology that allowed us to do this was Advance Continuous Simulation Language (ACSL) software  (https://en.wikipedia.org/wiki/AEgis_Technologies).   At the time the best way to run this software was on mini-computers (known colloquially as “pizza boxes” because of their shape and size) running UNIX with a program subscription from  the Dow Chemical Company called SimuSolv.    I mention this because there was a lot of expense (10s of thousands of dollars) involved in getting a “pizza box”, having the technical IT support to run UNIX on this minicomputer and the cost of the license for SimuSolv.    Today you can get a PC license for ACSL for as little as $500  (http://www.acslx.com/sales/).   I am not trying to sell or advertise ACSL.   My intent is only to point out that ACSL is a good modeling tool that we went to a lot of trouble and expense to use it but it has gotten a lot more affordable over the last 20 years.

We ran a number of experiments to feed this model and during our initial analysis could not make the model work!

In our initial model, we assumed that there was NO reaction of the pesticide while it was in any compartment.   As a result the model predicted exposures that were about 5 times higher than what was actually measured in the glass chamber.   Clearly, the model got it wrong and needed to be refined to account for the "lost" material.

We asked the synthesis chemists about the stability of the pesticide in air or on glass surfaces and they said that it should be very stable for the time frame we were measuring (a few hundred hours).  
 
Because our model did not work, and notwithstanding the Chemists' comments, we hypothesized that perhaps that the combination of long residence time on the internal chamber glass surface combined with the large surface area-to-volume residue of the pesticide film on the glass could indeed lead to degradation.  This degradation would come from reaction with oxygen or trace amounts of tropospheric ozone or other reactive species present in the untreated suburban air used to ventilate the chamber. 

We changed the model to allow for degradation while on the chamber walls and SimuSolv allowed us to optimize the model for the degradation rate that provided the best fit to the data.    This eventually led to 0.005/hr as the estimated rate of degradation.  In 100 hours this predicts that 50% of the deposited pesticide would have degraded to other, typically less toxic species.

The Chemists congratulated us on our model fit but said that they did not believe that degradation was occurring.    That led to another series of experiments where we demonstrated after putting essentially pure pesticide on glass it was significantly transformed to almost a dozen chromatographically distinct species of compounds after prolonged exposure to ambient air.   Clearly, the significant rate of degradation that the model predicted was occurring.

The initial failure of the model allowed us to discover this important mechanism that was driving the concentration in the glass chamber.

Clearly, most of us do not live in glass houses and subsequent experiments with real rooms showed much stronger effects presumably from absorption (probably with degradation) were in play; however, the lesson here should not be lost.   Modeling can lead to some important discoveries.


We published most of the above work in the AIHA J and I would be happy to send a copy to anyone who requests it from me at:  mjayjock@gmail.com

Monday, June 22, 2015

Exposure Modeling Research - The Time is Now

For someone who has been advocating the modeling of exposure estimation for many years, it is very heartening to see research in this area taking root and growing.

Twenty-four years ago this spring, a friend and colleague, Neil Hawkins, suggested that I meet with a young woman who was an IH working for Dow Corning.  Her name was Susan Arnold and Neil said that she was very bright with a lot of energy and that I should talk with her about exposure modeling.   I contacted Susan and we went out to dinner at the AIHA Annual Conference in Salt Lake City in the spring of 1991.    We have been friends and colleagues ever since and Susan has worked as a modeler ever since.   Indeed, she received her Master’s Degree with a modeling project and will defend her PhD thesis on modeling at the University of Minnesota later this summer.  Suffice it to say that Neil and I are very proud of Susan and her accomplishments.  

At this month’s conference of the American Industrial Hygiene Association in Salt Lake City, I and many of my colleagues were treated to some of the excellent work coming out of the University of Minnesota under the leadership of Dr. Gurumurthy Ramachandran or, as many of us know him, Ram.  

On the 24th anniversary of our first meeting in Salt Lake City, Susan presented three papers on modeling which I will mention very briefly here and send her slides to whomever asks for them.

For many years Susan, Ram, Perry Logan, John Mulhausen and others have been interested in investigating the nature, power and accuracy of “expert judgement” within the realm of industrial hygiene.    Indeed, since the beginning of the profession the mantle or cloak of “expert judgment” has been invoked most times an IH would declare  a particular exposure scenario to be “safe” or in need or further investigation.   The term was so ubiquitous that it begged to be defined.  This was done in the latest (and I believe earlier editions of) AIHA Exposure “Strategies Book”.  The quote below is from the 3rd Edition:
“The application and appropriate use of knowledge gained from the formal education, experience, experimentation, inference, and analogy.  The capacity of an experience professional to draw correct inferences from incomplete quantitative data, frequently on the basis of observations, analogy and intuition.”   

The nature of professional judgment of Industrial Hygienists has been put to the test by asking them to use their judgment to characterize well-described exposure scenarios (without monitoring data) by placing them in one of 4 bins; namely, less than 10% of the OEL,  10-50% of the OEL, 50 – less than 100% of the OEL and greater than or equal to the OEL.  When asked to do this without information provided by modeling they systematically underestimated the true exposure.

Note: Even when you have monitoring data, characterizing or placing the exposure  in the correct bin is challenging.  If you do not believe me, read a previous blog on the Smart Phone App:  IH DIG (http://jayjock-associates.blogspot.com/2014/01/ih-dig-and-pump-monkey.html).  Play IH DIG and you will understand. 

Susan’s three presentations get into the issue of professional judgment aided by modeling while putting some of the most popular models through their paces in both the laboratory and real world.   The titles of the three talks she presented are:

  • Evaluating Model Performance under Highly Controlled Conditions
  • Evaluating Model Performance under Real World Conditions
  • Predicting Construction Related Silica Exposure Using Input from Chamber and Field Studies

 As mentioned above, send me an email request (mjayjock@gmail.com) and I will send you these slides.

Research into exposure assessment modeling is really just getting started; there is still plenty of room for folks to get involved in this growing field.  Indeed, as Susan wrote in the last conclusion of one of her talks:  “A very young science… there is still much to learn!

Saturday, June 13, 2015

New Research into Eddy Diffusivity (D)

One cannot teach (or blog) without learning.  It is one of the very real perks of trying to convey knowledge and information. 

At the recent conference of the American Industrial Hygiene Association in Salt Lake City, I and many of my colleagues were treated to some of the excellent work coming out of the University of Minnesota under the leadership of Dr. Gurumurthy Ramachandran or, as many of us know him, Ram.   Two of his graduate students presented their work which I will be summarizing here over the next few weeks. 

This week, it is my pleasure to summarize the presentation and work of Yuan Shao who told us of his efforts to determine of Eddy Diffusivity Coefficient (D) from more easily measured quantities such as ventilation rates and room dimensions.  

You may remember a blog I did some time ago on this subject published on December 30, 2013 (http://jayjock-associates.blogspot.com/2013/12/the-eddy-diffusion-near-field-model-is.html) entitled:  The Eddy Diffusion Near Field Model is Now Useable.   In that 2013 blog I discussed how the Eddy Diffusivity Model should be ideally suited for modeling many indoor sources; however, the major problem with the use of the model is the determination and use of a critical model parameter; specifically, the Eddy Diffusivity Coefficient (D).  Indeed, the predictions of this model are highly depended on D as defined below.

The critical variable D is dependent on how the air moves about randomly within the indoor environment.  Unfortunately, it (D) has historically proven itself to be very difficult to measure or estimate.   As a result many of us wishing to use this model have been forced to use a very wide range of estimates for D.  As such the utility of this model has been quite limited.   In that blog I discussed the research of Dr. Kai-Chung Cheng from Stanford University and his work to relate D to the ventilation rate expressed as air changes per hour and the room’s dimensions.   I noted Dr. Kai-Chungs work as a real advancement in our ability to use the Eddy Diffusivity Model which, by the way, is one of the available modules in the freeware spreadsheet: IH MOD.

It would appear that Yuan Shao has advanced that effort and provided us with more data and analysis of this important topic.   His conclusions are presented below: 

  •    An exposure chamber was constructed to create conditions for the eddy diffusion studies.
  •     A diffusion model accounting for chamber boundary, advection and the removal of contaminant due to the local ventilation system was developed.
  •     In this study, the measured and modeled data fit well over a range of experimental conditions. There is a strong linear relationship between D and ACH, providing a surrogate parameter for estimating D in real-life settings.
  •     The values of D obtained from the experiments are generally consistent with values reported in the literature.
  •     These findings make the use of turbulent eddy diffusion models for exposure assessment in workplace environments more feasible.


This is exactly the type of work that has been needed for many years but is now coming out as a result of these excellent research programs.

Yuan Shao has given me permission to send his full slide deck to whoever asks me for it at:  mjayjock@gmail.com.

As always, I would be very interested in your comments about this work and your experience with the Eddy Diffusivity Model and IH MOD.


Sunday, June 7, 2015

Having a Hammer as a Sole Tool Focuses Your View of Problems to Nails


A noted psychologist, Abraham Maslow, is credited by some as coming up with one of my favorite quotes which I am paraphrasing below:

“If the only tool you have is a hammer, 
  you will see every problem as a nail”

Our Industrial Hygiene tool kit is rich in tools designed to assess the exposure and risk from the inhalation of toxicants.   Indeed, essentially all of our exposure limits (TLV, PELs, OELs, etc.) are set as airborne concentrations that might occur in the breathing zone of workers.   I am unaware of any similar compendiums of dermal exposure limits but my readers have pleasantly surprised me in the past.  So if you know of any please send me an email.   mjayjock@gmail.co.

Indeed, if a chemical has a relatively high molecular weight (say >200 Daltons) and an octanol water partitioning coefficient of greater than 100,  its exposure potential will most like result more from dermal exposure than from inhalation.  Indeed, I seem to remember biological and air monitoring studies done with pentachlorophenol in open wood treatment lines showed that the majority (>90%)  of the systemic exposure/dose to the workers came from dermal rather than inhalation exposure.

I met Chris Packham in London many years ago and he struck with his focus and dedication to the science of control of worker health risk from dermal exposure.   Clearly he has continued that dedication with his more current teachings and writings.   The following quote was taken from a document that he recently sent and me and does indeed provide food for thought:

It is well established that inhalation of toxic chemicals can result in systemic effects, i.e. damage to internal organs and systems. A great deal of research and development has been undertaken resulting in strategies and equipment to monitor inhalation exposure. As a result in many countries there are exposure limits for a wide range of chemicals. Far less attention has been paid to the potential for chemicals to penetrate the skin and either cause or contribute to systemic toxic effects. Yet there is considerable evidence showing the potential for skin exposure to do this, including with chemicals that are unlikely ever to be inhaled because of their physical properties.(1) There is also a view that inhalation exposure results in more serious damage to health than can occur from skin exposure, often regarded as “just a rash”. Yet the EU Classification, Labelling and Packaging Regulation (EU1272/2008) contains the Hazard Statement 'H310 – Fatal in contact with skin'.

In this article the author will review the evidence showing why, in considering risks of damage to health due to the use of chemicals, the potential for skin exposure to cause systemic damage must be an integral part of any chemical exposure risk assessment.


If you would like the full text of this piece by Chris, just let me know at mjayjock@gmail.com and I will send it to you.

Chris, has also had a recent (February 2015) piece printed by the British Occupational Hygiene Society on this subject that I would be happy to send to you as well.


I would be very interested to hear how readers of this blog address dermal exposure and risk assessment and how these efforts compare to what is done for inhalation risks.