LinkedIn

Monday, December 14, 2015

Regulations Need Good Tools for Risk Assessment

In the last blog I asserted the need for the “cold hand of regulation” before risk assessment for the vast majority of chemicals used in commerce would happen.   A colleague wrote to me about that blog and reminded me that having an ostensibly comprehensive set of regulations is no guarantee that good risk assessment will be done.  I have excerpted a portion of the email from this colleague who is literally on the front lines in the application of available risk assessment tools.  Please note that I have always found this IH professional to be insightful and plain-speaking while being dedicated and passionate about making a difference.

“I've recently been doing some more work for U.S.-based multi-national firms reviewing REACh documentation, and I have to say, I am kind of disillusioned about the 'promise' of the REACh regulation's outcomes.  So much of what I see for extended SDSs [Safety Data Sheets] are just cookie-cutter verbiage - or, use Tier I screening tools to justify squishy statements that have very little utility to the end/downstream users of chemical-containing products.   I am sure there are a multitude of reasons for why this has happened, but the end result (IMHO) is going to fall far short of the original intent of the regs.”

I can only say that I completely agree with this observation from this seasoned IH/RA professional.   Indeed, I believe that I know the primary reason for this unfortunate state-of-affairs; namely, it is a lack of well-developed tools particularly in the realm of exposure assessment.

The first threshold or gate in risk assessment is the decision to do a risk assessment.   As I argued in the last blog, to date, that threshold has not been crossed for most chemicals in this country.  It has been different in Europe.  There has been a movement in the EU for the last 15 years or so to cross this threshold.  They are clearly advanced.

Once you are on the hook to do a risk assessment then you need the resources to make it happen.  If you do not have them then you have to develop them.  Since you will be applying them literally to thousands of chemicals, they have to be generally applicable to a large number of chemicals.   The tools for this task need to strike a balance between being “sharp” and incisive enough to render good answers for specific chemicals while being “general” enough to be applicable in a cost-effective manner.  You obviously cannot measure everything everywhere; as such, the development of validated and comprehensive exposure and effects models is critical.

I have asserted for years that we yet to do the basic research needed to properly feed our exposure models and make them “sharp” enough to be generally useful in the above context.  We did our best to lay out a specific template for research for the EU in a series of 2005 Workshops that were sponsored by the European Commission Joint Research Centre (JRC) in Ispra, Italy. These reports, especially the 100+ page report on exposure source characterization used the combined expertise of seasoned and respected scientists from around the world (Berkeley, Virginia Tech, USEPA, EU, Japan, China) to point to where the research was needed.  That document and its recommendations lay on JRC server and in my files and hard drive for years without any action.  I can no longer find it on the JRC servers but I have it and would be happy to send this report to anyone asking at mjayjock@gmail.com.  You can also find it as a downloadable link on my webpage:
http://www.jayjock-associates.com/educational-files-and-events/


Instead of doing the basic, initially expensive but ultimately cost-effective detailed research and tool development, the regulatory community in Europe has developed or adopted light-weight and stop-gap approaches which have resulted in the outcomes as described by my colleague on the front lines; namely, “cookie-cutter verbiage - or, use Tier I screening tools to justify squishy statements that have very little utility to the end/downstream users of chemical-containing products.”

In my opinion, there really is no substitute for doing it right and I hope that someday the research and its work products will fulfill the original intent of the REACh (and hopefully the upcoming US and other world-wide chemical) regulation.

                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            

Tuesday, December 1, 2015

Chemical Risk Assessment Needs the Cold Hand of Regulation

I worked for a large specialty chemical company for 35 years.  The company had a reputation as being a leader in the area of human health risk assessment.   I believe that reputation came as a result of their response to a tragedy that occurred years earlier when some of its workers were unknowingly exposed to a potent carcinogen and many became ill.   The heartbreak of this incidence caused the owners to really understand and act on that fact that you typically cannot manage any risk which has not been first reasonably assessed.  When I came on to the risk assessment scene in the mid-1980s the culture was well in place but the tools for quantitative risk assessments were not (and I must say remain) relatively under-developed.  I did and continue to spend my professional time to this day working on them.

For much of that 30+ years I have been busy working on these tools as applied to compounds that were clearly hazardous.  That is, those designed or discovered to be biologically active.   This included biocides and the “stand out” toxicants such as benzene, formaldehyde, chlorinated hydrocarbons and any other molecules important to the company that had somehow adversely affected human health or had been tested in animals to be carcinogenic, neurotoxic or a reproductive hazard.

This is how essentially all chemical risk assessment is done today and it is lacking.  It is “reactive” risk assessment in which relatively few chemicals are evaluated and the vast majority go unaddressed.   This was convincingly shown in what has been known as the “HPV Challenge”.   An excerpt from an Environmental  Defense  Fund web site:

When it launched the HPV Challenge in 1998, the U.S. Environmental Protection Agency (EPA) acknowledged there were huge gaps in publicly available hazard data even for HPV chemicals (those produced in or imported into the U.S. in amounts equal to or exceeding one million pounds annually). 

This June 2015 web site (https://www.edf.org/health/reports/high-hopes-low-marks) generally asserts a continued lack of information born of missed deadlines and data quality concerns.

After thinking about this literally for decades, I have come to the conclusion that even highly “enlightened” companies such as the one I worked for (and continue to work for as a consultant) will not take on the burden of doing risk assessments on all chemicals by itself.  The systematic, comprehensive and shared risk assessment of chemicals is something that needs to occur in the public interest and therefore should be subject to public governance; that is, regulation.

The “Government” has shown itself to be very capable of screwing things up but I frankly do not see a reasonable alternative.  I suggest that we simply have to do a better job of governing and not throw the risk assessment “baby” out with the governmental “bathwater”.

The European Union has been trying to do this with REACh and more recently in this country we are trying to “reform” the Toxic Substance Control Act.   Ultimately, I believe the cold hand of regulation will be the best and perhaps only way to do rational and comprehensive chemical risk assessment.

As usual, I would love to hear your take on this opinion which I can present here as anonymous if you prefer.

Sunday, August 2, 2015

Wanted Topics for this Blog from You the Readers


To date, I have published 120 blogs in this space on essentially a weekly basis for more than two years.   It has been a very fulfilling activity in that I have connected with many wonderful colleagues and learned a lot in the process.

Now in the middle of the summer of 2015, I have decided to take a break, to re-evaluate the purpose of this bog and to seek your input. 

I know a lot of you are on vacation or doing other activities so I may repeat this request in the fall when I hope to restart this weekly blog.

What I am asking for is for you to send me your questions or requests for topics for this blog within the very general realm of human health exposure and risk assessment.     This could include anything under the rather broad  topics of:
  • ·         Exposure modeling
  • ·         Exposure monitoring
  • ·         Toxicology
  • ·         Exposure Limits
  • ·         Ethics of Risk Assessment
  • ·         Politics of Risk Assessment

Indeed, even if your question or topic does not fit into any of these exactly, please ask anyway.   Maybe I can add something or send you in the right direction.

Please contact me at mjayjock@gmail.com


Have a good summer and I hope to be back online again in the fall.

Sunday, July 26, 2015

WHY do Risk Assessment?

Chris Keil is a technically savvy colleague who has done a lot to advance the science of human exposure modeling.  He is a prime mover and editor of both editions of our bible for occupational exposure modeling:  Mathematical Models for Estimating Occupational Exposure to Chemicals.

Chris recently sent me and other colleagues a note asking for our help in a project his is doing.  An excerpt from his email is presented below:

“I’m doing a project in which I am writing on the WHY of occupational safety and health. Searching for “Ethics and OSH” yields lots of info on the Ethics of OSH *practice* but not so much the philosophical/ethical basis for it.

Lots of the written rationale for OHS is tied to it being a good idea economically. And there are vague references that it is the “right thing to do”. What I’m looking for are scholarly treatments of why OSH is the “right thing to do”.

If you know of any such treatments, please send them my way.”

In my opinion, this issue is fairly apparent and straightforward.   Indeed, I believe that our forefathers in the United States were absolutely brilliant in the fact that they wanted to separate religion from the state but also wanted to define and assert human values that were universally applicable to all people irrespective of religion.   This is not to say that religious principles, particularly Judaeo - Christian beliefs, did not drive these values.  Rather, I believe, they intended that any particular religious dogma would not be associated with the assertion and establishment of these as secular rules to live by.

The second sentence of the July 4, 1776, U.S. Declaration of Independence is particularly blunt, elegant and powerful in this regard:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  https://en.wikipedia.org/wiki/Life,_Liberty_and_the_pursuit_of_Happiness

I would argue that an untoward health effect from a chemical exposure or other workplace hazard is a direct threat to a person’s pursuit of Happiness if not their Life.

Indeed, some believe that the kernel for some of these ideas were voiced by the English philosopher John Locke almost 90 years earlier in 1689 when he wrote about the importance of "life, liberty, health, and indolency of body…" (ref:  same wiki web site as above).

The outdated term “indolency” is defined as:

Noun

indolency (plural indolencies)
1.     (obsolete) The lack of pain; absence of pain


It would be hard to argue that this, our country is not based on these principles.  They define who we are and how we should act as a nation and as a people.   To be true to these very clearly stated and agreed to values, it is not hard to imagine that we need to control the threats to “indolency” that might exist within our society from chemical exposure or other workplace hazards.

I have always found it to be particularly difficult and often quite inefficient to manage a risk to health from chemical exposure that was not first reasonably assessed.   Indeed, if we do not even attempt to assess a risk of chemical exposure then it is often tacitly (and often incorrectly) assumed to be negligible.  In short, doing good, proactive OSH allows us to "walk the walk" relative to the most basic of our values.

Doing good OSH may be good for the bottom line but that reason is not even close to why it should be done.   Doing good OSH lies at what should be heart of our agreed to and stated governing values as citizens and people.

As usual, I (and Chris) would love to hear your thoughts on this issue.




Sunday, July 19, 2015

To Measure (or Estimate) Skin Exposure or Not

Those of you who read this blog know that I have featured the work of Chris Packham from the UK and his tireless work on stemming the risk from dermal exposure.

I recently got an email from Chris commenting on a resent “If the only tool you have is a hammer…” blog in which I again highlighted the need for better tools to assess dermal exposure and risk.   Chris’ correspondence goes on at some length recounting the limitations of the current methodologies and in general concluding that we should not even try to measure  (or estimate with models) dermal exposure at this point.  In short, Chris suggests that we put that effort into managing the risk by choking down the exposure rather than assessing it.

I am reproducing the last paragraph of his correspondence below with the essence of his message.

“To measure, or not to measure?
Hygienists, in their training, tend to concentrate on measurement. I suggest that this can lead to a form of ‘tunnel vision’, the view being that if we cannot measure then it is not important for the hygienist. When, admittedly some years ago, I participated in some of the AIHA conferences, it appeared to me that the hygienists’ concern was to be able to demonstrate compliance. The view seemed to be that: “We cannot measure and there are no regulatory limits, so as with skin we cannot demonstrate to the client/employer that he or she is compliant, let us just ignore this.
My opinion, for what it is worth, is that attempting to measure skin exposure takes time away from work to reduce exposure and thus, whilst from an academic viewpoint is, perhaps, time well spent, it does little to reduce the incidence and prevalence of damage to health due to workplace skin exposure. I believe that risk assessment for skin exposure will remain subjective for years to come. What is important is that we attempt to introduce a measure of consistency so that we rate the risks of each task assessment using the same criteria. We can then rank the risks such that we tackle them in order of declining severity. “

I would be happy to send Chris’ entire note to those who request it at mjayjock@gmail.com; however, I am going to have to respectfully disagree with Chris on this issue. 

My position is that we can always make some quantitative estimate of dermal exposure using available models and information from sources like the EPA Exposure Factors Handbook and some worst case assumptions.   This process then bounds the quantitative upper level of exposure and is amiable to refinement with more data.

I have always found it very difficult to effectively manage any risk which has not been reasonably assessed.  The tools may be blunt but, in my opinion, they are better than not estimating the exposure potential and the risk from that potential at all.  Rather than being an academic exercise, I see these efforts as a concerted and rational effort to quantitatively gauge the level dermal exposure potential as a critical step in the assessment and management of risk.

Although significantly over-estimating the potential exposure, the “on the skin – in the body assumption” used as a worst case by some has its value in quantitatively bounding a worst case.  More important, it points the way to the value of experimental data to lower the uncertainty.   My sense is that the use of relatively poor tools is preferable to not attempting to use estimating tools at all. I believe that how we learn and progress is in the trying; that is, we do the best we can with the tools at hand and report the uncertainties associated with those efforts.

A prime example of this occurred many years ago when I was working for a large chemical company.  We had a product with a carcinogenic residual monomer and the business asked me what level of residual would render a virtually safe dose as was defined for this carcinogen at the time.   My analysis used worst case assumption including the “on the skin – in the body assumption” which, when all was done, meant that the monomer had to be reduced during manufacture to relatively low levels.  This assessment and the documented and rational risk management that came from it stood for a significant period of time until it was agreed that more data would help to refine the risk assessment.   With the addition of good experimental data, we determined that only a relatively small fraction of the amount on the skin could ever get into the systemic circulation and this provided considerable relief relative to how much residual monomer would be allowed within the product.

In this case we were required to present a quantitative estimate of dermal exposure which drove our estimation of risk.  We did so with the best information available to us which ultimately caused the gathering and expense of more data and a successful resolution to the issue.   This could not have been done if we simply abandon any estimation of exposure and instead tried to choke off the exposure without an assessment.

This is not to say that you cannot manage a risk without assessing it, only that it is quite difficult and can potentially be very inefficient to do so and present an impossible situation in a practical sense.

I look forward to continuing this dialog with Chris and other readers of this blog.


Sunday, July 12, 2015

Dimensional Analysis a Simple but Effective Tool

Dimensional Analysis (DA), sometimes called unit analysis, is a method by which modelers or other technologists keep track of the units of numbers during mathematical operations.   It is a reality check in that it tells you if the values(s) that you use and calculate make any sense or not.   Indeed, it has saved me many times from making embarrassing mistakes.  Granted, I still make embarrassing mistakes but it happens less often because of DA.  Also, I have gotten smart enough to always require peer review and sign-off for important work.

Conceptually DA is relatively easy to understand.  You simply need to  know what units are associated with the quantities you are working with and then do the algebra to cancel them out.   

Of course, knowing the units is first. For example, length is always expressed in some common measure like: inch, foot, yard, meter etc.    Area has units of length squared like square meters or m2.   Volume is length cubed; for example, m3.

You do the same for mass:  e.g., grams (g), pounds (lbs).

The quantity “time” is the most interesting and, by definition, most dynamic: e.g.,  seconds (s), minutes (min), hours (hrs), years (yr).

The first step in DA is understanding the conversion factors for the various units within each quantity.  For example: 0.001 mg = 1 microgram (ug) = 1,000 nanograms (ng) = 1,000,000 picograms (pg).

Ultimately, the quantities get combined into the common entities that we know and love: e .g.,  Speed = miles/hr,  Concentration =  g/m3.

The real fun comes when they get combined into more complicated algorithms as is often done in modeling.   For example,  image trying to understand and sort out the following modeling algorithm which describes the dynamic point-in-time concentration in an air volume in a well-mixed box containing a spill:

Here we need to sort things out by doing DA

You would need to know the units of α, M0, V, Q and t.  Which are

M0 = initial spill mass (mg)
α = evaporation rate constant (1/min)
V = effective room volume (m3)
Q = ventilation rate (m3/min)
t = time (min)

Let's start but taking "bite sized" pieces of the equation:

The quantity (α) (M0) has the units of (1/min)(mg) or, when combined they render mg/min.   The quantity (α)(V) – Q has the units of (1/min)(m3) or, when combined, m3/min.    If you divide these units into into the units for (α) (M0) you get mg/min divided by m3/min which cancels out the minutes (min) to give mg/m3 which happens to be the units of C.

The units in the exponents factor completely cancel out so they are “unitless”.   Thus "e" to a unitless power is a unitless number as well.
 
Two excellent tutorials on DA online are:



The first part of the second url is somewhat animated and I found it to be particularly clear.  You may want to skip the part on the use of DA for chemical reactions but  it is quite interesting.   

If you have avoided diving into this sort of analysis in the past, I urge you to consider these tutorials (or others you might find online), once you understand these concepts you will be able to analyze any algorithm for its units to see if it, its numbers and its predictions make sense.

I love to get comments back from the readers of this blog.  I would really like to know if this subject was of any value or perhaps aimed too low.  I simply know that DA has been a excellent tool for me over the years and wanted to share it.




Sunday, July 5, 2015

Cancer Risk Estimated at Legal OELs and ACGIH Voluntary OELs



I admire Adam Finkel for his intellectual acumen and force.  Adam has been fighting the good fight relative to exposure limits since I have known him.  My friend and colleague Tom Armstrong,  recently made me aware of  items that Adam and his colleagues at the Center for Public Integreity in Washington, DC have published online that provide a remarkably  user-friendly and informative  tool that shows the predicted level of protection provided by OSHA occupational exposure limits (OELs) versus those provided by the American Conference of Governmental Industrial Hygienist (ACGIH). 

One can always argue with how the quantitative level of risks were determined.   Adam and his colleagues anticipated this and provide the details and the rationale online:  http://www.publicintegrity.org/2015/06/26/17563/about-our-methodology

For me perhaps the most interesting and useful tool they published in this recent effort  is their “Unequal Risk Investigation” cancer-risk graphic.   A screen shot of this tool is below:





you will be treated to the live version of this tool that allows one to see the difference for OSHA and ACGIH exposure limits for carcinogens on a scale for an estimated 1 in 1000 (10-3) to 1000 in 1000 (10-0)  risk of cancer from exposure at the exposure limits.   You can filter the information by any of 12 categories including construction, manufacturing, health care and agriculture.  The tool also allows you to drill down to the individual chemical for specific risk estimates.  In the above screenshot the details of the estimated risks and uses of trichloroethylene are shown.

For those of you who read this blog regularly, you may recognize that what Adam and his colleagues are doing here is pretty much in line with the idea of Risk@OEL in which the residual risk at any exposure limit is presented as part of the documentation of that limit.  Because of our inability to reasonably establish true thresholds of risk from exposure to non-carcinogens, my thinking is that residual risk should be calculated (with error bands) for all toxic end-points not just cancer.

Indeed, I believe it is important that we all need to be aware of the significant uncertainty that exists around these estimates.   However, uncertainty notwithstanding, my sense is that we need to openly provide these estimates along with our best understanding of the error bands associated with them.

I would love to hear your comments about the information and ideas being presented here.  Do you believe it is OK not to include these estimates and their uncertainties in the documentation of the exposure limits?


Sunday, June 28, 2015

Exposure Modeling Leading to Discovery


Models can often present us with remarkable learning experiences.   Indeed, combining model development with the necessary aspects of experimentation to feed a model can lead to some important discoveries.  This week I am going to recount a discovery that my colleagues and I had while developing a model of the indoor air exposure potential from the off-gassing of pesticide from treated wood.

The pesticide was an important product for the company I was working for at the time and as the Manager of Human Health Risk Assessment, it was my responsibility to conduct a risk assessment on the use of the product as a wood preservative used on wood in the indoor environment.   

Our first experiment was to put the treated wood into a glass chamber and measure its concentration in the chamber’s air.  From this and other experiments, the generation rate of the off-gassing was estimated. 

An early lesson in all this is that models and some simulations are not reality but simply a portrayal of reality.   If they are reasonably portrayals, we can learn something.  Indeed, glass chambers are not real rooms but these results provided a lot of information about the ultimate use of treated wood used indoors in residences.

The system was complicated enough that we decided to do dynamic modeling of the chemical in compartments in a manner that is completely analogous to physiologically based pharmacokinetic (PBPK) models (https://en.wikipedia.org/wiki/Physiologically_based_pharmacokinetic_modelling). That is, each compartment is conceptually constructed to describe the instantaneous and integrated rate of pesticide input to and output from it.   While in any compartment, there was an option to describe changes or reactions of the pesticide during its time in that compartment.

The compartments we chose in our glass chamber experiment were:

·         The treated wood
·         The air space in the chamber
·         The chamber walls





A remarkable bit of technology that allowed us to do this was Advance Continuous Simulation Language (ACSL) software  (https://en.wikipedia.org/wiki/AEgis_Technologies).   At the time the best way to run this software was on mini-computers (known colloquially as “pizza boxes” because of their shape and size) running UNIX with a program subscription from  the Dow Chemical Company called SimuSolv.    I mention this because there was a lot of expense (10s of thousands of dollars) involved in getting a “pizza box”, having the technical IT support to run UNIX on this minicomputer and the cost of the license for SimuSolv.    Today you can get a PC license for ACSL for as little as $500  (http://www.acslx.com/sales/).   I am not trying to sell or advertise ACSL.   My intent is only to point out that ACSL is a good modeling tool that we went to a lot of trouble and expense to use it but it has gotten a lot more affordable over the last 20 years.

We ran a number of experiments to feed this model and during our initial analysis could not make the model work!

In our initial model, we assumed that there was NO reaction of the pesticide while it was in any compartment.   As a result the model predicted exposures that were about 5 times higher than what was actually measured in the glass chamber.   Clearly, the model got it wrong and needed to be refined to account for the "lost" material.

We asked the synthesis chemists about the stability of the pesticide in air or on glass surfaces and they said that it should be very stable for the time frame we were measuring (a few hundred hours).  
 
Because our model did not work, and notwithstanding the Chemists' comments, we hypothesized that perhaps that the combination of long residence time on the internal chamber glass surface combined with the large surface area-to-volume residue of the pesticide film on the glass could indeed lead to degradation.  This degradation would come from reaction with oxygen or trace amounts of tropospheric ozone or other reactive species present in the untreated suburban air used to ventilate the chamber. 

We changed the model to allow for degradation while on the chamber walls and SimuSolv allowed us to optimize the model for the degradation rate that provided the best fit to the data.    This eventually led to 0.005/hr as the estimated rate of degradation.  In 100 hours this predicts that 50% of the deposited pesticide would have degraded to other, typically less toxic species.

The Chemists congratulated us on our model fit but said that they did not believe that degradation was occurring.    That led to another series of experiments where we demonstrated after putting essentially pure pesticide on glass it was significantly transformed to almost a dozen chromatographically distinct species of compounds after prolonged exposure to ambient air.   Clearly, the significant rate of degradation that the model predicted was occurring.

The initial failure of the model allowed us to discover this important mechanism that was driving the concentration in the glass chamber.

Clearly, most of us do not live in glass houses and subsequent experiments with real rooms showed much stronger effects presumably from absorption (probably with degradation) were in play; however, the lesson here should not be lost.   Modeling can lead to some important discoveries.


We published most of the above work in the AIHA J and I would be happy to send a copy to anyone who requests it from me at:  mjayjock@gmail.com

Monday, June 22, 2015

Exposure Modeling Research - The Time is Now

For someone who has been advocating the modeling of exposure estimation for many years, it is very heartening to see research in this area taking root and growing.

Twenty-four years ago this spring, a friend and colleague, Neil Hawkins, suggested that I meet with a young woman who was an IH working for Dow Corning.  Her name was Susan Arnold and Neil said that she was very bright with a lot of energy and that I should talk with her about exposure modeling.   I contacted Susan and we went out to dinner at the AIHA Annual Conference in Salt Lake City in the spring of 1991.    We have been friends and colleagues ever since and Susan has worked as a modeler ever since.   Indeed, she received her Master’s Degree with a modeling project and will defend her PhD thesis on modeling at the University of Minnesota later this summer.  Suffice it to say that Neil and I are very proud of Susan and her accomplishments.  

At this month’s conference of the American Industrial Hygiene Association in Salt Lake City, I and many of my colleagues were treated to some of the excellent work coming out of the University of Minnesota under the leadership of Dr. Gurumurthy Ramachandran or, as many of us know him, Ram.  

On the 24th anniversary of our first meeting in Salt Lake City, Susan presented three papers on modeling which I will mention very briefly here and send her slides to whomever asks for them.

For many years Susan, Ram, Perry Logan, John Mulhausen and others have been interested in investigating the nature, power and accuracy of “expert judgement” within the realm of industrial hygiene.    Indeed, since the beginning of the profession the mantle or cloak of “expert judgment” has been invoked most times an IH would declare  a particular exposure scenario to be “safe” or in need or further investigation.   The term was so ubiquitous that it begged to be defined.  This was done in the latest (and I believe earlier editions of) AIHA Exposure “Strategies Book”.  The quote below is from the 3rd Edition:
“The application and appropriate use of knowledge gained from the formal education, experience, experimentation, inference, and analogy.  The capacity of an experience professional to draw correct inferences from incomplete quantitative data, frequently on the basis of observations, analogy and intuition.”   

The nature of professional judgment of Industrial Hygienists has been put to the test by asking them to use their judgment to characterize well-described exposure scenarios (without monitoring data) by placing them in one of 4 bins; namely, less than 10% of the OEL,  10-50% of the OEL, 50 – less than 100% of the OEL and greater than or equal to the OEL.  When asked to do this without information provided by modeling they systematically underestimated the true exposure.

Note: Even when you have monitoring data, characterizing or placing the exposure  in the correct bin is challenging.  If you do not believe me, read a previous blog on the Smart Phone App:  IH DIG (http://jayjock-associates.blogspot.com/2014/01/ih-dig-and-pump-monkey.html).  Play IH DIG and you will understand. 

Susan’s three presentations get into the issue of professional judgment aided by modeling while putting some of the most popular models through their paces in both the laboratory and real world.   The titles of the three talks she presented are:

  • Evaluating Model Performance under Highly Controlled Conditions
  • Evaluating Model Performance under Real World Conditions
  • Predicting Construction Related Silica Exposure Using Input from Chamber and Field Studies

 As mentioned above, send me an email request (mjayjock@gmail.com) and I will send you these slides.

Research into exposure assessment modeling is really just getting started; there is still plenty of room for folks to get involved in this growing field.  Indeed, as Susan wrote in the last conclusion of one of her talks:  “A very young science… there is still much to learn!

Saturday, June 13, 2015

New Research into Eddy Diffusivity (D)

One cannot teach (or blog) without learning.  It is one of the very real perks of trying to convey knowledge and information. 

At the recent conference of the American Industrial Hygiene Association in Salt Lake City, I and many of my colleagues were treated to some of the excellent work coming out of the University of Minnesota under the leadership of Dr. Gurumurthy Ramachandran or, as many of us know him, Ram.   Two of his graduate students presented their work which I will be summarizing here over the next few weeks. 

This week, it is my pleasure to summarize the presentation and work of Yuan Shao who told us of his efforts to determine of Eddy Diffusivity Coefficient (D) from more easily measured quantities such as ventilation rates and room dimensions.  

You may remember a blog I did some time ago on this subject published on December 30, 2013 (http://jayjock-associates.blogspot.com/2013/12/the-eddy-diffusion-near-field-model-is.html) entitled:  The Eddy Diffusion Near Field Model is Now Useable.   In that 2013 blog I discussed how the Eddy Diffusivity Model should be ideally suited for modeling many indoor sources; however, the major problem with the use of the model is the determination and use of a critical model parameter; specifically, the Eddy Diffusivity Coefficient (D).  Indeed, the predictions of this model are highly depended on D as defined below.

The critical variable D is dependent on how the air moves about randomly within the indoor environment.  Unfortunately, it (D) has historically proven itself to be very difficult to measure or estimate.   As a result many of us wishing to use this model have been forced to use a very wide range of estimates for D.  As such the utility of this model has been quite limited.   In that blog I discussed the research of Dr. Kai-Chung Cheng from Stanford University and his work to relate D to the ventilation rate expressed as air changes per hour and the room’s dimensions.   I noted Dr. Kai-Chungs work as a real advancement in our ability to use the Eddy Diffusivity Model which, by the way, is one of the available modules in the freeware spreadsheet: IH MOD.

It would appear that Yuan Shao has advanced that effort and provided us with more data and analysis of this important topic.   His conclusions are presented below: 

  •    An exposure chamber was constructed to create conditions for the eddy diffusion studies.
  •     A diffusion model accounting for chamber boundary, advection and the removal of contaminant due to the local ventilation system was developed.
  •     In this study, the measured and modeled data fit well over a range of experimental conditions. There is a strong linear relationship between D and ACH, providing a surrogate parameter for estimating D in real-life settings.
  •     The values of D obtained from the experiments are generally consistent with values reported in the literature.
  •     These findings make the use of turbulent eddy diffusion models for exposure assessment in workplace environments more feasible.


This is exactly the type of work that has been needed for many years but is now coming out as a result of these excellent research programs.

Yuan Shao has given me permission to send his full slide deck to whoever asks me for it at:  mjayjock@gmail.com.

As always, I would be very interested in your comments about this work and your experience with the Eddy Diffusivity Model and IH MOD.


Sunday, June 7, 2015

Having a Hammer as a Sole Tool Focuses Your View of Problems to Nails


A noted psychologist, Abraham Maslow, is credited by some as coming up with one of my favorite quotes which I am paraphrasing below:

“If the only tool you have is a hammer, 
  you will see every problem as a nail”

Our Industrial Hygiene tool kit is rich in tools designed to assess the exposure and risk from the inhalation of toxicants.   Indeed, essentially all of our exposure limits (TLV, PELs, OELs, etc.) are set as airborne concentrations that might occur in the breathing zone of workers.   I am unaware of any similar compendiums of dermal exposure limits but my readers have pleasantly surprised me in the past.  So if you know of any please send me an email.   mjayjock@gmail.co.

Indeed, if a chemical has a relatively high molecular weight (say >200 Daltons) and an octanol water partitioning coefficient of greater than 100,  its exposure potential will most like result more from dermal exposure than from inhalation.  Indeed, I seem to remember biological and air monitoring studies done with pentachlorophenol in open wood treatment lines showed that the majority (>90%)  of the systemic exposure/dose to the workers came from dermal rather than inhalation exposure.

I met Chris Packham in London many years ago and he struck with his focus and dedication to the science of control of worker health risk from dermal exposure.   Clearly he has continued that dedication with his more current teachings and writings.   The following quote was taken from a document that he recently sent and me and does indeed provide food for thought:

It is well established that inhalation of toxic chemicals can result in systemic effects, i.e. damage to internal organs and systems. A great deal of research and development has been undertaken resulting in strategies and equipment to monitor inhalation exposure. As a result in many countries there are exposure limits for a wide range of chemicals. Far less attention has been paid to the potential for chemicals to penetrate the skin and either cause or contribute to systemic toxic effects. Yet there is considerable evidence showing the potential for skin exposure to do this, including with chemicals that are unlikely ever to be inhaled because of their physical properties.(1) There is also a view that inhalation exposure results in more serious damage to health than can occur from skin exposure, often regarded as “just a rash”. Yet the EU Classification, Labelling and Packaging Regulation (EU1272/2008) contains the Hazard Statement 'H310 – Fatal in contact with skin'.

In this article the author will review the evidence showing why, in considering risks of damage to health due to the use of chemicals, the potential for skin exposure to cause systemic damage must be an integral part of any chemical exposure risk assessment.


If you would like the full text of this piece by Chris, just let me know at mjayjock@gmail.com and I will send it to you.

Chris, has also had a recent (February 2015) piece printed by the British Occupational Hygiene Society on this subject that I would be happy to send to you as well.


I would be very interested to hear how readers of this blog address dermal exposure and risk assessment and how these efforts compare to what is done for inhalation risks.

Saturday, May 30, 2015

Risk Assessment and the American Industrial Hygiene Association


The American Industrial Hygiene Association has made great strides in the realm of human health exposure and risk assessment in the last 20 years or so.   I had frankly not thought of it in this manner but it became obvious to me when a friend and colleague, Dr. Jack Hamilton (Bostik, Inc) mentioned this happy fact during a recent visit I made to his workplace.

Jack is a toxicologist with a strong technical background in risk assessment.  When I started explaining the various tools for Industrial Hygienists that have been developed by AIHA volunteer groups, he made what is now an obvious but dramatic point; specifically, the AIHA has made substantial and dramatic advances in the practical development of human health exposure assessment.
 

The annual American Industrial Hygiene Conference and Exposition is happening this week in Salt Lake City.  Given Jack’s comment and my new found appreciation, I thought I would outline what I see as the highlights of these offerings.   The following is a partial cut and paste from the Exposure Assessment Strategies Committee web site: 

https://www.aiha.org/get-involved/VolunteerGroups/Pages/Exposure-Assessment-Strategies-Committee.aspx


TOOLS (For the Practicing Industrial Hygienist)

  

The following software tools provide the practicing industrial hygienist with quick and easy access to the information necessary to evaluate exposure profiles and determine if the exposures are acceptable, not acceptable or if more data is needed to make the determination of acceptability. The tools are all free and are regularly updated. Several are available in multiple languages.   
·         New IHSTAT /  IHSTAT Macro Free Version: Excel applications that calculate a variety of exposure statistitics, performs goodness of fit tests and graphs exposure data. Multiple languages available.  
The links below will show you how to adjust the macros settings in your version of Excel, if needed  
  •  IH MOD: Includes mathematical models for estimating occupational exposures. Multiple  Languages   Click on this link for IH MOD General Help 
  • IH SkinPerm: Excel application for estimating dermal exposures. Factors in evaporation and
    absorption. The manual for IH SkinPerm is available separately.    

 

These free software products represent literally thousands of volunteer hours of technical effort by some of the top workers in the field.

They come associated with perhaps the best and most authoritative books available on the subject of occupational exposure assessment:

A Strategy for Assessing and Managing Occupational Exposures, 3rd edition
Edited by Joselito S. Ignacio and William H. Bullock (NOTE: THE 4TH EDITION IS COMING VERY SOON!)

Mathematical Models for Estimating Occupational Exposure to Chemicals, 2nd Edition
Edited by Charles B. Keil, Catherine E. Simmons, and T. Renee Anthony

This is probably a good place to mention a book we published in 2000 that I believe still has some value:

Risk Assessment Principles for the Industrial Hygienist, M.A. Jayjock, J.R. Lynch and D.I. Nelson
You can view this entire book on books.google.com or buy it in either pdf or hardcopy at:

In all, the body of work put out by the AIHA volunteers especially over the last 20 years has indeed been remarkable but it is not finished.  As mentioned above, the 4th Edition of the basic strategies book is due out any day now.  Revisions to the modeling software are constantly being made.

If you want to become a part of this movement, the AIHA and the various committees would welcome you!  Indeed, if you are going to Salt Lake City next week please consider coming to the meetings of the Exposure Strategies Committee, the Risk Assessment Committee, The Toxicology Committee or any other committee that might strike your interest.  It is a great place to learn and grow.