LinkedIn

Monday, October 28, 2013

Why Isn’t Risk Assessment Done on All Chemicals?

I have recently been asked why are there so few occupational exposure limits compared to the number of chemicals in commerce?   There are a number of potential answers to this but I think the simplest and most accurate comeback is that there is a lack of data.   So that begs the question as to why there is so little data?

We all probably know by now that Human Health Risk Assessment is the integration of the ability of a chemical to cause harm to human health with the actual exposure to that chemical that might occur in the real world.   In terms of a simple conceptual model:   Risk = (Toxicological Harm/Exposure)X(Exposure).    EVERY chemical will cause toxicological harm at some dose.   Pure oxygen breathed for an extended period is toxic.  Indeed Wikipedia reports that pulmonary and ocular toxicity result from extended exposures to elevated oxygen levels at normal pressure.   Given a dose high enough, toxicological harm comes from each and EVERY chemical we can think of.  The “what” of that untoward health outcome is called the HAZARD IDENTIFICATION for that substance. In the above example with oxygen it is pulmonary and ocular toxicity.  It is usually the first bad thing health-wise that happens as you ramp up the exposure level and it can range from irritation to more serious outcomes like cancer to death.   The exposure level over which that bad health effect happens defines the POTENCY of the chemical’s toxicological effect with highly potent materials causing effects at very low exposures.   Oxygen is not very potent.  By comparison benzene is very potent in its ability to cause toxicological harm.

The point of all this is that you cannot begin to do a risk assessment without data on the toxicological properties (HAZARD IDENTIFICATION and POTENCY) of the chemical of interest.  If you have No data you have No risk assessment unless you really force the issue which will be discussed below.

Unfortunately, what has sparked a lot of the toxicological data that we have comes from first seeing the adverse health effect of the chemicals on exposed humans.   Benzene falls into this category.   So does asbestos and vinyl chloride and all the other known human carcinogens.   Seeing people stricken by these substances caused them to be studied.   The other category of well-studied chemicals is pesticides.  This is primarily because they are designed to be commercial poisons so they are expected to be relatively potent toxicants and clearly worthy of study.   As a result they are also highly regulated.  How do we address the rest of the chemicals in our world?

In the late 1990s the Environmental Defense Fund issued a ground breaking report entitled:  Toxic Ignorance (The Continuing Absence of Basic Health Testing for Top-Selling Chemicals in the United States):  http://www.edf.org/sites/default/files/243_toxicignorance_0.pdf    It proved with undeniable hard data that, at that point in time, “…even the most basic toxicity testing results cannot be found in the public record for nearly 75% of the top volume chemicals in commercial use.”   As you might imagine it caused quite a stir and the EPA got involved and eventually hatched the High Production Volume (HPV) Challenge  program.  http://www.epa.gov/hpv/.   This resulted in considerably more toxicological data but as you might guess there remains a severe lack of data for the tens of thousands of chemicals still in commerce to which folks are exposed every day.

But that takes us to an even more fundamental question:  Why haven’t we been testing the chemicals that we breathe, eat and touch in our environment to determine their toxicological effects all along?   Why has it taken the cold hand of public scrutiny and regulation to get things moving?  I think one of  the strongest factors addressing this question is the misguided presumption of safety by those with an economic stake in these chemicals.   Many believe at some level that “no proof of risk means proof of no risk.”  This is, of course, not true; however, for these folks there is no incentive to go “looking for trouble” by testing the chemicals to identify the hazard they pose and the potency of that hazard.   Toxicology testing is, or has been, considered to be relatively expensive to do so.   Thus, the reasoning goes, why spend money to give you bad news when you assume it’s safe and the testing would only bring bad news?

There is another large factor in this and that is the mistrust of toxicological data.   Those who do not like the results point to the high doses used in the toxicological studies and assert that they do not relate to or represent the exposures received in the real world and are therefore unrealistic measures of risk.   We will get into these issues (dose-response extrapolation and setting exposure limits) in future blogs but I think you can see how the politics might be playing out in all this to have a strong bias towards not doing toxicological testing.

So at the end of 2013 we are left with lots and lots of chemicals but relatively little toxicology data to characterize the hazard and potency of these tens of thousands of substances in commerce.   My sense, however, is that as Bob Dylan once sang “The times they are a changin…”   The European REACh statute is essentially driving risk assessment by forcing risk assessment and the assignment of exposure limits based on very prescriptive procedures.  I will go over REACh in a future blog because I believe it has the capacity, given the political will, to ultimately drive the science of risk assessment.   This could conceivably force much more testing of both toxicity and exposure but that remains to be seen.
.  
On another front the EPA is spending considerable money (tens and perhaps hundreds of millions of USD) and resources in advanced toxicological screening using innovative molecular biology techniques in a program entitled ToxCast:  http://www.epa.gov/ncct/toxcast/     

More confident data will begin to feed the risk assessment process which should ultimately the lower uncertainty of the analyses.  Lower uncertainty will, in turn, lower the forced overestimation of risk that comes from doing a risk assessment with a lack of data.  Indeed, in the end this confident knowledge will be much more cost-effective in focusing on real risk and not spending money on phantom risk borne of uncertainty. 

Monday, October 21, 2013

Career Advantages of Being a Modeler

It occurred to me recently that I have not given you all the really good reasons why you should be dedicated to learning modeling. I will attempt to do so herein. The reasons are listed and further explained below:

  1.      It’s cool!
  2.  It will really help you in your job as an industrial hygienist or exposure assessor
  3.  You will become a relatively rare and sought after animal
  4.  It could dramatically increase your value to your employer


It’s Cool:
Huey Lewis once sang that “its hip to be square!”   If you have any feel for science at all, understanding or looking into the workings of reality can be a real rush.  The Science Channel counts on this human trait to gather viewers.  Indeed, seeking and organizing the factors that drive human exposure in any scenario is part of being human and many of us find it to be simply fun.   Let’s face it, we are curious animals that love to be competent and develop tools (i.e., models)  and acting on that curiosity is an end in itself for many of us.

It will really help you in your job as an industrial hygienist or exposure assessor:
Modeling will inform your judgment as to where the significant exposures might be whether they occur in the present, happened in the past or have not occurred as yet.   It will allow you to estimate the exposure potential of scenarios literally on the other side of the globe.   I should also ultimately mean you will most likely waste less time monitoring some exposure scenarios  that do not need measuring while focusing on other that do.  Properly done, skill in modeling could in the long run mean less direct monitoring and more effort put into characterizing what exactly is causing the potential over-exposures. 

You will become a relatively rare and sought after animal:
Thanks to the efforts of my colleagues within the AIHA there are quite a few more modelers out there in the Industrial Hygiene Community than there were 20 years ago when we started beating the drum but there are frankly still relatively few.   It is not the sort of discipline that you pick up very quickly and there are very few places to actually learn it.   The 2 day AIHA Professional Development Course is probably the best but it is very intense and, while Rome was not built in a day, it is even harder to “make a modeler” in two days.    Indeed, there are quite a few reasons that there remains a relative lack of folks that are reasonably skilled in human exposure modeling.   I outline this situation in detail in the following document:    


Those of you that read this short Word document will find that it is an “offer of service” to clients to take the time and attention needed to actually train professionals on-the-job in modeling to have them become fully functional.   The offer has been out for a while and I have yet to have any takers.   If I get a lot of response to this particular blog I may reproduce it in a future blog.   Frankly, it walks the line between service to you and your management and self-promotion but I am willing to take that chance to get the word out.
The fact remains that there are very few places to get this training and that if you take the time to do so you will be a rare, and valuable, exception.    You will no longer be someone who just measures and interprets exposures. You will be a technologist that predicts exposures and understands and can explain the scientific basis for that judgment.  That skill is worth something as the next point stresses.

It could dramatically increase your value to your employer:
I tell you truthfully, that being able to model exposures (and dose-response) made my career at the Rohm and Haas Company.   The skill was responsible for at least 3 promotions within that company.   Using models, I was able to predict exposures with just a modicum of data and the invocation of assumptions.   I could explain and justify those predictions based on first principle (and review-able) science and the managers just loved it.   Over 80% of my work was rendering these informed and explained technical opinions regarding the relative safety of products.   When the margins of safety were high enough, it gave them the confident knowledge they needed to proceed.   When the margins were not adequate, it gave me the necessary arguments (and support from my management) to obtain more information and data to reduce the uncertainty and usually the predicted level of risk.


Bottom Line:  Becoming skilled at modeling is not an easy or a short road but it’s the road less traveled and it could offer tremendous benefits to you and your career. 

Friday, October 18, 2013

Exposure Modeling Course at the SRA in Baltimore December 8 2013

PLEASE NOTE THAT THIS COURSE COVERS MONTE CARLO UNCERTAINTY ANALYSIS RELATED TO EXPOSURE MODELING.   IT WILL NOT TEACH MODELING PER SE.  SORRY FOR ANY CONFUSION.

This is your chance to learn Monte Carlo Uncertainty analysis modeling in a one day course from an excellent teacher.   Tom Armstrong and I have previously put together and taught a half day course on human health exposure modeling here in the Philadelphia area.   Tom asked if I would be interested in teaching a one day course at the SRA in Baltimore this year.  I told him that I would but that there would have to be enough student's attending to make it worthwhile for him or I to do so.   Tom agreed to put the course together and teach it by himself if a minimal number of students signed up and to have me come on-board if the student count became high enough.   Well it looks like there are enough folks signing up for Tom to give the course.   Tom is an excellent modeler and teacher and you will get your (or your company's) money worth if you attend this class.   If considerably more of you sign up in the next few weeks, I will be there as well - doing what I really love - teaching and getting paid for it.

The course is on Sunday Dec 8.

Below is mostly cut and paste from the SRA:

HOTEL RESERVATIONS
SRA has arranged for a special rate of $164 a night at the Hilton Baltimore. The Hilton is located in Baltimore’s Inner Harbor district and is only 15 minutes from BWI airport. Transportation from BWI Airport is available via the Light Rail System for $1.60 each way. The Hilton is located adjacent to the Convention Center station. Guests can enjoy the hotel’s prime location next to Oriole Park at Camden Yards, M&T Stadium, home of the Baltimore Ravens; and the University of Maryland Medical Center.

Please use this link for the discounted SRA rate: http://www.hilton.com/en/hi/groups/personalized/B/BWICCHH-SRA-20131207/index.jhtml?WT.mc_id=POG


ANNUAL MEETING WORKSHOPS
Workshops will take place on Sunday, December 8, and Thursday, December 12 at the Hilton.  Listed below are the available workshops.  Go to http://sra.org/sites/default/files/pdf/events/2013_Workshops.pdf to see all of the workshop descriptions.

If that link does not work, try this one:
http://www.sra.org/sites/default/files/pdf/events/2013%20SRA%20Workshops%20FINAL.pdf

I understand that you can sign up and pay for the workshop without registering for the conference.   I also hear that there is a significant discount rate for folks who are full time students that want to take this course. 







Sunday, October 13, 2013

Introduction to IH MOD Slide Deck

Dr. Tom Armstrong is a good friend and colleague and a tireless worker in the cause of educating the Industrial Hygiene community.  Tom just recently dropped a free and valuable gift in our laps.   It is his 53 slide presentation as a PDF file introducing folks to IH MOD.

Those of you who have been reading this blog may know that I think very highly of IH MOD.  For some details, see the Sept 2, 2013 post:  Modeling Math made Easy or at least Easier

IH MOD is an Excel Spreadsheet Workbook file that literally has many hundreds of hours of development put into it.   It is very skillfully programmed in Visual Basic by Daniel Drolet.   I am pretty sure it was Tom who took the idea to Daniel and, at least initially, provided the lion's-share of technical support relative to the models.   I had the privilege of spending some time with  these two a few years ago in working on an expansion of the models.

IH MOD makes the mathematical calculation part of inhalation exposure modeling easy, indeed almost trivial, and it is written to be user friendly.   Now we have 53 slides that provide an excellent introduction to this great tool.    In the slides Tom covers the following:

  • Brief theory of modeling to estimate exposures 
  • What IH Mod is 
  • Contents and layout of IH Mod 
  • Starting and Navigating in IH Mod 
  • Entering parameter values in IH Mod 
  • Interpreting the results from IH Mod 
  • Examples of what IH Mod can do 
  • Worked Examples for you to try IH Mod 

The slides are available online at:

http://www.ihmod.org/uploads/3/2/9/5/3295818/intro_ih_mod_1_hr.pdf

Even if you do not plan on studying or viewing these slides soon - it is worth downloading and saving the PDF as a reference for future study.  If you EVER want to get into exposure modeling (and all the career-advancing advantages that it could bring) you will be glad you did.

Monday, October 7, 2013

Simple Approaches to Common Modeling Problems

Previous blogs have been pretty technical – this one is intended to provide more practical advice on simple approaches to common modeling problems.

I am going to stick with the theme of the modeling of the concentration of volatile organic contaminants (VOCs) in the breathing zone.   As discussed in previous blogs we have the models and the software tools to make these estimations but we need to feed them.   Traditionally the most difficult inputs for modelers to get are the ventilation rate and the emission rate.   We spent a lot of ink talking about estimation of the ventilation rate (Q) and I think that between the two, it is the easier to estimate.   The generation or emission rate (G) is more challenging but certainly doable.

Consider a small spill of a mixture of aromatic volatile organic compounds (VOCs).   The first task is to figure out which VOC(s) need to be evaluated.   This goes to the concept of controlling health hazard.   That is, which compound or compounds will present the highest risk such that if it (they) are controlled then the other components will also not present an untoward risk to human health. If all the VOCs have similar vapor pressures then the simplest first crack at this problem is to take the ratios of percentage within the mixture of each chemical over its exposure limit.   For example let us consider an example we used earlier of a mixture with 1% benzene, 50% toluene and 49% xylene.  I am not certain of the current TLVs but for the sack of this example let’s say they are 0.1, 50 and 50 ppmV, respectively.   The ratios would be 1%/.1 = 10 for benzene and about 1 for toluene and xylene.   Clearly, benzene is the controlling health hazard in this example.     If it were just a 50/50 mixture of xylene and toluene (without the benzene) with the above supposed TLVs then you should probably look at both, at least initially.   Toluene would be expected to be more volatile but with the techniques we are using, xylene may be a close enough second that you need to look at both.  Plus, they most likely will have similar adverse health end-points to each other and thus need to be considered additive in their exposure and risk.  On the other hand, at this point in our understanding benzene presents a dramatically different and more dreaded risk compared to the other two aromatics and concurrent exposure to these two would not be considered to add to this risk.

The next important task is to figure out how much of the chemical you are interested in is evaporating.  One simple way of getting at this value is to weigh the loss of the delivering vessel (Paint can, aerosol can, pump spray, etc.) figure out how much is the evaporating chemical of interest and divide by the drying time.  These are rough estimates but can provide good information.
   
So at this point we have determined that benzene is the controlling health hazard in our spill and we have figured out approximately how much has been released and evaporated.   Now we have to figure out a rate of emission.   If you know how long the spill takes to evaporate you can estimate G but simply taking 1% of the total mass of the spill and divide it by the time of evaporation.   This gives you a rough average for G. This would be modeled with a constant G (using IH MOD) for that time period with G = 0 at the end of the evaporation period.   No matter how long the evaporation takes you need to compare the modeled concentration that occurs as an 8 hr time-weighted average with the 8 hr TLV-TWA of the chemical.    This assumes the worker is only exposed for 8 hours. 

If the spill is actually shrinking in area during the evaporation then you can use a first order assumption.  In this case we would assume that the spill took 8 half-lives to essentially disappear (e.g., it starts at 640 grams and go to 2.5 grams in 8 half-lives).   Thus as a rough estimate the half live is the evaporation time divided by 8 and a derivation in the previous blog will provide the decay constant; that is, k = 0.693/half-live.   If the drying time is less than 8 hours, you will get essentially the same 8 hr time weight average room volume concentration as with the constant G but it will affect the 15-minutes time-weight average for comparison to a Short Term Exposure Limit (STEL).  The highest 15 minute STEL during the evaporation of the spill will be significantly higher with the first order assumption versus an assumed constant rate of evaporation.

Another way to get an evaporation rate is to put a relatively thin layer (simulating the spill) evaporating material into a open dish, put it on a open scale under the same conditions as the spill and record the weight loss with time.    The thickness of the material in the watch glass should be similar to the thickness in the actual exposure scenario.    Consider the data set below for a product that has about 50% solids and 50% solvent.   The solvent is the one from our example:  1% benzene, 50% toluene and 49% xylene.  The only measurement is the total weight versus time. The difference in weight over time is the solvent evaporation and we are assuming that the benzene is 1% of that number.



  
You are pretty sure you have the vast majority of weight loss when the weight does not change significantly over a period of a few hours.   In this case less than 1% of the product mass is lost between 140 and160 minutes.  The time it took to lose half the solvent weight is roughly the half-life and again k= 0.693/half-life.  In this case the half-life is 20 minutes.   To be more precise you could model the weight loss data as a first order decay as has been shown in previous blogs.  Remember we are only interested in the portion of the evaporating mass that is the chemical of interest.  So if benzene is 1% of the mass, G is 1/100th of the overall evaporation rate as indicated in the data.

Remember that this scale experiment only gets you the half-life and k values – you still need to know how much was actually spilled to estimate the airborne concentration with a model.   Note also that the average emission rate in the scale experiment is roughly 320mg/160 minutes for 64 grams of product.  If you spill 640 grams of product it would be 10 times higher.   Also the first order evaporation rate at the beginning of a 640 gram spill would be 80 mg/minute over the first 20 minutes after the spill.

If the evaporating surface area does not shrink with time (e.g., an open vessel) then you can do the above experiment with a deep source so that the measured rate should be relatively constant.

I am doing all this without peer review - which is usually a bad idea; however, this is a simple educational blog.  If you find some errors or have some issues with the advice please let me know and I will issue corrections and expand the discussion.