LinkedIn

Sunday, November 24, 2013

More Modeling Links PLUS Using OLD Models on New PCs

One of the very best things about doing a blog is the almost real-time connections with colleagues in far off places.  Without leaving my home, I can get to interact and network with some marvelous folks who can literally be anywhere.

Many of you have, no doubt, seen the material sent out at regular intervals by Andrew Cutz.    Andrew provides a real service and a recent item from him really caught my eye.   He put out a notice of a new EPA web site:  EPA-Expo-Box (A Toolbox for Exposure Assessors).   This can be found at: http://www.epa.gov/risk/expobox/.   It is a remarkably complete and well organized resource outlining, what appear to be, most of the EPA resources on this topic.   I drilled down a little into the page and found a page that list a bunch of exposure modeling tools developed by and for the EPA.   The page ishttp://www.epa.gov/risk/expobox/routes/inh-cal.htm.    If you start digging into any of these and have some question, just send me an email and I will try to answer them.  If they are of general enough interest I will address them in a future blog. 
Theo Shaeffers from the Netherlands has come through again with a revised version of his website.  An excerpt from a recent email he sent to me is below: 

Hi Mike
Updated the  http://www.tsac.nl/websites.html site with more interesting freeware and moved  the Dutch texts to the end.
End paste.

This site is a “list of lists” or a compendium site with tons of resources.  If you click around on this TSAC site you may come to:  http://www.epa.gov/opptintr/exposure/  which shows all the EPA OPPT’s Exposure Assessment Tools and Models.   You will notice if you go to this page that quite a few of the models are dated well before the advent of Windows 7, 8 or 8.1.    They remain very good models but they have not been updated to specifically run on these newer operating systems.   Indeed, some of them do not run on the newer Operating Systems.

A prime example of the above became painfully obvious to me last week.  I needed to install and run the relatively user-friendly and freely available model shown within the above link entitled:  Multi- Chamber Concentration and Exposure Model (MCCEM) version 1.2 (2/2/2001) available online at:  http://www.epa.gov/oppt/exposure/pubs/mccem.htm

MCCEM is a remarkable model in that is allows for the estimation of airborne concentrations in multiple compartments of a home or other compartmentalized buildings from sources in individual rooms or zones.   Given ventilation rates (or using rates from an extensive database of homes) it calculates the airborne concentrations but also the dose of persons in these compartments given their time-spent and breathing rate inputs.   It also allows for Monte Carlo uncertainty analysis while allowing the user to place people and sources in various volumes during the day to receive their exposure which the model duly calculates.   It provides output reports and a .csv file of the concentrations at each time interval for which the model is run (I typically prescribe every minute).
The MCCEM model has been around for some time and runs very well on Windows 98 and Windows XP but may encounter some compatibility issues on later version of Windows (7, 8 and 8.1).  Indeed, neither I nor my IT Guru Steven Wright (FedSolution.com) could get it to run on Windows 8.1.   Eventually, the most effective way we found to run it on newer PCs is to first install Windows XP as a virtual machine (e.g., Oracle VM VirtualBox as a free download) on a PC running the later version of Windows and then install the MCCEM software. 

These are models  “oldies but goodies” and someday someone may update them for the new Windows operating systems but until then we are just going to have to cope.





Sunday, November 17, 2013

Learning from our Mistakes in Modeling

After the discussion of the octanol/water partition coefficient in the last blog, I thought this would be a good point to go over what was perhaps my biggest professional surprise and how we learn a lot from our mistakes.   What I am recounting below is my recollection of the facts.   To “blind” the identity of the product some of the numbers are changed but the scientific facts remain unchanged.

I was responsible for a risk assessment for a water treatment chemical that was used to treat large circulating systems within a plant.  The active ingredient in the product was highly lippophillic (had a high octanol water partitioning coefficient) and limited water solubility (about 500 ppm) (see previous blog on this subject).   It was used in the water at about 5 ppm by weight.   The vapor pressure of the active ingredient is relatively low such that the saturation or head space concentration of the pure material (see previous blog on this subject) was only about 1 mg/m3.    The exposure limit was relatively low at 0.4 mg/m3 based on its ability to cause local tissue upper respiratory irritation response seen in rat inhalation studies at relatively low concentrations in air.  

So I ran the models (UNIFAC and modified Raout’s Law):  Even with a large thermodynamic activity coefficient (see previous blog on UNIFAC) the worst case saturation concentration of this active ingredient at 5 ppm in water was a VERY small portion of the exposure limit.   I told my internal clients that I could not see any problem with inhalation exposure and risk from this active in this application. 

Imagine my surprise when we started getting reports of upper respiratory irritation in systems where older actives were being substituted with our new product.   I requested that some air monitoring be done in selected area within a specific plant.   To my complete and utter amazement, some of the values came back around 0.5 mg/m3.   This is 50% of the value for PURE active!   How could this be?   How could a 5 ppm solution in water generate this much airborne active ingredient?   I had to know so I booked a trip ASAP  to visit the site, to see things for myself and repeat the monitoring.

The highest sampled airborne concentration occurred over an open sump that contained about 5000 gals of the treated water.   As soon as I saw the sump, the whole thing started to made sense.   On top of the liquid surface was a 1-2 inch layer of FOAM.   Apparently some part of the plant process resulted in a foaming agent going into the water and the agitation within the process produced the foam.  

Because foam is typically much more "oily" than water, our active was partitioning into the foaming agent and thus into the foam.   Subsequent analysis of the foam showed that it had very high concentrations of our active ingredient.   As the form bubbles “popped” they released aerosol particles to the air that were rich in our active which further concentrated as the water and any other more volatile components evaporated from the aerosol particles because of their high surface area to volume ratio.   The entire mix of aerosol and active vapor produced the high breathing and irritating breathing zone concentrations above the sump.

If there was no foam there was no concentrating of the active ingredient into the foam and no high levels of airborne exposure.   The reality from a product stewardship perspective became clear to me; namely, that when foam was present in some systems that these scenarios where clearly capable of producing unacceptably high airborne concentrations.  

For me, the moral of this story is twofold.  First, one should always have some level of “ground truthing” for the predictions of a model.   That is, do some monitoring.   Those who model should always be open to monitoring.   They are not two separate camps.   The modeling informs the monitoring and the monitoring “ground truths” the models.   Second, we should always be ready to have Mother Nature throw us a curve ball.  We can predict a lot with models but we should always be open to potential game changers from unanticipated factors. 


Monday, November 11, 2013

Octanol Water Partition Coefficient – What does it Mean for Exposure Assessment?

The octanol/water partition coefficient is determined by a relatively simple test.   It is done by taking roughly equal volumes of octanol (an “oily” alcohol) and water, mixing them together and throwing the chemical of interest into the mix and mixing some more.   We all know that oil and water do not mix and as a result when everything settles down we get an oil layer of octanol on top of the water layer with the chemical of interest dissolved in each.   The ratio of the amount in the oil to the amount that got dissolved in the water is the octanol/water partition coefficient or Kow.   This number indicates whether a chemical is a lipophilic (a “fat lover”) or hydrophilic (a “water lover”).   For example, methanol would go mostly into the water while benzene goes mostly into the octanol.   Most organics are highly lippophillic such that the scale is made to be logarithmic.   Thus, we take the log of the ratio which is called log Kow or pKow.   If the Kow is 1000 the the log Kow or pKow would be 3.    Since only 17% of methanol goes into the octanol the Kow ratio is 0.17 and the pKow is -0.77.   See the previous blog on logarithms (A Simple Refresher of Logarithms, July 2013) if this is at all unclear.

The pKow of benzene is 2.13 which mean it really likes fats and generally dislikes water.  What does this all mean for those of us in the exposure assessment business.   Well first of all it tells us roughly where benzene or any other fat loving chemical will wind up in the environment.   It will tend to wind up in environmental compartments that are rich in organic constituents like some soils and most sediment.   If inhaled or ingested, it will tend to accumulate in areas of the body that have a lot of blood flow and a lot of fat. 

In general, chemicals with a high pKow tend to bioaccumulate or biocentrate in organisms within the environment.   That is, they concentrate in the tissues of animals -as you go up the food chain.  DDT was a prime example of this in the case of the Bald Eagle.  Since we are at the top of the food chain this could also be problematic for us.

Finally, chemicals with high pKow tend to be absorbed by the skin much more readily that those with low pKow.   Look at the following equation for dermal adsorption from Dr. Wil tenBerge’s web site:
From this it appears that pKow and molecular weight drive the dermal absorption of chemicals.

Of course, the higher the pKow, the lower the water solubility and this is an important considerations in some exposure assessments.   One can get faked out with this fact which will be the subject of my next blog and how I received what was perhaps my greatest surprise ever to date in doing an exposure assessment.



Monday, November 4, 2013

The Promise of REACh to Exposure and Risk Assessment

REACh could be the most impactful piece of legislation in the history of risk assessment.  From my perspective, the jury is still out.

Below are excerpts from a chapter I co-authored with Ron Pearson and Susan Arnold that currently appears in the 2nd Editions of the AIHA book in Mathematical Modeling for Estimating Occupational Exposure to chemicals.  The chapter is entitled: REACh – A New and Important Reason to Learn Modeling

In the 1990s at meetings of the European Union’s (EU) Council of Environment Ministers, concerns were raised that European chemical policies did NOT provide enough protection to chemical product users. An evaluation of existing rules revealed that new substances were heavily regulated, but they made up only a tiny fraction of the total chemicals in commerce. On the other hand, existing substances, which made up the vast majority of the total volume of chemicals in use, were essentially unregulated. Out of this, REACh was conceived[1],[2]


This lack of risk assessment attention for existing chemicals has also been recognized on this side of the Atlantic.  It has been explicitly noted by the US EPA Board of Scientific Councilors that any comprehensive assessment of human consumer exposure to chemicals has not occurred and that the vast majority of exposures have not been systematically and proactively addressed[3].   Indeed, it is reasonably well established at this point that the risks of most types of consumer chemical exposures in modern society have not been assessed.

The types of exposure mentioned here are the exposures to chemicals that result predominately from residential sources.  Because of the leadership of the European Union and the development of REACh, we are on the cusp of a change in which regulatory mandates playing out in the rest of the world are literally driving the overall scientific development of human health exposure assessment for the multitude of common and relatively unstudied substances to which humans are exposed.

What then is REACh all about at its base level and what does it mean to the Industrial Hygienist?

REACh stands for Registration, Evaluation, Authorisation and restriction of Chemical Substances.  It is a complex piece of rulemaking that lawmakers have been hammering out in Europe for some time.  REACh became the law of the land in the EU on June 1, 2007.  Its primary purpose is to evaluate and control the human health risk from chemicals. It places the burden of proof for chemical safety on manufacturers, removing it from regulatory authorities.  Because of the complexity and broad scope of REACh, it is driving the need to screen and assess many chemicals and substances in a relatively short (but rather immediate) time-frame
  
So it is thus clear that REACh was conceived in Europe to regulate the majority of chemicals that exist in commerce.  What then is the upshot of REACh relative to industrial hygiene?  Professionals anticipate REACh will actually spur innovations and opportunities in estimating exposures to consumer products, which would positively affect the field of IH. Toxicologists are also facing the same issues as the second half of any risk assessment team. 

First Comes Assessment and Modeling

If  REACh requires a comprehensive and scientifically valid and rational evaluation of human exposure to substances, then modeling is an indispensable element of that assessment.   Since it has been established that this level of comprehensive assessment of human exposure to chemicals has not occurred and that the vast majority of exposures have not been systematically and proactively addressed, a considerable amount of scientific model development needs to occur.  For example, personal chemical exposures to consumers from predominately residential sources have been ignored by most chemical modelers.  Given this commitment within REACh to assessment, exposure model utilization and development should come to the forefront[4].
Exposure models are critical to any commitment for comprehensive exposure assessment because we will never be able to monitor or measure every exposure everywhere.  The need for models is particularly acute with REACh because it increases proportionally with the growing universe of chemicals under consideration. Also, as a technical expert, an industrial hygienist has a critical need for objective and rational scientific tools for analysis.  These facts necessitate both continuing education in and the ongoing scientific development of the discipline. 
Specific research needs relative to exposure model development and validation have been outlined and presented[5]; unfortunately, to date the authors are unaware of any concerted and coordinated effort to follow up on these recommendations. (Jayjock Note:  This document and web site can no longer be found online.  I will put it on my web site but in the meantime, please send me an email at mjayock@gmail.com and I will send you a copy).
Conclusion
Stripping away all the complexity, the “bottom line” for those responsible for conducting risk assessments under REACh is to provide enough quality information about any substance’s exposure pathway(s) and toxic effects in the real world to pass the “red face” test of accountability.   The size of the challenge is clearly daunting and the need for tool development is critical. 
Jayjock Comment:   This chapter was written about 5 years ago and I must say from my perspective, I have not seen a ground swell of data or researched and developed modeling tools (e.g., source characterization) coming forth.   The economic and political tides in all of this continue to shift but we can only hope that the promised progress will ultimately be forthcoming.    

[1]     http://ec.europa.eu/enterprise/sectors/chemicals/reach/index_en.htm
 (Accessed October 28, 2013)
[2]    http://ecb.jrc.it/REACH/  (Accessed January 9, 2008)
[3]    USEPA: Human Health Research Program Review: A Report of the US EPA Science Advisory Board,  Final Report of the Subcommittee on Human Health, EPA Office of Research and Development , May 18, 2005 revised July 18, 2006,  [available ONLINE] http://www.epa.gov/osp/bosc/pdf/hh0507rpt.pdf (Accessed October 28, 2013)
[4]    Jayjock, M.A, C.F. Chaisson, S. Arnold and E.J. Dederick, Modeling framework for human exposure assessment, Journal of Exposure Science and Environmental Epidemiology (2007) 17, S81–S89.

[5]     Kephalopoulos, S, A. Arvanitis, M.A. Jayjock (Eds):  Global CEM Net Report of the Workshop no. 2 on “Source Characterization, Transport and Fate”, Intra (Italy), 20-21 June 2005.  Available online: http://www.jrc.ec.europa.eu/pce/documentation/eur_reports/Global%20CEM%20Net%20Workshop%202%20SOURCES.pdf   (Last accessed January 9, 2008).


Monday, October 28, 2013

Why Isn’t Risk Assessment Done on All Chemicals?

I have recently been asked why are there so few occupational exposure limits compared to the number of chemicals in commerce?   There are a number of potential answers to this but I think the simplest and most accurate comeback is that there is a lack of data.   So that begs the question as to why there is so little data?

We all probably know by now that Human Health Risk Assessment is the integration of the ability of a chemical to cause harm to human health with the actual exposure to that chemical that might occur in the real world.   In terms of a simple conceptual model:   Risk = (Toxicological Harm/Exposure)X(Exposure).    EVERY chemical will cause toxicological harm at some dose.   Pure oxygen breathed for an extended period is toxic.  Indeed Wikipedia reports that pulmonary and ocular toxicity result from extended exposures to elevated oxygen levels at normal pressure.   Given a dose high enough, toxicological harm comes from each and EVERY chemical we can think of.  The “what” of that untoward health outcome is called the HAZARD IDENTIFICATION for that substance. In the above example with oxygen it is pulmonary and ocular toxicity.  It is usually the first bad thing health-wise that happens as you ramp up the exposure level and it can range from irritation to more serious outcomes like cancer to death.   The exposure level over which that bad health effect happens defines the POTENCY of the chemical’s toxicological effect with highly potent materials causing effects at very low exposures.   Oxygen is not very potent.  By comparison benzene is very potent in its ability to cause toxicological harm.

The point of all this is that you cannot begin to do a risk assessment without data on the toxicological properties (HAZARD IDENTIFICATION and POTENCY) of the chemical of interest.  If you have No data you have No risk assessment unless you really force the issue which will be discussed below.

Unfortunately, what has sparked a lot of the toxicological data that we have comes from first seeing the adverse health effect of the chemicals on exposed humans.   Benzene falls into this category.   So does asbestos and vinyl chloride and all the other known human carcinogens.   Seeing people stricken by these substances caused them to be studied.   The other category of well-studied chemicals is pesticides.  This is primarily because they are designed to be commercial poisons so they are expected to be relatively potent toxicants and clearly worthy of study.   As a result they are also highly regulated.  How do we address the rest of the chemicals in our world?

In the late 1990s the Environmental Defense Fund issued a ground breaking report entitled:  Toxic Ignorance (The Continuing Absence of Basic Health Testing for Top-Selling Chemicals in the United States):  http://www.edf.org/sites/default/files/243_toxicignorance_0.pdf    It proved with undeniable hard data that, at that point in time, “…even the most basic toxicity testing results cannot be found in the public record for nearly 75% of the top volume chemicals in commercial use.”   As you might imagine it caused quite a stir and the EPA got involved and eventually hatched the High Production Volume (HPV) Challenge  program.  http://www.epa.gov/hpv/.   This resulted in considerably more toxicological data but as you might guess there remains a severe lack of data for the tens of thousands of chemicals still in commerce to which folks are exposed every day.

But that takes us to an even more fundamental question:  Why haven’t we been testing the chemicals that we breathe, eat and touch in our environment to determine their toxicological effects all along?   Why has it taken the cold hand of public scrutiny and regulation to get things moving?  I think one of  the strongest factors addressing this question is the misguided presumption of safety by those with an economic stake in these chemicals.   Many believe at some level that “no proof of risk means proof of no risk.”  This is, of course, not true; however, for these folks there is no incentive to go “looking for trouble” by testing the chemicals to identify the hazard they pose and the potency of that hazard.   Toxicology testing is, or has been, considered to be relatively expensive to do so.   Thus, the reasoning goes, why spend money to give you bad news when you assume it’s safe and the testing would only bring bad news?

There is another large factor in this and that is the mistrust of toxicological data.   Those who do not like the results point to the high doses used in the toxicological studies and assert that they do not relate to or represent the exposures received in the real world and are therefore unrealistic measures of risk.   We will get into these issues (dose-response extrapolation and setting exposure limits) in future blogs but I think you can see how the politics might be playing out in all this to have a strong bias towards not doing toxicological testing.

So at the end of 2013 we are left with lots and lots of chemicals but relatively little toxicology data to characterize the hazard and potency of these tens of thousands of substances in commerce.   My sense, however, is that as Bob Dylan once sang “The times they are a changin…”   The European REACh statute is essentially driving risk assessment by forcing risk assessment and the assignment of exposure limits based on very prescriptive procedures.  I will go over REACh in a future blog because I believe it has the capacity, given the political will, to ultimately drive the science of risk assessment.   This could conceivably force much more testing of both toxicity and exposure but that remains to be seen.
.  
On another front the EPA is spending considerable money (tens and perhaps hundreds of millions of USD) and resources in advanced toxicological screening using innovative molecular biology techniques in a program entitled ToxCast:  http://www.epa.gov/ncct/toxcast/     

More confident data will begin to feed the risk assessment process which should ultimately the lower uncertainty of the analyses.  Lower uncertainty will, in turn, lower the forced overestimation of risk that comes from doing a risk assessment with a lack of data.  Indeed, in the end this confident knowledge will be much more cost-effective in focusing on real risk and not spending money on phantom risk borne of uncertainty. 

Monday, October 21, 2013

Career Advantages of Being a Modeler

It occurred to me recently that I have not given you all the really good reasons why you should be dedicated to learning modeling. I will attempt to do so herein. The reasons are listed and further explained below:

  1.      It’s cool!
  2.  It will really help you in your job as an industrial hygienist or exposure assessor
  3.  You will become a relatively rare and sought after animal
  4.  It could dramatically increase your value to your employer


It’s Cool:
Huey Lewis once sang that “its hip to be square!”   If you have any feel for science at all, understanding or looking into the workings of reality can be a real rush.  The Science Channel counts on this human trait to gather viewers.  Indeed, seeking and organizing the factors that drive human exposure in any scenario is part of being human and many of us find it to be simply fun.   Let’s face it, we are curious animals that love to be competent and develop tools (i.e., models)  and acting on that curiosity is an end in itself for many of us.

It will really help you in your job as an industrial hygienist or exposure assessor:
Modeling will inform your judgment as to where the significant exposures might be whether they occur in the present, happened in the past or have not occurred as yet.   It will allow you to estimate the exposure potential of scenarios literally on the other side of the globe.   I should also ultimately mean you will most likely waste less time monitoring some exposure scenarios  that do not need measuring while focusing on other that do.  Properly done, skill in modeling could in the long run mean less direct monitoring and more effort put into characterizing what exactly is causing the potential over-exposures. 

You will become a relatively rare and sought after animal:
Thanks to the efforts of my colleagues within the AIHA there are quite a few more modelers out there in the Industrial Hygiene Community than there were 20 years ago when we started beating the drum but there are frankly still relatively few.   It is not the sort of discipline that you pick up very quickly and there are very few places to actually learn it.   The 2 day AIHA Professional Development Course is probably the best but it is very intense and, while Rome was not built in a day, it is even harder to “make a modeler” in two days.    Indeed, there are quite a few reasons that there remains a relative lack of folks that are reasonably skilled in human exposure modeling.   I outline this situation in detail in the following document:    


Those of you that read this short Word document will find that it is an “offer of service” to clients to take the time and attention needed to actually train professionals on-the-job in modeling to have them become fully functional.   The offer has been out for a while and I have yet to have any takers.   If I get a lot of response to this particular blog I may reproduce it in a future blog.   Frankly, it walks the line between service to you and your management and self-promotion but I am willing to take that chance to get the word out.
The fact remains that there are very few places to get this training and that if you take the time to do so you will be a rare, and valuable, exception.    You will no longer be someone who just measures and interprets exposures. You will be a technologist that predicts exposures and understands and can explain the scientific basis for that judgment.  That skill is worth something as the next point stresses.

It could dramatically increase your value to your employer:
I tell you truthfully, that being able to model exposures (and dose-response) made my career at the Rohm and Haas Company.   The skill was responsible for at least 3 promotions within that company.   Using models, I was able to predict exposures with just a modicum of data and the invocation of assumptions.   I could explain and justify those predictions based on first principle (and review-able) science and the managers just loved it.   Over 80% of my work was rendering these informed and explained technical opinions regarding the relative safety of products.   When the margins of safety were high enough, it gave them the confident knowledge they needed to proceed.   When the margins were not adequate, it gave me the necessary arguments (and support from my management) to obtain more information and data to reduce the uncertainty and usually the predicted level of risk.


Bottom Line:  Becoming skilled at modeling is not an easy or a short road but it’s the road less traveled and it could offer tremendous benefits to you and your career. 

Friday, October 18, 2013

Exposure Modeling Course at the SRA in Baltimore December 8 2013

PLEASE NOTE THAT THIS COURSE COVERS MONTE CARLO UNCERTAINTY ANALYSIS RELATED TO EXPOSURE MODELING.   IT WILL NOT TEACH MODELING PER SE.  SORRY FOR ANY CONFUSION.

This is your chance to learn Monte Carlo Uncertainty analysis modeling in a one day course from an excellent teacher.   Tom Armstrong and I have previously put together and taught a half day course on human health exposure modeling here in the Philadelphia area.   Tom asked if I would be interested in teaching a one day course at the SRA in Baltimore this year.  I told him that I would but that there would have to be enough student's attending to make it worthwhile for him or I to do so.   Tom agreed to put the course together and teach it by himself if a minimal number of students signed up and to have me come on-board if the student count became high enough.   Well it looks like there are enough folks signing up for Tom to give the course.   Tom is an excellent modeler and teacher and you will get your (or your company's) money worth if you attend this class.   If considerably more of you sign up in the next few weeks, I will be there as well - doing what I really love - teaching and getting paid for it.

The course is on Sunday Dec 8.

Below is mostly cut and paste from the SRA:

HOTEL RESERVATIONS
SRA has arranged for a special rate of $164 a night at the Hilton Baltimore. The Hilton is located in Baltimore’s Inner Harbor district and is only 15 minutes from BWI airport. Transportation from BWI Airport is available via the Light Rail System for $1.60 each way. The Hilton is located adjacent to the Convention Center station. Guests can enjoy the hotel’s prime location next to Oriole Park at Camden Yards, M&T Stadium, home of the Baltimore Ravens; and the University of Maryland Medical Center.

Please use this link for the discounted SRA rate: http://www.hilton.com/en/hi/groups/personalized/B/BWICCHH-SRA-20131207/index.jhtml?WT.mc_id=POG


ANNUAL MEETING WORKSHOPS
Workshops will take place on Sunday, December 8, and Thursday, December 12 at the Hilton.  Listed below are the available workshops.  Go to http://sra.org/sites/default/files/pdf/events/2013_Workshops.pdf to see all of the workshop descriptions.

If that link does not work, try this one:
http://www.sra.org/sites/default/files/pdf/events/2013%20SRA%20Workshops%20FINAL.pdf

I understand that you can sign up and pay for the workshop without registering for the conference.   I also hear that there is a significant discount rate for folks who are full time students that want to take this course.