LinkedIn

Sunday, November 24, 2013

More Modeling Links PLUS Using OLD Models on New PCs

One of the very best things about doing a blog is the almost real-time connections with colleagues in far off places.  Without leaving my home, I can get to interact and network with some marvelous folks who can literally be anywhere.

Many of you have, no doubt, seen the material sent out at regular intervals by Andrew Cutz.    Andrew provides a real service and a recent item from him really caught my eye.   He put out a notice of a new EPA web site:  EPA-Expo-Box (A Toolbox for Exposure Assessors).   This can be found at: http://www.epa.gov/risk/expobox/.   It is a remarkably complete and well organized resource outlining, what appear to be, most of the EPA resources on this topic.   I drilled down a little into the page and found a page that list a bunch of exposure modeling tools developed by and for the EPA.   The page ishttp://www.epa.gov/risk/expobox/routes/inh-cal.htm.    If you start digging into any of these and have some question, just send me an email and I will try to answer them.  If they are of general enough interest I will address them in a future blog. 
Theo Shaeffers from the Netherlands has come through again with a revised version of his website.  An excerpt from a recent email he sent to me is below: 

Hi Mike
Updated the  http://www.tsac.nl/websites.html site with more interesting freeware and moved  the Dutch texts to the end.
End paste.

This site is a “list of lists” or a compendium site with tons of resources.  If you click around on this TSAC site you may come to:  http://www.epa.gov/opptintr/exposure/  which shows all the EPA OPPT’s Exposure Assessment Tools and Models.   You will notice if you go to this page that quite a few of the models are dated well before the advent of Windows 7, 8 or 8.1.    They remain very good models but they have not been updated to specifically run on these newer operating systems.   Indeed, some of them do not run on the newer Operating Systems.

A prime example of the above became painfully obvious to me last week.  I needed to install and run the relatively user-friendly and freely available model shown within the above link entitled:  Multi- Chamber Concentration and Exposure Model (MCCEM) version 1.2 (2/2/2001) available online at:  http://www.epa.gov/oppt/exposure/pubs/mccem.htm

MCCEM is a remarkable model in that is allows for the estimation of airborne concentrations in multiple compartments of a home or other compartmentalized buildings from sources in individual rooms or zones.   Given ventilation rates (or using rates from an extensive database of homes) it calculates the airborne concentrations but also the dose of persons in these compartments given their time-spent and breathing rate inputs.   It also allows for Monte Carlo uncertainty analysis while allowing the user to place people and sources in various volumes during the day to receive their exposure which the model duly calculates.   It provides output reports and a .csv file of the concentrations at each time interval for which the model is run (I typically prescribe every minute).
The MCCEM model has been around for some time and runs very well on Windows 98 and Windows XP but may encounter some compatibility issues on later version of Windows (7, 8 and 8.1).  Indeed, neither I nor my IT Guru Steven Wright (FedSolution.com) could get it to run on Windows 8.1.   Eventually, the most effective way we found to run it on newer PCs is to first install Windows XP as a virtual machine (e.g., Oracle VM VirtualBox as a free download) on a PC running the later version of Windows and then install the MCCEM software. 

These are models  “oldies but goodies” and someday someone may update them for the new Windows operating systems but until then we are just going to have to cope.





Sunday, November 17, 2013

Learning from our Mistakes in Modeling

After the discussion of the octanol/water partition coefficient in the last blog, I thought this would be a good point to go over what was perhaps my biggest professional surprise and how we learn a lot from our mistakes.   What I am recounting below is my recollection of the facts.   To “blind” the identity of the product some of the numbers are changed but the scientific facts remain unchanged.

I was responsible for a risk assessment for a water treatment chemical that was used to treat large circulating systems within a plant.  The active ingredient in the product was highly lippophillic (had a high octanol water partitioning coefficient) and limited water solubility (about 500 ppm) (see previous blog on this subject).   It was used in the water at about 5 ppm by weight.   The vapor pressure of the active ingredient is relatively low such that the saturation or head space concentration of the pure material (see previous blog on this subject) was only about 1 mg/m3.    The exposure limit was relatively low at 0.4 mg/m3 based on its ability to cause local tissue upper respiratory irritation response seen in rat inhalation studies at relatively low concentrations in air.  

So I ran the models (UNIFAC and modified Raout’s Law):  Even with a large thermodynamic activity coefficient (see previous blog on UNIFAC) the worst case saturation concentration of this active ingredient at 5 ppm in water was a VERY small portion of the exposure limit.   I told my internal clients that I could not see any problem with inhalation exposure and risk from this active in this application. 

Imagine my surprise when we started getting reports of upper respiratory irritation in systems where older actives were being substituted with our new product.   I requested that some air monitoring be done in selected area within a specific plant.   To my complete and utter amazement, some of the values came back around 0.5 mg/m3.   This is 50% of the value for PURE active!   How could this be?   How could a 5 ppm solution in water generate this much airborne active ingredient?   I had to know so I booked a trip ASAP  to visit the site, to see things for myself and repeat the monitoring.

The highest sampled airborne concentration occurred over an open sump that contained about 5000 gals of the treated water.   As soon as I saw the sump, the whole thing started to made sense.   On top of the liquid surface was a 1-2 inch layer of FOAM.   Apparently some part of the plant process resulted in a foaming agent going into the water and the agitation within the process produced the foam.  

Because foam is typically much more "oily" than water, our active was partitioning into the foaming agent and thus into the foam.   Subsequent analysis of the foam showed that it had very high concentrations of our active ingredient.   As the form bubbles “popped” they released aerosol particles to the air that were rich in our active which further concentrated as the water and any other more volatile components evaporated from the aerosol particles because of their high surface area to volume ratio.   The entire mix of aerosol and active vapor produced the high breathing and irritating breathing zone concentrations above the sump.

If there was no foam there was no concentrating of the active ingredient into the foam and no high levels of airborne exposure.   The reality from a product stewardship perspective became clear to me; namely, that when foam was present in some systems that these scenarios where clearly capable of producing unacceptably high airborne concentrations.  

For me, the moral of this story is twofold.  First, one should always have some level of “ground truthing” for the predictions of a model.   That is, do some monitoring.   Those who model should always be open to monitoring.   They are not two separate camps.   The modeling informs the monitoring and the monitoring “ground truths” the models.   Second, we should always be ready to have Mother Nature throw us a curve ball.  We can predict a lot with models but we should always be open to potential game changers from unanticipated factors. 


Monday, November 11, 2013

Octanol Water Partition Coefficient – What does it Mean for Exposure Assessment?

The octanol/water partition coefficient is determined by a relatively simple test.   It is done by taking roughly equal volumes of octanol (an “oily” alcohol) and water, mixing them together and throwing the chemical of interest into the mix and mixing some more.   We all know that oil and water do not mix and as a result when everything settles down we get an oil layer of octanol on top of the water layer with the chemical of interest dissolved in each.   The ratio of the amount in the oil to the amount that got dissolved in the water is the octanol/water partition coefficient or Kow.   This number indicates whether a chemical is a lipophilic (a “fat lover”) or hydrophilic (a “water lover”).   For example, methanol would go mostly into the water while benzene goes mostly into the octanol.   Most organics are highly lippophillic such that the scale is made to be logarithmic.   Thus, we take the log of the ratio which is called log Kow or pKow.   If the Kow is 1000 the the log Kow or pKow would be 3.    Since only 17% of methanol goes into the octanol the Kow ratio is 0.17 and the pKow is -0.77.   See the previous blog on logarithms (A Simple Refresher of Logarithms, July 2013) if this is at all unclear.

The pKow of benzene is 2.13 which mean it really likes fats and generally dislikes water.  What does this all mean for those of us in the exposure assessment business.   Well first of all it tells us roughly where benzene or any other fat loving chemical will wind up in the environment.   It will tend to wind up in environmental compartments that are rich in organic constituents like some soils and most sediment.   If inhaled or ingested, it will tend to accumulate in areas of the body that have a lot of blood flow and a lot of fat. 

In general, chemicals with a high pKow tend to bioaccumulate or biocentrate in organisms within the environment.   That is, they concentrate in the tissues of animals -as you go up the food chain.  DDT was a prime example of this in the case of the Bald Eagle.  Since we are at the top of the food chain this could also be problematic for us.

Finally, chemicals with high pKow tend to be absorbed by the skin much more readily that those with low pKow.   Look at the following equation for dermal adsorption from Dr. Wil tenBerge’s web site:
From this it appears that pKow and molecular weight drive the dermal absorption of chemicals.

Of course, the higher the pKow, the lower the water solubility and this is an important considerations in some exposure assessments.   One can get faked out with this fact which will be the subject of my next blog and how I received what was perhaps my greatest surprise ever to date in doing an exposure assessment.



Monday, November 4, 2013

The Promise of REACh to Exposure and Risk Assessment

REACh could be the most impactful piece of legislation in the history of risk assessment.  From my perspective, the jury is still out.

Below are excerpts from a chapter I co-authored with Ron Pearson and Susan Arnold that currently appears in the 2nd Editions of the AIHA book in Mathematical Modeling for Estimating Occupational Exposure to chemicals.  The chapter is entitled: REACh – A New and Important Reason to Learn Modeling

In the 1990s at meetings of the European Union’s (EU) Council of Environment Ministers, concerns were raised that European chemical policies did NOT provide enough protection to chemical product users. An evaluation of existing rules revealed that new substances were heavily regulated, but they made up only a tiny fraction of the total chemicals in commerce. On the other hand, existing substances, which made up the vast majority of the total volume of chemicals in use, were essentially unregulated. Out of this, REACh was conceived[1],[2]


This lack of risk assessment attention for existing chemicals has also been recognized on this side of the Atlantic.  It has been explicitly noted by the US EPA Board of Scientific Councilors that any comprehensive assessment of human consumer exposure to chemicals has not occurred and that the vast majority of exposures have not been systematically and proactively addressed[3].   Indeed, it is reasonably well established at this point that the risks of most types of consumer chemical exposures in modern society have not been assessed.

The types of exposure mentioned here are the exposures to chemicals that result predominately from residential sources.  Because of the leadership of the European Union and the development of REACh, we are on the cusp of a change in which regulatory mandates playing out in the rest of the world are literally driving the overall scientific development of human health exposure assessment for the multitude of common and relatively unstudied substances to which humans are exposed.

What then is REACh all about at its base level and what does it mean to the Industrial Hygienist?

REACh stands for Registration, Evaluation, Authorisation and restriction of Chemical Substances.  It is a complex piece of rulemaking that lawmakers have been hammering out in Europe for some time.  REACh became the law of the land in the EU on June 1, 2007.  Its primary purpose is to evaluate and control the human health risk from chemicals. It places the burden of proof for chemical safety on manufacturers, removing it from regulatory authorities.  Because of the complexity and broad scope of REACh, it is driving the need to screen and assess many chemicals and substances in a relatively short (but rather immediate) time-frame
  
So it is thus clear that REACh was conceived in Europe to regulate the majority of chemicals that exist in commerce.  What then is the upshot of REACh relative to industrial hygiene?  Professionals anticipate REACh will actually spur innovations and opportunities in estimating exposures to consumer products, which would positively affect the field of IH. Toxicologists are also facing the same issues as the second half of any risk assessment team. 

First Comes Assessment and Modeling

If  REACh requires a comprehensive and scientifically valid and rational evaluation of human exposure to substances, then modeling is an indispensable element of that assessment.   Since it has been established that this level of comprehensive assessment of human exposure to chemicals has not occurred and that the vast majority of exposures have not been systematically and proactively addressed, a considerable amount of scientific model development needs to occur.  For example, personal chemical exposures to consumers from predominately residential sources have been ignored by most chemical modelers.  Given this commitment within REACh to assessment, exposure model utilization and development should come to the forefront[4].
Exposure models are critical to any commitment for comprehensive exposure assessment because we will never be able to monitor or measure every exposure everywhere.  The need for models is particularly acute with REACh because it increases proportionally with the growing universe of chemicals under consideration. Also, as a technical expert, an industrial hygienist has a critical need for objective and rational scientific tools for analysis.  These facts necessitate both continuing education in and the ongoing scientific development of the discipline. 
Specific research needs relative to exposure model development and validation have been outlined and presented[5]; unfortunately, to date the authors are unaware of any concerted and coordinated effort to follow up on these recommendations. (Jayjock Note:  This document and web site can no longer be found online.  I will put it on my web site but in the meantime, please send me an email at mjayock@gmail.com and I will send you a copy).
Conclusion
Stripping away all the complexity, the “bottom line” for those responsible for conducting risk assessments under REACh is to provide enough quality information about any substance’s exposure pathway(s) and toxic effects in the real world to pass the “red face” test of accountability.   The size of the challenge is clearly daunting and the need for tool development is critical. 
Jayjock Comment:   This chapter was written about 5 years ago and I must say from my perspective, I have not seen a ground swell of data or researched and developed modeling tools (e.g., source characterization) coming forth.   The economic and political tides in all of this continue to shift but we can only hope that the promised progress will ultimately be forthcoming.    

[1]     http://ec.europa.eu/enterprise/sectors/chemicals/reach/index_en.htm
 (Accessed October 28, 2013)
[2]    http://ecb.jrc.it/REACH/  (Accessed January 9, 2008)
[3]    USEPA: Human Health Research Program Review: A Report of the US EPA Science Advisory Board,  Final Report of the Subcommittee on Human Health, EPA Office of Research and Development , May 18, 2005 revised July 18, 2006,  [available ONLINE] http://www.epa.gov/osp/bosc/pdf/hh0507rpt.pdf (Accessed October 28, 2013)
[4]    Jayjock, M.A, C.F. Chaisson, S. Arnold and E.J. Dederick, Modeling framework for human exposure assessment, Journal of Exposure Science and Environmental Epidemiology (2007) 17, S81–S89.

[5]     Kephalopoulos, S, A. Arvanitis, M.A. Jayjock (Eds):  Global CEM Net Report of the Workshop no. 2 on “Source Characterization, Transport and Fate”, Intra (Italy), 20-21 June 2005.  Available online: http://www.jrc.ec.europa.eu/pce/documentation/eur_reports/Global%20CEM%20Net%20Workshop%202%20SOURCES.pdf   (Last accessed January 9, 2008).