LinkedIn

Sunday, January 25, 2015

i-SVOC (2014): A Modern and Remarkable Advancement over AMEM

For me the best part about writing a blog is the networking and interaction with the colleagues.   After last week’s blog about an old but, I thought, still useful software tool (AMEM), I was contacted by Shen Tian, P.E., Environmental, Health and Safety Engineer at Bayer Material Science.   Shen advised that he had found and uses the EPA i-SVOC (2014) (Note: SVOC stands for Semi-Volatile Organic Compounds of which phthalates and flame retardants are prime examples). Shen advises that this model can do similar estimates of SVOC (e.g., flame retardant and phthalates) emitting from products (substrate).   Copying from the EPA web site on this program we are told:

“Computer program i-SVOC Version 1.0 is a Microsoft Windows-based application for dynamic modeling of the emissions, transport, sorption, and distribution of semi-volatile organic compounds (SVOCs) in the indoor environment. This program covers the following indoor media or compartments:
·         air,
·         sources,
·         sinks (i.e., sorption by interior surfaces),
·         contaminant barriers,
·         suspended particles, and
·         settled particles.”
 One can find and download the program and documentation   at:  http://www.epa.gov/nrmrl/appcd/mmd/i-sovc.html.  Shen went on in his email to discuss his experience with the model.   He notes that for the emission of SVOC from a plastic it depends on the following user inputs with I have copied and listed below:
  • D: diffusion coefficient of SVOC in the substrate,
  • Kma: partition coefficient of SVOC between solid and air,
  • ha: mass transfer coefficient in the air
  • C0: the initial SVOC concentration in the substrate. 

I have to admit I had not heard of the i-SVOC model so I downloaded it and installed it in my Window 8.1 laptop (reportedly, it will work on Windows 7 or XP as well).   My first impression of this program:  Holy Cow!   If AMEN is High School this is Graduate School!  It is a remarkable piece of freeware with a very slick interface that attempts to determine the time course and fate of an SVOC of concern in all the various compartments (sources, sinks, settled dust and airborne particulate matter) extant indoors.  The documentation was written by Dr. Zhishi Guo and is presented in a 2013, 80 page PDF file that downloads with the program.  I have known of Dr. Guo's work for many years and he is a brilliant modeler who has done outstanding work at the EPA in constructing and sharing many very useful and relatively user-friendly models.   The i-SVOC model was also reviewed and tested by some of the best technical minds working in this field.   By all indications, this is a first- rate model and program but it will require a significant level of study and time from anyone wishing to use it.  To make this task easier, a number of demo input files are provided with the download to help the new user.
In order to feed this model relative to its inputs, Shen Tian advises that the companion modeling program (PARAMS 1.0) from EPA can be used to estimate quite a few parameters required for input.    I went looking for this program and found at link at: (http://www.epa.gov/ordntrnt/ORD/NRMRL/appcd/mmd/PARAMS.zip); however, when I tried the link it was broken with the dreaded 404 error.   I am looking into reviving this link and will let you know what happens, if you like, via email correspondence.   If worst comes to worst I will get a copy of PARAM 1.0 and make it available to anyone who sends me a request at mjayjock@gmail.com.
In his work Shen reports that the most challenging part of using i-SVOC is the estimation of D since there has not been a lot of testing done on SVOC/pairs.  To help this particular gap he tells us that Liu and Guo et al recently published a testing method for estimating D and Kma in Atmospheric Environment 89 (2014) 76-84 , entitled:  Experimental method development for estimating solid-phase diffusion coefficients and material/air partition coefficients of SVOCs which he found helpful in filling the gap. 
In all, i-SVOC represents a remarkable accomplishment and resource for anyone doing this type of human exposure assessment of SVOC indoors.  I am indebted to Shen Tian for pointing us toward it.   If you have a need to perform this type of analysis, I believe that it would be well worth your effort to learn and use it.

Sunday, January 18, 2015

AMEM: A Tool to Estimate Human Exposure from Fire Retardants and Plasticizers in Plastics


A lot of plastics in our homes are not just made of polymer.  Many have monomeric additives within them and sometimes these additives represent a significant portion of the mass of the plastic.   For example, the plasticizer in flexible PVC  is typically greater than 10% and can be has high as 30-40%.  Phthalates are often used as the plasticizer to make PVC flexible.    All phthalates are monomers and all will diffuse out of the plasticized PVC matrix given enough time to do so.   The typical mechanism for plasticized items used indoors is for the plasticizer to diffuse out of the PVC and then partition into house dust.  Must of house dust is composed of our shredded skins cells and are therefore expected to be lipophilic.  The plasticizer would partition into the dust and the dust would be distributed within the residence.  Of course, some will be cleaned and removed but it is difficult to determine how much might be removed before human exposure may occur.  What we do know it that there is a significant amount of phthalates in the dust of some homes. The dominant source of exposure is anticipated to be hand to mouth ingestion of this dust with kids doing more than adults but adult doing at least some of this type of ingestion our entire lives.  

While researching this issue on the Internet I found the following data for a PVC “geomembrane” which is another way of saying a canal liner that was used to contain water.   They report that the PVC films was 30-35% plasticizer but I did not find out which plasticizer they used.   Their report shows the following loss of plasticizer to the water.

Service
Years
2
4
5
9
14
19
Original
Specification
Value
Plasticizer
Content
Remaining
92.1%
67.7%
67.8%
65.6%
58.0%
54.3%
None

I am frankly not sure how this relates to plasticizer diffusing from a PVC product to its surface to be then transported within the indoor environment via dust; however, it is an interesting study.

It should be mentioned that plasticizers are the not only monomers to diffuse out of plastic.  Indeed, any monomer would be expected to do so including flame retardants.   Indeed, some flame retardants appear in the plastics of our electronic cases at concentrations around 10% and are thus prime candidate for the migration-to-dust-to-ingestion pathway.

It turns out there has been a tool around for some time that was developed in 1989 by Arthur D. Little, Inc for the  EPA Office of Pollution Prevention and Toxics, Economics, Exposure, and Technology Division, Exposure Assessment Branch (EAB).   It is an “oldie but goodie”.  Indeed, it is a DOS program that EPA claims can be run in the modern Windows (7 or 8? ) environment.    I keep an old PC around just to run this old stuff (it runs DOS 6.2, Windows 3.2 and Windows XP) so I am not sure if AMEM will run on Windows 7 or 8.  If anyone has any experience with this, please let me know.

You can download and learn about it at:  http://www.epa.gov/oppt/exposure/pubs/amem.htm   A cut and paste of some of the Q&A from this EPA site is presented below:
The model assumes:
·    The chemical is homogeneously distributed throughout the polymer and is not initially present in the phase external to the polymer,
·    Migration of the chemical is not affected by the migration of any other chemical or by the penetration into the polymer of any component of the external phase,
·    The migration is isothermal,
·    and Fick's law of diffusion and convective mass transfer theory applies.
How Does AMEM Work?
AMEM is a DOS-based software product developed in 1989 that uses a family of mathematical equations that address simple and complex scenarios for chemical migration through a specific polymer. The more complex equations require more input data. Using the model, you may:
·    Develop migration estimates,
·    Consider the effect of chemical partitioning between the polymer and the external phase,and
·    Consider the effect of mass transfer resistances in the external phase.
In all cases the model estimates the fraction migrated (i.e., the fraction of the chemical initially present in the polymer that migrates) after a specified duration. This model only provides one parameter needed to estimate exposure. The user must then use other equations and/or models to estimate exposure.
What Do I Need to Use AMEM?
Polymer category (i.e., Silicone Rubber, Natural Rubber, LDPE, HDPE, Polystyrene, or unplasiticized PVC) or diffusion coefficient of the polymer.  (MAJ Note: This is the first I have noticed the category of unplasticized PVC.  I think, however, the program and documentation might still present some insight for the estimation of phthalate out of PVC).
·    Molecular weight of additive.
·    Total polymer sheet thickness (cm) External phase (i.e., air, water, or solid)
·    One or two sided migration Time frame of interest
What Type of Computer System Do I Need?
·    Processor - IBM-compatible computer with a Pentium microprocessor (minimum speed: 33 MHZ)
·    Memory - 640K
·    Hard disk space - 2 MB
·    Operating System - AMEM is a DOS-based program, however, in can be run in a Windows environment by using key strokes not a mouse.

(Jayjock Note:  640k of memory and 2 MB of hard disk space really shows how far we have come since 1989).
 
If anyone knows of a better tool to answer these questions, please drop me a line and I may write about it here.





Monday, January 12, 2015

Inhalation Model Research Needs

Hopefully the readers of this blog are now convinced of the importance of models that estimate the exposure potential of workers and other humans.    My sense is that we are only scratching the surface of the potential value of these models, however.   Indeed, many folks still reject the idea of modeling in favor of the direct approach of measuring which they consider the gold standard.  To the extent that we do not have the tools to feed our models with proper inputs, they are correct.   It is by now an old “Catch 22”:  it costs more to get the model inputs for any single question at hand than it does to directly measure, so we almost always directly measure.   The reality is that once what have the modeling input parameters we can use them in many different scenarios so that in the end it would generally be a much more cost-effective way forward.  
Please don't get me wrong, models are still very valuable tools but they could be so much more powerful and useful if properly developed with research done as a public works project in the general shared interest.  
Indeed, it was with this in mind that I spent quite a bit of time in Italy starting in 2004 in an effort to organize such an effort.  Given an introduction by my friend and colleague, Bert Hakkinen, I began working with Stylianos Kephalopoulos, who was head of the Physical and Chemical Exposure Unit of the Institute for Health and Consumer Protection (IHCP/PCE) at the Joint Research Centre (JRC) of the European Commission in Ispra which is just north of Milan.  
The REACh regulation was happening in Europe and it was obvious to many that exposure assessment tools needed to be developed to help with the implementation of this ground-breaking legislation.
Together we first organized a pre-workshop to setup the questions and issues and then  later a series of 5 simultaneous workshops on the general subject of modeling that happened in June 2005 in Intra Italy.   I was an organizer and moderator for the pre-workshop and the workshop on model “Source Characterization” since I had always seen this as vital research need.   For this workshop on Source Characterization, we invited and gathered modelers from all over the world with the following folks coming to the workshop:

Arvanitis A.   JRC/IHCP/PCE (EU) (Rapporteur)
Bruinen de Bruin Y    JRC/IHCP/PCE (EU)
Delmaar C.   RIVM (Netherlands)
Flessner C.   EPA (USA)
Hanninen O.   KTL (Finland)
Hubal E. Cohen   EPA (USA)
Jantunen M.   KTL (Finland)
Jayjock M.   The Lifeline Group (USA) (Moderator)
Kephalopoulos S.   JRC/IHCP/PCE (EU) (Co-ordinator)
Koistinen K.   JRC/IHCP/PCE (EU)
Little J.   Virginia Polytechnic Inst. (USA)
Mason M.   EPA (USA)
Matoba Y.   Sumitomo (Japan)
McKone T.   University of California (USA)
Nazaroff W.   University of California (USA)
Pandian M.    Infoscientific.com (USA)
Price P.        The Lifeline Group (USA)
Shade W.     Rohm and Haas, Co (USA)
Sheldon L.   EPA (USA)
Sutcliffe R.  Health Canada (CAN)
Won D.        National Research Council (CAN)
Wu K.          University of Taiwan (Taiwan)
Zhang Y.     Tsinghua University (China)

Quite a few other fine modelers could not make this workshop but contributed to the report.

I must tell you that this was a remarkably talented and energetic group and it was all I could do to keep up with the ideas coming out of this critical mass of world-class modelers.   The main conclusions of our deliberations are presented below:

“It is the recommendation of the Workshop participants that the work products presented herein to be used in the systematic development of human exposure models for their use in a tiered approach to exposure/risk assessment. 
Given that the 5 bins presented herein represent a consensus taxonomy or universe of sources, the workshop participants advise that a reasonably representative subset of this comprehensive listing be selected for specific laboratory analysis and model development. It was further suggested that exposure models designed to describe these sources of exposure and the transport and fate of substances should be constructed using a step-wise approach as outlined in this report.”

In essence the group determined that there was no reasonably inclusive outline description of source types and certainly no systematic research effort to characterize them.   The two-day workshop resulted in the following primary work products: 
  • Identification of existing source sub-models: presented in the pre-workshop report and references
  • A defined Taxonomy of Sources
  • Identification and definition of the attributes and characteristics of First Principle, Mechanistic Source and Transport/Fate Models to be developed in a tiered approach 
All of the details of these outcomes are described in the 104 page workshop report which I will send to anyone requesting it: mjayjock@gmail.com
This work and report are almost 10 years old.   From my perspective some progress has been made primarily from the work of Drs. Bill Nazaroff and John Little and their colleagues in the characterization of indoor air sources.    I think even Bill and John will admit that the vast majority of work that we outlined in this workshop has not been started.   From my perspective the effective implementation of REACh continues to limp along without these tools.  Any effective re-authorization of TSCA would also require the fruits of this research. 
As usual nothing really is going to happen without committed resources ($).   I simply plan to pull this report out every few years, dust it off and remind folks that it is here.   If we, as a society, are really serious about doing a comprehensive job of human health risk assessment to chemicals we will ultimately need to develop these cost-effective tools.


Sunday, January 4, 2015

Acceptable Risk: Personal Gut Check

As Exposure and Risk Assessors we typically sit at the center point of imploding pressures.   We have distinct responsibility to at least 3 external groups; namely, our charges, our employers and our profession.   In addition, I believe from a moral and ethical perspective, we need to be responsible to ourselves. 

Our charges are the folks on the receiving end of any exposure to chemicals that might result from our analyses and recommendations.   When we do risk assessments for this group we are often forced into the role of risk manager and this has some strong implications for us.  Please let me explain why I say this. 
 
When we commit to doing a risk assessment, it of course needs to come to a conclusion relative to the risk.   In the world of Industrial Hygiene this is often the comparison of the predicted or measured exposure (EXP) to the Occupational Exposure Limit (OEL).   When EXP/OEL is less (hopefully much less) than 1, the risk assessment has a happy face: J.   This risk is considered to be “acceptable” or at least not unacceptable (see previous blogs on this topic).   At this point the risk assessment report normally gets written.   If, however, the ratio of EXP/OEL is 1 or greater than unity we have: L.   Typically, this means the application of some risk management option to choke down the EXP so that we can have a J and the report gets written at that point.   In a long career I do not ever remember writing a report that said the risk was unacceptable.  In essence the risk assessor is forced to become a risk manager before his or her report can be written with a happy face.

All of this presupposes that the EXP was accurately estimated and the OEL is an appropriate measure of an exposure level that presents an acceptable risk.   Whether this is true or not are issues for other discussions but for purposes of this treatise; let us assume it is true.

In doing a risk assessment we need to come to grips with issues around the safety of our charges and the needs of our employers.   We do this while staying true to the precepts and standards of our profession.   Almost all of this typically involves discussion and decisions involving money and where we as a professional draw the line on what we “believe” is reasonably acceptable risk.

I have never found balancing these needs to be easy.  We as the assessors are, of course, human with our own set of needs.   Some of which I am listing below:
  • To assure the health and safety of the workers relative to this risk
  • To surprise and delight the boss or client relative to our ability to add value to the organization
    • remain "promotable" as an employee
    • remain "on contract" as a consultant
  • To assure our livelihood:
    • maintain income and a lifestyle
    • provide well for our families
    • educate our children
  • To feel good about what we are doing in our careers and as a human being

It really is not simple to balance these.  For example, you could stay employed and keep you job but still be considered a "problem" with your employer.  On the other hand, you have to be part of the team with your boss and be invested in the success of the organization.

For me this all boils down to a specific and not very complicated personal  “gut check” I have for any risk assessment that I am willing to write and sign:  Would I allow a member of my family to be exposed to the occupational scenarios I am declaring to be acceptably safe?    If I would then I feel like I can endorse the analysis and conclusions.

I am not suggesting that this is the only standard; it is simply mine.  I admit to being extremely fortunate and grateful in my career because this standard never forced me to decide between employment and professional and personal integrity with my previous employer of 35 years or my current or past clients. Somehow I only wind up with the type of client that seems to subliminally understand and appreciates this position.  If I lived in another time, place or circumstance, I am frankly not sure I could have maintained this standard.   My point is that I believe we should all have a “gut check” to gauge our bottom line with clients, employers and ourselves.  I believe we all need to keep our personal standard in mind when we are put into the position of determining and ultimately managing acceptable risk for other human beings. 

The general subject of “acceptable risk” has generated a remarkable amount of interest and comment from readers of this blog.    Indeed, last week’s blog is on track to be the most popular post in almost 90 previous offerings.   We have begun work on what will probably be a Roundtable on this subject at the 2016 AIHA Conference in Baltimore.  

Human Health Risk Assessment is, after all, a human pursuit and, in the end, quite personal.  I would love to hear from folks reading this blog about your personal “gut checks” in doing human health risk assessment.