LinkedIn

Sunday, February 22, 2015

Inhalation Exposure from Spills Modeling – Part II

A few years ago I was fortunate enough to be hired by Dr. Perry Logan of the 3M Company to do some solvent spill modeling.   Perry is a great collaborator and this partnership resulted in a considerable amount of experimental work and a paper published online and hard copy as an “open access” document in the Annals of Occupational Hygiene.   For me the really neat part about this work was that the laboratory experiments were done by very savvy technologists and the data generated really showed the validity of the modeling approach.  I would be happy to send a pdf of this publication and the Word document of the extensive supplementary material published online (mostly laboratory test details) to anyone who asks at: mjayjock@gmail.com

During this work we looked at evaporating liquids with constant sources such as might occur from an open vessel or deep pool. We also examined first-order decreasing sources such as typically thin spills as they initially spread out on the floor or counter top.  The first type of source (i.e., constant) would be generally applicable to "non-spill" scenarios with relatively deep evaporating liquids.

Some of the important learnings from this work for me are:

These solvents evaporated so rapidly after being spilled that the exposure was essentially a bolus occurring in a very short time frame (minutes).   As I have written in previous blogs, bolus exposures are not often or well considered by the IH and risk assessment community within most workplace scenarios.

Given no STEL or Ceiling value for any particular compound of interest, one was compelled to use the ACGIH Excursion limit as outlined in the TLV Booklet:

“Excursions in worker exposure levels may exceed 3 times the TLV-TWA for not more that a total of 30 minutes during a workday, and under no circumstances should they exceed 5 times the TLV-TWA, provided that the [8 hour] TLV-TWA is not exceeded.”

In the case of these very volatile solvents this de facto exposure limit was exceeded during 1 L spills by all of the solvents under consideration because they evaporated so quickly. 

Given that situation, evacuation is the most appropriate response to a spill with a quantity beyond a determined threshold.   That threshold can be estimated using the techniques described in this paper and supplementary material.

The practical conclusions from this work of potential use to practicing Industrial Hygienists are presented below:

Clearly, the chamber methods used in this work will not be available to all Industrial Hygienists.  However, the critical factor for the determination of both constant and first-order decreasing evaporation rates for single-component liquids resides in the simple gravimetric measurement of evaporative solvent weight loss with time. The practicing hygienist should consider conducting gravimetric measurement of evaporation rates for both types of models (constant rate for constant sources and decreasing rates for spills) using tools that should be readily available to most investigators. This would involve placing a small container on a scale accurate to within 0.1 g in a laboratory hood. The liquid of interest would be placed in the container and mass loss recorded versus time. The type of scenario to be modeled would determine the specific experimental setup. Constant emission sources, such as an open vessel with evaporating solvent could be simulated with a petri dish or bottle cap with a small surface area to volume ratio in an attempt to produce a constant rate of evaporation for a constrained surface area. Simulation of a spill (an exponentially decreasing source) might use a petri dish, paint can lid, or floor tile in which the solvent thickness represents a realistic value by using a spill volume, which barely reaches the perimeter of the selected surface area. For the constant evaporation rate experiments, given a known surface area and representative air movement over the liquid (measured with an anemometer), one could readily calculate the evaporation rate per unit area. For the exponentially decreasing sources, alpha (α) could also be calculated. The details of both calculation techniques are described in the available supplementary material.

This blog lives on your questions and your experience as exposure assessors relative to the application of modeling techniques.  It will ultimately slow and stop without this participation.  Please let me know what you think about the above information, whether you might consider doing this and what would keep you from it.   That could, in turn, help me to determine what information or specific approach or calculation advice, if any, might be needed.


Sunday, February 15, 2015

Evaporation Rate of Small Spills

When we spill a volatile chemical we have a problem relative to potential inhalation exposure.   Depending on the specific situation we need to answer the following questions:   Do we clean it up immediately or do we simply stand back and let it evaporate?  Do I (or we) need to get out of the room or the area?   What is my exposure/risk potential and the potential of others in the area if I do not evacuate and stay to clean it up?

It is such questions that modelers live on!   In order to answer them, however, they need some reasonable input relative to the evaporation rate, size of the spill and the room, the ventilation rate and the rate of linear air movement within the room.   A very useful model to use for this scenario is the two-zone model developed and promoted by Dr. Mark Nicas and available on the slick freeware spreadsheet  IH MOD.  Please see some of my previous blogs on these if you interested in learning more about them.   Like all models, the 2 zone model needs to be fed with the appropriate inputs.

The basic idea around the two zone model is to put both the source of the exposure and the breathing zone of the exposed person into the inner or near-field zone.  It is the geometry of the scenario that determines the size and shape of the near-field.  For example, someone spraying the hair on their head with hair-spray would have a near-field well described by a sphere of about 0.5 m diameter around their head.   Someone cleaning up a spill by hand might have a near-field described as a hemisphere with a radius equal to an arm’s length.

Some of the inputs needed for the 2 zone model of a spill are usually readily available.    Let’s say our hypothetical spill of liquid has a volume of 100 cm3 .   If we assume a circular spill of 1 m in diameter, that calculates out to a spill thickness of about 1-2 mm. That seems reasonable assuming it is on a relatively nonporous surface like tile or finished cement.   We can also approximate the ventilation rate.  If it is laboratory or industrial area then 3-5 mixing air-changes per hour seems reasonable, if it a residential area with doors and windows closed, then about 1/10th this value would be typical.   Room size is easy to get.   So what are we missing?  Ans:  evaporation rate.

This brings us to the subject of sub-models or models that are used to feed other models.   Some evaporating sources are essentially constant within a time frame of any one-day exposure.   A prime example would be an open drum.   Spills, on the other hand, are typically not constant.   That is, they typically shrink with time as they evaporate.   As they shrink their rate of generation decreases until is ceases entirely when it is all gone.  A model that seems to do a reasonably good job of mathematically describing this situation is the first-order decay model.   First-order decay was also discussed in detail in a previous blog if you are interested in going back.   It is described here briefly as a rate that is dependent on and directly proportional to how much of the original spill remains.   That is, the rate is maximized in the beginning at 100% when all or 100% of the spill is available to evaporate.   After 10% evaporates the evaporation rate is 90% of the maximum.   After 50% evaporates the rate is half that of the original.  The time it takes to get to 50% is the half-life of this first-order kinetic model.    After 7 half-lives less than 1% of the spill remains and the evaporation rate is less than 1% of the maximum.   Theoretically, you never get to zero while in reality you certainty do.  After 7-8 half lives it is essentially gone.   In any event, this model appears to be a credible job of describing the evaporation rate of spills when compared against real world data.

The basic model is:

Evaporation Rate = (Initial Evaporation Rate)(exp(-α t))

It can be shown that:
Evaporation Rate = (α )(M0) (exp(-α t))

Where:
  • M0 = initial mass of the spill
  • α = the evaporation rate constant or the proportion of the mass evaporating per unit time usually per minute.  Thus α = 0.10 means that 10% of the initial (or remaining mass) will evaporate every minute.

Drs. Nicas and Keil wrote a seminal paper about α  in the context of modeling spills which I will be happy to send to whomever asks me for it at:  mjayjock@gmail.com.

This paper has a number of values for α for common laboratory solvents and also forwards the following data fitting relationship for other volatile organic compounds:

  α, min-1 = 0.000524 Pv + 0.0108  SA/VOL
Where:
  • Pv = saturation vapor pressure in mm Hg at 20 C
  • SA/VOL = initial surface area to volume ratio of the spill, cm-1

An experimental evaluation of this algorithm with n-pentane is presented in the book:  Mathematical Model for Estimating Occupational Exposure to Chemicals, 2nd Edition, AIHA Press.

What is so useful about this model, as implemented by IH MOD, is that it provides estimates of the PEAK breathing zone concentration along with the time-weight average for whatever time frame you want (e.g., 15 minutes or 8 hours) to be compared with whatever exposure limit (Ceiling, 15 minute STEL or 8 or 24 hour TWA) you would deem appropriate.

In a later blog I will discuss other work done and published on spill modeling in an industrial setting where the risk was driven by peak exposures.


Monday, February 9, 2015

Introduction to Monte Carlo Simulations for Exposure Assessment

Exposure assessment modeling has its Stars.   Last week I discussed some of the work of Dr. Zhishi Guo, this week it is Thomas W. Armstrong PhD, CIH and AIHA Fellow.   I am happy and grateful to count Tom as a friend and colleague.  He is a steady and tireless worker in Industrial Hygiene education and in the development of tools for the profession of exposure assessment.  He gives freely of his time in these endeavors and his overall approach and judgment in the realm of exposure and risk assessment is first-rate   It has been his most recent contribution that is being highlighted here today; namely, an article in this month’s AIHA Synergist on Monte Carlo Simulation (MCS) and its role in making decisions.   

Like all good educational pieces, it is clearly written making it easy for the beginner to understand.  Anyone can see it online at:  http://synergist.aiha.org/Monte-Carlo-Risk-Assessment

I just wanted to highlight some of the lower level details of the process along with some of my basic experience with the commercial software mentioned in the article.   For me the really neat aspect of most MSC software is that it sits on top of Microsoft Excel as an add-on. Many, if not most, of us know and love Excel as a remarkably capable program that can do some very complex calculations.   Indeed, given all the functionality that it has, Excel should be a basic tool of any technologist.  What it cannot do easily by itself is treat any cell as a DISTRIBUTION rather than a single value or a single value resulting from a calculation within the spreadsheet.   Let us assume that we have, say, cell B3 in a normal spreadsheet with the value 7.  What MCS add-on software can do is to easily and simply allow you to describe that cell (B3) as a DISTRIBUTION.  Let us say we want it to be a normal (Gaussian) distribution with a mean of 7 and a standard deviation of 2.    The MCS add-on will allow you to do this.  You could have just as easily chosen a uniform distribution with a minimum of 2 and a maximum of 12.   In the case of the normal distribution mean = 7 sd = 2 the MCS software samples the cell and gets a value constrained by the distribution; that is, it literally samples the distribution.  Say, for example, this single sample returns the value: 5.2.   That value could be used elsewhere in the spreadsheet with other values that were either constant or also samples from a defined distribution.   After all this, the spreadsheet has done exactly ONE set without any iteration.   Set the MCS software to 10,000 iterations and the PC goes through the sampling and calcuations again and again automatically and never gets tired.  It will keep a record of the output distributions of the cells that you choose.   Anything that can be represented as a single deterministic outcome calculation in Excel can now be presented as a DISTRIBUTION of outcomes. It is, IMHO technological magic!

My first IBM compatible PC had an early generation 8086 CPU which was well before 286, 386 or Pentium CPUs which are now considered ancient in our world of multi-core monster processors. The commercial MCS software in those early days cost a few hundred bucks and came on a single 3.5” floppy.  I would set up a spreadsheet to do 10,000 MCS iterations and leave to have dinner!   It was usually complete when I returned an hour or two later.   Today, a typical CPU will do 10,000 iterations in seconds!

Today the MCS commercial software is also considerably more expensive and typically requires an annual fee for commercial (not educational) customers to keep it up to date through the various evolving versions of Excel.  If you have the need for all its “bells and whistles” (and there are a lot) and a company or client to purchase it then I think the commercial software makes sense.  If not, then you may want to look into the freeware versions that Tom mentions in his piece.

If any of the readers of this blog have considerable experience with these freeware versions, either Simular or Simulacion, I would love to hear the details of your experience which I would be happy to pass on here.


Sunday, February 1, 2015

Dr. Zhishi Guo and the Programs IAQX, PARAMS v1.1 and More

I can count the folks that I know who have provided free, user-friendly modeling programs in the service of human health exposure/risk assessment on the fingers of my two hands. Dr. Zhishi Guo is one such person. He retired from US EPA last March and is currently serving as the Deputy Editor for journal Indoor and Built Environment while also doing some consulting. Last week this blog highlighted one of his programs, i-SVOC, as an important tool. I found it to be a remarkable piece of freeware with a very slick interface that attempts to determine the time course and fate of any semi-volatile organic compound (SVOC) of concern in all the various compartments (sources, sinks, settled dust and airborne particulate matter) extant indoors.

Although I have yet to really put it through its paces, Shen Tian, P.E. Environmental, Health and Safety Engineer at Bayer Material Science, who initially identified it to me, has worked with i-SVOC and reports that it requires a number of input variables to feed it properly. Shen found that another program from Dr. Guo, PARAMS 1.0 can be used to estimate quite a few of the parameters required for input. Using a Google search I had trouble locating a link for PARAMS 1.0 that was not broken. I emailed Dr. Guo asking for his help. He responded:

“Last year, I updated programs IAQX and PARAMS to version 1.1 for compatibility with the Windows 7 and 8. The installation files for the new version can be downloaded from http://www.epa.gov/nrmrl/appcd/mmd/iaq.html”.

I checked it out and indeed there is a wealth of freeware and information on this page. I downloaded everything I could from it. The descriptions on this page for the three programs offered on it are reproduced below:

IAQX v1.1: IAQX stands for Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure. It is a Microsoft Windows-based indoor air quality (IAQ) simulation software package that complements and supplements existing IAQ simulation programs (such as RISK) and is designed mainly for advanced users. The IAQX package consists of five stand-alone simulation programs. In addition to performing conventional IAQ simulations, which compute the time/concentration profile and inhalation exposure, IAQX can estimate the adequate ventilation rate when certain air quality criteria are provided by the user, a unique feature useful for product stewardship and risk management. IAQX will be developed in a cumulative manner and more special-purpose simulation programs will be added to the package in the future.

PARAMS 1.1: This Microsoft Windows-based computer program implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into seven categories:

1. the properties of indoor air,
2. the first-order decay rate constants for solvent emissions from indoor coating materials,
3. gas-phase, liquid-phase, and overall mass transfer coefficients,
4. molar volume,
5. molecular diffusivity in air, liquid, and solid materials,
6. solid-air partition coefficient, and
7. vapor pressure and volatility for pure organic compounds and petroleum-based solvents and the properties of water.

Potential users include those who develop or use IAQ and exposure models, and those who develop or use quantitative structure-activity relationship (QSAR) models. In addition, many calculations are useful to researchers in areas other than indoor air quality. Users can benefit from this program in two ways: first, it serves as a handy tool by putting commonly used parameter estimation methods in one place; second, it saves users time by taking over tedious calculations.

RISK: The latest published version of the RISK computer model is designed to allow calculation of individual exposure to indoor air pollutants from sources. The model runs in the MS-Windows operating environment and is designed to calculate exposure due to individual, as opposed to population, activity patterns and source use. The model also provides the capability to calculate risk due to the calculated exposure.

Even as someone with a very keen and long-standing interest in inhalation exposure modeling, I must admit that I was only vaguely aware of the availability of these tools. Along with i-SVOC, they form a remarkable legacy for the work of Dr. Guo and his colleagues. I am highlighting them here to point out their existence in the hopes that more folks will use them. We can only hope that Dr. Guo continues to stay active in the science and that the EPA will continue to sponsor and encourage exposure assessment scientists of his caliber.

Sunday, January 25, 2015

i-SVOC (2014): A Modern and Remarkable Advancement over AMEM

For me the best part about writing a blog is the networking and interaction with the colleagues.   After last week’s blog about an old but, I thought, still useful software tool (AMEM), I was contacted by Shen Tian, P.E., Environmental, Health and Safety Engineer at Bayer Material Science.   Shen advised that he had found and uses the EPA i-SVOC (2014) (Note: SVOC stands for Semi-Volatile Organic Compounds of which phthalates and flame retardants are prime examples). Shen advises that this model can do similar estimates of SVOC (e.g., flame retardant and phthalates) emitting from products (substrate).   Copying from the EPA web site on this program we are told:

“Computer program i-SVOC Version 1.0 is a Microsoft Windows-based application for dynamic modeling of the emissions, transport, sorption, and distribution of semi-volatile organic compounds (SVOCs) in the indoor environment. This program covers the following indoor media or compartments:
·         air,
·         sources,
·         sinks (i.e., sorption by interior surfaces),
·         contaminant barriers,
·         suspended particles, and
·         settled particles.”
 One can find and download the program and documentation   at:  http://www.epa.gov/nrmrl/appcd/mmd/i-sovc.html.  Shen went on in his email to discuss his experience with the model.   He notes that for the emission of SVOC from a plastic it depends on the following user inputs with I have copied and listed below:
  • D: diffusion coefficient of SVOC in the substrate,
  • Kma: partition coefficient of SVOC between solid and air,
  • ha: mass transfer coefficient in the air
  • C0: the initial SVOC concentration in the substrate. 

I have to admit I had not heard of the i-SVOC model so I downloaded it and installed it in my Window 8.1 laptop (reportedly, it will work on Windows 7 or XP as well).   My first impression of this program:  Holy Cow!   If AMEN is High School this is Graduate School!  It is a remarkable piece of freeware with a very slick interface that attempts to determine the time course and fate of an SVOC of concern in all the various compartments (sources, sinks, settled dust and airborne particulate matter) extant indoors.  The documentation was written by Dr. Zhishi Guo and is presented in a 2013, 80 page PDF file that downloads with the program.  I have known of Dr. Guo's work for many years and he is a brilliant modeler who has done outstanding work at the EPA in constructing and sharing many very useful and relatively user-friendly models.   The i-SVOC model was also reviewed and tested by some of the best technical minds working in this field.   By all indications, this is a first- rate model and program but it will require a significant level of study and time from anyone wishing to use it.  To make this task easier, a number of demo input files are provided with the download to help the new user.
In order to feed this model relative to its inputs, Shen Tian advises that the companion modeling program (PARAMS 1.0) from EPA can be used to estimate quite a few parameters required for input.    I went looking for this program and found at link at: (http://www.epa.gov/ordntrnt/ORD/NRMRL/appcd/mmd/PARAMS.zip); however, when I tried the link it was broken with the dreaded 404 error.   I am looking into reviving this link and will let you know what happens, if you like, via email correspondence.   If worst comes to worst I will get a copy of PARAM 1.0 and make it available to anyone who sends me a request at mjayjock@gmail.com.
In his work Shen reports that the most challenging part of using i-SVOC is the estimation of D since there has not been a lot of testing done on SVOC/pairs.  To help this particular gap he tells us that Liu and Guo et al recently published a testing method for estimating D and Kma in Atmospheric Environment 89 (2014) 76-84 , entitled:  Experimental method development for estimating solid-phase diffusion coefficients and material/air partition coefficients of SVOCs which he found helpful in filling the gap. 
In all, i-SVOC represents a remarkable accomplishment and resource for anyone doing this type of human exposure assessment of SVOC indoors.  I am indebted to Shen Tian for pointing us toward it.   If you have a need to perform this type of analysis, I believe that it would be well worth your effort to learn and use it.

Sunday, January 18, 2015

AMEM: A Tool to Estimate Human Exposure from Fire Retardants and Plasticizers in Plastics


A lot of plastics in our homes are not just made of polymer.  Many have monomeric additives within them and sometimes these additives represent a significant portion of the mass of the plastic.   For example, the plasticizer in flexible PVC  is typically greater than 10% and can be has high as 30-40%.  Phthalates are often used as the plasticizer to make PVC flexible.    All phthalates are monomers and all will diffuse out of the plasticized PVC matrix given enough time to do so.   The typical mechanism for plasticized items used indoors is for the plasticizer to diffuse out of the PVC and then partition into house dust.  Must of house dust is composed of our shredded skins cells and are therefore expected to be lipophilic.  The plasticizer would partition into the dust and the dust would be distributed within the residence.  Of course, some will be cleaned and removed but it is difficult to determine how much might be removed before human exposure may occur.  What we do know it that there is a significant amount of phthalates in the dust of some homes. The dominant source of exposure is anticipated to be hand to mouth ingestion of this dust with kids doing more than adults but adult doing at least some of this type of ingestion our entire lives.  

While researching this issue on the Internet I found the following data for a PVC “geomembrane” which is another way of saying a canal liner that was used to contain water.   They report that the PVC films was 30-35% plasticizer but I did not find out which plasticizer they used.   Their report shows the following loss of plasticizer to the water.

Service
Years
2
4
5
9
14
19
Original
Specification
Value
Plasticizer
Content
Remaining
92.1%
67.7%
67.8%
65.6%
58.0%
54.3%
None

I am frankly not sure how this relates to plasticizer diffusing from a PVC product to its surface to be then transported within the indoor environment via dust; however, it is an interesting study.

It should be mentioned that plasticizers are the not only monomers to diffuse out of plastic.  Indeed, any monomer would be expected to do so including flame retardants.   Indeed, some flame retardants appear in the plastics of our electronic cases at concentrations around 10% and are thus prime candidate for the migration-to-dust-to-ingestion pathway.

It turns out there has been a tool around for some time that was developed in 1989 by Arthur D. Little, Inc for the  EPA Office of Pollution Prevention and Toxics, Economics, Exposure, and Technology Division, Exposure Assessment Branch (EAB).   It is an “oldie but goodie”.  Indeed, it is a DOS program that EPA claims can be run in the modern Windows (7 or 8? ) environment.    I keep an old PC around just to run this old stuff (it runs DOS 6.2, Windows 3.2 and Windows XP) so I am not sure if AMEM will run on Windows 7 or 8.  If anyone has any experience with this, please let me know.

You can download and learn about it at:  http://www.epa.gov/oppt/exposure/pubs/amem.htm   A cut and paste of some of the Q&A from this EPA site is presented below:
The model assumes:
·    The chemical is homogeneously distributed throughout the polymer and is not initially present in the phase external to the polymer,
·    Migration of the chemical is not affected by the migration of any other chemical or by the penetration into the polymer of any component of the external phase,
·    The migration is isothermal,
·    and Fick's law of diffusion and convective mass transfer theory applies.
How Does AMEM Work?
AMEM is a DOS-based software product developed in 1989 that uses a family of mathematical equations that address simple and complex scenarios for chemical migration through a specific polymer. The more complex equations require more input data. Using the model, you may:
·    Develop migration estimates,
·    Consider the effect of chemical partitioning between the polymer and the external phase,and
·    Consider the effect of mass transfer resistances in the external phase.
In all cases the model estimates the fraction migrated (i.e., the fraction of the chemical initially present in the polymer that migrates) after a specified duration. This model only provides one parameter needed to estimate exposure. The user must then use other equations and/or models to estimate exposure.
What Do I Need to Use AMEM?
Polymer category (i.e., Silicone Rubber, Natural Rubber, LDPE, HDPE, Polystyrene, or unplasiticized PVC) or diffusion coefficient of the polymer.  (MAJ Note: This is the first I have noticed the category of unplasticized PVC.  I think, however, the program and documentation might still present some insight for the estimation of phthalate out of PVC).
·    Molecular weight of additive.
·    Total polymer sheet thickness (cm) External phase (i.e., air, water, or solid)
·    One or two sided migration Time frame of interest
What Type of Computer System Do I Need?
·    Processor - IBM-compatible computer with a Pentium microprocessor (minimum speed: 33 MHZ)
·    Memory - 640K
·    Hard disk space - 2 MB
·    Operating System - AMEM is a DOS-based program, however, in can be run in a Windows environment by using key strokes not a mouse.

(Jayjock Note:  640k of memory and 2 MB of hard disk space really shows how far we have come since 1989).
 
If anyone knows of a better tool to answer these questions, please drop me a line and I may write about it here.





Monday, January 12, 2015

Inhalation Model Research Needs

Hopefully the readers of this blog are now convinced of the importance of models that estimate the exposure potential of workers and other humans.    My sense is that we are only scratching the surface of the potential value of these models, however.   Indeed, many folks still reject the idea of modeling in favor of the direct approach of measuring which they consider the gold standard.  To the extent that we do not have the tools to feed our models with proper inputs, they are correct.   It is by now an old “Catch 22”:  it costs more to get the model inputs for any single question at hand than it does to directly measure, so we almost always directly measure.   The reality is that once what have the modeling input parameters we can use them in many different scenarios so that in the end it would generally be a much more cost-effective way forward.  
Please don't get me wrong, models are still very valuable tools but they could be so much more powerful and useful if properly developed with research done as a public works project in the general shared interest.  
Indeed, it was with this in mind that I spent quite a bit of time in Italy starting in 2004 in an effort to organize such an effort.  Given an introduction by my friend and colleague, Bert Hakkinen, I began working with Stylianos Kephalopoulos, who was head of the Physical and Chemical Exposure Unit of the Institute for Health and Consumer Protection (IHCP/PCE) at the Joint Research Centre (JRC) of the European Commission in Ispra which is just north of Milan.  
The REACh regulation was happening in Europe and it was obvious to many that exposure assessment tools needed to be developed to help with the implementation of this ground-breaking legislation.
Together we first organized a pre-workshop to setup the questions and issues and then  later a series of 5 simultaneous workshops on the general subject of modeling that happened in June 2005 in Intra Italy.   I was an organizer and moderator for the pre-workshop and the workshop on model “Source Characterization” since I had always seen this as vital research need.   For this workshop on Source Characterization, we invited and gathered modelers from all over the world with the following folks coming to the workshop:

Arvanitis A.   JRC/IHCP/PCE (EU) (Rapporteur)
Bruinen de Bruin Y    JRC/IHCP/PCE (EU)
Delmaar C.   RIVM (Netherlands)
Flessner C.   EPA (USA)
Hanninen O.   KTL (Finland)
Hubal E. Cohen   EPA (USA)
Jantunen M.   KTL (Finland)
Jayjock M.   The Lifeline Group (USA) (Moderator)
Kephalopoulos S.   JRC/IHCP/PCE (EU) (Co-ordinator)
Koistinen K.   JRC/IHCP/PCE (EU)
Little J.   Virginia Polytechnic Inst. (USA)
Mason M.   EPA (USA)
Matoba Y.   Sumitomo (Japan)
McKone T.   University of California (USA)
Nazaroff W.   University of California (USA)
Pandian M.    Infoscientific.com (USA)
Price P.        The Lifeline Group (USA)
Shade W.     Rohm and Haas, Co (USA)
Sheldon L.   EPA (USA)
Sutcliffe R.  Health Canada (CAN)
Won D.        National Research Council (CAN)
Wu K.          University of Taiwan (Taiwan)
Zhang Y.     Tsinghua University (China)

Quite a few other fine modelers could not make this workshop but contributed to the report.

I must tell you that this was a remarkably talented and energetic group and it was all I could do to keep up with the ideas coming out of this critical mass of world-class modelers.   The main conclusions of our deliberations are presented below:

“It is the recommendation of the Workshop participants that the work products presented herein to be used in the systematic development of human exposure models for their use in a tiered approach to exposure/risk assessment. 
Given that the 5 bins presented herein represent a consensus taxonomy or universe of sources, the workshop participants advise that a reasonably representative subset of this comprehensive listing be selected for specific laboratory analysis and model development. It was further suggested that exposure models designed to describe these sources of exposure and the transport and fate of substances should be constructed using a step-wise approach as outlined in this report.”

In essence the group determined that there was no reasonably inclusive outline description of source types and certainly no systematic research effort to characterize them.   The two-day workshop resulted in the following primary work products: 
  • Identification of existing source sub-models: presented in the pre-workshop report and references
  • A defined Taxonomy of Sources
  • Identification and definition of the attributes and characteristics of First Principle, Mechanistic Source and Transport/Fate Models to be developed in a tiered approach 
All of the details of these outcomes are described in the 104 page workshop report which I will send to anyone requesting it: mjayjock@gmail.com
This work and report are almost 10 years old.   From my perspective some progress has been made primarily from the work of Drs. Bill Nazaroff and John Little and their colleagues in the characterization of indoor air sources.    I think even Bill and John will admit that the vast majority of work that we outlined in this workshop has not been started.   From my perspective the effective implementation of REACh continues to limp along without these tools.  Any effective re-authorization of TSCA would also require the fruits of this research. 
As usual nothing really is going to happen without committed resources ($).   I simply plan to pull this report out every few years, dust it off and remind folks that it is here.   If we, as a society, are really serious about doing a comprehensive job of human health risk assessment to chemicals we will ultimately need to develop these cost-effective tools.