LinkedIn

Sunday, February 22, 2015

Inhalation Exposure from Spills Modeling – Part II

A few years ago I was fortunate enough to be hired by Dr. Perry Logan of the 3M Company to do some solvent spill modeling.   Perry is a great collaborator and this partnership resulted in a considerable amount of experimental work and a paper published online and hard copy as an “open access” document in the Annals of Occupational Hygiene.   For me the really neat part about this work was that the laboratory experiments were done by very savvy technologists and the data generated really showed the validity of the modeling approach.  I would be happy to send a pdf of this publication and the Word document of the extensive supplementary material published online (mostly laboratory test details) to anyone who asks at: mjayjock@gmail.com

During this work we looked at evaporating liquids with constant sources such as might occur from an open vessel or deep pool. We also examined first-order decreasing sources such as typically thin spills as they initially spread out on the floor or counter top.  The first type of source (i.e., constant) would be generally applicable to "non-spill" scenarios with relatively deep evaporating liquids.

Some of the important learnings from this work for me are:

These solvents evaporated so rapidly after being spilled that the exposure was essentially a bolus occurring in a very short time frame (minutes).   As I have written in previous blogs, bolus exposures are not often or well considered by the IH and risk assessment community within most workplace scenarios.

Given no STEL or Ceiling value for any particular compound of interest, one was compelled to use the ACGIH Excursion limit as outlined in the TLV Booklet:

“Excursions in worker exposure levels may exceed 3 times the TLV-TWA for not more that a total of 30 minutes during a workday, and under no circumstances should they exceed 5 times the TLV-TWA, provided that the [8 hour] TLV-TWA is not exceeded.”

In the case of these very volatile solvents this de facto exposure limit was exceeded during 1 L spills by all of the solvents under consideration because they evaporated so quickly. 

Given that situation, evacuation is the most appropriate response to a spill with a quantity beyond a determined threshold.   That threshold can be estimated using the techniques described in this paper and supplementary material.

The practical conclusions from this work of potential use to practicing Industrial Hygienists are presented below:

Clearly, the chamber methods used in this work will not be available to all Industrial Hygienists.  However, the critical factor for the determination of both constant and first-order decreasing evaporation rates for single-component liquids resides in the simple gravimetric measurement of evaporative solvent weight loss with time. The practicing hygienist should consider conducting gravimetric measurement of evaporation rates for both types of models (constant rate for constant sources and decreasing rates for spills) using tools that should be readily available to most investigators. This would involve placing a small container on a scale accurate to within 0.1 g in a laboratory hood. The liquid of interest would be placed in the container and mass loss recorded versus time. The type of scenario to be modeled would determine the specific experimental setup. Constant emission sources, such as an open vessel with evaporating solvent could be simulated with a petri dish or bottle cap with a small surface area to volume ratio in an attempt to produce a constant rate of evaporation for a constrained surface area. Simulation of a spill (an exponentially decreasing source) might use a petri dish, paint can lid, or floor tile in which the solvent thickness represents a realistic value by using a spill volume, which barely reaches the perimeter of the selected surface area. For the constant evaporation rate experiments, given a known surface area and representative air movement over the liquid (measured with an anemometer), one could readily calculate the evaporation rate per unit area. For the exponentially decreasing sources, alpha (α) could also be calculated. The details of both calculation techniques are described in the available supplementary material.

This blog lives on your questions and your experience as exposure assessors relative to the application of modeling techniques.  It will ultimately slow and stop without this participation.  Please let me know what you think about the above information, whether you might consider doing this and what would keep you from it.   That could, in turn, help me to determine what information or specific approach or calculation advice, if any, might be needed.


Sunday, February 15, 2015

Evaporation Rate of Small Spills

When we spill a volatile chemical we have a problem relative to potential inhalation exposure.   Depending on the specific situation we need to answer the following questions:   Do we clean it up immediately or do we simply stand back and let it evaporate?  Do I (or we) need to get out of the room or the area?   What is my exposure/risk potential and the potential of others in the area if I do not evacuate and stay to clean it up?

It is such questions that modelers live on!   In order to answer them, however, they need some reasonable input relative to the evaporation rate, size of the spill and the room, the ventilation rate and the rate of linear air movement within the room.   A very useful model to use for this scenario is the two-zone model developed and promoted by Dr. Mark Nicas and available on the slick freeware spreadsheet  IH MOD.  Please see some of my previous blogs on these if you interested in learning more about them.   Like all models, the 2 zone model needs to be fed with the appropriate inputs.

The basic idea around the two zone model is to put both the source of the exposure and the breathing zone of the exposed person into the inner or near-field zone.  It is the geometry of the scenario that determines the size and shape of the near-field.  For example, someone spraying the hair on their head with hair-spray would have a near-field well described by a sphere of about 0.5 m diameter around their head.   Someone cleaning up a spill by hand might have a near-field described as a hemisphere with a radius equal to an arm’s length.

Some of the inputs needed for the 2 zone model of a spill are usually readily available.    Let’s say our hypothetical spill of liquid has a volume of 100 cm3 .   If we assume a circular spill of 1 m in diameter, that calculates out to a spill thickness of about 1-2 mm. That seems reasonable assuming it is on a relatively nonporous surface like tile or finished cement.   We can also approximate the ventilation rate.  If it is laboratory or industrial area then 3-5 mixing air-changes per hour seems reasonable, if it a residential area with doors and windows closed, then about 1/10th this value would be typical.   Room size is easy to get.   So what are we missing?  Ans:  evaporation rate.

This brings us to the subject of sub-models or models that are used to feed other models.   Some evaporating sources are essentially constant within a time frame of any one-day exposure.   A prime example would be an open drum.   Spills, on the other hand, are typically not constant.   That is, they typically shrink with time as they evaporate.   As they shrink their rate of generation decreases until is ceases entirely when it is all gone.  A model that seems to do a reasonably good job of mathematically describing this situation is the first-order decay model.   First-order decay was also discussed in detail in a previous blog if you are interested in going back.   It is described here briefly as a rate that is dependent on and directly proportional to how much of the original spill remains.   That is, the rate is maximized in the beginning at 100% when all or 100% of the spill is available to evaporate.   After 10% evaporates the evaporation rate is 90% of the maximum.   After 50% evaporates the rate is half that of the original.  The time it takes to get to 50% is the half-life of this first-order kinetic model.    After 7 half-lives less than 1% of the spill remains and the evaporation rate is less than 1% of the maximum.   Theoretically, you never get to zero while in reality you certainty do.  After 7-8 half lives it is essentially gone.   In any event, this model appears to be a credible job of describing the evaporation rate of spills when compared against real world data.

The basic model is:

Evaporation Rate = (Initial Evaporation Rate)(exp(-α t))

It can be shown that:
Evaporation Rate = (α )(M0) (exp(-α t))

Where:
  • M0 = initial mass of the spill
  • α = the evaporation rate constant or the proportion of the mass evaporating per unit time usually per minute.  Thus α = 0.10 means that 10% of the initial (or remaining mass) will evaporate every minute.

Drs. Nicas and Keil wrote a seminal paper about α  in the context of modeling spills which I will be happy to send to whomever asks me for it at:  mjayjock@gmail.com.

This paper has a number of values for α for common laboratory solvents and also forwards the following data fitting relationship for other volatile organic compounds:

  α, min-1 = 0.000524 Pv + 0.0108  SA/VOL
Where:
  • Pv = saturation vapor pressure in mm Hg at 20 C
  • SA/VOL = initial surface area to volume ratio of the spill, cm-1

An experimental evaluation of this algorithm with n-pentane is presented in the book:  Mathematical Model for Estimating Occupational Exposure to Chemicals, 2nd Edition, AIHA Press.

What is so useful about this model, as implemented by IH MOD, is that it provides estimates of the PEAK breathing zone concentration along with the time-weight average for whatever time frame you want (e.g., 15 minutes or 8 hours) to be compared with whatever exposure limit (Ceiling, 15 minute STEL or 8 or 24 hour TWA) you would deem appropriate.

In a later blog I will discuss other work done and published on spill modeling in an industrial setting where the risk was driven by peak exposures.


Monday, February 9, 2015

Introduction to Monte Carlo Simulations for Exposure Assessment

Exposure assessment modeling has its Stars.   Last week I discussed some of the work of Dr. Zhishi Guo, this week it is Thomas W. Armstrong PhD, CIH and AIHA Fellow.   I am happy and grateful to count Tom as a friend and colleague.  He is a steady and tireless worker in Industrial Hygiene education and in the development of tools for the profession of exposure assessment.  He gives freely of his time in these endeavors and his overall approach and judgment in the realm of exposure and risk assessment is first-rate   It has been his most recent contribution that is being highlighted here today; namely, an article in this month’s AIHA Synergist on Monte Carlo Simulation (MCS) and its role in making decisions.   

Like all good educational pieces, it is clearly written making it easy for the beginner to understand.  Anyone can see it online at:  http://synergist.aiha.org/Monte-Carlo-Risk-Assessment

I just wanted to highlight some of the lower level details of the process along with some of my basic experience with the commercial software mentioned in the article.   For me the really neat aspect of most MSC software is that it sits on top of Microsoft Excel as an add-on. Many, if not most, of us know and love Excel as a remarkably capable program that can do some very complex calculations.   Indeed, given all the functionality that it has, Excel should be a basic tool of any technologist.  What it cannot do easily by itself is treat any cell as a DISTRIBUTION rather than a single value or a single value resulting from a calculation within the spreadsheet.   Let us assume that we have, say, cell B3 in a normal spreadsheet with the value 7.  What MCS add-on software can do is to easily and simply allow you to describe that cell (B3) as a DISTRIBUTION.  Let us say we want it to be a normal (Gaussian) distribution with a mean of 7 and a standard deviation of 2.    The MCS add-on will allow you to do this.  You could have just as easily chosen a uniform distribution with a minimum of 2 and a maximum of 12.   In the case of the normal distribution mean = 7 sd = 2 the MCS software samples the cell and gets a value constrained by the distribution; that is, it literally samples the distribution.  Say, for example, this single sample returns the value: 5.2.   That value could be used elsewhere in the spreadsheet with other values that were either constant or also samples from a defined distribution.   After all this, the spreadsheet has done exactly ONE set without any iteration.   Set the MCS software to 10,000 iterations and the PC goes through the sampling and calcuations again and again automatically and never gets tired.  It will keep a record of the output distributions of the cells that you choose.   Anything that can be represented as a single deterministic outcome calculation in Excel can now be presented as a DISTRIBUTION of outcomes. It is, IMHO technological magic!

My first IBM compatible PC had an early generation 8086 CPU which was well before 286, 386 or Pentium CPUs which are now considered ancient in our world of multi-core monster processors. The commercial MCS software in those early days cost a few hundred bucks and came on a single 3.5” floppy.  I would set up a spreadsheet to do 10,000 MCS iterations and leave to have dinner!   It was usually complete when I returned an hour or two later.   Today, a typical CPU will do 10,000 iterations in seconds!

Today the MCS commercial software is also considerably more expensive and typically requires an annual fee for commercial (not educational) customers to keep it up to date through the various evolving versions of Excel.  If you have the need for all its “bells and whistles” (and there are a lot) and a company or client to purchase it then I think the commercial software makes sense.  If not, then you may want to look into the freeware versions that Tom mentions in his piece.

If any of the readers of this blog have considerable experience with these freeware versions, either Simular or Simulacion, I would love to hear the details of your experience which I would be happy to pass on here.


Sunday, February 1, 2015

Dr. Zhishi Guo and the Programs IAQX, PARAMS v1.1 and More

I can count the folks that I know who have provided free, user-friendly modeling programs in the service of human health exposure/risk assessment on the fingers of my two hands. Dr. Zhishi Guo is one such person. He retired from US EPA last March and is currently serving as the Deputy Editor for journal Indoor and Built Environment while also doing some consulting. Last week this blog highlighted one of his programs, i-SVOC, as an important tool. I found it to be a remarkable piece of freeware with a very slick interface that attempts to determine the time course and fate of any semi-volatile organic compound (SVOC) of concern in all the various compartments (sources, sinks, settled dust and airborne particulate matter) extant indoors.

Although I have yet to really put it through its paces, Shen Tian, P.E. Environmental, Health and Safety Engineer at Bayer Material Science, who initially identified it to me, has worked with i-SVOC and reports that it requires a number of input variables to feed it properly. Shen found that another program from Dr. Guo, PARAMS 1.0 can be used to estimate quite a few of the parameters required for input. Using a Google search I had trouble locating a link for PARAMS 1.0 that was not broken. I emailed Dr. Guo asking for his help. He responded:

“Last year, I updated programs IAQX and PARAMS to version 1.1 for compatibility with the Windows 7 and 8. The installation files for the new version can be downloaded from http://www.epa.gov/nrmrl/appcd/mmd/iaq.html”.

I checked it out and indeed there is a wealth of freeware and information on this page. I downloaded everything I could from it. The descriptions on this page for the three programs offered on it are reproduced below:

IAQX v1.1: IAQX stands for Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure. It is a Microsoft Windows-based indoor air quality (IAQ) simulation software package that complements and supplements existing IAQ simulation programs (such as RISK) and is designed mainly for advanced users. The IAQX package consists of five stand-alone simulation programs. In addition to performing conventional IAQ simulations, which compute the time/concentration profile and inhalation exposure, IAQX can estimate the adequate ventilation rate when certain air quality criteria are provided by the user, a unique feature useful for product stewardship and risk management. IAQX will be developed in a cumulative manner and more special-purpose simulation programs will be added to the package in the future.

PARAMS 1.1: This Microsoft Windows-based computer program implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into seven categories:

1. the properties of indoor air,
2. the first-order decay rate constants for solvent emissions from indoor coating materials,
3. gas-phase, liquid-phase, and overall mass transfer coefficients,
4. molar volume,
5. molecular diffusivity in air, liquid, and solid materials,
6. solid-air partition coefficient, and
7. vapor pressure and volatility for pure organic compounds and petroleum-based solvents and the properties of water.

Potential users include those who develop or use IAQ and exposure models, and those who develop or use quantitative structure-activity relationship (QSAR) models. In addition, many calculations are useful to researchers in areas other than indoor air quality. Users can benefit from this program in two ways: first, it serves as a handy tool by putting commonly used parameter estimation methods in one place; second, it saves users time by taking over tedious calculations.

RISK: The latest published version of the RISK computer model is designed to allow calculation of individual exposure to indoor air pollutants from sources. The model runs in the MS-Windows operating environment and is designed to calculate exposure due to individual, as opposed to population, activity patterns and source use. The model also provides the capability to calculate risk due to the calculated exposure.

Even as someone with a very keen and long-standing interest in inhalation exposure modeling, I must admit that I was only vaguely aware of the availability of these tools. Along with i-SVOC, they form a remarkable legacy for the work of Dr. Guo and his colleagues. I am highlighting them here to point out their existence in the hopes that more folks will use them. We can only hope that Dr. Guo continues to stay active in the science and that the EPA will continue to sponsor and encourage exposure assessment scientists of his caliber.