LinkedIn

Monday, September 26, 2016

Modeling Aerosol Exposures

I have gotten very few requests for blog topics since issuing the offer some time ago.  One such request has come from Richard Quenneville who asks how one might model aerosol or airborne particulate exposure.

Aerosols are certainly different from vapors or gases and the differences significantly complicate any attempt to model their exposure.   Even relatively small aerosol particles (microns or tenths of microns) are much larger than the individual molecules that make up a gas or vapor.  This gives them different properties at least in the following areas:
  • ·    They are typically more readily electrically charged especially if they are generated by sliding along a surface (e.g., dust from transporting powder in a pneumatic tube).  This charge can affect the size distribution and sampling of the aerosol.   
  • ·     With or without electrical charge, aerosol particles are often susceptible to combining with one another in a mechanism known as agglomeration.  This process, of course, changes the size distribution of the aerosol.
  • ·     Most important, because they have much more mass than vapor molecules they have a settling velocity which increases with increasing particle size and this, again, constantly changes the airborne size distribution of the aerosol with time.
  • ·     Because of their mass, airborne particles do NOT always make it into sampling orifices thus biasing their measurement.

Assuming agglomeration is not happening in a time frame that is relevant to the potential exposure, one can estimate any time-interval concentration of any aerosol particle or size range of particles.   This is done by taking the average settling velocity of the particles in that size range and accounting for their loss from settling.   Typically is this done for particles from 2 meters in height settling to the floor.  If one is sure that the breathing zone remains at say 2 meters high you can calculate the concentration loss from the horizontal volume at 2 meters height to say, 1.8 meters.   If you do this over small enough time intervals you can estimate a time-weighted average of aerosol concentration for any time period dependent on the nature of the aerosol source.

This brings up another complication of dealing with aerosol.  Compared to vapors, predicting the “release” or generation rate of particulate into the air is highly problematic because it depends on many undefined or unmeasured factors such as inter-particle forces.  I have never been able to use first-principle models to predict this rate. Instead, we have had success experimentally determining this rate from simulating the mechanism of generation, measuring the resultant concentrations and back calculating the rate of generation.  I personally think this is what needs to happen for the exposure assessment of nanoparticles released to the air in various scenarios.

Please note, settling is dependent on the particle size distribution of the generated aerosol.  I have seen situations in plants that were literally “particle fountains” with particle size distributions with a significant portion of the particles were greater than 100 microns.  These particles hit the floor in a time frame of seconds which dramatically lowers the total aerosol mass/volume.   Particles on the other end of the spectrum, e.g., nanoparticles, are going to essentially remain airborne and not settle at an appreciable rate in most scenarios.

Finally, aerosol, especially insoluble aerosol, will deposit in the respiratory track based particle size.  At the current time we have some aerosol exposure limits specified in terms of total and respirable particulate.   These are defined mathematically by the ACGIH and these algorithms can be applied to the concentration in the above size intervals above to render the amount of aerosol that might be inhaled (inhalable mass concentration) or be able to reach the deep pulmonary regions of the lungs (respirable mass concentration).

The above analysis sounds daunting mathematically and indeed it is not simple; however, it is nothing that an Excel spreadsheet cannot handle with relative ease given the proper input of scenario specific dimensions, generation rate, initial particle size distribution, particle size interval-specific settling velocity and ACGIH algorithms.   Like all models it is not exact but, I believe it is accurate enough to be useful.
 


Thursday, May 26, 2016

Exposure Modeling will Make You a Super Star


I see spectacular headlines when I am checking out of the Super Market.  Indeed, spectacular headlines seem to work for the National Enquirer so I hoped that they would work for me here.

I have literally grown old extolling the virtues and power of Exposure Assessment Modeling for Industrial Hygienists; however, my friend and colleague, Perry Logan tells me that what I have done is not enough.  He advises that one has to mention something many many times before it sinks in.  I do not remember how many times Perry suggested but it was many more than a few times.   Also, committing to using models is not a trivial decision without at least some considerable effort.   Thus, Perry is almost certainly correct, I have not promoted modeling enough.

I may be older but I am not done and I am going to list some of the very basic, with some self-serving, reasons an IH should get into learning exposure modeling:

    It will definitely enhance your standing with your employer and/or your clients

You will present yourself as “one of the few” a relatively rare professional who can take the factors that cause and predict exposure and apply them in a systematic manner to rendering predictions of exposure and risk.  This often occurs without the need for a lot of data which managers seem to particularly like.

Indeed, many people see models as technological magic and those who use them as wizards.  It often does not hurt you or your career to subtly let them think this is so even while you might tell them otherwise.

.  You will have confidence born of the knowledge and ability that you personally gained to estimate exposures using models and no one can take that from you.

These models are, for the most part, made up of first principles; that is,  basic laws of nature like the conservation of mass and are therefore, pretty true and useful on their face.   Clearly they can be both wrong and misused but at their core they are aimed at being reasoned and reasonable descriptors of reality or at least the reality that we know.  If they fall short, then they provide a mechanism and framework to fix themselves.  They can become complicated but they can also be “pulled apart” so that their pieces can be examined individually as to whether they make sense.

    Complex mathematical operations are no longer an issue with available free software.

I am prone to math errors.  Running long strings of calculations invariably has led me to make simple mistakes and the wrong answers.  In order to save my credibility I learned early on in my career that programing the calculation steps into a spreadsheet or BASIC program took more time initially but assured I had a tool that would not produce math errors.   That early effort has grown dramatically with other talented colleagues (like Tom Armstrong and Daniel Drolet) taking up the cause and the result is IH MOD – which is a free Excel Spreadsheet with mostly any modeling calculation you might need.

    Like any other skill (or Rome) Modeling Acumen will not be built in a day but the inputs can be structured to be very simple at first and then build on themselves.

Simple models can be learned in a day (or even less than an hour) but they are typically less useful than more complicated models; however, they have some use and, most important, they form the basis for building your knowledge, background, comfort level and skill base in this critical area.   How many times have you climbed a long hill (or task) one step at a time only to look back after a time to appreciate how far you have come?

If you go back through this blog to earlier entries you can hopefully see this progression.   Start with an equilibrium model and build from there.   Perhaps the simplest model I know is the equilibrium model:  C = G/Q  or concentration (C) is equal to generation rate (G) of a contaminant divided by ventilation rate (Q).    If you do not understand this model, PLEASE write to me and let me know where you get lost.  I will put together a brief blog that goes into enough detail to explain it.  Once you have this model, we will move on to more complicated models but I need your help to give me feedback via email (mjayjock@gmail.com) as to whether the lessons are working or not and if not where you get lost.  

If any of you are willing start this journey, I am willing to teach you in short 10-20 minutes blogs.

I cannot think of anything that has helped my career more than an interest and understanding of exposure assessment models.

   

Thursday, April 14, 2016

Risk Assessment Without Numbers

Adam Finkel recently sent me a Commentary from an advanced access publication (January 2016) of the Annals of Occupational Hygiene entitled “Hygiene Without Numbers” by Hans Kromhout.    Adam knows me and knows that I could not read such a piece an NOT comment.

I have never met and do not know Dr. Hans Kromhout, except by reputation, but I found his words to be right to the mark in his two pages of comments which I would be happy to forward to anyone requesting it of me at mjayjock@gmail.com.

Hans Kromhout described control banding as a "numberless intervention" and generally criticized its adequacy.  Indeed, I have always been frankly wary of control banding, which in my opinion, uses available and typically quite limited data to takes educated guesses at the ranges of toxicity to provide the level of needed control at various bands of exposure.  When combined with “exposure banding” one takes a similar banding estimate approach to the level of exposure that might be extant to get some notion of risk.   I CAN see this as the FIRST steps in a process aimed at understanding and controlling risk for a large number of chemicals but, like Dr. Kromhout, I do not see it as the end game.  There is simply too much uncertainty related to underestimation or, on the other side, overestimation of risk and both conditions are unacceptable for obvious reasons.

Everyone wants to “add value” to their organization and be “cost-effective”.  These are well-worn and, on their face, not unreasonable precepts enshrined in our psyche over at least the past 20-30 years especially in Corporate America.  Indeed, I believe that these personal/professional drivers have fed the rush to banding.   The bottom-line for me is that, according to my mother, there is no free lunch.  When one is committed to trying to understand the risk to human health from exposure to the vast majority of chemicals in commerce, we face an enormous short-fall in basic information related to both the toxicity and exposure associated with our interactions with these chemicals in a modern society.  I see banding as a response to the pressures that result from this uncomfortable situation.  As indicated above, I see it is a positive initial move but, I believe, in the majority of cases it does not reasonably or adequately assess the risk.

Risk assessment desperately needs data and the subsequent modeling of that data as the application of the scientific method to interpret that data and adequately estimate the level of risk.   That is, we need data on both the toxicity and exposure which should be accompanied by modeling these data to inform our confident knowledge of and decisions concerning the risk posed.   Like food and water, I believe that, freedom from unacceptable risk to chemicals should be considered to be a human need and its importance and provision should be recognized and addressed as such.

Spending the money to get the “numbers” will be much more expensive than proceeding with banding as the end game; however, it will be “cost-effective” relative to preventing unacceptable exposures and risk (or over-regulation).  This should be an important precept for any general society that truly values both its general economic health and the physical health of its citizens.