LinkedIn

Monday, September 26, 2016

Modeling Aerosol Exposures

I have gotten very few requests for blog topics since issuing the offer some time ago.  One such request has come from Richard Quenneville who asks how one might model aerosol or airborne particulate exposure.

Aerosols are certainly different from vapors or gases and the differences significantly complicate any attempt to model their exposure.   Even relatively small aerosol particles (microns or tenths of microns) are much larger than the individual molecules that make up a gas or vapor.  This gives them different properties at least in the following areas:
  • ·    They are typically more readily electrically charged especially if they are generated by sliding along a surface (e.g., dust from transporting powder in a pneumatic tube).  This charge can affect the size distribution and sampling of the aerosol.   
  • ·     With or without electrical charge, aerosol particles are often susceptible to combining with one another in a mechanism known as agglomeration.  This process, of course, changes the size distribution of the aerosol.
  • ·     Most important, because they have much more mass than vapor molecules they have a settling velocity which increases with increasing particle size and this, again, constantly changes the airborne size distribution of the aerosol with time.
  • ·     Because of their mass, airborne particles do NOT always make it into sampling orifices thus biasing their measurement.

Assuming agglomeration is not happening in a time frame that is relevant to the potential exposure, one can estimate any time-interval concentration of any aerosol particle or size range of particles.   This is done by taking the average settling velocity of the particles in that size range and accounting for their loss from settling.   Typically is this done for particles from 2 meters in height settling to the floor.  If one is sure that the breathing zone remains at say 2 meters high you can calculate the concentration loss from the horizontal volume at 2 meters height to say, 1.8 meters.   If you do this over small enough time intervals you can estimate a time-weighted average of aerosol concentration for any time period dependent on the nature of the aerosol source.

This brings up another complication of dealing with aerosol.  Compared to vapors, predicting the “release” or generation rate of particulate into the air is highly problematic because it depends on many undefined or unmeasured factors such as inter-particle forces.  I have never been able to use first-principle models to predict this rate. Instead, we have had success experimentally determining this rate from simulating the mechanism of generation, measuring the resultant concentrations and back calculating the rate of generation.  I personally think this is what needs to happen for the exposure assessment of nanoparticles released to the air in various scenarios.

Please note, settling is dependent on the particle size distribution of the generated aerosol.  I have seen situations in plants that were literally “particle fountains” with particle size distributions with a significant portion of the particles were greater than 100 microns.  These particles hit the floor in a time frame of seconds which dramatically lowers the total aerosol mass/volume.   Particles on the other end of the spectrum, e.g., nanoparticles, are going to essentially remain airborne and not settle at an appreciable rate in most scenarios.

Finally, aerosol, especially insoluble aerosol, will deposit in the respiratory track based particle size.  At the current time we have some aerosol exposure limits specified in terms of total and respirable particulate.   These are defined mathematically by the ACGIH and these algorithms can be applied to the concentration in the above size intervals above to render the amount of aerosol that might be inhaled (inhalable mass concentration) or be able to reach the deep pulmonary regions of the lungs (respirable mass concentration).

The above analysis sounds daunting mathematically and indeed it is not simple; however, it is nothing that an Excel spreadsheet cannot handle with relative ease given the proper input of scenario specific dimensions, generation rate, initial particle size distribution, particle size interval-specific settling velocity and ACGIH algorithms.   Like all models it is not exact but, I believe it is accurate enough to be useful.
 


Thursday, May 26, 2016

Exposure Modeling will Make You a Super Star


I see spectacular headlines when I am checking out of the Super Market.  Indeed, spectacular headlines seem to work for the National Enquirer so I hoped that they would work for me here.

I have literally grown old extolling the virtues and power of Exposure Assessment Modeling for Industrial Hygienists; however, my friend and colleague, Perry Logan tells me that what I have done is not enough.  He advises that one has to mention something many many times before it sinks in.  I do not remember how many times Perry suggested but it was many more than a few times.   Also, committing to using models is not a trivial decision without at least some considerable effort.   Thus, Perry is almost certainly correct, I have not promoted modeling enough.

I may be older but I am not done and I am going to list some of the very basic, with some self-serving, reasons an IH should get into learning exposure modeling:

    It will definitely enhance your standing with your employer and/or your clients

You will present yourself as “one of the few” a relatively rare professional who can take the factors that cause and predict exposure and apply them in a systematic manner to rendering predictions of exposure and risk.  This often occurs without the need for a lot of data which managers seem to particularly like.

Indeed, many people see models as technological magic and those who use them as wizards.  It often does not hurt you or your career to subtly let them think this is so even while you might tell them otherwise.

.  You will have confidence born of the knowledge and ability that you personally gained to estimate exposures using models and no one can take that from you.

These models are, for the most part, made up of first principles; that is,  basic laws of nature like the conservation of mass and are therefore, pretty true and useful on their face.   Clearly they can be both wrong and misused but at their core they are aimed at being reasoned and reasonable descriptors of reality or at least the reality that we know.  If they fall short, then they provide a mechanism and framework to fix themselves.  They can become complicated but they can also be “pulled apart” so that their pieces can be examined individually as to whether they make sense.

    Complex mathematical operations are no longer an issue with available free software.

I am prone to math errors.  Running long strings of calculations invariably has led me to make simple mistakes and the wrong answers.  In order to save my credibility I learned early on in my career that programing the calculation steps into a spreadsheet or BASIC program took more time initially but assured I had a tool that would not produce math errors.   That early effort has grown dramatically with other talented colleagues (like Tom Armstrong and Daniel Drolet) taking up the cause and the result is IH MOD – which is a free Excel Spreadsheet with mostly any modeling calculation you might need.

    Like any other skill (or Rome) Modeling Acumen will not be built in a day but the inputs can be structured to be very simple at first and then build on themselves.

Simple models can be learned in a day (or even less than an hour) but they are typically less useful than more complicated models; however, they have some use and, most important, they form the basis for building your knowledge, background, comfort level and skill base in this critical area.   How many times have you climbed a long hill (or task) one step at a time only to look back after a time to appreciate how far you have come?

If you go back through this blog to earlier entries you can hopefully see this progression.   Start with an equilibrium model and build from there.   Perhaps the simplest model I know is the equilibrium model:  C = G/Q  or concentration (C) is equal to generation rate (G) of a contaminant divided by ventilation rate (Q).    If you do not understand this model, PLEASE write to me and let me know where you get lost.  I will put together a brief blog that goes into enough detail to explain it.  Once you have this model, we will move on to more complicated models but I need your help to give me feedback via email (mjayjock@gmail.com) as to whether the lessons are working or not and if not where you get lost.  

If any of you are willing start this journey, I am willing to teach you in short 10-20 minutes blogs.

I cannot think of anything that has helped my career more than an interest and understanding of exposure assessment models.

   

Thursday, April 14, 2016

Risk Assessment Without Numbers

Adam Finkel recently sent me a Commentary from an advanced access publication (January 2016) of the Annals of Occupational Hygiene entitled “Hygiene Without Numbers” by Hans Kromhout.    Adam knows me and knows that I could not read such a piece an NOT comment.

I have never met and do not know Dr. Hans Kromhout, except by reputation, but I found his words to be right to the mark in his two pages of comments which I would be happy to forward to anyone requesting it of me at mjayjock@gmail.com.

Hans Kromhout described control banding as a "numberless intervention" and generally criticized its adequacy.  Indeed, I have always been frankly wary of control banding, which in my opinion, uses available and typically quite limited data to takes educated guesses at the ranges of toxicity to provide the level of needed control at various bands of exposure.  When combined with “exposure banding” one takes a similar banding estimate approach to the level of exposure that might be extant to get some notion of risk.   I CAN see this as the FIRST steps in a process aimed at understanding and controlling risk for a large number of chemicals but, like Dr. Kromhout, I do not see it as the end game.  There is simply too much uncertainty related to underestimation or, on the other side, overestimation of risk and both conditions are unacceptable for obvious reasons.

Everyone wants to “add value” to their organization and be “cost-effective”.  These are well-worn and, on their face, not unreasonable precepts enshrined in our psyche over at least the past 20-30 years especially in Corporate America.  Indeed, I believe that these personal/professional drivers have fed the rush to banding.   The bottom-line for me is that, according to my mother, there is no free lunch.  When one is committed to trying to understand the risk to human health from exposure to the vast majority of chemicals in commerce, we face an enormous short-fall in basic information related to both the toxicity and exposure associated with our interactions with these chemicals in a modern society.  I see banding as a response to the pressures that result from this uncomfortable situation.  As indicated above, I see it is a positive initial move but, I believe, in the majority of cases it does not reasonably or adequately assess the risk.

Risk assessment desperately needs data and the subsequent modeling of that data as the application of the scientific method to interpret that data and adequately estimate the level of risk.   That is, we need data on both the toxicity and exposure which should be accompanied by modeling these data to inform our confident knowledge of and decisions concerning the risk posed.   Like food and water, I believe that, freedom from unacceptable risk to chemicals should be considered to be a human need and its importance and provision should be recognized and addressed as such.

Spending the money to get the “numbers” will be much more expensive than proceeding with banding as the end game; however, it will be “cost-effective” relative to preventing unacceptable exposures and risk (or over-regulation).  This should be an important precept for any general society that truly values both its general economic health and the physical health of its citizens.



Monday, December 14, 2015

Regulations Need Good Tools for Risk Assessment

In the last blog I asserted the need for the “cold hand of regulation” before risk assessment for the vast majority of chemicals used in commerce would happen.   A colleague wrote to me about that blog and reminded me that having an ostensibly comprehensive set of regulations is no guarantee that good risk assessment will be done.  I have excerpted a portion of the email from this colleague who is literally on the front lines in the application of available risk assessment tools.  Please note that I have always found this IH professional to be insightful and plain-speaking while being dedicated and passionate about making a difference.

“I've recently been doing some more work for U.S.-based multi-national firms reviewing REACh documentation, and I have to say, I am kind of disillusioned about the 'promise' of the REACh regulation's outcomes.  So much of what I see for extended SDSs [Safety Data Sheets] are just cookie-cutter verbiage - or, use Tier I screening tools to justify squishy statements that have very little utility to the end/downstream users of chemical-containing products.   I am sure there are a multitude of reasons for why this has happened, but the end result (IMHO) is going to fall far short of the original intent of the regs.”

I can only say that I completely agree with this observation from this seasoned IH/RA professional.   Indeed, I believe that I know the primary reason for this unfortunate state-of-affairs; namely, it is a lack of well-developed tools particularly in the realm of exposure assessment.

The first threshold or gate in risk assessment is the decision to do a risk assessment.   As I argued in the last blog, to date, that threshold has not been crossed for most chemicals in this country.  It has been different in Europe.  There has been a movement in the EU for the last 15 years or so to cross this threshold.  They are clearly advanced.

Once you are on the hook to do a risk assessment then you need the resources to make it happen.  If you do not have them then you have to develop them.  Since you will be applying them literally to thousands of chemicals, they have to be generally applicable to a large number of chemicals.   The tools for this task need to strike a balance between being “sharp” and incisive enough to render good answers for specific chemicals while being “general” enough to be applicable in a cost-effective manner.  You obviously cannot measure everything everywhere; as such, the development of validated and comprehensive exposure and effects models is critical.

I have asserted for years that we yet to do the basic research needed to properly feed our exposure models and make them “sharp” enough to be generally useful in the above context.  We did our best to lay out a specific template for research for the EU in a series of 2005 Workshops that were sponsored by the European Commission Joint Research Centre (JRC) in Ispra, Italy. These reports, especially the 100+ page report on exposure source characterization used the combined expertise of seasoned and respected scientists from around the world (Berkeley, Virginia Tech, USEPA, EU, Japan, China) to point to where the research was needed.  That document and its recommendations lay on JRC server and in my files and hard drive for years without any action.  I can no longer find it on the JRC servers but I have it and would be happy to send this report to anyone asking at mjayjock@gmail.com.  You can also find it as a downloadable link on my webpage:
http://www.jayjock-associates.com/educational-files-and-events/


Instead of doing the basic, initially expensive but ultimately cost-effective detailed research and tool development, the regulatory community in Europe has developed or adopted light-weight and stop-gap approaches which have resulted in the outcomes as described by my colleague on the front lines; namely, “cookie-cutter verbiage - or, use Tier I screening tools to justify squishy statements that have very little utility to the end/downstream users of chemical-containing products.”

In my opinion, there really is no substitute for doing it right and I hope that someday the research and its work products will fulfill the original intent of the REACh (and hopefully the upcoming US and other world-wide chemical) regulation.

                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            

Tuesday, December 1, 2015

Chemical Risk Assessment Needs the Cold Hand of Regulation

I worked for a large specialty chemical company for 35 years.  The company had a reputation as being a leader in the area of human health risk assessment.   I believe that reputation came as a result of their response to a tragedy that occurred years earlier when some of its workers were unknowingly exposed to a potent carcinogen and many became ill.   The heartbreak of this incidence caused the owners to really understand and act on that fact that you typically cannot manage any risk which has not been first reasonably assessed.  When I came on to the risk assessment scene in the mid-1980s the culture was well in place but the tools for quantitative risk assessments were not (and I must say remain) relatively under-developed.  I did and continue to spend my professional time to this day working on them.

For much of that 30+ years I have been busy working on these tools as applied to compounds that were clearly hazardous.  That is, those designed or discovered to be biologically active.   This included biocides and the “stand out” toxicants such as benzene, formaldehyde, chlorinated hydrocarbons and any other molecules important to the company that had somehow adversely affected human health or had been tested in animals to be carcinogenic, neurotoxic or a reproductive hazard.

This is how essentially all chemical risk assessment is done today and it is lacking.  It is “reactive” risk assessment in which relatively few chemicals are evaluated and the vast majority go unaddressed.   This was convincingly shown in what has been known as the “HPV Challenge”.   An excerpt from an Environmental  Defense  Fund web site:

When it launched the HPV Challenge in 1998, the U.S. Environmental Protection Agency (EPA) acknowledged there were huge gaps in publicly available hazard data even for HPV chemicals (those produced in or imported into the U.S. in amounts equal to or exceeding one million pounds annually). 

This June 2015 web site (https://www.edf.org/health/reports/high-hopes-low-marks) generally asserts a continued lack of information born of missed deadlines and data quality concerns.

After thinking about this literally for decades, I have come to the conclusion that even highly “enlightened” companies such as the one I worked for (and continue to work for as a consultant) will not take on the burden of doing risk assessments on all chemicals by itself.  The systematic, comprehensive and shared risk assessment of chemicals is something that needs to occur in the public interest and therefore should be subject to public governance; that is, regulation.

The “Government” has shown itself to be very capable of screwing things up but I frankly do not see a reasonable alternative.  I suggest that we simply have to do a better job of governing and not throw the risk assessment “baby” out with the governmental “bathwater”.

The European Union has been trying to do this with REACh and more recently in this country we are trying to “reform” the Toxic Substance Control Act.   Ultimately, I believe the cold hand of regulation will be the best and perhaps only way to do rational and comprehensive chemical risk assessment.

As usual, I would love to hear your take on this opinion which I can present here as anonymous if you prefer.

Sunday, August 2, 2015

Wanted Topics for this Blog from You the Readers


To date, I have published 120 blogs in this space on essentially a weekly basis for more than two years.   It has been a very fulfilling activity in that I have connected with many wonderful colleagues and learned a lot in the process.

Now in the middle of the summer of 2015, I have decided to take a break, to re-evaluate the purpose of this bog and to seek your input. 

I know a lot of you are on vacation or doing other activities so I may repeat this request in the fall when I hope to restart this weekly blog.

What I am asking for is for you to send me your questions or requests for topics for this blog within the very general realm of human health exposure and risk assessment.     This could include anything under the rather broad  topics of:
  • ·         Exposure modeling
  • ·         Exposure monitoring
  • ·         Toxicology
  • ·         Exposure Limits
  • ·         Ethics of Risk Assessment
  • ·         Politics of Risk Assessment

Indeed, even if your question or topic does not fit into any of these exactly, please ask anyway.   Maybe I can add something or send you in the right direction.

Please contact me at mjayjock@gmail.com


Have a good summer and I hope to be back online again in the fall.

Sunday, July 26, 2015

WHY do Risk Assessment?

Chris Keil is a technically savvy colleague who has done a lot to advance the science of human exposure modeling.  He is a prime mover and editor of both editions of our bible for occupational exposure modeling:  Mathematical Models for Estimating Occupational Exposure to Chemicals.

Chris recently sent me and other colleagues a note asking for our help in a project his is doing.  An excerpt from his email is presented below:

“I’m doing a project in which I am writing on the WHY of occupational safety and health. Searching for “Ethics and OSH” yields lots of info on the Ethics of OSH *practice* but not so much the philosophical/ethical basis for it.

Lots of the written rationale for OHS is tied to it being a good idea economically. And there are vague references that it is the “right thing to do”. What I’m looking for are scholarly treatments of why OSH is the “right thing to do”.

If you know of any such treatments, please send them my way.”

In my opinion, this issue is fairly apparent and straightforward.   Indeed, I believe that our forefathers in the United States were absolutely brilliant in the fact that they wanted to separate religion from the state but also wanted to define and assert human values that were universally applicable to all people irrespective of religion.   This is not to say that religious principles, particularly Judaeo - Christian beliefs, did not drive these values.  Rather, I believe, they intended that any particular religious dogma would not be associated with the assertion and establishment of these as secular rules to live by.

The second sentence of the July 4, 1776, U.S. Declaration of Independence is particularly blunt, elegant and powerful in this regard:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  https://en.wikipedia.org/wiki/Life,_Liberty_and_the_pursuit_of_Happiness

I would argue that an untoward health effect from a chemical exposure or other workplace hazard is a direct threat to a person’s pursuit of Happiness if not their Life.

Indeed, some believe that the kernel for some of these ideas were voiced by the English philosopher John Locke almost 90 years earlier in 1689 when he wrote about the importance of "life, liberty, health, and indolency of body…" (ref:  same wiki web site as above).

The outdated term “indolency” is defined as:

Noun

indolency (plural indolencies)
1.     (obsolete) The lack of pain; absence of pain


It would be hard to argue that this, our country is not based on these principles.  They define who we are and how we should act as a nation and as a people.   To be true to these very clearly stated and agreed to values, it is not hard to imagine that we need to control the threats to “indolency” that might exist within our society from chemical exposure or other workplace hazards.

I have always found it to be particularly difficult and often quite inefficient to manage a risk to health from chemical exposure that was not first reasonably assessed.   Indeed, if we do not even attempt to assess a risk of chemical exposure then it is often tacitly (and often incorrectly) assumed to be negligible.  In short, doing good, proactive OSH allows us to "walk the walk" relative to the most basic of our values.

Doing good OSH may be good for the bottom line but that reason is not even close to why it should be done.   Doing good OSH lies at what should be heart of our agreed to and stated governing values as citizens and people.

As usual, I (and Chris) would love to hear your thoughts on this issue.