LinkedIn

Thursday, March 7, 2019

Are the Exposure Models Used for REACh Wrong?


Dr Joonas Koivisto and 16 others, including this writer, have recently authored what I believe is a very important paper:  Source specific exposure and risk assessment for indoor aerosols.    It sounds a bit like a paper focused on aerosol assessment but it is actually a comprehensive look at inhalation exposure models and the quality of these models to make decisions relative to chemical regulation and risk assessment.   The reality is that aerosols represent the most challenging scenarios for modeling because of their added properties compared to gases.  If one can accurately model aerosols then gases are relatively simple to model.  

The publication outlines the current state of the science and available models.  It also makes a developing case for the use of first principle mathematical mass balance models versus other types of models (knowledge-based models, and statistical models of exposure determinants) especially for regulatory decisions such as those mandated by REACh.

The Europeans are much more advanced than the US in the application of exposure models because they have to be.   The REACh regulation requires a risk assessment for literally thousands of chemicals and a risk assessment requires an exposure assessment.  There is not nearly enough measured exposure data available, so they have turned to models.   It is clearly evident that the inputs to and data bases for the mathematical mass balance models have not been sufficiently developed so the European Regulators have turned to knowledge-based and statistical models of exposure determinants.  These models are more easily applied because the inputs are relatively simple.   The paper implies that these models are not performing up to the task and that there is a real need to develop the input data necessary to feed the more competent first principle mathematical mass balance models.  

The paper points to an earlier paper I did with Tom Armstrong and Mike Taylor in which we challenged the mass balance 2 zone Near-field/Far-field (NF/FF) model to the Daubert legal criteria which is widely used by the Courts to assess whether expert witnesses scientific testimony is methodologically valid.   In that paper we concluded the NF/FF model fulfils the Daubert criteria and when it is used within its stated limitations, it adequately estimates the exposure as applied to legal decisions.  The implication is that the models currently used for making decisions for REACh would, most likely, not pass the Daubert criteria, which requires that these models:

1) Are applicable and have been tested.
2) Have been subjected to peer-review and are generally accepted.
3) The rate of error is known and acceptable.
4) have maintenance of standards and controls concerning their operation.
5) Are generally accepted in the relevant scientific community.

This Daubert paper is:  Jayjock, M.A., Armstrong, T., Taylor, M., 2011. The Daubert Standard as applied to exposure assessment modeling using the two zone (NF/FF) model estimation of indoor air breathing zone concentration as an example. J. Occup. Environ. Hyg. 8, D114–D122.   I will email an electronic copy to anyone requesting it: mjayjock@gmail.com.

What Dr. Koivisto and the other authors are asserting in this paper is somewhat striking; namely, the currently used REACh models need to be explicitly challenged by the Daubert (or similar objective) criteria and, if found wanting, better alternatives should be developed and employed.   This would, most likely, result in something this writer has been advocating for many years; specifically, comprehensive research and compilation of exposure source data bases.

This should be a straightforward objective scientific exercise; that is, a technically competent and empowered group of scientists would set open and objective criteria and test the currently used regulatory sanctioned models to those standards.   The reality, as I see it, is that there are strong vested interests and forces at work in this case that may resist this sort of effort.   Change is never easy but, hopefully, scientific integrity, good judgement and established facts will ultimately work to improve the public health, partisan politics notwithstanding.

The paper was published online this week at https://doi.org/10.1016/j.scitotenv.2019.02.398 as gold open access, which means that the full pdf text is a free download from the publisher Elsevier.  



Wednesday, December 12, 2018

Simple Techniques for Assessing Airflow in Occupied Spaces


Jeff Burton is a treasure to our profession.  He wrote a piece on ventilation earlier this year and published it in the AIHA Synergist.  I found it to be incredibly valuable.  On the chance that you did not see it, I am reproducing part of it below with his permission.  It is a trove of practical advice born from a lifetime of experience  and a great resource for any practising IH.

One thing the Jeff did not mention but that I think is important is that much of this can be used for exposure modelling input.

I am reproducing the first few paragraph of the article below.  If you are a member of AIHA, you can go to the online version in the Synergist to get it in all its glory at: 

https://www.aiha.org/membercenter/SynergistArchives/2018SynergistArchives/Pages/Six-Ways-to-Approximate-Airflow.aspx

If you are not a member, and you want it for your personal use, you can send me a request (mjayjock@gmail.com) and I will send you the original MS Word document that Jeff sent to me. 

______________________________________________________


Six Ways to Approximate Airflow

Simple Techniques for Assessing Airflow in Occupied Spaces

By D. Jeff Burton

Every occupational health and safety professional must be able to evaluate the air the occupants of a space are experiencing to assess the potential for IAQ problems and their solutions.

Most OHS professionals today are unable to conduct in-depth testing or measurement of HVAC systems and their airflows. Specialized knowledge of testing, measurement, and balancing is often required on the complex systems of today. Industrial hygiene engineers or TAB (testing, adjusting, and balancing) specialists can be employed to make detailed measurements. However, an OHS professional can often gather enough simple information to quickly provide approximate answers to questions about airflow in a space, regardless of the complexity of the system.

This article provides guidelines for simple testing, measurements, and approximations an OHS professional might perform. These include temperature and humidity; air movement and distribution, outdoor air flowrates, and air exchange rates in the occupied space; concentrations of carbon dioxide in the air; and the effects of wind on the airflow through a building.

The following equipment is needed to perform the simple tests and measurements described in this article: tape measure, thermometer, psychrometer, smoke tubes, and carbon dioxide monitor.

...


Are REACH Exposure Models Good Enough?


The political will in the European Union to enact REACh was and is extraordinary.   The body politic in the EU wants this regulation and certainly needs it to be effective. It should be clear that it cannot be effective if the exposure assessment half of the risk equation used for REACh is faulty.  Underestimation of exposure and risk hurts people's health directly, over-estimations hurts people's well-being by unnecessary hurting of the economy.   The use of good modelling tools is critical or REACh, in my opinion, will ultimately be doomed to fail. 

I have always thought that first principle physical chemical models (FPModels) have been superior to models that are not based on first principles (NFPModels).  Now a thoughtful and talented Danish researcher (Dr. Antti Joonas Koivisto) is examining and demonstrating with logic and DATA exactly why first principle models are better and, most likely, even necessary to make good regulatory decisions.

An early question might be:  Why develop NFPModels when FPModels are available for development?   The easy and probably correct answer:  They can be developed relatively quickly and with less effort and expense. FPModels are available but need to be parameterized for critical exposure scenarios and that means research dollars.

NFPModels, for the most part, are based on dimensionless factors to calculate scores, which are then converted to exposure values.  They are conceptual models than do not have to conform to first-principles and are thus (using Joonas' word) somewhat vague.

While there are other NFPModels, the big hitter in the EU for modelling exposure via REACh appears to be Stoffenmanager® v.7.1 which as of last month:

·         is reportedly validated by 15 scientific studies based on more than 6000 measurements. 
·         has more than 33,000 users with 50 new users per week. 
·         used to make over 200,000 regulatory decisions

It is accepted by the Dutch Labour inspectorate as a validated method to evaluate exposure to hazardous substances in the workplace.   More important, the European Commission officially recognises Stoffenmanager as a instrument to comply with the REACh regulation.

Other REACh-recommended NFPModels include:

ECETOC TRA
MEASE
EMK-EXPO-TOOL
ART

Although somewhat varied in their approach, they all share the same feature that they are
all based on dimensionless factors to calculate scores, which are then converted to exposure values.  They are conceptual models than do not have to conform to first-principles (like the conservation of mass).  Thus, they are not scientifically formalized and that leaves them difficult to explain.

Dr. Koivisto asserts, and I agree, that there should be minimum requirements for regulatory exposure models and that those criteria should be no less than the Daubert criteria used in US Courts for valid scientific testimony.  The model criteria: 
  •          Is applicable and has been tested.
  •          Has been subjected to peer review and is generally accepted.
  •          The rate of error is known and acceptable.
  •          The existence and maintenance of standards and controls concerning the                     operation.
  •          Is generally accepted in the relevant scientific community.
Joonas goes on to advise that FPMmodels are superior to the above NFPModels (what he calls “imaginary” models) because:

       Mass flows are traceable à Model can be used for environmental, occupational and consumer exposure assessment!!
       There is No unit conversions!!
       Error analysis can be made separately for emission source, emission controls, and dispersion.
       No need for Tier levels;  the Tier level depends on available information.
       Possible ”calibration” is straight forward (e.g. chamber tests)
       In the NF/FF model the NF volume and air mixing are adjustable according to the source (free parameterization).
       Results are easy to interpret
       TRANSPARENT!
       Easy to develop for further needs
       No need to discretize parameters (e.g. room size, ventilation rate,…)
       Accuracy superior when compared to compared to mechanistic or conceptual modes

I took most of the above from a November 29, 2018 presentation that Joonas gave in Denmark.  I will be happy to send the PowerPoint slide deck of that talk to anyone who asks at mjayjock@gmail.com.

Wednesday, September 12, 2018

IH Mod 2.0 - A Major Advance in Exposure Modelling Tools


I have not blogged for quite a while primarily because in 125 blogs I pretty much exhausted what I wanted to say on various topics.  Also, new ideas for blogs from the readers also seemed to have dried up.  I am, however, moved to post again by the wonderful work of Daniel Drolet and Tom Armstrong on the tool many of us know as IH Mod.  For years, they have wanted to combine the power of these deterministic models with the new dimension of stochastic uncertainty modelling (e.g., Monte Carlo simulation).  Daniel is a brilliant programmer and he made it happen!  It is now available as IH Mod 2.0 and, as usual, its a free download.  Daniel and Tom and all the folks who worked on this have done so without pay for the benefit of the professional.  Below is Tom's announcement.   I remain open at mjayjock@gmail.com for ideas for future blogs.   I do have another blog that will come out soon with goodies from Jeff Burton and the wonderful tools on ventilation he has recently provided to the profession.

Attention all exposure assessors who use or want to use mathematical modeling to estimate airborne exposure to chemicals!   IH Mod 2.0 and a Support File are now available (on the public access Exposure Assessment Strategies Committee web page.  https://www.aiha.org/get-involved/VolunteerGroups/Pages/Exposure-Assessment-Strategies-Committee.aspx

IH Mod 2.0 includes the same mathematical models as in the still available original IH Mod.   IH Mod 2.0 gives the user the choice between running the models in deterministic (point value parameters) or in Monte Carlo Simulation mode, with choices of distributions of parameter values.  This is right in MS Excel with no other software needed.  It requires a desktop install of MS Excel,  for  Windows or Apple computers.  The currently posted version has English, French, Serbo-Croation and Japanese language options.  Spanish, German and Italian will be available soon.

Support File for IH Mod 2.0 is also available.  It includes useful information about IH Mod 2.0, and spreadsheet tabs to estimate liquid spill pool generation rates via the Hummel-Fehrenbacher equation, a units of measure conversion tool, examples of generation rate estimation, a "Bootstrap" procedure tool, a summary of approaches to estimate ALPHA for the exponentially decreasing emission rate models, and some links to other resources.  The support file is evolving and will be updated periodically with new information.  Check back at the EASC web page (URL above) for updates.

Monday, September 26, 2016

Modeling Aerosol Exposures

I have gotten very few requests for blog topics since issuing the offer some time ago.  One such request has come from Richard Quenneville who asks how one might model aerosol or airborne particulate exposure.

Aerosols are certainly different from vapors or gases and the differences significantly complicate any attempt to model their exposure.   Even relatively small aerosol particles (microns or tenths of microns) are much larger than the individual molecules that make up a gas or vapor.  This gives them different properties at least in the following areas:
  • ·    They are typically more readily electrically charged especially if they are generated by sliding along a surface (e.g., dust from transporting powder in a pneumatic tube).  This charge can affect the size distribution and sampling of the aerosol.   
  • ·     With or without electrical charge, aerosol particles are often susceptible to combining with one another in a mechanism known as agglomeration.  This process, of course, changes the size distribution of the aerosol.
  • ·     Most important, because they have much more mass than vapor molecules they have a settling velocity which increases with increasing particle size and this, again, constantly changes the airborne size distribution of the aerosol with time.
  • ·     Because of their mass, airborne particles do NOT always make it into sampling orifices thus biasing their measurement.

Assuming agglomeration is not happening in a time frame that is relevant to the potential exposure, one can estimate any time-interval concentration of any aerosol particle or size range of particles.   This is done by taking the average settling velocity of the particles in that size range and accounting for their loss from settling.   Typically is this done for particles from 2 meters in height settling to the floor.  If one is sure that the breathing zone remains at say 2 meters high you can calculate the concentration loss from the horizontal volume at 2 meters height to say, 1.8 meters.   If you do this over small enough time intervals you can estimate a time-weighted average of aerosol concentration for any time period dependent on the nature of the aerosol source.

This brings up another complication of dealing with aerosol.  Compared to vapors, predicting the “release” or generation rate of particulate into the air is highly problematic because it depends on many undefined or unmeasured factors such as inter-particle forces.  I have never been able to use first-principle models to predict this rate. Instead, we have had success experimentally determining this rate from simulating the mechanism of generation, measuring the resultant concentrations and back calculating the rate of generation.  I personally think this is what needs to happen for the exposure assessment of nanoparticles released to the air in various scenarios.

Please note, settling is dependent on the particle size distribution of the generated aerosol.  I have seen situations in plants that were literally “particle fountains” with particle size distributions with a significant portion of the particles were greater than 100 microns.  These particles hit the floor in a time frame of seconds which dramatically lowers the total aerosol mass/volume.   Particles on the other end of the spectrum, e.g., nanoparticles, are going to essentially remain airborne and not settle at an appreciable rate in most scenarios.

Finally, aerosol, especially insoluble aerosol, will deposit in the respiratory track based particle size.  At the current time we have some aerosol exposure limits specified in terms of total and respirable particulate.   These are defined mathematically by the ACGIH and these algorithms can be applied to the concentration in the above size intervals above to render the amount of aerosol that might be inhaled (inhalable mass concentration) or be able to reach the deep pulmonary regions of the lungs (respirable mass concentration).

The above analysis sounds daunting mathematically and indeed it is not simple; however, it is nothing that an Excel spreadsheet cannot handle with relative ease given the proper input of scenario specific dimensions, generation rate, initial particle size distribution, particle size interval-specific settling velocity and ACGIH algorithms.   Like all models it is not exact but, I believe it is accurate enough to be useful.
 


Thursday, May 26, 2016

Exposure Modeling will Make You a Super Star


I see spectacular headlines when I am checking out of the Super Market.  Indeed, spectacular headlines seem to work for the National Enquirer so I hoped that they would work for me here.

I have literally grown old extolling the virtues and power of Exposure Assessment Modeling for Industrial Hygienists; however, my friend and colleague, Perry Logan tells me that what I have done is not enough.  He advises that one has to mention something many many times before it sinks in.  I do not remember how many times Perry suggested but it was many more than a few times.   Also, committing to using models is not a trivial decision without at least some considerable effort.   Thus, Perry is almost certainly correct, I have not promoted modeling enough.

I may be older but I am not done and I am going to list some of the very basic, with some self-serving, reasons an IH should get into learning exposure modeling:

    It will definitely enhance your standing with your employer and/or your clients

You will present yourself as “one of the few” a relatively rare professional who can take the factors that cause and predict exposure and apply them in a systematic manner to rendering predictions of exposure and risk.  This often occurs without the need for a lot of data which managers seem to particularly like.

Indeed, many people see models as technological magic and those who use them as wizards.  It often does not hurt you or your career to subtly let them think this is so even while you might tell them otherwise.

.  You will have confidence born of the knowledge and ability that you personally gained to estimate exposures using models and no one can take that from you.

These models are, for the most part, made up of first principles; that is,  basic laws of nature like the conservation of mass and are therefore, pretty true and useful on their face.   Clearly they can be both wrong and misused but at their core they are aimed at being reasoned and reasonable descriptors of reality or at least the reality that we know.  If they fall short, then they provide a mechanism and framework to fix themselves.  They can become complicated but they can also be “pulled apart” so that their pieces can be examined individually as to whether they make sense.

    Complex mathematical operations are no longer an issue with available free software.

I am prone to math errors.  Running long strings of calculations invariably has led me to make simple mistakes and the wrong answers.  In order to save my credibility I learned early on in my career that programing the calculation steps into a spreadsheet or BASIC program took more time initially but assured I had a tool that would not produce math errors.   That early effort has grown dramatically with other talented colleagues (like Tom Armstrong and Daniel Drolet) taking up the cause and the result is IH MOD – which is a free Excel Spreadsheet with mostly any modeling calculation you might need.

    Like any other skill (or Rome) Modeling Acumen will not be built in a day but the inputs can be structured to be very simple at first and then build on themselves.

Simple models can be learned in a day (or even less than an hour) but they are typically less useful than more complicated models; however, they have some use and, most important, they form the basis for building your knowledge, background, comfort level and skill base in this critical area.   How many times have you climbed a long hill (or task) one step at a time only to look back after a time to appreciate how far you have come?

If you go back through this blog to earlier entries you can hopefully see this progression.   Start with an equilibrium model and build from there.   Perhaps the simplest model I know is the equilibrium model:  C = G/Q  or concentration (C) is equal to generation rate (G) of a contaminant divided by ventilation rate (Q).    If you do not understand this model, PLEASE write to me and let me know where you get lost.  I will put together a brief blog that goes into enough detail to explain it.  Once you have this model, we will move on to more complicated models but I need your help to give me feedback via email (mjayjock@gmail.com) as to whether the lessons are working or not and if not where you get lost.  

If any of you are willing start this journey, I am willing to teach you in short 10-20 minutes blogs.

I cannot think of anything that has helped my career more than an interest and understanding of exposure assessment models.

   

Thursday, April 14, 2016

Risk Assessment Without Numbers

Adam Finkel recently sent me a Commentary from an advanced access publication (January 2016) of the Annals of Occupational Hygiene entitled “Hygiene Without Numbers” by Hans Kromhout.    Adam knows me and knows that I could not read such a piece an NOT comment.

I have never met and do not know Dr. Hans Kromhout, except by reputation, but I found his words to be right to the mark in his two pages of comments which I would be happy to forward to anyone requesting it of me at mjayjock@gmail.com.

Hans Kromhout described control banding as a "numberless intervention" and generally criticized its adequacy.  Indeed, I have always been frankly wary of control banding, which in my opinion, uses available and typically quite limited data to takes educated guesses at the ranges of toxicity to provide the level of needed control at various bands of exposure.  When combined with “exposure banding” one takes a similar banding estimate approach to the level of exposure that might be extant to get some notion of risk.   I CAN see this as the FIRST steps in a process aimed at understanding and controlling risk for a large number of chemicals but, like Dr. Kromhout, I do not see it as the end game.  There is simply too much uncertainty related to underestimation or, on the other side, overestimation of risk and both conditions are unacceptable for obvious reasons.

Everyone wants to “add value” to their organization and be “cost-effective”.  These are well-worn and, on their face, not unreasonable precepts enshrined in our psyche over at least the past 20-30 years especially in Corporate America.  Indeed, I believe that these personal/professional drivers have fed the rush to banding.   The bottom-line for me is that, according to my mother, there is no free lunch.  When one is committed to trying to understand the risk to human health from exposure to the vast majority of chemicals in commerce, we face an enormous short-fall in basic information related to both the toxicity and exposure associated with our interactions with these chemicals in a modern society.  I see banding as a response to the pressures that result from this uncomfortable situation.  As indicated above, I see it is a positive initial move but, I believe, in the majority of cases it does not reasonably or adequately assess the risk.

Risk assessment desperately needs data and the subsequent modeling of that data as the application of the scientific method to interpret that data and adequately estimate the level of risk.   That is, we need data on both the toxicity and exposure which should be accompanied by modeling these data to inform our confident knowledge of and decisions concerning the risk posed.   Like food and water, I believe that, freedom from unacceptable risk to chemicals should be considered to be a human need and its importance and provision should be recognized and addressed as such.

Spending the money to get the “numbers” will be much more expensive than proceeding with banding as the end game; however, it will be “cost-effective” relative to preventing unacceptable exposures and risk (or over-regulation).  This should be an important precept for any general society that truly values both its general economic health and the physical health of its citizens.