LinkedIn

Thursday, March 7, 2019

Are the Exposure Models Used for REACh Wrong?


Dr Joonas Koivisto and 16 others, including this writer, have recently authored what I believe is a very important paper:  Source specific exposure and risk assessment for indoor aerosols.    It sounds a bit like a paper focused on aerosol assessment but it is actually a comprehensive look at inhalation exposure models and the quality of these models to make decisions relative to chemical regulation and risk assessment.   The reality is that aerosols represent the most challenging scenarios for modeling because of their added properties compared to gases.  If one can accurately model aerosols then gases are relatively simple to model.  

The publication outlines the current state of the science and available models.  It also makes a developing case for the use of first principle mathematical mass balance models versus other types of models (knowledge-based models, and statistical models of exposure determinants) especially for regulatory decisions such as those mandated by REACh.

The Europeans are much more advanced than the US in the application of exposure models because they have to be.   The REACh regulation requires a risk assessment for literally thousands of chemicals and a risk assessment requires an exposure assessment.  There is not nearly enough measured exposure data available, so they have turned to models.   It is clearly evident that the inputs to and data bases for the mathematical mass balance models have not been sufficiently developed so the European Regulators have turned to knowledge-based and statistical models of exposure determinants.  These models are more easily applied because the inputs are relatively simple.   The paper implies that these models are not performing up to the task and that there is a real need to develop the input data necessary to feed the more competent first principle mathematical mass balance models.  

The paper points to an earlier paper I did with Tom Armstrong and Mike Taylor in which we challenged the mass balance 2 zone Near-field/Far-field (NF/FF) model to the Daubert legal criteria which is widely used by the Courts to assess whether expert witnesses scientific testimony is methodologically valid.   In that paper we concluded the NF/FF model fulfils the Daubert criteria and when it is used within its stated limitations, it adequately estimates the exposure as applied to legal decisions.  The implication is that the models currently used for making decisions for REACh would, most likely, not pass the Daubert criteria, which requires that these models:

1) Are applicable and have been tested.
2) Have been subjected to peer-review and are generally accepted.
3) The rate of error is known and acceptable.
4) have maintenance of standards and controls concerning their operation.
5) Are generally accepted in the relevant scientific community.

This Daubert paper is:  Jayjock, M.A., Armstrong, T., Taylor, M., 2011. The Daubert Standard as applied to exposure assessment modeling using the two zone (NF/FF) model estimation of indoor air breathing zone concentration as an example. J. Occup. Environ. Hyg. 8, D114–D122.   I will email an electronic copy to anyone requesting it: mjayjock@gmail.com.

What Dr. Koivisto and the other authors are asserting in this paper is somewhat striking; namely, the currently used REACh models need to be explicitly challenged by the Daubert (or similar objective) criteria and, if found wanting, better alternatives should be developed and employed.   This would, most likely, result in something this writer has been advocating for many years; specifically, comprehensive research and compilation of exposure source data bases.

This should be a straightforward objective scientific exercise; that is, a technically competent and empowered group of scientists would set open and objective criteria and test the currently used regulatory sanctioned models to those standards.   The reality, as I see it, is that there are strong vested interests and forces at work in this case that may resist this sort of effort.   Change is never easy but, hopefully, scientific integrity, good judgement and established facts will ultimately work to improve the public health, partisan politics notwithstanding.

The paper was published online this week at https://doi.org/10.1016/j.scitotenv.2019.02.398 as gold open access, which means that the full pdf text is a free download from the publisher Elsevier.  



9 comments:

  1. Mike
    Thanks for moving this topic forward with your blog. I am in receipt of the paper from Tom's email this morning, but still have to carefully review it. I am very interested in the need for the data library conclusion, as it rolls right in to advocacy for Big Data management (an AIHA Content Priority).
    Thanks Steve

    ReplyDelete
  2. Mike
    Thanks for sharing this article. I see a couple of things to advance to the broader Exposure Assessment and Risk Assessment community:
    1. The emissions library need is apparent. And that need aligns to the AIHA Content Priority around Big Data. Should we be advocating to the membership and board an active roll in OWNING AND MANAGING such a library?
    2. The standardization of data capture across all measurement methods (conventional air sampling as well as emerging sensors) is ever expanding. We started this with the excellent work of Mark Stenzel, Daniel Drolet, and Dr. Susan Arnold with IHEST......which empowers the individual. But are we actively and consistently soliciting feedback of scenarios captured? Should we be? What are the barriers to getting that feedback (quite assuredly corporate legal departments)? Respond here or dump an email. Ready to post into Catalyst with an Open Forum comment that I could create, you could review and collaborate on.....
    Respectfully,
    Steve

    ReplyDelete
    Replies
    1. Hi Steve,

      Anything you can do to open up this discussion is welcome and I would happily participate in. I see this as a top down effort. Neil Hawkins and I published a paper extolling industrial hygienists to take more data on exposure determinants during scenario testing many years ago. It went over like a lead balloon. Everyone claimed they already had too much work.
      Best Regards,
      Mike

      Delete
    2. Mike
      Thanks. I reached out to Tom Armstrong and Dr. Arnold on summary results for IHEST users feedback (is anyone collecting scenarios). Have not checked yet for replies.
      Will press for more energy on this at conference.....will you make Minneapolis?
      Regards
      Steve

      Delete
    3. Hi Steve, I got the Edward J. Baier Award this year (thanks to Mark Stenzel) which means an almost free trip the the conference. Looking forward to seeing you there.

      Delete
  3. Hi Mike, Thank you for sharing your interesting work. I will read your paper. As you know, REACh is using a tiered approach for consumer exposure. In my limited experience, the Tier 1 model is conservative but may not be accurate. One can use a refined exposure model or measured data but it takes more time and expense. regards! David

    ReplyDelete
    Replies
    1. Dear David,
      Current problem with ECHA recommended Tier 1 models is that they are not "precautionary". ECETOC TRA, MEASE, and EMKG-EXPO-TOOL are all overestimating or underestimating as shown by e.g. Lamb et al. (2015), Landberg et al. (2017; 2018), van Tongeren et al. (2017), and Spinazzè et al. (2017). What these tools does, is just having larger deviation from measured values. Predicatability is ranges usually 5 order of magnitude. This mean, that if you have measured 1, the model predicts the result within range of 0.01 to 1000. How someone can make decisions based on that kind of accuracy?
      We should clarify what are requirements for a regulatory decision tool. Best starting point is to revise Daubert’s principles, as made by Jayjock et al. (2011) for 2-box model (NF/FF-model).

      Kind regards,
      Joonas

      References:
      van Tongeren, M., Lamb, J., Cherrie, J.W., MacCalman, L., Basinas, I., Hesse, S., 2017. Validation of lower tier exposure tools used for REACH: comparison of tools estimates with available exposure measurements. Ann Work Expo Health. 61, 921–938.
      Lamb, J., Hesse, S., Miller, B.G., MacCalman, L., Schroeder, K., Cherrie, J., van Tongeren, M., 2015. Evaluation of Tier 1 Exposure Assessment Models under REACH (eteam) Project-Final Overall Project Summary Report. Available: http://www.baua.de/de/Publikationen/Fachbeitraege/F2303-D26-D28.html.
      Landberg, H.E., Axmon, A., Westberg, H., Tinnerberg, H., 2017. A study of the validity of two exposure assessment tools: Stoffenmanager and the advanced REACH tool. Ann. Work Expo. Health. 61, 575–588.
      Landberg, H.E., Westberg, H., Tinnerberg, H., 2018. Evaluation of risk assessment approaches of occupational chemical exposures based on models in comparison with measurements. Saf. Sci. 109, 412–420.
      Spinazzè, A., Lunghini, F., Campagnolo, D., Rovelli, S., Locatelli, M., Cattaneo, A., Cavallo, D.M., 2017. Accuracy evaluation of three modelling tools for occupational exposure assessment. Ann. Work Expo. Health. 61, 284–298.

      Delete
    2. This is an important topic and as someone who works regularly with ConsExpo, the RIFM 2-Box Air Dispersion Model (NF/FF analyses) and ARA's Multi-Path Particle Deposition Model, I am eager to learn from your collective insights. Will there be a workshop for this that is open for attendance?

      Delete
    3. Dear Madhuri,

      Good to hear from you! We owe Joonas a debt of gratitude for pushing this issue and supplying data-based evidence of the need for further work. I do not know of any current or planned workshops but this issue is emerging and I think there will be more on this. Stay tuned. Best Regards, Mike

      Delete