We all probably know by
now that Human Health Risk Assessment is the integration of the ability of a chemical to
cause harm to human health with the actual exposure to that chemical that might
occur in the real world. In terms
of a simple conceptual model: Risk =
(Toxicological Harm/Exposure)X(Exposure).
EVERY chemical will cause toxicological harm at some dose. Pure oxygen breathed for an extended period
is toxic. Indeed Wikipedia reports that
pulmonary and ocular toxicity result from extended exposures to elevated oxygen
levels at normal pressure. Given
a dose high enough, toxicological harm comes from each and EVERY chemical we
can think of. The “what” of that untoward
health outcome is called the HAZARD IDENTIFICATION for that substance. In the above example with oxygen it is pulmonary and ocular toxicity. It is usually the first bad thing health-wise
that happens as you ramp up the exposure level and it can range from irritation
to more serious outcomes like cancer to death. The exposure level over which that bad health effect happens defines the POTENCY of the chemical’s
toxicological effect with highly potent materials causing effects at very low
exposures. Oxygen is not very
potent. By comparison benzene is
very potent in its ability to cause toxicological harm.
The point of all this is
that you cannot begin to do a risk assessment without data on the toxicological
properties (HAZARD IDENTIFICATION and POTENCY) of the chemical of interest. If you have No data you have No risk
assessment unless you really force the issue which will be discussed below.
Unfortunately, what has
sparked a lot of the toxicological data that we have comes from first seeing
the adverse health effect of the chemicals on exposed humans. Benzene falls into this category. So does asbestos and vinyl chloride and all
the other known human carcinogens.
Seeing people stricken by these substances caused them to be
studied. The other category of
well-studied chemicals is pesticides.
This is primarily because they are designed to be commercial poisons so
they are expected to be relatively potent toxicants and clearly worthy of study. As a result they are also highly regulated. How do we address the rest of the chemicals in
our world?
In the late 1990s the
Environmental Defense Fund issued a ground breaking report entitled: Toxic Ignorance (The Continuing Absence of
Basic Health Testing for Top-Selling Chemicals in the United States): http://www.edf.org/sites/default/files/243_toxicignorance_0.pdf It proved with undeniable hard data that, at
that point in time, “…even the most basic toxicity testing results cannot be
found in the public record for nearly 75% of the top volume chemicals in
commercial use.” As you might imagine
it caused quite a stir and the EPA got involved and eventually hatched the High
Production Volume (HPV) Challenge program.
http://www.epa.gov/hpv/. This resulted in considerably more
toxicological data but as you might guess there remains a severe lack of data
for the tens of thousands of chemicals still in commerce to which folks are
exposed every day.
But that takes us to an
even more fundamental question: Why
haven’t we been testing the chemicals that we breathe, eat and touch in our environment to
determine their toxicological effects all along? Why has it taken the cold hand of public
scrutiny and regulation to get things moving?
I think one of the strongest factors addressing this question is the misguided presumption of
safety by those with an economic stake in these chemicals. Many believe at some level that “no proof of
risk means proof of no risk.” This is,
of course, not true; however, for these folks there is no incentive to go “looking
for trouble” by testing the chemicals to identify the hazard they pose and the
potency of that hazard. Toxicology
testing is, or has been, considered to be relatively expensive to do so. Thus, the reasoning goes, why spend money to
give you bad news when you assume it’s safe and the testing would only bring
bad news?
There is another large factor
in this and that is the mistrust of toxicological data. Those who do not like the results point to
the high doses used in the toxicological studies and assert that they do not
relate to or represent the exposures received in the real world and are
therefore unrealistic measures of risk.
We will get into these issues (dose-response extrapolation and setting exposure limits) in future blogs but I think you can see
how the politics might be playing out in all this to have a strong bias towards
not doing toxicological testing.
So at the end of 2013 we are left
with lots and lots of chemicals but relatively little toxicology data to
characterize the hazard and potency of these tens of thousands of substances in
commerce. My sense, however, is that as Bob Dylan once
sang “The times they are a changin…” The
European REACh statute is essentially driving risk assessment by forcing risk assessment and the
assignment of exposure limits based on very prescriptive procedures. I will go over REACh in a future blog because I
believe it has the capacity, given the political will, to ultimately drive the science of
risk assessment. This could conceivably
force much more testing of both toxicity and exposure but that remains to be seen.
.
.
On another front the EPA
is spending considerable money (tens and perhaps hundreds of millions of USD)
and resources in advanced toxicological screening using innovative molecular
biology techniques in a program entitled ToxCast: http://www.epa.gov/ncct/toxcast/
More confident data will begin to
feed the risk assessment process which should ultimately the lower uncertainty
of the analyses. Lower uncertainty will,
in turn, lower the forced overestimation of risk that comes from doing a risk
assessment with a lack of data. Indeed,
in the end this confident knowledge will be much more cost-effective in focusing
on real risk and not spending money on phantom risk borne of uncertainty.
I encourage you to consider the Tox21 program also, which is running >8000 unique chemicals and mixtures through approximately 30 high throughput assays - including assays evaluating mitochondrial membrane potential; DNA damage pathways; stress response pathways; and nuclear receptor responses. More information may be found at: http://ntp.niehs.nih.gov/?objectid=05F80E15-F1F6-975E-77DDEDBDF3B941CD
ReplyDeleteEPA has recently released "Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology" for public comment.
The link to the report is: http://cfpub.epa.gov/ncea/risk/recordisplay.cfm?deid=259936
Thanks, I am aware of these effort and applaud them; however, my sense is that they are not to the point where we can translate these results into exposure limits or reasonable exposure-response curves for exposure routes found in the real world. I am sure it is coming - just not here yet.
ReplyDeleteChris Packham from the UK is an expert in chemical risk assessment from dermal exposure. Chris sent me this comment that I thought should be included below:
ReplyDeleteFor some reason my system here would not let me comment on line. However, here are the thoughts I would have posted:
One of the issues that I am frequently confronted with is that in my particular field - damage to health due to workplace skin exposure - the hazard is moveable. We generally acquire chemicals to fulfil a particular purpose. In the process we may well change the chemical's characteristics and thus also the hazard. We may heat, mix, react, contaminate the chemical. Furthermore, in the majority of cases in the real industrial world we will be working with mixtures and these can represent completely different hazards than will be the case for the individual constituents.
In the 1990s the Dermal Exposure Network of the EU spent three years trying to identify a process whereby we could develop validatable exposure limits for skin exposure only to have to report that this was not possible. The exposure limits for skin exposure for REACH are 'derived exposure limits' and not ones that are scientificably defensible, nor, if they were would they have any practicable use in day to day risk assessments.
As the EU's Agency for Safety and Health has stated: “However, there is no scientific method of measuring the results of the body’s exposure to risk through dermal contact. Consequently no dermal exposure standards have been set.” - from “Occupational skin diseases and dermal exposure in the European Union (EU-25):policy and practice overview - European Agency for Safety and Health at Work.
Another question would be "What would you measure". Would you measure what lands on the skin, what adheres to the skin, what is absorbed into the skin or what penetrates the skin, or in many cases a combination of all of these?
Then, given that different areas of skin on the body react differently, where would you measure and how would you correlate the different measurements?
Consider also that most occupational skin disease is chronic and, with the most common form, irritant contact dermatitis, not limited to exposure to a single chemical. Irritant contact dermatitis is almost always due to repeated exposures to a multitude of different chemicals, each exposure causing some sub-clinical irritant damage until the skin finally succumbs. We have no method at present of measuring or quantifying the acute effects although we can detect the subclinical damage at any particular point in time by measuring skin hydration. (anyone wishing to know more on this drop me an e-mail at chris@enviroderm.co.uk and I will send you a document on skin hydration measurement.)
So exposure limits would be largely meaningless in my field. However, we are currently looking at a measuring technique (polarized light spectroscopy) that appears to offer a possibility of detecting and quantifying the cumulative effect of repeated skin exposure to chemical at a sub-clinical and acute level, but our project is currently being held up due to lack of funding.
One of the most problematic aspects of risk assessment I encounter for skin exposure is determining the real hazard in the workplace, particularly since this can vary during the task or in successive tasks. My view is that we have a long way to go yet!
If you would like to post them on to your blog please feel free to do so.
Best regards
Chris
Chris Packham
FRSPH, FIIRSM, FInstSMM, MCMI, RSP, MBICSc
EnviroDerm Services
Unit 10, Building 11, The Mews, Mitcheldean, GL17 0SN
Tel: 01386 832 311
Mobile: 07818 035 898
www.enviroderm.co.uk
Mike,
ReplyDeleteI have to take issue with your suggestion that suppliers and users of common chemicals in commerce have not been testing their materials. At the American Cleaning Institute, we found as a result of our participation in the OECD and EPA HPV Chemical programs that we were able to "liberate" a substantial amount of information from our companies. For 149 chemicals we sponsored in seven categories, we disgorged >6,100 studies across the SIDS endpoint spectrum and had to conduct only 8 additional studies across those chemicals to complete the dossiers.
Based on this success, we set a goal to make publicly available hazard and exposure information and to conduct a screening level risk assessments for every chemical used in each of our members' consumer cleaning products. The first step of this project was to compile an Ingredient Inventory (http://www.aciscience.org/IngredientInventory.aspx) for our members' products. We identified about 920 chemicals that are used. We are currently identifying sources of publicly available hazard data and we hope to release that next set of information in December.
We've had a great deal of success in identifying publicly available hazard data for our chemicals, but to the extent it is not available, the issue is usually that such data are not publicly available, not that the tests have not been conducted. I think there is a similar story being revealed in the REACH arena as well - there is an awful lot of data available for high and medium volume chemicals but much of it is held confidential within companies because of its value. And if anything, data compensation under REACH has probably exacerbated that trend.
Paul DeLeo
Senior Director, Environmental Safety
American Cleaning Institute
Hi Paul,
DeleteI am aware of the efforts of the American Cleaning Institute (ACI) and I applaud them as very proactive and in the best tradition of product stewardship. Also, I think your points on the HPV program "liberating" data are right to the mark for a good portion of the high volume chemicals. On the other hand, I have to say after spending 35 years inside a Fortune 200 chemical manufacturing company that there has been in the past a real bias against testing any chemical (not requiring testing via regulation) for the reasons I mention above. If you look at the thousands of chemicals out there in commerce, I am pretty sure that only a small percentage have been tested at all or, if they have been tested, I am guessing that the data are not very complete even if they are not publicly available.
High Production does not always mean high exposure potential. The systematic approach to assessing the risk to a defined universe of chemicals - as is currently underway in ACI - remains an unfortunately rare event.