LinkedIn

Monday, October 28, 2013

Why Isn’t Risk Assessment Done on All Chemicals?

I have recently been asked why are there so few occupational exposure limits compared to the number of chemicals in commerce?   There are a number of potential answers to this but I think the simplest and most accurate comeback is that there is a lack of data.   So that begs the question as to why there is so little data?

We all probably know by now that Human Health Risk Assessment is the integration of the ability of a chemical to cause harm to human health with the actual exposure to that chemical that might occur in the real world.   In terms of a simple conceptual model:   Risk = (Toxicological Harm/Exposure)X(Exposure).    EVERY chemical will cause toxicological harm at some dose.   Pure oxygen breathed for an extended period is toxic.  Indeed Wikipedia reports that pulmonary and ocular toxicity result from extended exposures to elevated oxygen levels at normal pressure.   Given a dose high enough, toxicological harm comes from each and EVERY chemical we can think of.  The “what” of that untoward health outcome is called the HAZARD IDENTIFICATION for that substance. In the above example with oxygen it is pulmonary and ocular toxicity.  It is usually the first bad thing health-wise that happens as you ramp up the exposure level and it can range from irritation to more serious outcomes like cancer to death.   The exposure level over which that bad health effect happens defines the POTENCY of the chemical’s toxicological effect with highly potent materials causing effects at very low exposures.   Oxygen is not very potent.  By comparison benzene is very potent in its ability to cause toxicological harm.

The point of all this is that you cannot begin to do a risk assessment without data on the toxicological properties (HAZARD IDENTIFICATION and POTENCY) of the chemical of interest.  If you have No data you have No risk assessment unless you really force the issue which will be discussed below.

Unfortunately, what has sparked a lot of the toxicological data that we have comes from first seeing the adverse health effect of the chemicals on exposed humans.   Benzene falls into this category.   So does asbestos and vinyl chloride and all the other known human carcinogens.   Seeing people stricken by these substances caused them to be studied.   The other category of well-studied chemicals is pesticides.  This is primarily because they are designed to be commercial poisons so they are expected to be relatively potent toxicants and clearly worthy of study.   As a result they are also highly regulated.  How do we address the rest of the chemicals in our world?

In the late 1990s the Environmental Defense Fund issued a ground breaking report entitled:  Toxic Ignorance (The Continuing Absence of Basic Health Testing for Top-Selling Chemicals in the United States):  http://www.edf.org/sites/default/files/243_toxicignorance_0.pdf    It proved with undeniable hard data that, at that point in time, “…even the most basic toxicity testing results cannot be found in the public record for nearly 75% of the top volume chemicals in commercial use.”   As you might imagine it caused quite a stir and the EPA got involved and eventually hatched the High Production Volume (HPV) Challenge  program.  http://www.epa.gov/hpv/.   This resulted in considerably more toxicological data but as you might guess there remains a severe lack of data for the tens of thousands of chemicals still in commerce to which folks are exposed every day.

But that takes us to an even more fundamental question:  Why haven’t we been testing the chemicals that we breathe, eat and touch in our environment to determine their toxicological effects all along?   Why has it taken the cold hand of public scrutiny and regulation to get things moving?  I think one of  the strongest factors addressing this question is the misguided presumption of safety by those with an economic stake in these chemicals.   Many believe at some level that “no proof of risk means proof of no risk.”  This is, of course, not true; however, for these folks there is no incentive to go “looking for trouble” by testing the chemicals to identify the hazard they pose and the potency of that hazard.   Toxicology testing is, or has been, considered to be relatively expensive to do so.   Thus, the reasoning goes, why spend money to give you bad news when you assume it’s safe and the testing would only bring bad news?

There is another large factor in this and that is the mistrust of toxicological data.   Those who do not like the results point to the high doses used in the toxicological studies and assert that they do not relate to or represent the exposures received in the real world and are therefore unrealistic measures of risk.   We will get into these issues (dose-response extrapolation and setting exposure limits) in future blogs but I think you can see how the politics might be playing out in all this to have a strong bias towards not doing toxicological testing.

So at the end of 2013 we are left with lots and lots of chemicals but relatively little toxicology data to characterize the hazard and potency of these tens of thousands of substances in commerce.   My sense, however, is that as Bob Dylan once sang “The times they are a changin…”   The European REACh statute is essentially driving risk assessment by forcing risk assessment and the assignment of exposure limits based on very prescriptive procedures.  I will go over REACh in a future blog because I believe it has the capacity, given the political will, to ultimately drive the science of risk assessment.   This could conceivably force much more testing of both toxicity and exposure but that remains to be seen.
.  
On another front the EPA is spending considerable money (tens and perhaps hundreds of millions of USD) and resources in advanced toxicological screening using innovative molecular biology techniques in a program entitled ToxCast:  http://www.epa.gov/ncct/toxcast/     

More confident data will begin to feed the risk assessment process which should ultimately the lower uncertainty of the analyses.  Lower uncertainty will, in turn, lower the forced overestimation of risk that comes from doing a risk assessment with a lack of data.  Indeed, in the end this confident knowledge will be much more cost-effective in focusing on real risk and not spending money on phantom risk borne of uncertainty. 

Monday, October 21, 2013

Career Advantages of Being a Modeler

It occurred to me recently that I have not given you all the really good reasons why you should be dedicated to learning modeling. I will attempt to do so herein. The reasons are listed and further explained below:

  1.      It’s cool!
  2.  It will really help you in your job as an industrial hygienist or exposure assessor
  3.  You will become a relatively rare and sought after animal
  4.  It could dramatically increase your value to your employer


It’s Cool:
Huey Lewis once sang that “its hip to be square!”   If you have any feel for science at all, understanding or looking into the workings of reality can be a real rush.  The Science Channel counts on this human trait to gather viewers.  Indeed, seeking and organizing the factors that drive human exposure in any scenario is part of being human and many of us find it to be simply fun.   Let’s face it, we are curious animals that love to be competent and develop tools (i.e., models)  and acting on that curiosity is an end in itself for many of us.

It will really help you in your job as an industrial hygienist or exposure assessor:
Modeling will inform your judgment as to where the significant exposures might be whether they occur in the present, happened in the past or have not occurred as yet.   It will allow you to estimate the exposure potential of scenarios literally on the other side of the globe.   I should also ultimately mean you will most likely waste less time monitoring some exposure scenarios  that do not need measuring while focusing on other that do.  Properly done, skill in modeling could in the long run mean less direct monitoring and more effort put into characterizing what exactly is causing the potential over-exposures. 

You will become a relatively rare and sought after animal:
Thanks to the efforts of my colleagues within the AIHA there are quite a few more modelers out there in the Industrial Hygiene Community than there were 20 years ago when we started beating the drum but there are frankly still relatively few.   It is not the sort of discipline that you pick up very quickly and there are very few places to actually learn it.   The 2 day AIHA Professional Development Course is probably the best but it is very intense and, while Rome was not built in a day, it is even harder to “make a modeler” in two days.    Indeed, there are quite a few reasons that there remains a relative lack of folks that are reasonably skilled in human exposure modeling.   I outline this situation in detail in the following document:    


Those of you that read this short Word document will find that it is an “offer of service” to clients to take the time and attention needed to actually train professionals on-the-job in modeling to have them become fully functional.   The offer has been out for a while and I have yet to have any takers.   If I get a lot of response to this particular blog I may reproduce it in a future blog.   Frankly, it walks the line between service to you and your management and self-promotion but I am willing to take that chance to get the word out.
The fact remains that there are very few places to get this training and that if you take the time to do so you will be a rare, and valuable, exception.    You will no longer be someone who just measures and interprets exposures. You will be a technologist that predicts exposures and understands and can explain the scientific basis for that judgment.  That skill is worth something as the next point stresses.

It could dramatically increase your value to your employer:
I tell you truthfully, that being able to model exposures (and dose-response) made my career at the Rohm and Haas Company.   The skill was responsible for at least 3 promotions within that company.   Using models, I was able to predict exposures with just a modicum of data and the invocation of assumptions.   I could explain and justify those predictions based on first principle (and review-able) science and the managers just loved it.   Over 80% of my work was rendering these informed and explained technical opinions regarding the relative safety of products.   When the margins of safety were high enough, it gave them the confident knowledge they needed to proceed.   When the margins were not adequate, it gave me the necessary arguments (and support from my management) to obtain more information and data to reduce the uncertainty and usually the predicted level of risk.


Bottom Line:  Becoming skilled at modeling is not an easy or a short road but it’s the road less traveled and it could offer tremendous benefits to you and your career. 

Friday, October 18, 2013

Exposure Modeling Course at the SRA in Baltimore December 8 2013

PLEASE NOTE THAT THIS COURSE COVERS MONTE CARLO UNCERTAINTY ANALYSIS RELATED TO EXPOSURE MODELING.   IT WILL NOT TEACH MODELING PER SE.  SORRY FOR ANY CONFUSION.

This is your chance to learn Monte Carlo Uncertainty analysis modeling in a one day course from an excellent teacher.   Tom Armstrong and I have previously put together and taught a half day course on human health exposure modeling here in the Philadelphia area.   Tom asked if I would be interested in teaching a one day course at the SRA in Baltimore this year.  I told him that I would but that there would have to be enough student's attending to make it worthwhile for him or I to do so.   Tom agreed to put the course together and teach it by himself if a minimal number of students signed up and to have me come on-board if the student count became high enough.   Well it looks like there are enough folks signing up for Tom to give the course.   Tom is an excellent modeler and teacher and you will get your (or your company's) money worth if you attend this class.   If considerably more of you sign up in the next few weeks, I will be there as well - doing what I really love - teaching and getting paid for it.

The course is on Sunday Dec 8.

Below is mostly cut and paste from the SRA:

HOTEL RESERVATIONS
SRA has arranged for a special rate of $164 a night at the Hilton Baltimore. The Hilton is located in Baltimore’s Inner Harbor district and is only 15 minutes from BWI airport. Transportation from BWI Airport is available via the Light Rail System for $1.60 each way. The Hilton is located adjacent to the Convention Center station. Guests can enjoy the hotel’s prime location next to Oriole Park at Camden Yards, M&T Stadium, home of the Baltimore Ravens; and the University of Maryland Medical Center.

Please use this link for the discounted SRA rate: http://www.hilton.com/en/hi/groups/personalized/B/BWICCHH-SRA-20131207/index.jhtml?WT.mc_id=POG


ANNUAL MEETING WORKSHOPS
Workshops will take place on Sunday, December 8, and Thursday, December 12 at the Hilton.  Listed below are the available workshops.  Go to http://sra.org/sites/default/files/pdf/events/2013_Workshops.pdf to see all of the workshop descriptions.

If that link does not work, try this one:
http://www.sra.org/sites/default/files/pdf/events/2013%20SRA%20Workshops%20FINAL.pdf

I understand that you can sign up and pay for the workshop without registering for the conference.   I also hear that there is a significant discount rate for folks who are full time students that want to take this course. 







Sunday, October 13, 2013

Introduction to IH MOD Slide Deck

Dr. Tom Armstrong is a good friend and colleague and a tireless worker in the cause of educating the Industrial Hygiene community.  Tom just recently dropped a free and valuable gift in our laps.   It is his 53 slide presentation as a PDF file introducing folks to IH MOD.

Those of you who have been reading this blog may know that I think very highly of IH MOD.  For some details, see the Sept 2, 2013 post:  Modeling Math made Easy or at least Easier

IH MOD is an Excel Spreadsheet Workbook file that literally has many hundreds of hours of development put into it.   It is very skillfully programmed in Visual Basic by Daniel Drolet.   I am pretty sure it was Tom who took the idea to Daniel and, at least initially, provided the lion's-share of technical support relative to the models.   I had the privilege of spending some time with  these two a few years ago in working on an expansion of the models.

IH MOD makes the mathematical calculation part of inhalation exposure modeling easy, indeed almost trivial, and it is written to be user friendly.   Now we have 53 slides that provide an excellent introduction to this great tool.    In the slides Tom covers the following:

  • Brief theory of modeling to estimate exposures 
  • What IH Mod is 
  • Contents and layout of IH Mod 
  • Starting and Navigating in IH Mod 
  • Entering parameter values in IH Mod 
  • Interpreting the results from IH Mod 
  • Examples of what IH Mod can do 
  • Worked Examples for you to try IH Mod 

The slides are available online at:

http://www.ihmod.org/uploads/3/2/9/5/3295818/intro_ih_mod_1_hr.pdf

Even if you do not plan on studying or viewing these slides soon - it is worth downloading and saving the PDF as a reference for future study.  If you EVER want to get into exposure modeling (and all the career-advancing advantages that it could bring) you will be glad you did.

Monday, October 7, 2013

Simple Approaches to Common Modeling Problems

Previous blogs have been pretty technical – this one is intended to provide more practical advice on simple approaches to common modeling problems.

I am going to stick with the theme of the modeling of the concentration of volatile organic contaminants (VOCs) in the breathing zone.   As discussed in previous blogs we have the models and the software tools to make these estimations but we need to feed them.   Traditionally the most difficult inputs for modelers to get are the ventilation rate and the emission rate.   We spent a lot of ink talking about estimation of the ventilation rate (Q) and I think that between the two, it is the easier to estimate.   The generation or emission rate (G) is more challenging but certainly doable.

Consider a small spill of a mixture of aromatic volatile organic compounds (VOCs).   The first task is to figure out which VOC(s) need to be evaluated.   This goes to the concept of controlling health hazard.   That is, which compound or compounds will present the highest risk such that if it (they) are controlled then the other components will also not present an untoward risk to human health. If all the VOCs have similar vapor pressures then the simplest first crack at this problem is to take the ratios of percentage within the mixture of each chemical over its exposure limit.   For example let us consider an example we used earlier of a mixture with 1% benzene, 50% toluene and 49% xylene.  I am not certain of the current TLVs but for the sack of this example let’s say they are 0.1, 50 and 50 ppmV, respectively.   The ratios would be 1%/.1 = 10 for benzene and about 1 for toluene and xylene.   Clearly, benzene is the controlling health hazard in this example.     If it were just a 50/50 mixture of xylene and toluene (without the benzene) with the above supposed TLVs then you should probably look at both, at least initially.   Toluene would be expected to be more volatile but with the techniques we are using, xylene may be a close enough second that you need to look at both.  Plus, they most likely will have similar adverse health end-points to each other and thus need to be considered additive in their exposure and risk.  On the other hand, at this point in our understanding benzene presents a dramatically different and more dreaded risk compared to the other two aromatics and concurrent exposure to these two would not be considered to add to this risk.

The next important task is to figure out how much of the chemical you are interested in is evaporating.  One simple way of getting at this value is to weigh the loss of the delivering vessel (Paint can, aerosol can, pump spray, etc.) figure out how much is the evaporating chemical of interest and divide by the drying time.  These are rough estimates but can provide good information.
   
So at this point we have determined that benzene is the controlling health hazard in our spill and we have figured out approximately how much has been released and evaporated.   Now we have to figure out a rate of emission.   If you know how long the spill takes to evaporate you can estimate G but simply taking 1% of the total mass of the spill and divide it by the time of evaporation.   This gives you a rough average for G. This would be modeled with a constant G (using IH MOD) for that time period with G = 0 at the end of the evaporation period.   No matter how long the evaporation takes you need to compare the modeled concentration that occurs as an 8 hr time-weighted average with the 8 hr TLV-TWA of the chemical.    This assumes the worker is only exposed for 8 hours. 

If the spill is actually shrinking in area during the evaporation then you can use a first order assumption.  In this case we would assume that the spill took 8 half-lives to essentially disappear (e.g., it starts at 640 grams and go to 2.5 grams in 8 half-lives).   Thus as a rough estimate the half live is the evaporation time divided by 8 and a derivation in the previous blog will provide the decay constant; that is, k = 0.693/half-live.   If the drying time is less than 8 hours, you will get essentially the same 8 hr time weight average room volume concentration as with the constant G but it will affect the 15-minutes time-weight average for comparison to a Short Term Exposure Limit (STEL).  The highest 15 minute STEL during the evaporation of the spill will be significantly higher with the first order assumption versus an assumed constant rate of evaporation.

Another way to get an evaporation rate is to put a relatively thin layer (simulating the spill) evaporating material into a open dish, put it on a open scale under the same conditions as the spill and record the weight loss with time.    The thickness of the material in the watch glass should be similar to the thickness in the actual exposure scenario.    Consider the data set below for a product that has about 50% solids and 50% solvent.   The solvent is the one from our example:  1% benzene, 50% toluene and 49% xylene.  The only measurement is the total weight versus time. The difference in weight over time is the solvent evaporation and we are assuming that the benzene is 1% of that number.



  
You are pretty sure you have the vast majority of weight loss when the weight does not change significantly over a period of a few hours.   In this case less than 1% of the product mass is lost between 140 and160 minutes.  The time it took to lose half the solvent weight is roughly the half-life and again k= 0.693/half-life.  In this case the half-life is 20 minutes.   To be more precise you could model the weight loss data as a first order decay as has been shown in previous blogs.  Remember we are only interested in the portion of the evaporating mass that is the chemical of interest.  So if benzene is 1% of the mass, G is 1/100th of the overall evaporation rate as indicated in the data.

Remember that this scale experiment only gets you the half-life and k values – you still need to know how much was actually spilled to estimate the airborne concentration with a model.   Note also that the average emission rate in the scale experiment is roughly 320mg/160 minutes for 64 grams of product.  If you spill 640 grams of product it would be 10 times higher.   Also the first order evaporation rate at the beginning of a 640 gram spill would be 80 mg/minute over the first 20 minutes after the spill.

If the evaporating surface area does not shrink with time (e.g., an open vessel) then you can do the above experiment with a deep source so that the measured rate should be relatively constant.

I am doing all this without peer review - which is usually a bad idea; however, this is a simple educational blog.  If you find some errors or have some issues with the advice please let me know and I will issue corrections and expand the discussion.   



Monday, September 30, 2013

Gifts from the Netherlands – More UNIFAC and more

Compared to the US, the Netherlands is a relatively small country but it is gigantic in the stature and generosity of its scientists doing human health risk assessment.  Of course, these folks  are not the only other good guys in the world making great contributions to our field from relatively small countries (Demark and the UK come to mind) but one Dutch colleague really highlighted this for me recently because of his kindness.   My friend and colleague, Theo Scheffers (formal title:  Ir. Theo Scheffers RAH) read the last blog and sent me an email.   He asked if I was aware of an Excel sheet (XLUnifac.xls) that has been highlighted as an advanced tool for REACh.    REACh stands for Registration, Evaluation, Authorisation and Restriction of Chemicals.  It is a very big deal in Europe and something I will cover in a future blog or two.

Turns out I had not heard of XLUnifac.xls so Theo sent me a copy along with a pdf manual for the spreadsheet.    The program and manual were written by Preben Randhol and Hilde K. Engelien for their students at the Norwegian University of Science and Technology (NTNU).   It will do 15 mixture components (UNIFACAL does 5) and really goes into the background of the UNIFAC method which the UNIFACAL spreadsheet discussed last week does not.  So if you really like looking under the hood this is the program and document for you.  You can find the program with a Google search but if that gives you trouble just email me (mjayjock@gmail.com) and I will send it to you along with the manual.

It further turns out that Theo has sent me a link to a site with quite a few very interesting tools for the Industrial Hygiene community.   He apologizes that some of the material is in Dutch but there is enough there for any of us to find of interest.   The site is:  http://www.tsac.nl/websites.html 
I will let you folks explore its contents for yourself but wanted to put up two sites that really caught my eye: The first is http://limitvalue.ifa.dguv.de/Webform_gw.aspx  This site presents a very extensive database of Occupational Exposure Limits from all over the world.   It includes OSHA and NIOSH limits but not the ACGIH TLVs. 

Another site that I found very interesting and potentially quite valuable:  http://www.chemspider.com/.   Below is a cut and paste from this site:

“ChemSpider is a free chemical structure database providing fast text and structure search access to over 29 million structures from hundreds of data sources.
Watch 
our introduction video.”  
Pretty cool!

Down in the Dutch language part of the TSAC web site is a section entitled:   SKINPERM.    In this section there is a link to the home page of another outstanding contributor from the Netherlands to our science:  Dr. Wil tenBurg.    Wil’s site: http://home.planet.nl/~wtberge/ has been up for many years and contains a lot of great stuff.    The page is not flashy and has the very modest introduction reproduced below:

   "Main areas of interest are:
  • Estimation of the permeation rate of substances through the skin .
  • Estimation of dose/response relationships for acute inhalation toxicity, controlled by exposure concentration, exposure period and other independent variables"
The “skin” link is the program SKINPERM which estimates dermal exposure to chemicals applied to the skin and in the air over skin.  It does this as well or better than anything else I have seen or I have been able to find after many year of searching.   The “relationships” link presents his ground breaking software to calculate acute dose-response modeling (usually lethality) as a function of both time and concentration.   This is a very big deal in setting Emergency Response Planning Guidelines (ERPGs) and is used all over the world.   Wil provides these remarkable tools for free.   I will be discussing the technical details of all this in future blogs but I just wanted show off some of these links and to tip my hat to our colleagues from the Netherlands.  



Monday, September 23, 2013

Getting Activity Coefficients for Mixtures

In last week’s  blog I talked about non-ideal solution mixtures that can go far afield of the predictions of vapor pressure over a mixture from Raoult's Law.    The fix for this situation was to modify Raoult’s Law to include an activity coefficient (AC):

VPM = (VPP)(MF)(AC)

VPM = vapor pressure of the compound of interest over the mixture
VPP = vapor pressure of pure compound
MF =  mole fraction of the compound

The thermodynamic AC can be very large (>1000) as in the case of benzene coming out of water or pretty close to one (1.0) as in the case of the mixture of compounds with similar structures like methanol in ethanol or a mixture of aromatics hydrocarbons.  


So how do we come up with a numeric AC is for a compound of interest in a mixture of chemicals?   Well there are models, of course.    In this case there are relatively complicated physical-chemical models that use the structural characteristics of the molecules in the mixture to estimate the AC values of each component.  Here again we needed someone to do the computing coding so we would not have to wade through all the math.  We need a dedicated user-friendly program.  I think that the most useful one that I have found over the years of looking has been:  UNIFACAL.exe.    This little program (1.7mb) can do wonders.   The screen shot below shows what I mean: 


Notice that there are places for 6 mixture components.   I have put the two component (binary) system that I talked about in the last blog:  1.7 grams (solubility limit) of benzene in 1 liter of water.   UNIFACAL comes with a modest database of benzene, chlorobenzene, ethylbenzene, toluene and water.   It is is a relatively simple matter to add compounds using the database.  For example, I added ethanol by adding up 1 x CH3 group, 1 x CH2 group, and 1 x OH group.   You will find it is even easier than it sounds once you know the structure of the chemical you want to add and move to actually put it into the UNIFACAL Database.    All the halogens are in the database, along with some silicon, sulfur and nitrogen containing moieties. 

Notice that the AC of benzene in this aqueous mixture is predicted to be over 2400 which is essentially the full expression of the vapor pressure (VPP) of pure benzene.   The reasons for this are discussed in the previous blog. 

The latest version of this program was written and has been shared as freeware since 1998 by Bruce Choy and Danny D. Reible from The University of Sydney Australia and Louisiana State University and we owe them a debt of gratitude for their generosity. 

You can download this remarkable program at:  

Another way to estimate vapor pressure of compounds in water is to use Henry's Law Constant (HLC).
Henry's Law is simply a variation on Raoutl's Law.   It says that the amount of a mixture's component vapor in the equilibrium head-space will be a constant proportion to the amount of that component dissolved in the liquid mixture.  This constant ratio is called Henry's Law Constant (HLC) and if one knows the concentration in water and the HLC then he or she can calculate the concentration (i.e., vapor pressure) in the headspace.   Remember that the saturation  head-space concentration is convertible to the partial pressure which is the vapor pressure over the mixture (VPM).   We went through this calculation in a previous blog.    Many compounds have published HLC but these, of course, are only for water.   I must say that water is used a lot as a solvent but if you are not dealing with water and have dissimilar organics (e.g., straight and branch chain hydrocarbon and lower alcohol) you need to use the AC approach above.

Note that the AC is a function of concentration (i.e., molefraction) so you need to understand what might happen to the concentration of the evaporating liquid over the time of exposure.   If the time of exposure is pretty short or only a small portion of the evaporating liquid is expected to disappear during the exposure then you do not need to worry about it much; however, if the composition is expected to change significantly during the exposure then you need to account for this.   I usually do this by taking worst case.   That is, what is the worst emission rate that might occur during the exposure and use that for the entire exposure period knowing that it is a purposeful overestimate that still might be useful.  If you cannot live with the overestimation there are other ways to approach the problem but they always require more work. 

In the next blog I am going to talk about practical approaches to estimating emission source rates by various means under different circumstances.   If you send me some of the specifics of what has been challenging you I may use it as an example.   Give me as much detail as possible and let me know what I might need to "blind" for reasons of confidentiality.