LinkedIn

Sunday, January 25, 2015

i-SVOC (2014): A Modern and Remarkable Advancement over AMEM

For me the best part about writing a blog is the networking and interaction with the colleagues.   After last week’s blog about an old but, I thought, still useful software tool (AMEM), I was contacted by Shen Tian, P.E., Environmental, Health and Safety Engineer at Bayer Material Science.   Shen advised that he had found and uses the EPA i-SVOC (2014) (Note: SVOC stands for Semi-Volatile Organic Compounds of which phthalates and flame retardants are prime examples). Shen advises that this model can do similar estimates of SVOC (e.g., flame retardant and phthalates) emitting from products (substrate).   Copying from the EPA web site on this program we are told:

“Computer program i-SVOC Version 1.0 is a Microsoft Windows-based application for dynamic modeling of the emissions, transport, sorption, and distribution of semi-volatile organic compounds (SVOCs) in the indoor environment. This program covers the following indoor media or compartments:
·         air,
·         sources,
·         sinks (i.e., sorption by interior surfaces),
·         contaminant barriers,
·         suspended particles, and
·         settled particles.”
 One can find and download the program and documentation   at:  http://www.epa.gov/nrmrl/appcd/mmd/i-sovc.html.  Shen went on in his email to discuss his experience with the model.   He notes that for the emission of SVOC from a plastic it depends on the following user inputs with I have copied and listed below:
  • D: diffusion coefficient of SVOC in the substrate,
  • Kma: partition coefficient of SVOC between solid and air,
  • ha: mass transfer coefficient in the air
  • C0: the initial SVOC concentration in the substrate. 

I have to admit I had not heard of the i-SVOC model so I downloaded it and installed it in my Window 8.1 laptop (reportedly, it will work on Windows 7 or XP as well).   My first impression of this program:  Holy Cow!   If AMEN is High School this is Graduate School!  It is a remarkable piece of freeware with a very slick interface that attempts to determine the time course and fate of an SVOC of concern in all the various compartments (sources, sinks, settled dust and airborne particulate matter) extant indoors.  The documentation was written by Dr. Zhishi Guo and is presented in a 2013, 80 page PDF file that downloads with the program.  I have known of Dr. Guo's work for many years and he is a brilliant modeler who has done outstanding work at the EPA in constructing and sharing many very useful and relatively user-friendly models.   The i-SVOC model was also reviewed and tested by some of the best technical minds working in this field.   By all indications, this is a first- rate model and program but it will require a significant level of study and time from anyone wishing to use it.  To make this task easier, a number of demo input files are provided with the download to help the new user.
In order to feed this model relative to its inputs, Shen Tian advises that the companion modeling program (PARAMS 1.0) from EPA can be used to estimate quite a few parameters required for input.    I went looking for this program and found at link at: (http://www.epa.gov/ordntrnt/ORD/NRMRL/appcd/mmd/PARAMS.zip); however, when I tried the link it was broken with the dreaded 404 error.   I am looking into reviving this link and will let you know what happens, if you like, via email correspondence.   If worst comes to worst I will get a copy of PARAM 1.0 and make it available to anyone who sends me a request at mjayjock@gmail.com.
In his work Shen reports that the most challenging part of using i-SVOC is the estimation of D since there has not been a lot of testing done on SVOC/pairs.  To help this particular gap he tells us that Liu and Guo et al recently published a testing method for estimating D and Kma in Atmospheric Environment 89 (2014) 76-84 , entitled:  Experimental method development for estimating solid-phase diffusion coefficients and material/air partition coefficients of SVOCs which he found helpful in filling the gap. 
In all, i-SVOC represents a remarkable accomplishment and resource for anyone doing this type of human exposure assessment of SVOC indoors.  I am indebted to Shen Tian for pointing us toward it.   If you have a need to perform this type of analysis, I believe that it would be well worth your effort to learn and use it.

Sunday, January 18, 2015

AMEM: A Tool to Estimate Human Exposure from Fire Retardants and Plasticizers in Plastics


A lot of plastics in our homes are not just made of polymer.  Many have monomeric additives within them and sometimes these additives represent a significant portion of the mass of the plastic.   For example, the plasticizer in flexible PVC  is typically greater than 10% and can be has high as 30-40%.  Phthalates are often used as the plasticizer to make PVC flexible.    All phthalates are monomers and all will diffuse out of the plasticized PVC matrix given enough time to do so.   The typical mechanism for plasticized items used indoors is for the plasticizer to diffuse out of the PVC and then partition into house dust.  Must of house dust is composed of our shredded skins cells and are therefore expected to be lipophilic.  The plasticizer would partition into the dust and the dust would be distributed within the residence.  Of course, some will be cleaned and removed but it is difficult to determine how much might be removed before human exposure may occur.  What we do know it that there is a significant amount of phthalates in the dust of some homes. The dominant source of exposure is anticipated to be hand to mouth ingestion of this dust with kids doing more than adults but adult doing at least some of this type of ingestion our entire lives.  

While researching this issue on the Internet I found the following data for a PVC “geomembrane” which is another way of saying a canal liner that was used to contain water.   They report that the PVC films was 30-35% plasticizer but I did not find out which plasticizer they used.   Their report shows the following loss of plasticizer to the water.

Service
Years
2
4
5
9
14
19
Original
Specification
Value
Plasticizer
Content
Remaining
92.1%
67.7%
67.8%
65.6%
58.0%
54.3%
None

I am frankly not sure how this relates to plasticizer diffusing from a PVC product to its surface to be then transported within the indoor environment via dust; however, it is an interesting study.

It should be mentioned that plasticizers are the not only monomers to diffuse out of plastic.  Indeed, any monomer would be expected to do so including flame retardants.   Indeed, some flame retardants appear in the plastics of our electronic cases at concentrations around 10% and are thus prime candidate for the migration-to-dust-to-ingestion pathway.

It turns out there has been a tool around for some time that was developed in 1989 by Arthur D. Little, Inc for the  EPA Office of Pollution Prevention and Toxics, Economics, Exposure, and Technology Division, Exposure Assessment Branch (EAB).   It is an “oldie but goodie”.  Indeed, it is a DOS program that EPA claims can be run in the modern Windows (7 or 8? ) environment.    I keep an old PC around just to run this old stuff (it runs DOS 6.2, Windows 3.2 and Windows XP) so I am not sure if AMEM will run on Windows 7 or 8.  If anyone has any experience with this, please let me know.

You can download and learn about it at:  http://www.epa.gov/oppt/exposure/pubs/amem.htm   A cut and paste of some of the Q&A from this EPA site is presented below:
The model assumes:
·    The chemical is homogeneously distributed throughout the polymer and is not initially present in the phase external to the polymer,
·    Migration of the chemical is not affected by the migration of any other chemical or by the penetration into the polymer of any component of the external phase,
·    The migration is isothermal,
·    and Fick's law of diffusion and convective mass transfer theory applies.
How Does AMEM Work?
AMEM is a DOS-based software product developed in 1989 that uses a family of mathematical equations that address simple and complex scenarios for chemical migration through a specific polymer. The more complex equations require more input data. Using the model, you may:
·    Develop migration estimates,
·    Consider the effect of chemical partitioning between the polymer and the external phase,and
·    Consider the effect of mass transfer resistances in the external phase.
In all cases the model estimates the fraction migrated (i.e., the fraction of the chemical initially present in the polymer that migrates) after a specified duration. This model only provides one parameter needed to estimate exposure. The user must then use other equations and/or models to estimate exposure.
What Do I Need to Use AMEM?
Polymer category (i.e., Silicone Rubber, Natural Rubber, LDPE, HDPE, Polystyrene, or unplasiticized PVC) or diffusion coefficient of the polymer.  (MAJ Note: This is the first I have noticed the category of unplasticized PVC.  I think, however, the program and documentation might still present some insight for the estimation of phthalate out of PVC).
·    Molecular weight of additive.
·    Total polymer sheet thickness (cm) External phase (i.e., air, water, or solid)
·    One or two sided migration Time frame of interest
What Type of Computer System Do I Need?
·    Processor - IBM-compatible computer with a Pentium microprocessor (minimum speed: 33 MHZ)
·    Memory - 640K
·    Hard disk space - 2 MB
·    Operating System - AMEM is a DOS-based program, however, in can be run in a Windows environment by using key strokes not a mouse.

(Jayjock Note:  640k of memory and 2 MB of hard disk space really shows how far we have come since 1989).
 
If anyone knows of a better tool to answer these questions, please drop me a line and I may write about it here.





Monday, January 12, 2015

Inhalation Model Research Needs

Hopefully the readers of this blog are now convinced of the importance of models that estimate the exposure potential of workers and other humans.    My sense is that we are only scratching the surface of the potential value of these models, however.   Indeed, many folks still reject the idea of modeling in favor of the direct approach of measuring which they consider the gold standard.  To the extent that we do not have the tools to feed our models with proper inputs, they are correct.   It is by now an old “Catch 22”:  it costs more to get the model inputs for any single question at hand than it does to directly measure, so we almost always directly measure.   The reality is that once what have the modeling input parameters we can use them in many different scenarios so that in the end it would generally be a much more cost-effective way forward.  
Please don't get me wrong, models are still very valuable tools but they could be so much more powerful and useful if properly developed with research done as a public works project in the general shared interest.  
Indeed, it was with this in mind that I spent quite a bit of time in Italy starting in 2004 in an effort to organize such an effort.  Given an introduction by my friend and colleague, Bert Hakkinen, I began working with Stylianos Kephalopoulos, who was head of the Physical and Chemical Exposure Unit of the Institute for Health and Consumer Protection (IHCP/PCE) at the Joint Research Centre (JRC) of the European Commission in Ispra which is just north of Milan.  
The REACh regulation was happening in Europe and it was obvious to many that exposure assessment tools needed to be developed to help with the implementation of this ground-breaking legislation.
Together we first organized a pre-workshop to setup the questions and issues and then  later a series of 5 simultaneous workshops on the general subject of modeling that happened in June 2005 in Intra Italy.   I was an organizer and moderator for the pre-workshop and the workshop on model “Source Characterization” since I had always seen this as vital research need.   For this workshop on Source Characterization, we invited and gathered modelers from all over the world with the following folks coming to the workshop:

Arvanitis A.   JRC/IHCP/PCE (EU) (Rapporteur)
Bruinen de Bruin Y    JRC/IHCP/PCE (EU)
Delmaar C.   RIVM (Netherlands)
Flessner C.   EPA (USA)
Hanninen O.   KTL (Finland)
Hubal E. Cohen   EPA (USA)
Jantunen M.   KTL (Finland)
Jayjock M.   The Lifeline Group (USA) (Moderator)
Kephalopoulos S.   JRC/IHCP/PCE (EU) (Co-ordinator)
Koistinen K.   JRC/IHCP/PCE (EU)
Little J.   Virginia Polytechnic Inst. (USA)
Mason M.   EPA (USA)
Matoba Y.   Sumitomo (Japan)
McKone T.   University of California (USA)
Nazaroff W.   University of California (USA)
Pandian M.    Infoscientific.com (USA)
Price P.        The Lifeline Group (USA)
Shade W.     Rohm and Haas, Co (USA)
Sheldon L.   EPA (USA)
Sutcliffe R.  Health Canada (CAN)
Won D.        National Research Council (CAN)
Wu K.          University of Taiwan (Taiwan)
Zhang Y.     Tsinghua University (China)

Quite a few other fine modelers could not make this workshop but contributed to the report.

I must tell you that this was a remarkably talented and energetic group and it was all I could do to keep up with the ideas coming out of this critical mass of world-class modelers.   The main conclusions of our deliberations are presented below:

“It is the recommendation of the Workshop participants that the work products presented herein to be used in the systematic development of human exposure models for their use in a tiered approach to exposure/risk assessment. 
Given that the 5 bins presented herein represent a consensus taxonomy or universe of sources, the workshop participants advise that a reasonably representative subset of this comprehensive listing be selected for specific laboratory analysis and model development. It was further suggested that exposure models designed to describe these sources of exposure and the transport and fate of substances should be constructed using a step-wise approach as outlined in this report.”

In essence the group determined that there was no reasonably inclusive outline description of source types and certainly no systematic research effort to characterize them.   The two-day workshop resulted in the following primary work products: 
  • Identification of existing source sub-models: presented in the pre-workshop report and references
  • A defined Taxonomy of Sources
  • Identification and definition of the attributes and characteristics of First Principle, Mechanistic Source and Transport/Fate Models to be developed in a tiered approach 
All of the details of these outcomes are described in the 104 page workshop report which I will send to anyone requesting it: mjayjock@gmail.com
This work and report are almost 10 years old.   From my perspective some progress has been made primarily from the work of Drs. Bill Nazaroff and John Little and their colleagues in the characterization of indoor air sources.    I think even Bill and John will admit that the vast majority of work that we outlined in this workshop has not been started.   From my perspective the effective implementation of REACh continues to limp along without these tools.  Any effective re-authorization of TSCA would also require the fruits of this research. 
As usual nothing really is going to happen without committed resources ($).   I simply plan to pull this report out every few years, dust it off and remind folks that it is here.   If we, as a society, are really serious about doing a comprehensive job of human health risk assessment to chemicals we will ultimately need to develop these cost-effective tools.


Sunday, January 4, 2015

Acceptable Risk: Personal Gut Check

As Exposure and Risk Assessors we typically sit at the center point of imploding pressures.   We have distinct responsibility to at least 3 external groups; namely, our charges, our employers and our profession.   In addition, I believe from a moral and ethical perspective, we need to be responsible to ourselves. 

Our charges are the folks on the receiving end of any exposure to chemicals that might result from our analyses and recommendations.   When we do risk assessments for this group we are often forced into the role of risk manager and this has some strong implications for us.  Please let me explain why I say this. 
 
When we commit to doing a risk assessment, it of course needs to come to a conclusion relative to the risk.   In the world of Industrial Hygiene this is often the comparison of the predicted or measured exposure (EXP) to the Occupational Exposure Limit (OEL).   When EXP/OEL is less (hopefully much less) than 1, the risk assessment has a happy face: J.   This risk is considered to be “acceptable” or at least not unacceptable (see previous blogs on this topic).   At this point the risk assessment report normally gets written.   If, however, the ratio of EXP/OEL is 1 or greater than unity we have: L.   Typically, this means the application of some risk management option to choke down the EXP so that we can have a J and the report gets written at that point.   In a long career I do not ever remember writing a report that said the risk was unacceptable.  In essence the risk assessor is forced to become a risk manager before his or her report can be written with a happy face.

All of this presupposes that the EXP was accurately estimated and the OEL is an appropriate measure of an exposure level that presents an acceptable risk.   Whether this is true or not are issues for other discussions but for purposes of this treatise; let us assume it is true.

In doing a risk assessment we need to come to grips with issues around the safety of our charges and the needs of our employers.   We do this while staying true to the precepts and standards of our profession.   Almost all of this typically involves discussion and decisions involving money and where we as a professional draw the line on what we “believe” is reasonably acceptable risk.

I have never found balancing these needs to be easy.  We as the assessors are, of course, human with our own set of needs.   Some of which I am listing below:
  • To assure the health and safety of the workers relative to this risk
  • To surprise and delight the boss or client relative to our ability to add value to the organization
    • remain "promotable" as an employee
    • remain "on contract" as a consultant
  • To assure our livelihood:
    • maintain income and a lifestyle
    • provide well for our families
    • educate our children
  • To feel good about what we are doing in our careers and as a human being

It really is not simple to balance these.  For example, you could stay employed and keep you job but still be considered a "problem" with your employer.  On the other hand, you have to be part of the team with your boss and be invested in the success of the organization.

For me this all boils down to a specific and not very complicated personal  “gut check” I have for any risk assessment that I am willing to write and sign:  Would I allow a member of my family to be exposed to the occupational scenarios I am declaring to be acceptably safe?    If I would then I feel like I can endorse the analysis and conclusions.

I am not suggesting that this is the only standard; it is simply mine.  I admit to being extremely fortunate and grateful in my career because this standard never forced me to decide between employment and professional and personal integrity with my previous employer of 35 years or my current or past clients. Somehow I only wind up with the type of client that seems to subliminally understand and appreciates this position.  If I lived in another time, place or circumstance, I am frankly not sure I could have maintained this standard.   My point is that I believe we should all have a “gut check” to gauge our bottom line with clients, employers and ourselves.  I believe we all need to keep our personal standard in mind when we are put into the position of determining and ultimately managing acceptable risk for other human beings. 

The general subject of “acceptable risk” has generated a remarkable amount of interest and comment from readers of this blog.    Indeed, last week’s blog is on track to be the most popular post in almost 90 previous offerings.   We have begun work on what will probably be a Roundtable on this subject at the 2016 AIHA Conference in Baltimore.  

Human Health Risk Assessment is, after all, a human pursuit and, in the end, quite personal.  I would love to hear from folks reading this blog about your personal “gut checks” in doing human health risk assessment.




Monday, December 29, 2014

Acceptable Risk: The Discussion Continues

After the blog on “Acceptable Risk” I received an email and phone call from Harry Ettinger who, in addition to being a friend and colleague, is a Past President of the American Industrial Hygiene Association.  Harry has been around a long time and has a wealth of knowledge, wisdom and perspective.  Harry suggests that “Acceptable Risk” is an important subject that deserves additional attention.  He encouraged a panel discussion at a future AIHCE to air out the issues around this topic and the companion subjects:  “how safe is safe enough and how clean is clean enough”.    

If any readers of this blog are interested in participating on such a panel, please let me know (mjayjock@gmail.com) and I will put you on the list of potential participants.

Harry goes on to make the following points:

  •        There is probably no definitive answer that will satisfy everyone (or perhaps anyone)
  •         It depends on which side of the fence you are sitting on.
  •         It needs

o    a balance so that the greatest good (whatever that means)       
o   to be is provided
§   to the greatest number of people who are most important (however that is measured)  
§  by ethical/unbiased/knowledgeable  decision makers (if these decision makers can be found or even exist)
§   in a long term time frame (rather than short term) 

Harry points out that, unfortunately, we typically define acceptable risk, to a large extent, on the basis of:

§  who has the most political clout
§  shouts the loudest
§  short term considerations
§  media influence
§  risk that has typically already been accepted

Harry advises that consideration should be added to any discussion of “acceptable risk” relative to the subjective perception and psychology regarding the source of the risk. 

He reminds us that we kill 38,000 people a year (approximately 100 deaths a day) in car incidents, and do not think very long about this death rate. We know that reducing the speed limit will reduce the death rate but when this restriction was introduced, many people objected.  Indeed,  New Mexico currently has a 75 mph speed limit on its Interstate.   If a rail car carrying chemicals such as Chlorine/Ammonia/shale oil/etc. derails, the subject is off the front page of the newspaper in 2-3 days (even if there are fatalities). If that same rail car was carrying protective clothing with minimal radioactive contamination (that is not readily released) the uproar would continue indefinitely.

Harry suggests that another topic to add to the discussion is that acceptable risk changes over time. What was unacceptable today was acceptable 10-20 years ago, and may or may not be acceptable in the future.  He believes that the definition of acceptable risk varies as a function of time and where it happens, and other relative risks extant at the time. This suggests that the 1/1000 definition in the benzene decision (which we still quote and use) probably needs to be updated.


I appreciate Harry providing his considerable insight on this subject and encourage you all to weigh in on this important topic.   As I get older I am beginning to appreciate more and more why some tribal folks highly value their elders. 

Sunday, December 21, 2014

Dermal Exposure from the Air

We can get a significant dermal exposure to a toxicant from having exposed skin in contact with that toxicant in ambient air.   I am not talking about splashing or the settling of mist onto the skin, the mode of exposure discussed in this blog is pure vapor in air-to-skin-to-systemic absorption.  This manner of exposure can become very important when the respiratory route is reasonably well guarded via the use of a respirator.

The classic example of this type of exposure is phenol which has a relatively low exposure limit indicating that it is quite toxic via inhalation.   It is also quite irritating to the respiratory tract and thus may provide some good warning indications of exposure via that route.  Thus, folks who have to work in environments at greater than the OEL (ACGIH TLV and OSHA PEL = 5 ppm as an 8 hour time weighted average) would almost certainly be using respirators.

Phenol also readily penetrates the skin and, again, I believe dermal exposure to liquid phenol placed or splashed onto the skin surface would be very irritating or even corrosive. 

A more subtitle route of exposure is as vapor molecules going from the air above the skin, into the skin and then through the stratum corneum and dermis to be systemically absorbed.  This presents a real problem to the Industrial Hygienist trying to evaluate or estimate this exposure potential.  

One method would be to put absorbent patches on the skin to be subsequently desorbed and analyzed.   These could provide a measure of the weight per square centimeter to which the skin was exposed.   Multiply this weight by the total amount of exposed skin for at least some measure of what the exposure might be.

Other than involving a lot of logistics and laboratory development, the above method has the problem of being done AFTER the fact of exposure.   What we need is a method that is prospective; this is, before and predictive of the exposure.   

I have discussed Dr. Wil ten Berge’s work before here.  He developed SKINPERM which has been around for a long time and, working with Daniel Drolet, Rosalie Tibaldi  and Tom Armstrong, has produced the more recent, more user-friendly IHSkinperm model.  The basic modeling engine that Wil developed is the only one I know of that will take the airborne concentration of a chemical (along with other physical-chemical properties) and estimate the dermal exposure potential of that chemical to exposed skin.   

You can find SKINPERM on Wil’s website: http://home.wxs.nl/~wtberge/qsarperm.html   In addition to the model, Wil has a wealth of educational material on dermal exposure on this web site that has been around serving the risk assessment community for some time.   Put “IHSkinperm” into Google and the first two hits are the manual and the spreadsheet.

Unless someone is wearing well-fitted vapor barrier clothing, you might want to give some consideration to the possibility that even skin ostensibly covered with cloth clothing is at least somewhat “exposed” to vapors.   Considering the person naked (about 2 m2 of skin) would represent a worst case or bounding condition for these estimates.

It is the Holiday Season and I would like to take this opportunity to wish everyone reading this, along with their loved-ones, according to their preference, a Merry Christmas, Happy Hanukah, or whatever holiday you observe including the Return of the Sun during this special time of year. 



Sunday, December 14, 2014

What is ACCEPTABLE Risk?

In this week’s blog I go out on a limb, the topic is ACCEPTABLE RISK.

More than a few years ago, a then newly minted PhD, friend and colleague; Jack Hamilton asked my opinion about rules one might use to set a level of exposure and risk as ACCEPTABLE.   Being older and a lot grayer than Jack, I got to render my opinions.  I decided to share these opinions with you in this week’s blog to hopefully stimulate some discussion.  I do not pretend to have all, or maybe any, of the answers but I am willing shared my opinions to shine a light into this somewhat dark area in the hopes of bringing out additional discourse from you, my intelligent readers.

Finding and declaring ACCEPTABLE Risk Levels are almost never easy.   If it were easy we would have a life that was more dull and more people could do our jobs.  The question of risk acceptability is political and social.  Acceptable to who?  When?  The answer(s) always has (have) been a mix of politically recognized and derived subjective VALUES.

I have heard some folks on the extreme political left declare that Risk Assessment is a “Tool of the Capitalist Devil”.   I believe that to the extent that anyone independently determines what is "acceptable"for OTHER stakeholders this harshly worded judgment may have a ring of truth.  

Perhaps my most important mentor as I was learning and developing in this field was Dr. Irv Rosenthal.  Irv was a wonderfully intelligent and wise person and a gifted teacher.  Irv reminded me that we often ask permission to pass someone in a narrow hallway.  We say “excuse me” or “peg your pardon”.  Why then, Irv asked, would we not seek their permission to render them at risk (however small) from exposure to compounds we introduce into their environment?  Why can't they participate in drawing the line relative to their own exposures and the putative consequences?

The situation with Human Health is such that we look for consensus among industrial colleagues, academics, regulators and judges as to what are historically "acceptable" levels of risk.  We get benchmarks like 1 in 1000 lifetime risk for carcinogens exposure to workers and 1 in 1,000,000 for non-workers.  This has been evolving somewhat in the courts (e.g., US Supreme Court Benzene decision) but I do not think it has ever been put directly to the ultimate stakeholders, those being exposed.  These folks personally deal with exposure and risk and my sense is that, properly educated and, much more important, properly empowered, I think they would come up with doable and workable limits.  

I mentioned this possibility at an American  Industrial Hygiene National Conference in a Forum on Risk Assessment in the 1990s and the reaction was somewhat predictable.  Very few expressed a willingness to open the process up to these type of potential complications and "problems."  I admit that it will not be easy - I just think it will be ultimately necessary to make the process more politically viable, legitimate and inherently ethical.

So to answer Jack’s question succinctly, I advised him to use his skill to determine the quantitative level of exposure and risk as best he could.  I suggest that, if pressed, he leave the issue of “acceptability” to the first viable choice from the following rough hierarchy.

    1. A "Gold Standard" or criteria you were given by the client or that the client agreed to.   Note:  This is also known as “Rendering unto Caesar those things belonging to Caesar”.
    2. A criteria that is acceptable to some standard-setting or regulatory authority AND seems to make sense to you.
    3. An unwritten but generally accepted "rule of thumb" or common practice that you can refer to and makes sense to you. 
    4. Your "gut feel" on what it should be. 

In every case above, it would be beneficial to identify which line of the above hierarchy was used and the reasoning for using it within the report.   Doing so is particularly important for #4.  Indeed, my advice to Jack was to avoid using #4 unless you are asked specifically to do so by the client OR you have an overwhelming personal need to make your opinion known.   We should always to keep in mind that the determination or declaration of ACCEPTABLE risk is a somewhat subjective risk management function and it is not strictly speaking risk assessment.

Finally, I believe that there should be a new line at the top of this hierarchy that does not exist yet:

Acceptable risk defined in quantitative terms by consensual agreement among all the principal stakeholders.   My sense is that this is a worthy goal.