Value and Use of Mechanistic Data to the Regulatory Community
George M. Rusch, Ph.D.
Department of Toxicology
101 Columbia Road
Morristown, NJ 07962
To an outsider looking at the regulatory community, it would appear that there is resistance to using mechanistic data in risk assessment. However, if we examine the process carefully, it is not so much a reluctance to use mechanistic data, but a need to follow an approach that is consistent and obvious - an approach that will provide recommendations that are protective of public health even if they are overly conservative. To take two examples, The Linear Multistage Model for low dose extrapolation for cancer risk assessment (Portier and Kaplan, 1989) and the development of Chronic Reference Dose values (Dourson, et.al., 1996; Jarabek,1994; Calabrese, 1985).
First, if we look at the Linear Multistage Model, although it involves a sophisticated mathematical model, it is consistent in its approach. It will always give a value that is protective of public health. Unfortunately, because it does not include mechanistic considerations, sometimes it gives numbers that are overly conservative. Members of the regulatory community recognize this when presented with contrary evidence. However, since this new information requires them to go against a standard model, the "burden of proof" clearly is on the group submitting the new, mechanistic data. As our understanding of mechanisms and our ability to develop physiologically based pharmacokinetic (PB/PK) profiles of how chemicals are treated by the body has progressed, the regulatory community appears far more willing to accept and use this information. The use of this data benefits the public as well as industry, in that it avoids reliance on default assumptions, which in reality are nothing more than conservative educated guesses (Leung, 1991). When we treat a group of animals with large doses of a chemical and define an effect in that animal, we still have many unanswered questions as we try to extend this finding to develop a human risk assessment. Was our animal model a good surrogate for man? Which is the most appropriate model? Are we more like rats or mice or monkeys? Will what happened as a result of these high doses occur (with lower frequency and severity) at low doses or is there a threshold? To account for these uncertainties in the past, one applied uncertainty factors or used maximum likelihood upper bounds for statistical predictions.
Today, these questions can be answered far more precisely through understanding the mechanism of actions; and where appropriately characterized, are increasingly being accepted by the regulatory community. Probably one of the best examples where understanding the mechanism of action has had a major impact on carcinogenic risk assessment arose from studies on the chronic toxicity/carcinogenicty of unleaded gasoline. In the initial study, male and female rats and mice were exposed to vapors of unleaded gasoline for periods of up to two years. The male rats developed renal tumors, no increases in renal tumors were seen in female rats or mice of either sex. The dilemma is "Are we more like male rats or female rats and mice?" Excellent research conducted at the Chemical Industry Institute of Toxicology identified that the male rat liver produced a low molecular weight protein "alpha-2m-globulin" which is secreted into the blood, filtered out by the kidney and broken down into amino acids in the kidney. However, when the male rats were exposed to the unleaded gasoline or similar chemicals, these chemicals complexed with the alpha-2m-globulin inhibiting breakdown and leading to a buildup of the complex in the kidney. This lead to renal toxicity, renal necrosis and ultimately tumors. Further research demonstrated that this chemical is not found in significant amounts in female rats or mice. The next, and most important question was whether humans produce this or a similar protein. Borghoff and Lagarde (1993) studied these questions and determined that, compared to male rats, human kidneys have a much lower concentration of low molecular weight proteins. Also, no proteins were found in the human kidney that were similar to the alpha-2m-globulin in charge or chemical binding specificity. These findings provided the solid scientific support for the US EPA to conclude "the male rat kidney response to chemicals that induce alpha-2m-globulin accumulation is probably not relevant to humans for the purpose of risk assessment" (EPA, 1990). This conclusion was a major departure from the Linear Multistage Model approach, however, it was grounded in solid science and as such, the reasons for the conclusion were clear. Given the widespread use of unleaded gasoline, the impact of this decision is tremendous. If it had been concluded that exposure to vapors of unleaded gasoline posed a significant cancer risk, control measures for man would have been required that would have been far more extensive and expensive to the public than those in use today. Alternatively, there could have been a movement to find an alternative fuel that ultimately might have had other, greater problems.
A second example of when an understanding of the mechanism of carcinogenic action may have a profound impact on the risk assessment involves the hepatotoxicity and carcinogenicity of chloroform. In lifetime oral gavage studies at the maximum tolerated dose, chloroform induced hepatocellular carcinomas in female B6C3F mice. However, when chloroform was given in the drinking water, no increase in liver cancer was seen, even though the mice received the same total dose of chloroform.
Subsequent, research showed that hepatocellular necrosis and repair is a prerequisite for carcinogenicity and necrosis does not occur with dose levels below 1,800 ppm. In the gavage study, the large bolus dose overwhelmed the body's defense mechanisms, while in the drinking water study the dose was received in small increments over the entire day. This gave the mouse's liver time to detoxify it. (Larson et.al, 1994a and Larson, et. al, 1994b). If one uses the Linearized Multistage Model, one calculates a one in a million cancer risk for a level of chloroform in drinking water of 4 ppb. Using the threshold approach EPA has recently indicated, it could recommend 300 ppb as a Maximum Contaminate Level Goal (MCLG) for chloroform in drinking water. This would represent a departure from the MCLG of the "zero" tolerance for carcinogens under the Safe Drinking Water Act (SDWA). Again, several years of research have gone into the development of the information supporting this position. In fact, it is probably one of the most intensely studied toxicological problems. The reason is that chloroform, as well as other trihalogenated methanes, are formed by reaction of chlorine used for water purification with organic substances present in the drinking water. Chloroform levels are currently controlled at 100 ppb a level. Should chloroform have been judged to have an impact on human health at that level, the cost to the public for developing alternate technologies for water purification could run in the billions of dollars. Unlike the alpha-2m-globulin issue, chloroform continues to be a controversial topic (Clark, 1998). Adoption of a 300 ppb MCLG will represent the first time that EPA has departed from the "zero" tolerance concept in SDWA. While drinking water suppliers and others have argued that this reassessment is fully justified on the weight of the scientific evidence, some environmental groups have expressed strong opposition to this "new approach." This led to a debate within the scientific and environmental communities over the importance of using sound science for risk assessments. One of the key concerns being how Industry can justify the expense of doing research in critical areas of toxicology, if the results from these programs will not be utilized in the regulatory risk assessment process. (Goodman, 1998).
Understanding the mechanism or mode of action is also important in developing guidelines for exposure to chemicals for short periods of time. There continues to be much discussion on how to extrapolate findings of toxicity from one exposure period to another (EPA, 1998). In some instances, the relationship can be expressed as the product of concentration and time (C x t = K), in others it may be a more complex relationship (e.g., Cnt = K or Ctn = K) or in some cases there is a threshold for a response. Of these three possibilities, prediction of the threshold response requires the most knowledge of the mechanism of action for the development of a meaningful risk assessment.
Several examples come to mind. The most common are the irritant gases such as ammonia or hydrogen chloride gas. Both have relatively low levels of systemic toxicity and risk assessments based solely on chronic, systemic toxicity could lead to recommendations of exposure levels that are highly irritating, especially if the data used were derived from exposures involving fairly long daily time periods such as 6 hours and the exposure time in question was relatively short e.g. 15 minutes.
Chemicals that are anesthetics frequently present the opposite problem. Exposures at levels slightly below the acute lethal dose can often be tolerated without showing any signs of systemic toxicity. Two examples are 1,1,2-trichloro-1,2,2-trifluoroethane (CFC 113) (Trochimowicz et. al., 1988) and 1,1-dichloro-1-fluoroethane (HCFC 141b) (Brock et. al., 1988). Both materials have 4-hour acute lethal concentrations of approximately 60,000 ppm, yet in chronic studies rats tolerated daily 6-hour exposures to levels up to 20,000 ppm. Thus in each week they received two and one-half times the four hour median lethal dose. Long-term survival in the high exposure level groups in both of these studies was at least as good as the air-exposed controls. Any risk assessment based solely on the acute data would markedly over estimate the chronic toxicity and a similar evaluation based on the chronic data would under estimate the acute toxicity. When risk assessments have been conducted on this class of compounds, this type of data make it apparent that the exposure level, not cumulative exposure, is the most important parameter.
A similar conclusion can be reached when evaluating the toxicity of hydrogen sulfide, although not because of anesthetic effects, but from respiratory effects. In this case, the body can convert low levels of hydrogen sulfide to sulfate, and as long as the body's ability to oxidize the sulfide is not over-whelmed, there will be no systemic toxicity. However, even short exposures above this level can be lethal. The relative insensitivity of hydrogen sulfide induced lethality to length of exposure is shown in Table 1. These results, obtained from many studies (Prior et. al. 1988 and Zwort et. al., 1990), show that the median lethal concentration changes from 835 ppm to 335 ppm (2.5-fold) while exposure time varies from 10 minutes to 6 hours (36 fold). Any risk assessment that does not recognize this phenomena could easily fail to account for short term high level (peak) exposures.
An excellent example of the use of mechanistic data by a government group in conducting a risk assessment is to look at the National Advisory Committee on Acute Exposure Guideline Levels for Hazardous Substances evaluation of analine. This Committee, whose membership includes individuals from both Federal and State Agencies as well as scientists form academia, industry and special interest groups, reviews the toxicology data on a wide variety of chemicals and then recommends exposure guidance levels for acute exposure periods from 30 minutes to 8 hours. Acutely, the primary toxic effect of analine is the formation of methemoglobin (Kim and Carlson, 1986). It is known that humans, especially the young, can be more sensitive than rats to methemoglobin formation. Therefore, an uncertainty factor of thirty was applied to the rat data. The analysis then proceeded to compare findings in the rat with projected human effects. Exposure of rats to 100 ppm of analine resulted in an elevation of methemoglobin to 22%. The literature indicated that for humans methemoglobin levels of 15 to 20% resulted in clinical cyanosis but without hypoxic symptoms (Kiese, 1974). Thus 100 ppm for 8 hours was taken as the basis for deriving the acute exposure guidance level for minimal toxicity. In the same rat study, administration of 150 ppm for 8 hours resulted in a methemoglobin level of 41%. The literature indicated that, in humans, levels of methemoglobin of 20 to 40% were associated with fatigue, lethargy, exertional dyspnea and headache. These symptoms were considered to represent a threshold for disabling effect and this level was taken as the basis for deriving the acute exposure guidance levels associated with disability or serious toxicity. Finally, using the data from this rat study, it was projected that an exposure at 250 ppm for 8 hours would result in a level of methemoglobin of 70 80 %. This was considered to be the threshold for lethality in humans and this level was chosen as the basis for the acute guidance level for lethality. This represents an excellent example of how an understanding of the mechanism of toxic action for a chemical can be used to refine the risk assessment process. The methemoglobin levels associated with exposure to analine had been determined in an animal model. The clinical signs expected in humans were evaluated from a totally independent data set, that did not involve exposure to analine. When put together they formed a complete package for risk assessment for acute exposures to analine.
In summary, regulators are making more frequent use of mechanistic data in the regulatory process. In doing so, there must be a clear understanding of how this data is being used. The less sophisticated approaches using simple uncertainty factors, typically multiples of 10, had the advantage of consistency and clarity. Because mechanistic data often applies to a single chemical or small family of chemicals, its applicability must be carefully documented. When used properly, it is a powerful tool which helps to increase the precision of our risk assessments. It can provide the basis for us using or rejecting acute data in chronic exposures or the reverse. It can also help us understand the risk that may be associated with short-term peak exposure occurring during the day as contrasted to a lower level of exposure averaged over the entire day. As the body of data grows, so will its acceptance by both the public and the regulatory community.
Hydrogen Sulfide Median Lethal Concentrations For Inhalation Exposures in Rats
Brock, W.J., Trochimowicz, H.J., Millischer, R-J., Farr, C., Kawano, T. and Rusch, G.M. (1995). Acute and Subchronic Toxicity of 1,1-dichloro-1-fluoroethane (HFC-141b). Food and Chemical Toxicology, 33 (6):483-490.
Broghoff, S.J. and Lagard, W.H. (1993). Assessment of Binding of 2,4,4-trimethyl-2-pentanol to Low-Molecular Weight Protein Isolated from Kidneys of Male Rats and Humans. Toxicology and Applied Pharmacology,119:228-235.
Calabrese, E.J. (1985). Uncertainty Factor and Interindividual Variation. Regulatory Toxicol. Pharmacol. 5:190-196.
Clark, D.P. (1998). Chloroform Controversy Raises Major Science, Risk Policy Stakes. Risk Policy Report, Vol. 5 (6), June 19, 1998. Inside Washington Publishers, Washington, DC
Dourson, M.L., Feltzer, S.P. and Robinson, D. (1996). Evolution of Science-Based Uncertainty Factors. Regulatory Toxicology and Pharmacology, 24:108-120.
EPA (1998). Summary of the US EPA Workshop on the Relationship Between Exposure Duration and Toxicity. US EPA Risk Assessment Forum, Washington, DC, September 30, 1998.
EPA (US Environmental Protection Agency) (1990). Alpha-2 Microglobulin Association with Renal Toxicity and Neoplasms in Male Rats. Draft report prepared for the Risk Assessment Forum, Office of Research and Development, Washington, DC.
Goddman, Jay (1998). Open letter to Members of the Society of Toxicology. Society of Toxicology, Reston, VA, August 4, 1998.
Jarabek, A.M. (1994). Inhalation RfC Methodology: Dosimetric Adjustments and Dose-Response Estimation of Noncancer Toxicity in the Upper Respiratory Tract. Inhalation Toxicology, 6 (Suppl.):301-325.
Kiese, M. (1974). Methemoglobinemia: A Comprehensive Treatise. CRC Press, Cleveland, Ohio.
Kim, Y.C. and Carlson, G.P. (1986). The Effect of an Unusual Workshift on Chemical Toxicity II Studies on the Exposure of Rats to Analine. Fundamental and Applied Toxicology, 7:144-152.
Larson, J.L., Sprankle, C.S. and Butterworth, B.E. (1994a). Lack of Chloroform-Induced DNA Repair In vitro and In Vivo Hepatocytes of Female B6C3F1 Mice. Environmental Mol. Mutagen, 23:132-136.
Larson, J.L., Wolf, D.C. and Butterworth, B.E. (1994b). Induced Cytotoxicity and Cell Proliferation in the Hepatocarcinogenicity of Chloroform in Female B6C3F1 Mice: Comparison of Administration by Gavage in Corn Oil vs. Ad Libitum in Drinking Water. Fundamental Applied Toxicology, 22:90-102.
Leung, H.W. (1991). Development and Utilization of Physiologically Based Pharmacokinetic Models for Toxicological Applications. Journal of Toxicological and Environmental Health, 32:247-267.
Portier, C.J. and Kaplan, N.L. (1989). Variability of Safe Dose Estimates When Using Complicated Models of the Carcinogenic Process. Fundamental and Applied Toxicology, 13:533-544.
Prior, M.G, Sharma, A.K., Yong, S. et.al. (1998). Concentration-time interactions in hydrogen sulfide toxicity in rats. Can. Journal Vet. Res., 52:375-379.
Trochimowicz, H.J., Rusch, G.M., Chiu, T. and Wood, C.K. (1988). Chronic Inhalation Toxicity/Carcinogenicity Study in Rats Exposed to Fluorocarbon 113 (FC-113). Fundamental and Applied Toxicology, 11:68-75.
Zwart, A, Arts, J.H.E., Klokman-Houweling, J.M., et. al. (1990). Determination of concentration-time-mortality relationships to replace LC50 values. Inhalation Toxicology, 2:105-117.