Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Exposure science and the U.S. EPA National Center for Computational Toxicology

Abstract

The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect. In this paper, several of the major research activities being sponsored by Environmental Protection Agency's National Center for Computational Toxicology are highlighted. Potential links between research in computational toxicology and human exposure science are identified. As with the traditional approaches for toxicity testing and hazard assessment, exposure science is required to inform design and interpretation of high-throughput assays. In addition, common themes inherent throughout National Center for Computational Toxicology research activities are highlighted for emphasis as exposure science advances into the 21st century.

Introduction

Computational toxicology is a new and high-priority research area in US Environmental Protection Agency (US EPA, 2003; Kavlock et al., 2007). Defined as the application of mathematical and computer models to predict adverse effects and to better understand the mechanism(s) through which a given chemical induces harm, computational toxicology provides approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. This integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect.

In February 2005, US EPA established the National Center for Computational Toxicology (NCCT) to conduct and sponsor research in this area. The overall goal of ORD's research program on Computational Toxicology is to use emerging technologies to improve quantitative risk assessment and reduce uncertainties in the source-to-adverse outcome continuum by providing ultimately systems level understanding of biological processes and their perturbation. The importance and relevance of this mission and NCCT-initiated research has received strong support with the recent release of the National Academy of Sciences report calling for a transformative shift in toxicity testing and risk assessment (NRC, 2007). Toxicity Testing in the 21st Century: A Vision and a Strategy, calls for a collaborative effort across the toxicology community to rely less on animal studies and more on in vitro tests using human cells and cellular components to identify chemicals with toxic effects. A framework for implementing this long-range vision is provided by the recently formalized collaboration between two NIH institutes (NIEHS and NHGRI) and the EPA to use high-speed, automated screening methods to efficiently test compounds for potential toxicity (Collins et al., 2008).

These high visibility efforts in toxicity testing and computational toxicology raise important research questions and opportunities for exposure scientists. The National Academies report authors (NRC, 2007) emphasize that, population-based data and human exposure information are required at each step of their vision for toxicity testing; and that these data will continue to play a critical role in both guiding development and use of the toxicity information.

The NCCT Computational Toxicology program has identified the need to include exposure information for chemical prioritization, modeling system response to chemical exposures across multiple levels of biological organization and linking information on potential toxicity of environmental contaminants to real-world health outcomes (e.g., complex disease). As a starting point, several common themes have emerged among the NCCT research projects. These themes are of particular interest to exposure scientist as we consider how to best incorporate the tools of computational toxicology into exposure research as well as how to best contribute to research in computational toxicology. The research conducted in the NCCT is designed to address the need for: (1) characterization of the target system across levels of biological organization; (2) improved linkages across the source-to-outcome continuum; and (3) a shift from linear source-to-dose paradigm to a systems-based approach. In addition, the complexity of the systems under study and the multidimensional nature of data produced using emerging technologies requires extensive collaboration and advanced environmental informatic capabilities. In this paper, with these common themes in mind: potential links between research conducted in the US EPA's National Center for Computational Toxicology and human exposure science are discussed; the need for exposure science to address chemical screening, prioritizing and toxicity testing in the 21st century is identified; and priority research areas for exposure scientists are proposed.

NCCT research activities

Toxcast: Prioritizing the Toxicity Testing of Environmental Chemicals

Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. In 2007, US EPA's NCCT launched ToxCast™ to develop a cost-effective in vitro approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time (Dix et al., 2007). Using data from state-of-the-art high-throughput screening (HTS) bioassays developed in the pharmaceutical industry, ToxCast™ is building computational models to forecast the potential human toxicity of chemicals. The premise underlying ToxCast™ is that toxicological response is driven by interactions between chemicals and biomolecular targets. For most environmental chemicals the protein targets and biological effects underlying potential adverse effects have yet to be identified or characterized. The strategy of ToxCast™ is to focus on a diverse range of assays and data types to identify potential targets. The ToxCast™ program will apply a multiple target matrix approach to address this goal. The matrix contains an expanded number of potential targets whose chemical interactions may be characterized by in silico models, biochemical assays, cell-based in vitro assays (based on both human and animal tissues) and nonmammalian models. The resulting data span levels of biological organization: molecular, cellular, tissue and whole organism. The overall pattern across many assays and data types will be used to develop a fingerprint or bioactivity profile that can be used as a predictor of toxicity. These hazard predictions will provide EPA regulatory programs with science-based information helpful in prioritizing chemicals for more detailed toxicological evaluations and lead to more efficient use of animal testing. The resulting data will also provide insights into modes of action of chemical toxicity in an unprecedented and unbiased manner. This, in turn, has implications for identifying potentially susceptible populations, both from a life-stage viewpoint, but also from a genetic (polymorphic) standpoint as toxicity pathways intersect with disease pathways.

The ToxCast™ program is being implemented using a tiered multiphase approach (see www.epa.gov/ncct/toxcast). In phase I, over 300 well-characterized chemicals have been profiled in over 400 HTS end points. These end points include biochemical assays of protein function, cell-based transcriptional reporter assays, multi-cell interaction assays, transcriptomics on primary cell cultures and developmental assays in zebrafish embryos. Almost all of the phase 1 compounds have been tested in traditional toxicology tests, including developmental toxicity, multigeneration studies and subchronic and chronic rodent bioassays. Phase 1 ToxCast™ signatures will be defined and evaluated by the ability to predict outcomes from existing mammalian toxicity testing and identify toxicity pathways that are relevant to human health effects.

ToxCast phase II, scheduled to launch in FY09 will bring the total number of chemicals screened to nearly 1000. These additional compounds will represent broader chemical structure and use classes and some pharmaceutical agents with known adverse side effects, to evaluate the predictive bioactivity signatures developed in phase I. As a result of the memorandum of understanding (MOU) with National Toxicology Program/NIEHS and NIH Chemical Economics Center/NHGRI, additional chemical screening capability is being made accessible and it is now projected that more than 5000 chemicals will be entering the high-throughput screening program of the NCGC within the next year. It is anticipated that successful conclusion of ToxCast™ phases I and II will provide EPA regulatory programs with a tool for rapidly and efficiently screening compounds and prioritizing further toxicity testing.

As computational analyses of ToxCast™ phase I data begin, the need to consider exposure potential for selecting phase II chemicals as well as for providing real-world relevance for interpretation of toxicity screening has been identified by NCCT. As ToxCast™ and related research activities provide information on key events required to incorporate mode-of-action information along the continuum, similar key exposure metrics at comparable resolution will need to be identified (Figure 1). This would build upon the great strides made in the last 20 years on PBPK modeling and expand on that success as emerging technologies blur the boundaries between exposure and effects sciences. The ultimate goal would be an integrated program in which biomarkers of exposure and bioindicators of effects are jointly determined and can be used to enhance biologically based dose–response models by providing measured parameters linking relevant exposures to the probability of an adverse outcome (NRC, 2007).

Figure 1
figure1

Cascade of exposure–response processes for integrating exposure science and toxicogenomic mode-of-action information.

Distributed Structure-Searchable Toxicity Database Network: Informatics for Environmental Health Risk Assessment

Specific activities at the NCCT include research to define chemical properties that can be used as indicators of potential toxicity for use in prioritization of toxicity testing as well as to construct computational models of chemical interactions with biological systems for human health risk assessment. This research requires creation of flexible databases covering a broad range of chemical space so that the wide range of multidimensional data spanning levels of biological organizations across the source-to-outcome continuum can be accessed, combined and interpreted using novel approaches.

The Distributed Structure-Searchable Toxicity (DSSTox) Database Network is creating a chemical data foundation for improved structure activity and predictive toxicology capabilities, and broad linkages to chemical data resources across and outside of EPA (Richard et al., 2006). The DSSTox website (US EPA, 2008) publishes downloadable, chemical structure files associated with toxicity data in a variety of formats, along with documentation and links to source information, quality review procedures and guidance for users. Standardized chemical structure annotation of a diverse array of toxicology-related data and resources, coupled with the online DSSTox Structure Browser, are providing structure-searchability and direct access to these data (including EPA's Integrated Risk Information System, Fat-head minnow acute toxicity database and High-Production Volume Chemical lists, the National Toxicology Program's — NTP Bioassay database, as well as estrogen-receptor binding data, rodent carcinogenicity data and most recently, gene expression data). The DSSTox project is also providing primary structure-annotation and cheminformatics support to both the NTP HTS and EPA ToxCast™ programs in conjunction with NCCT's Aggregated Computational Toxicity Resource (ACToR) project, slated for public release in late 2008 (Richard et al., 2008). The latter is providing a relational database platform for surveying vast Internet data resources pertaining to environmental toxicology (hundreds of thousands of chemicals), including high- and medium-production chemical lists and exposure-related data (Judson et al., in press). ACToR will also serve as the primary storage and analysis resource for the ToxCast HTS data, linking these data to standardized historical toxicological test results and broader chemical resources.

Similarly, it is imperative that exposure data be accessible and linked to the rapidly growing base of toxicity data. Development of consolidated data and knowledge bases for exposure is a high priority. Existing tools and platforms that are currently being implemented with environmental toxicity information should be considered to provide the most useful links to existing toxicity and environmental health data. Relevance and value of exposure information for toxicology and risk assessment will increase dramatically if links to these data are immediately apparent to an investigator searching the universe of toxicity and health data for a given compound. Chemical structure-annotation of exposure-related data, such as could be provided by DSSTox, and incorporation of such data into the new ACToR resource, will greatly enhance linkages between these exposure data and toxicity-related human health end points. In a preliminary demonstration, a DSSTox file of 60 chemical structures was created to index chemical-related content within the EPA Children's Total Exposure to Persistant Pesticides and Other Persistant Pollutants (CTEPP) database (US EPA, 2006). Ideally, conversion of text tables within CTEPP pdf documents would be tagged and indexed in web-accessible files such that chemical structure-searches on the Internet could locate relevant exposure content. These sorts of linkages have the potential to bring the toxicology and exposure science research communities into closer alignment and foster more productive interaction.

v-Liver™: Characterizing Toxicity Pathways and Extrapolating Dose–Response

ToxCast™ has generated an unprecedented amount of rodent data for discovering in vitro biomarkers of adverse outcomes in vivo, which will be vital for prioritizing chemicals for further testing. Rodent liver toxicity is currently the most frequent cause for the regulation of orally consumed environmental chemicals. The Virtual Liver Project v-Liver™ will utilize ToxCast™ and other public and agency data to aid in extrapolating in vitro assays to clinical outcomes across chemicals, doses, genders, life stages and populations (Kavlock et al., 2007). Virtual Tissues offer a novel translational paradigm for predicting target organ toxicity by fusing molecular and cellular systems modeling for physiologically relevant simulation (Knudsen and Kavlock, 2008). The goal of v-Liver™ is to quantitatively simulate liver injury due to chronic chemical exposure by modeling the linkage of perturbed molecular pathways with adaptive or adverse processes leading to changes of cell state, and the integration of this response through a dynamic cellular network giving rise to macroscopic tissue alterations. Histopathology is currently the clinical gold standard for estimating adverse liver outcomes. In the long-term, the Virtual Liver's ability to quantitatively predict tissue lesions from molecular and cellular networks dynamics will help in accurately assessing human risks from exposure to environmental stressors.

The first phase of v-Liver™ is a proof of concept focused on a subset of ToxCast™ chemicals and apical toxicity end points. These initially include a subset of pathways from nuclear receptor activation to proliferative sublobular lesions in rodents through a combination of cellular mitogenic, mutagenic and regenerative proliferation processes. Currently, data are being gathered on relevant molecular, cellular and tissue-level quantitative data on chemicals with known toxicological profiles to cross-validate the in silico modeling approach. In addition, qualitative and quantitative information on normal and pathologic processes across levels of biological organization is being curated in a knowledge base for virtual tissue construction. Finally, the multiscale molecular, cellular and tissue responses are simulated via an agent-based modeling (ABM) approach. Here, the liver tissue is being conceptualized as an ecosystem of heterogeneous cells. The ABM approach attempts to faithfully model the microanatomic heterogeneity of the complex hepatic acinus as a network of parenchymal and nonparenchymal “agents” in a nutrient and xenobiotic gradient. Agents model the dynamic behavior of liver cells to their microenvironment by processing endogenous and xenobiotic inputs through molecular circuits. ABM approaches encapsulate molecular, cellular and tissue complexity effectively enabling in silico simulation of functional liver unit(s) across species, chemicals, doses and times. The modularity of the approach also simplifies integration with physiologic models at the organism scale.

The liver's response to environmental chemicals spans multiple levels of organization — from molecular interactions to alterations in tissue structure. Novel computational approaches are required to ensure that information on biological effects is developed at environmentally relevant exposures. Integrating exposure, organism-level ADME and Virtual Tissues is vital for assessing the risk of adverse outcomes in human populations. Significant work in exposure modeling has focused on application of mass-balance approaches to model chemical fate and transport from source to individual to internal dose. This research should be continued with specific emphasis on developing inputs required to predict response of biological systems to environmental perturbations such as those required for v-Liver™. In addition, there continues to be significant challenges associated with modeling and predicting individual and population interaction with the environment. Similar to tissue-level biological systems, imbedded complexity (e.g., feedback, multiple scales, multiple stressors, etc.) of the higher-level systems requires consideration of a novel approach. Conceptualizing a population as an ecosystem of heterogeneous individuals and the individual as an ecosystem of heterogeneous behavior will facilitate holistic modeling of human–environment interaction. Use cases for data rich compounds should be developed to test utility of this approach for identifying key exposure events that can support tissue-level predictions. In the long-term, the source-to-outcome modeling paradigm may become more integrative by capitalizing on emerging multiscale systems modeling approaches such as those being applied in the v-Liver™ project.

Mechanistic Indicators of Childhood Asthma Study: Understanding Environmental Factors of Complex Disease

Ultimately, the systems of primary interest to human-health risk assessors are those at the individual and population level. Emerging tools in molecular biology provide the potential to develop cellular and molecular indicators of exposure that can be used to assess the vulnerability of humans to environmental stressors. The Mechanistic Indicators of Childhood Asthma (MICA) Study has been designed to incorporate state-of-the-art technologies to examine the physiological and environmental factors that interact to increase the risk of asthmatic responses (http://www.epa.gov/dears/studies.htm). Collected markers of susceptibility, exposure and effects are being used to analyze and characterize combined risk factors that relate to asthma severity in a cohort of children. The MICA study provides an opportunity to advance a system-based approach for evaluating complex relationships among environmental factors, physiological biomarkers and health outcomes. This study also provides a platform for applying and testing computational approaches to evaluate multifactorial-multidimensional data that are becoming standard output of environmental health and ecogenetic studies (NRC, 2008) and to use these data for hypothesis development.

The MICA study is primarily a clinically based observational children's study. Multiple measures of health status, asthma severity, environmental exposure and gene expression have been collected in a case/control cohort of 200 children (aged 9–12 years). Environmental samples have been collected with a focus on three broad classes of particulate-associated chemicals: volatile organic compounds, metals and polycyclic aromatic hydrocarbons. In the NCCT component of the MICA study, advanced statistical and machine learning methods are being applied in combination with mechanistic information to evaluate multiple types of biomarker data collected in MICA (similar approaches for combining data presented by Reif et al., in press). Methods and tools are being applied to evaluate and visualize gene expression data in novel ways. These approaches are being used to characterize the relationship between rat and human response, to characterize the relationship between gene expression and biological response and to evaluate the utility of gene expression data collected in a human cohort study for understanding relationships between exposure, susceptibility and early effects (Reif et al., in preparation; Heidenfelder et al., submitted).

The MICA study provides a case example for how exposure information and computational toxicology has the potential to provide a mechanistic interpretation of biomonitoring data, whether these data are “classical” concentration measurements or “toxicogenomic” markers in relation to exposure patterns, routes and pathways. As methods for assessing health risks resulting from exposures to individual environmental pollutants improves, environmental health scientists are turning attention toward characterizing relationships between multiple environmental factors and complex disease. The MICA study is an example of this shift. Computational tools and approaches for efficiently characterizing exposure potential of environmental compounds are required for screening and prioritizing as well as for environmental health studies. One possibility is to formulate an exposure classification index based on a limited set of metrics designed to efficiently cover exposure space. Application of environmental informatic approaches may help to identify the critical metrics for representing personal exposure over time, place, life stage and lifestyle or behavior. Such an approach could also inform exposure data collection at the personal, residential, community and ambient level. The MICA study also serves as an example of the type of study outlined in Figure 1 where biomarkers of exposure and bioindicators of effects are jointly determined and linked computationally to support modeling for risk assessment.

Exposure science for computational toxicology

Clearly the new field of computational toxicology provides significant opportunities for exposure scientists. The challenge is to move forward and consider new approaches for measuring, modeling and assessing exposure to address 21st century research needs for environmental health risk assessment. Consideration of analogies in hazard assessment may help to inform our path forward.

The NRC Vision (2007) of a shift to characterizing toxicity pathways requires a commensurate shift to characterizing exposure across all levels of biological organization (Figure 1). Interpretation of toxicogenomic hazard data requires contextual relevance. Pathways identified using HTS approaches such as those being developed in the ToxCast program are being anchored to apical end points using conventional toxicity data. Similarly, understanding relevant perturbations leading to these toxicogenomic end points require anchoring stressors to real-world human exposure (e.g., biomonitoring data and other conventional exposure metrics). As illustrated in the examples below, new approaches to risk assessment require exposure science to extend beyond traditional boundaries and predict exposures down to the molecular level. This requires consideration of the interactions between exposure and effect and highlights the need for interdisciplinary teams to define these interactions.

Suter (1999) notes that conventionally risk assessment considered the process by which a release of a contaminant results in exposure of a target or receptor. Induction of effects is assumed to occur after the exposure process facilitating separate analysis of exposure and hazard. However, as risk assessment has moved to address risks resulting from exposures to multiple stressors, this assumption is no longer appropriate. Effects at one organizational level affect others resulting in complex health outcomes. So, rather than considering flow of contaminants along the source-to-outcome continuum, there is a need to characterize cascades of alternating processes and states in an overall network (Suter, 1999).

As toxicity testing relies more on evaluating the mode of action for compounds, systems approaches describing the molecular basis of disease (Loscalzo et al., 2007) are being considered for risk assessment purposes (Edwards and Preston, 2008). With this approach, molecular networks for disease can be generated (Schadt and Lum, 2006) and used to derive key event networks for use in mode of action determination. The resulting networks describe the overall connectivity of the system along with the perturbations of that system resulting in certain disease states. Mode of action can now be defined as the perturbations of this “normal” state by a specific stressor or mixture. Such a holistic systems approach demands exposure metrics and models to characterize key stressors at a level of resolution commensurate with that of the response or effects. An example of this type of approach has previously been demonstrated for ecological risk assessment (Ankley et al., submitted; Ekman et al., 2007, 2008). Development and application of toxicogenomic molecular indicators of exposure (e.g., Sen et al., 2007) and nanotechnology-based sensors (Weis, 2005) provides the potential to mechanistically link traditional exposure metrics and end points measured in HTS assays. Together, a focus on mode of action and characterization of stressors at all levels of biological organization enables the vision for toxicity testing in the 21st century set forth by the NRC (NRC, 2007) by providing a framework in which to interpret toxicity pathway perturbations.

The NCCT research program and the NRC vision present tremendous challenges and opportunities for exposure science. In May 2008, US EPA established a Community of Practice in Exposure Science for Toxicity Testing, Screening and Prioritization (ExpoCop) to provide a forum for promoting the advancement of exposure science to begin to address some of the challenges alluded to in this paper (US EPA, 2008). We look forward to a broad participation from the exposure science community as we continue this important dialog.

References

  1. Ankley, et al. Endocrine disrupting chemicals in fish: developing exposure indicators and predictive models of effects based on mechanism of action. submitted.

  2. Collins F.S., Gray G.M., and Bucher J.R. Transforming environmental health protection. Science 2008: 319: 906–907.

    CAS  Article  Google Scholar 

  3. Dix D.J., Houck K.A., Martin M.T., Richard A.M., Setzer R.W., and Kavlock R.J. The ToxCast program for prioritizing toxicity testing of environmental chemicals. Toxicol Sci 2007: 95 (1): 5–12.

    CAS  Article  Google Scholar 

  4. Edwards S.W., and Preston R.J. Systems biology and mode of action based risk assessment. Toxicol Sci 2008 doi:10.1093/toxsci/kfn190.

    CAS  Article  Google Scholar 

  5. Ekman D.R., Teng Q., et al. NMR analysis of male fathead minnow urinary metabolites: a potential approach for studying impacts of chemical exposures. Aquat Toxicol 2007: 85 (2): 104–112.

    CAS  Article  Google Scholar 

  6. Ekman D.R., Teng Q., et al. Investigating compensation and recovery of fathead minnow (Pimephales promelas) exposed to 17alpha-ethynylestradiol with metabolite profiling. Environ Sci Technol 2008: 42 (11): 4188–4194.

    CAS  Article  Google Scholar 

  7. Heidenfelder B.L., Reif D.M., Cohen Hubal E.A., Hudgens E.E., Bramble L.A., Wagner J.G., Harkema J.R., Morishita M., Keeler G.J., Edwards S.W., and Gallagher J.E. Comparative microarray analysis and pulmonary morphometric changes in brown Norway rats exposed to ovalbumin and/or concentrated air particulates. Submitted.

  8. Judson R., Richard A., Dix D., Houck K., Elloumi F., Martin M., Cathey T., Transue T.R., Spencer R., and Wolf M. ACToR — aggregated computational toxicology resource. Toxicol Appl Pharmacol, Available online 18 July 2008.

  9. Kavlock R.J., Ankley G., Blancato J., Breen M., Conolly R., Dix D., Houck K., Hubal E., Judson R., Rabinowitz J., Richard A., Setzer R.W., Shah I., Villeneuve D., and Weber E. Computational toxicology a state of the science mini review. Toxicol Sci 2007: 103 (1): 14–27.

    Article  Google Scholar 

  10. Knudsen T.B., and Kavlock R.J. Comparative Bioinformatics and Computational Toxicology, Abbot B., and Hansen D. (Eds.). 3rd edn. Taylor & Francis 2009.

  11. Loscalzo J., Kohane I., and Barabasi A.L. Human disease classification in the postgenomic era: a complex systems approach to human pathobiology. Mol Syst Biol 2007: 3: 124.

    Article  Google Scholar 

  12. National Research Council of the National Academies (NRC). Toxicity Testing in the 21st Century: A Vision and A Strategy. The National Academies Press, Washington, DC, 2007.

  13. National Research Council of the National Academies (NRC). The National Children's Study Research Plan: A Review. The National Academies Press, Washington, DC, 2008.

  14. Richard A., Yang C., and Judson R. Toxicity data informatics: supporting a new paradigm for toxicity prediction. Toxicol Mech Meth 2008: 18: 103–118.

    CAS  Article  Google Scholar 

  15. Richard A.M., Gold L.S., and Nicklaus M.C. Chemical structure indexing of toxicity data on the Internet: Moving toward a flat world. Curr Opin Drug Discov Devel 2006: 9 (3): 314–325.

    CAS  PubMed  Google Scholar 

  16. Reif D.M., et al. Integrating demographic, clinical, and environmental exposure information to identify genomic biomarkers associated with subtypes of childhood asthma. 2008 Joint Annual Conference ISEE/ISEA. Pasadena, CA.

  17. Reif D.M., Motsinger A.A., McKinney B.A., Edwards K.M., Chanock S.J., Rock M.T., Crowe Jr J.E., and Moore J.H. Integrated analysis of genetic and proteomic data identifies biomarkers associated with systemic adverse events following smallpox vaccination. Genes Immun, published online 16 October 2008. doi:10.1038/gene.2008.80.

    Article  Google Scholar 

  18. Schadt E.E., and Lum P.Y. Thematic review series: systems biology approaches to metabolic and cardiovascular disorders. Reverse engineering gene networks to identify key drivers of complex disease phenotypes. J Lipid Res 2006: 47: 2601–2613.

    CAS  Article  Google Scholar 

  19. Sen B., Mahadevan B., and DeMarini D.M. Transcriptional responses to complex mixtures — a review. Mutat Res 2007: 636 (2007): 144–177.

    CAS  Article  Google Scholar 

  20. Suter G.W. Developing conceptual models for complex ecological risk assessments. Hum Ecol Risk Assess 1999: 5 (2): 375–396.

    Article  Google Scholar 

  21. U.S. EPA. A Framework for a Computational Toxicology Research Program. Office of Research and Development, Washington, DC, 2003 EPA 600/R-03/065 http://www.epa.gov/comptox/publications/comptoxframework06_02_04.pdf.

  22. U.S. EPA. A Pilot Study of Children's Total Exposure to Persistent Pesticides and Other Persistent Organic Pollutants (CTEPP). Volume I: Final Report. Contract Number 68-D-99-011, U.S. Environmental Protection Agency, Office of Research and Development, Research Triangle Park, NC, 2006 Available online at http://www.epa.gov/heasd/ctepp/ctepp_report.pdf.

  23. U.S. EPA. EPA Community of Practice: Exposure Science for Toxicity Testing, Screening, and Prioritization 2008: http://www.epa.gov/ncct/practice_community/exposure_science.htmlAccessed September 16, 2008.

  24. Weis B.K., Balshaw D., Barr J.R., Brown D., Ellisman M., Lioy P., et al. Personalized exposure assessment: promising approaches for human environmental health research. Environ Health Perspect 2005: 113 (7): 840–848.

    CAS  Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Elaine A Cohen Hubal.

Additional information

Disclaimer

The US Environmental Protection Agency, through its Office of Research and Development funded and managed the research described here. It has been subjected to Agency's administrative review and approved for publication.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Cohen Hubal, E., Richard, A., Shah, I. et al. Exposure science and the U.S. EPA National Center for Computational Toxicology. J Expo Sci Environ Epidemiol 20, 231–236 (2010). https://doi.org/10.1038/jes.2008.70

Download citation

Keywords

  • exposure modeling
  • toxicology
  • bioinformatics
  • toxicogenomics

Further reading

Search

Quick links