Animal-Free Chemical Safety Assessment

The exponential growth of the Internet of Things and the global popularity and remarkable decline in cost of the mobile phone is driving the digital transformation of medical practice. The rapidly maturing digital, non-medical world of mobile (wireless) devices, cloud computing and social networking is coalescing with the emerging digital medical world of omics data, biosensors and advanced imaging which offers the increasingly realistic prospect of personalized medicine. Described as a potential “seismic” shift from the current “healthcare” model to a “wellness” paradigm that is predictive, preventative, personalized and participatory, this change is based on the development of increasingly sophisticated biosensors which can track and measure key biochemical variables in people. Additional key drivers in this shift are metabolomic and proteomic signatures, which are increasingly being reported as pre-symptomatic, diagnostic and prognostic of toxicity and disease. These advancements also have profound implications for toxicological evaluation and safety assessment of pharmaceuticals and environmental chemicals. An approach based primarily on human in vivo and high-throughput in vitro human cell-line data is a distinct possibility. This would transform current chemical safety assessment practice which operates in a human “data poor” to a human “data rich” environment. This could also lead to a seismic shift from the current animal-based to an animal-free chemical safety assessment paradigm.


INTRODUCTION
In a paper titled "The feasibility of replacing animal testing for assessing consumer safety: a suggested future direction" Fentem et al. (2004) discussed how the new "omics" technologies; genomics, transcriptomics, proteomics and metabonomics could be used in the future to replace animal-based data in human chemical safety assessment. At that time, a major impediment to progress was that much of these data were generated in a clinical setting and not in abundance in the public domain. They recommended that making these data and information generally accessible in an ethical and legal way, could lead to the translation of experimental non-animal data that could be used in safety assessment (Fentem et al., 2004).
Much has changed in the research landscape since then. The expansions of the internet, allowing greater connectivity of devices and sensors, computational speed, cloud computing and multi-disciplinary collaborations, are the main developments that characterize these changes. For example, in recent years the speed of supercomputers has increased by several orders of magnitude, boasting processing speeds of 10 15 floating-point operations per second which will soon reach 10 18 floating-point operations per second (Witze, 2014). Without such computational power the production of approximately 1.8 zettabytes (10 21 ) of genomic, epigenomic, transcriptomic, proteomic and metabolomic data generated each year, roughly doubling the world's information resource every two years, would not be possible (Dearry, 2013). Indeed, more than 50,000 omics papers are published each year (Cote et al., 2014). Movements such as, Open Access in publishing (Bains, 2009), Open Source Initiative in software development 1 , Open Source engineered human tissue models (De Wever et al., 2015), the Open Phacts Foundation (Williams et al., 2012) and the need for multi-disciplinary approaches to the access, integration and analysis of big data sets (Schumacher et al., 2014;Alyass et al., 2015) has led to a burgeoning of collaborative research 2 . This in turn has led to the proliferation of publicly available databases that include omics data for human disease, as well as survey and clinical assay data on human exposure and health outcomes (Zhu et al., 2008;Sakurai et al., 2011;Kim et al., 2012;Kotera et al., 2012;Kamburov et al., 2013;Wachter and Beissbarth, 2015). This new environment is leading to significant paradigm shifts in medicine and toxicology. Indeed, medicine is being transformed into a data science (Topol, 2010;Hood et al., 2015;Topol et al., 2015).
These changes could lead to the transformation of human chemical safety assessment from a "human data poor" to a "human data rich" arena with the consequent elimination of animal-based toxicology studies that currently underpin chemical safety assessment. In this review the components that could bring about an animal-free chemical safety assessment paradigm are discussed.

SYSTEMS BIOLOGY
Contemporary methods for the diagnosis of human disease originated in the late 19 th century, and are based on simple observational correlations between clinical syndromes and pathological analysis (Loscalzo et al., 2007;Loscalzo and Barabasi, 2011). Over the same period, research followed the reductionist approach that attempts to explain complex phenomena by defining the functional properties of the individual components that make up a system (Sobradillo et al., 2011). Consequently, the research focus progressed from the whole organism (anatomy) to the organs (physiology), cells (cell biology) and ultimately to subcellular molecular interactions (genes, proteins, lipids and metabolites; molecular biology) (Figure 1). This reductionist strategy is based on the assumption that many of the functions of the whole organism can be understood by knowing the properties of the component parts (Sobradillo et al., 2011). Both the approach to the diagnosis of disease and the reductionist strategy to research have made major contributions to our understanding of health and disease, however, they have inherent significant limitations. Current methods for diagnosing disease lack sensitivity for identifying preclinical disease (i.e., identifying precursor events that support early detection and treatment), and specificity in unequivocally defining disease (Loscalzo et al., 2007). The reductionist approach does not account for phenomena that emerge from the interactions of parts, and that appear as 'coordinated' functions of the individual components at higher levels of system organization (Sobradillo et al., 2011).
An alternative to the reductionist mind-set is the 'systems biology' perspective that integrates events at various levels of 'system' organization, and accounts for interactions of individual components and emerging properties that cannot be deduced from information on the single elements alone (Sobradillo et al., 2011).
Systems biology and systems medicine are characterized by the application of computational and mathematical modeling techniques that aim to unravel and understand the complexity of normal and diseased biological systems (Galas and Hood, 2009;Loscalzo and Barabasi, 2011;Chen and Snyder, 2012;Jack et al., 2013). They are biology-based, inter-disciplinary studies that deploy engineering approaches to discover emergent properties of cells, tissues and organs functioning as a system from the interactions between genetic, metabolic and cell signaling responses (Galas and Hood, 2009;Loscalzo and Barabasi, 2011;Chen and Snyder, 2012;Jack et al., 2013).
In systems biology, the implications of altered molecular and cellular components that result from exposure to chemical and non-chemical stressors, are studied and integrated across multiple levels of biological organization. That is, from genes to gene expression products, to alterations in biochemical pathways and networks and the propagation of effects from cells to tissues to organs and the whole body (Andersen et al., 2005;Zhang et al., 2010). Disease arises as a consequence of disease-perturbed networks in the diseased organ that propagate from one or a few disease-perturbed networks to many as the disease progresses. These initial disease perturbations may be due to genetic changes (e.g., mutations) and/or from exposure to stressors in the environment (e.g., infectious organisms, or chemicals). These perturbations alter the information expressed in these networks dynamically -and these altered dynamics of information flow explain the pathophysiology of the disease and suggest new approaches to diagnosis and therapy (Hood et al., 2012). By treating disease as a consequence of genetic and/or environmental perturbations of biological networks the systems approach also considers social and environmental influences that may impact health. The cross-talk of all networks is integrated in order to understand their functioning in the context of the individual (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015Smarr, 2012). Importantly, there is a growing body of evidence that these perturbations conform to biological patterns or 'signatures' that are associated with specific diseases (Nicholson and Holmes, 2006;Holmes and Nicholson, 2007;Holmes et al., 2008;Bouhifd et al., 2013).

THE INTERNET OF THINGS, THE MOBILE PHONE AND PERSONALIZED MEDICINE
"Medicine is undergoing a revolution that will transform the practice of healthcare in virtually every way" (Hood  , 2013). The systems approach to disease is beginning to change healthcare by deploying technologies that permit the rapid sequencing of an individual human genome and the quantification of "units of biological information" such as single genes, single molecules, single cells and single organs to provide disease relevant information on health or disease for the individual. This is resulting in an explosion of patient data that is transforming "traditional biology and medicine" into an information science (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015Smarr, 2012). By harnessing the capabilities of computational analysis of "big data" the digital revolution is transforming healthcare just as it has already transformed communications, finance, retail and information technology (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015. The digital revolution is making the management and analysis of extremely large biological and environmental datasets tractable and it is driving the invention of personal monitoring devices that can digitize biological information, thus enabling, the individual assessment of wellness and disease commonly described as personalized medicine (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015Smarr, 2012).
Personalized Medicine, Stratified Medicine, Precision Medicine 3,4 and P4 Medicine are interchangeable terms for systems medicine approaches to individualized healthcare (Topol, 2010;Hood et al., 2012Hood et al., , 2013Smarr, 2012;Collins and Varmus, 2015;Topol et al., 2015). Personalized Medicine is a medical model that separates patients into different groupswith medical decisions, practices, interventions and products being tailored to the individual patient based on their predicted response or risk of disease. It is emerging from the convergence of systems medicine, the healthcare-focussed derivative of systems biology and the digital revolution (Hood et al., 2013). It's proponents ascribe this revolution to the digital transformation of medical practice as being due to the "coalescence of the rapidly maturing digital, non-medical world of mobile (wireless) devices, cloud computing and social networking with the emerging digital medical world of genomics, biosensors and advancing imaging" (Topol, 2012). Described as the "greatest convergence in our history, " this revolution has become possible because of the exponential growth of the Internet of Things (IoT) and the global popularity and remarkable decline in cost of the mobile phone 5 (Topol, 2010(Topol, , 2012Mak, 2015).
The IoT has been defined as a "global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on existing and evolving interoperable information and communication technologies" 6 . Fundamentally, the IoT comprises sensors, which are increasingly being embedded into smartphones and wearable devices interconnected via the internet. The ability to pack 19 million transistors into integrated circuits that occupy a 16 nm space explains how there are more than 2 billion transistors in some current smartphones (Topol et al., 2015). Current devices are already able to "digitize the biology of a human being" with the use of wearable sensors to quantify physiological metrics such as vital signs or relevant features of an individual's environment, provide high definition images of the anatomy, and elucidate an individual's biology by sequencing their DNA, RNA, microbiome and epigenome (Topol, 2014). In the next 5 years, individuals with hypertension and diabetes could have their blood pressure and glucose levels continuously monitored, most routine laboratory tests may be obtainable with smartphone kits and time series measurements of key biochemical variables should be feasible (Smarr, 2012;Topol et al., 2015). Billions of data points from each individual will be uploaded to a virtual cloud where sophisticated algorithms will decipher 'signal' from noise generated by the complexities of health and disease (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015Smarr, 2012). Unsurprisingly, this would constitute a seismic shift from the current "healthcare" model to a "wellness" paradigm that is predictive, preventative, personalized and participatory (Hood and Friend, 2011;Hood et al., 2012Hood et al., , 2013Hood et al., , 2015Smarr, 2012).
Central to personalized medicine is biomarker tracking, specifically, the monitoring of time series measurements of key biochemical variables in an individual. Soon it may be possible to use integrated microfluidic chip technology to rapidly measure a panel of plasma proteins from a finger prick volume of whole blood. This could provide inexpensive, point-of-care, informative clinical diagnosis (Fan et al., 2008;Hood and Friend, 2011;Smarr, 2012;Hood et al., 2013Hood et al., , 2015. This technology could lead to the identification of organ-specific blood protein "fingerprints" that distinguish normal functioning from disease-perturbed biological networks (Hood et al., 2012(Hood et al., , 2013. Such fingerprints or "signatures" are not confined to proteins. In fact, the field of metabolomics has tremendous potential for the identification of pre-symptomatic, diagnostic and prognostic metabolic signatures of disease, toxicity and exposure to environmental pollutants (Nicholson and Holmes, 2006;Holmes and Nicholson, 2007;Holmes et al., 2008;Bouhifd et al., 2013). The wellness paradigm may be characterized by the longitudinal monitoring of integrative personal omics profiles (iPOP) which combine genomics, transcriptomics, proteomics, metabolomics and autoantibody profiles (Stanberry et al., 2013). In this approach, changes in metabolite expression levels reflect differential expression of biological pathways and are associated with disease (Stanberry et al., 2013;Guo et al., 2015).

METABOLOMICS
Genetics alone cannot fully explain differences in disease predisposition, (Nicholson, 2006). Only about 5-10% of total human genetic variance occurs across populations and ethnic groups, although disease distributions and drug toxicity may vary greatly. Broadly speaking, genomics does not account for differences in phenotype . Although a gene may be expressed and a protein may be synthesized, this protein may not be in the proper form to induce a metabolic change and therefore induce a phenotypic effect. The epigenome, which consists of non-sequence-based modifications, such as DNA methylation, is heritable and may affect normal phenotypes and predisposition to disease (Feinberg, 2010;Feinberg et al., 2010). Indeed, epigenetic changes have been shown to have a strong relationship with cancer and other common diseases .
Critical illness is characteristically the loss of metabolic homeostasis (Serkova et al., 2011;Mastrangelo et al., 2014). Monitoring the fluctuations of endogenous, low-molecular weight molecules in blood (plasma and serum) and urine is an important way to detect various human pathologies such as, cancer, cardiovascular disease, diabetes and drug and chemical toxicity (Serkova et al., 2011). Thus, metabolomics, or metabolic profiling, is the study of the quantitative description of all lowmolecular-weight (<1 kDa) components in a biological sample. These may consist of metabolites solely under endogenous control and may also involve those originating from exogenous sources (microbiome, diet, drugs, and environmental pollutants).
The combination of genes and environment contribute to the effects observed in the metabolome as do factors such as gender, age, diet, exposure to xenobiotics and products of the gut microbiota (Mastrangelo et al., 2014). In addition, metabolomics involves the quantification of metabolites to track the developing response to a stimulus, has the advantage of being high-throughput and currently provides the best approach to delineating and understanding a biological mechanism preceding an effect (Kosmides et al., 2013;Mastrangelo et al., 2014).
The advantage of metabolomics is that it allows the evaluation of changes at a higher level of organization, that is, closer to the phenotype which therefore, should provide a more reliable indication on the state of health of the individual (Figure 2). This is possible because endogenous small molecules are at the top of the systems biology continuum and reflect and magnify (several thousands of times) perturbations that occur at the genomic, transcriptomic and proteomic level (Raamsdonk et al., 2001;Wishart, 2012). Indeed, metabolomics data are needed to construct powerful top-down systems biology tools that link the omics disciplines (Coen et al., 2008). In this respect, the ability to link the information provided by the different omics data and build a pathway of toxicity (PoT) linking an external stressor induced perturbation to a disease endpoint is analogous to the development of chemically agnostic adverse outcome pathways (AOPs) proposed for use in chemical safety assessment (Ankley et al., 2010;Bouhifd et al., 2013;Burden et al., 2015;Athersuch, 2016;Edwards et al., 2016). It could provide novel information on phenotypic characteristics and therefore the potential to investigate the output of complex, interconnected networks (Kosmides et al., 2013).
Metabolomics is very sensitive, currently capable of detecting femtomolar to attomolar (10 −15 to 10 −18 ) changes in metabolite concentrations (Veenstra, 2012). Small dietary changes, increased physical activity, elevated stress or even variations in seasons can significantly alter metabolic profiles (Monte et al., 2014). Another advantage of metabolomics is that experimental and analytical variation in commonly used methods of metabolite measurement are several orders of magnitudes smaller than biological variation which confers robustness to metabolomic signatures (Keun et al., 2002;Maher et al., 2007).

TARGETED AND NON-TARGETED METABOLOMICS
There are two main strategies used in metabolomic studies: targeted or untargeted (Fiehn, 2001;Mastrangelo et al., 2014;Guo et al., 2015). Targeted approaches are generally used in the identification of potential direct or surrogate biomarkers of health, disease and mechanistic pathways and non-targeted approaches for the detection of broad classes of biochemical to provide a comprehensive functional phenotype integrating clinical phenotypes with genetic and non-genetic factors Guo et al., 2015). Non-targeted studies require the application of bioinformatics and computational tools to analyze and interpret large and complex data .
There are three specific applications of metabolomics, non-targeted metabolic fingerprinting and metabolic profiling and targeted metabolic profiling (Kraly et al., 2009). Nontargeted metabolic fingerprinting seeks to measure a global profile of metabolites with identification of specific profiles based on pattern recognition. A major weakness of metabolic fingerprinting is the inability to identify specific biomarkers for a disease state or therapeutic endpoint. Metabolic profiling is the measurement of the full complement of low-molecularweight metabolites and their intermediates, such as amino acids, carbohydrates and lipids, that reflects the dynamic response to genetic modification and physiological, pathophysiological and/or developmental stimuli (Clarke and Haselden, 2008). In targeted metabolic profiling one or two analytes are tracked with time and because of this is often excluded from the discussion of metabolomics. However, it is a very useful tool for understanding biological systems (Kraly et al., 2009).

METABOLOMICS AND THE MICROBIOME
Metabolic profiling can also include the contribution from gut microorganisms, the microbiome. The microbiome comprises more than 100 trillion microorganisms belonging to 300-500 different species that live inside and on every human being (Guarner and Malagelada, 2003). The number of microorganisms in a healthy human adult are estimated to outnumber human cells by a ratio of ten to one 7 and make up 1-3% of body mass 8 (0.75-2.25 kg in a 75 kg person) and represents a confounding factor when interpreting genomic, proteomic or metabolomic response data (Nicholson et al., 2004;Nicholson, 2006;Kinross et al., 2008;Nicholson and Lindon, 2008). An Individual's microbiome is unique and may share as little as 1% of the same type of bacteria with other people (Kinross et al., 2008) and may change with age, diet, drugs, disease and medical or surgical intervention (Kinross et al., 2008). The gut microbiome interacts with the other systems in the body and has metabolic, trophic and protective functions (Guarner and Malagelada, 2003;Kinross et al., 2009). It influences the levels of cytochrome P450 enzymes (Nicholson et al., 2004), has a significant role in obesity Kinross et al., 2008;Li et al., 2008;Calvani et al., 2010) sepsis (MacFie et al., 1999;Shimizu et al., 2006), inflammatory bowel disease, irritable bowel syndrome and colon cancer (Guarner and Malagelada, 2003). The gut microbiome contributes to interindividual variability in drug toxicities and may contribute to the bioactivation of carcinogens that would not have been metabolized by the human cells (Nicholson et al., 2005). Metabolites arising from the gut microbiome merge with endogenous chemicals thereby altering the metabolome without having influenced gene and protein expression.

METABOLITE SIGNATURES
In toxicology, extensive efforts are underway to identify signatures of toxicity which are patterns of metabolite changes predictive of the manifestation of toxicity and disease. These patterns are more commonly known as metabolite signatures (Bouhifd et al., 2013(Bouhifd et al., , 2014(Bouhifd et al., , 2015. Similarly, in clinical applications the prediction of xenobiotic toxicity or drug effects in an individual based on a mathematical model of pre-intervention metabolite signatures is known as pharmacometabolomics (Clayton et al., 2006). The identification of signatures associated with toxicity, drug effects and disease endpoints in a range of media, including; serum, plasma, urine, mucosa, exhaled breath, saliva, hair, tissue and cultured cells shows steady growth and could provide human data that may be used in chemical safety assessment (Bouhifd et al., 2013;Zhang et al., 2013a,b;Armitage and Barbas, 2014;Sulek et al., 2014). With the application of powerful bioinformatics and statistics, metabolite signatures can be used to identify a PoT which connects the molecular initiating event (MIE) of a toxicant with an adverse outcome (Ankley et al., 2010;Bouhifd et al., 2013;Vinken, 2013;Athersuch, 2016). The development of an underpinning mechanistic toxicology in the form of a perturbed PoT is a key concept for the implementation of the much vaunted Toxicity Testing for the 21st Century (NRC, 2007).
The following are just a few examples illustrating how the identification of in vivo metabolic signatures in pharmaceutical applications, environmental and occupational toxicology and in in vitro systems is a rapidly growing area that could provide actionable data for human chemical safety assessment.
In clinical applications metabolite signatures have been identified in a range of biological fluids that; can distinguish between patients with various cancers including; colorectal, pancreatic, gastric, liver, breast, ovarian, kidney, bladder, prostate, oesophageal, lung and oral and healthy controls , are consistent with early indications of diabetes, liver dysfunction and disruption of gut microbiome homeostasis in healthy volunteers (Guo et al., 2015), can distinguish between race and genotype in response to the antihypertensive drug atenolol (Wikoff et al., 2013), and were able to discriminate hepatitis B virus (HBV) infected subjects from healthy controls (Zhang et al., 2013a).
In environmental toxicology, metabolic profiling of urinary metabolites has been shown; to detect early effects of environmental and lifestyle exposure to cadmium in a human population (Ellis et al., 2012), to distinguish controls and alcohol consumers, but not smokers exposed to a complex mixture such as welding fumes (Kuo et al., 2012), to identify intermediate biomarkers of response to environmental/occupational concentrations of lead, cadmium and arsenic, in smelter workers (Dudka et al., 2014), to indicate oxidative stress-related effects in humans exposed to environmental concentrations of polycyclic aromatic hydrocarbons (PAHs) (Wang et al., 2015), and associate male infertility with arsenic exposure caused by a PoT involving oxidative stress and sexual hormone disruption (Shen et al., 2013).
Distinguishable metabolic signatures have been observed in in vitro cultured human fibroblast cells infected by herpes simplex virus type-1 (HSV-1) and human cytomegalovirus (HCMV) (Rabinowitz et al., 2011). In vitro metabolic signatures leading to hepatotoxicity in HepG2/C3a cells in microfluidic culture conditions which appeared consistent with literature reports of in vivo hepatotoxicity were identified (Choucha Snouber et al., 2013), and different in vitro hepatic metabolic signatures and pathways for ammonia, dimethylsulfoxide and paracetamol toxicity were identified in liver and kidney co-cultures (Shintu et al., 2012).

DOSE-DEPENDENT METABOLITE SIGNATURES
To ensure public safety and environmental quality, regulatory agencies are required by law to undertake science-based safety and risk assessments of potential hazards (Burgoon and Zacharewski, 2008). These agencies use dose-response modeling to identify a Reference Point (RP), also known as a point-of-departure (PoD), which is the point of transition on the dose-effect curve, to derive a health-based guidance value (Sand et al., 2006(Sand et al., , 2012Burgoon and Zacharewski, 2008). Therefore, the identification of dose-dependent changes in metabolite signatures would permit the use of such data in the current safety assessment paradigm (European Food Safety Authority, 2014). Encouraging developments in dosedependent changes in metabolic biomarkers in both in vivo and in vitro studies are increasingly reported. For example, dose-dependent changes were observed in urinary metabolite biomarkers of male infertility in Han Chinese men following environmental exposure to arsenic, and in cadmium-induced renal toxicity in Chinese women (Gao et al., 2014). In addition, the metabolomic changes observed in the latter study were sufficiently distinct to allow the differentiation of cadmiuminduced renal toxicity from subjects with chronic kidney disease (Gao et al., 2014). In male Sprague Dawley rats treated with 0.5 or 2 mg/kg HgCl 2 [mercury(II) chloride] maximal and marked kidney tubule necrosis was observed by 48 h post exposure at the high dose and modest injury at the low dose (Griffin and Bollard, 2004). In vitro, organ-specific, dosedependent, predictive, compound-specific metabolite signatures for ammonia and paracetamol toxicity were observed in microfluidic liver and kidney co-cultures (Shintu et al., 2012).
The use of metabolite signatures in the safety assessment process could be possible if signatures observed in vivo can be reproduced in appropriate in vitro systems in which doseresponse relationships could be more easily observed and measured (European Food Safety Authority, 2014). There are encouraging developments in the area.

MICROFLUIDICS AND BIOCHIPS
Common laboratory practice is to use two-dimensional (2D) cell culture techniques, that is, to grow cells on a flat substrate such as a petri dish or microtiter plate (van Duinen et al., 2015). In three-dimensional (3D) cell culture techniques cells are permitted to grow or interact with their surroundings in all three dimensions and have been shown to be an improvement on 2D cultures (van Duinen et al., 2015). For example, apicalbasal polarization (Schoenenberger et al., 1994), lumen formation (Debnath et al., 2003), reduced proliferation and increased differentiation (Weaver et al., 1997) and numerous changes in RNA and protein expression (Lin and Bissell, 1993).
Yet existing 2D and 3D cell culture models do not fully recapitulate subtle organ-specific variations in the in vivo microenvironment (Huh et al., 2012). In situ, cells experience organ-specific dynamic variations in spatiotemporal chemical gradients and mechanical forces (e.g., cyclic strain, compression, fluid shear stresses) in their local tissue microenvironment that are crucial governors of their survival, growth and function. Thus, many fundamental aspects of cell behavior are mechanosensitive, including adhesion, spreading, migration, gene expression and cell-cell interactions (Jansen et al., 2015). Integrin-mediated mechanosensing feeds into cell fate decisions by activating various downstream signaling cascades connected to gene expression (Jansen et al., 2015).
Microfluidic techniques are based on micrometer-sized, hollow channels lined with living cells arranged to recreate tissueand organ-level physiology which are continuously perfused with nutrient medium (Huh et al., 2010(Huh et al., , 2012(Huh et al., , 2013Tseng et al., 2014;van Duinen et al., 2015). These technologies, also known as biochips and are about the size of a computer memory stick, further increase the physiological relevance of 3D cell culture by enabling spatially controlled co-cultures e.g., liver and kidney, perfusion flow and spatial control over signaling gradients (Smith et al., 2013;Tseng et al., 2014;van Duinen et al., 2015). The detection of a metabolomic signature in a co-cultured biochip should, in theory, be similar to an in vivo blood or urine metabolomic signature as they are all aggregate responses to a chemical stressor.
When coupled to metabolomics and intracellular gene and protein levels, biochips have the potential to provide a functional cell response. They can behave as "biosensor" systems when combined with metabolomic studies of organ culture media that may be useful in a high-throughput small-molecule screening approach . Indeed, biochips are being developed for high-throughput assay development (Prot and Leclerc, 2012;Trietsch et al., 2013). In addition, advances in real-time quantification of changes in intracellular metabolic activities have the potential to vastly improve the prediction of current and future cellular phenotypes based on metabolomic signatures (Heinemann et al., 2014). A proof-ofprinciple microfluidic-based inline small molecule extraction system which allows for continuous metabolomics analysis of living systems has been developed. This technology could detect cyclic patterns and forecastable metabolic trajectories. Metabolic oscillations and predictable transitions in both growth and stress related changes in E. coli and ovine whole blood could be observed and measured (Heinemann et al., 2014). The combination of recent advances in stem cell biology, such as induced pluripotent stem cells (Moreno et al., 2015) and organoid technology (Astashkina et al., 2012), with microfluidic 3D cell culture will lead to the implementation of personalized medicine and companion diagnostics in the next 5 years (van Duinen et al., 2015). However, specific metabolite signatures have been observed that are a cellular response to the culture mode and cellular environment in biochips (Choucha Snouber et al., 2013;Sturla et al., 2014). For example, a cytoprotective cell response was induced in HepG2/C3a cells by the microfluidic biochip conditions (Prot et al., 2011). Also, there are differences in response between biochips and conventional plate cultures . The latter may occur due to differences in mechanosensing (Huh et al., 2012;Jansen et al., 2015). These must be distinguished from specific signatures that are consistent with those observed in vivo . Nonetheless, transcriptomic and proteomic signatures of acetaminophen toxicity observed in cultured HepG2/C3A cells in a microfluidic biochip study have been shown to be similar to those reported in vivo (Prot et al., 2011). Many more examples like that of acetaminophen, encompassing a representative chemical space, are required in order to provide the evidence base to replace animal-based toxicity testing.

VALIDATION OF BIOCHIP DATA
Validation is the independent assessment of the scientific basis, the reproducibility, and the predictive capacity of a test. Currently, the validation of in vitro models is a significant challenge in drug candidate and toxicity screening. High percentages of new chemicals and biological entities still fail late-stage human drug testing, or receive regulatory "black box" warnings, or are removed from the market for safety reasons after regulatory approval (van Duinen et al., 2015). There are a number of reasons for the perception that in vitro cellbased assays and subsequent preclinical in vivo studies do not yet provide sufficient pharmacological and toxicity data or reliable predictive capacity for understanding drug candidate and environmental chemical performance in vivo (Adler et al., 2011;Astashkina et al., 2012;Piersma et al., 2014). The discussion of the regulatory acceptance of in vitro data for safety assessment is beyond the scope of this review. However, the reader may find the reviews of Adler et al. (2011) and Piersma et al. (2014) useful.
A key problem for the novel technologies is the absence of a point of reference, i.e., a "traditional test" or "gold standard." In the absence of reference data, "scientific validation" needs to be stressed. This would involve a systematic review of the extent to which a given test reflects current scientific understanding. In the case of toxicity, this would require review of established modes of action (MOA), PoT and AOPs. This is in contrast to traditional validation, which largely considers the test system as a black box and compares the results obtained therein with those of another black box, the animal model(s) (Pamies et al., 2014). However, because of their origin human-derived 3D cell culture models are expected to be better predictors of clinical and toxic outcome than animal models (van Duinen et al., 2015). Retrospective validation based on clinical results for success or failure of compounds with regards to toxicology should be used as RPs for validation (van Duinen et al., 2015). Nonetheless, more human toxicity data and high-quality in vivo data are critical in assessing the true predictive power of in vitro databased models of in vivo toxicity (Huang et al., 2016). Although historically these data, in many cases are not publicly available, particularly for drugs, there are increasingly numerous new freely available databases that may provide such data, e.g., the Human Metabolome Database (Wishart et al., 2013) and Consensus Path Database (ConsensusPathDB-human) (Kamburov et al., 2013). Therefore, an alternative validation strategy would be to compare biochemical changes between an in vitro model system and in vivo human interaction networks such as gene, protein, signaling, metabolic and drug-target interactions as well as gene regulation and biochemical pathway perturbations (Dumas, 2012;Kamburov et al., 2013;European Food Safety Authority, 2014).

CHARACTERISATION OF HUMAN EXPOSURE
The estimation of human exposure is of fundamental importance in the evaluation of the relevance and interpretation of doseresponse data for toxicity in the assessment of health risks (Thomas et al., 2013). Therefore, a PoD determined in an in vitro system must be extrapolated to an in vivo PoD, which in turn must be related to an administered dose or tissue dose arising from human exposure (Thomas et al., 2013;Wetmore et al., 2015).
Human exposure may be estimated from the measurement of parent chemical concentrations in the blood or urinary metabolite concentrations from which exposure concentrations can be inferred using reverse dosimetry (Tan et al., 2006;Lyons et al., 2008;McNally et al., 2012McNally et al., , 2014. This approach could be appropriate in the case of blood parent chemical or urinary metabolite concentrations of known environmental pollutants measured as part of a metabolic profile. For chemicals without biological monitoring data, as would be the case with a PoD derived from biochips, high-throughput human exposure models are being developed which combine environmental fate and transport models with indoor or consumer exposure models (Arnot et al., 2010(Arnot et al., , 2012Wambaugh et al., 2013Wambaugh et al., , 2014Wambaugh et al., , 2015Wetmore et al., 2015). Comparison of the administered dose or tissue dose with human exposure predictions could provide a margin of exposure (MOE) approach that can help the shift from a hazard based-toward a more risk-based-methodology (Thomas et al., 2013;Wetmore et al., 2015).

BIOINFORMATICS: PATHWAY AND NETWORK ANALYSIS
Omics data must distinguish changes and pathways associated with impending pathology versus benign adaptive changes that are responsive to the chemical but are not associated with toxicity (Harrill and Rusyn, 2008). More and more powerful data analytics required to distinguish biological signals from noise are increasingly available (Braun, 2014). Once identified metabolomic signatures provide relevance beyond clinical biomarkers as they represent a powerful means of identifying mechanisms of toxicity and disease (Wikoff et al., 2013;Zhang et al., 2013a).
The rapid proliferation of metabolomics studies has led to difficulties in the identification of compounds, their physiological role or toxicity or disease-specific pathway (Collins, 2004). The Human Metabolome Database or HMDB is a resource designed to address these issues. It is an open access database with up-to-date referential information about metabolites, metabolic pathways, biomarkers and reference NMR, MS/MS (tandem mass spectrometry), and GC-MS (gas chromatography mass spectrometry) spectra for compound identification (Wishart, 2007;Wishart et al., 2009Wishart et al., , 2013Wishart et al., , 2016. At the time of writing the HMDB contains 41,993 metabolite entries, more than 5000 normal and abnormal metabolite concentrations and nearly 800 metabolic and disease-associated pathways and dozens of cancer biomarkers (Wishart et al., 2016). However, currently only a fraction of the known human metabolome is linked to pathways and secondary processes such as gut microbiome-generated effects and lipid metabolism (Wikoff et al., 2009(Wikoff et al., , 2013. Even with the growing number of knowledge-based metabolic pathway databases that can be used to reveal the higher-order systemic operation of cells, organs and whole organisms (Stobbe et al., 2014;Zhukova and Sherman, 2014) more comprehensive tools and databases specifically designed for network and pathway analysis using metabolomics data are required (Xia and Wishart, 2010a,b;Kankainen et al., 2011;Wikoff et al., 2013). For example, perturbed metabolic pathways have been identified by mapping transcriptomic, proteomic and metabolomics data signatures using freely available resources such as the KEGG database and Ingenuity canonical pathways Posma et al., 2014). However, few network mapping programs consider that the typical mammal has metabolic contributions from symbiotic gut microbiota and even parasitic organisms (Gill et al., 2006;Nicholson et al., 2012). MetaboNetworks, a freely available tool for the identification of complex metabolic reaction networks, combines metabolic reactions from different organisms and allows the delineation and combination of reaction networks from selected and combined organisms (Posma et al., 2014).
Another promising web-based tool is the ConsensusPathDBhuman where human in vivo signatures may be identified by interrogating 32 freely accessible databases accessed via a single portal (Kamburov et al., 2013). The ConsensusPathDB-human integrates interaction networks in humans including binary and complex protein-protein, genetic, metabolic, signaling, gene regulatory and drug-target interactions, as well as biochemical pathways (Kamburov et al., 2013).
However, confidence in the quality and reliability of omics data must be improved. Specifically, significant improvements are necessary in the sensitivity, accuracy and reproducibility of these data (Leung et al., 2013). Biological variation and differences in "time of capture" of samples and inter-laboratory variation can lead to a lack of reproducibility (Leung et al., 2013).
There are efforts under way that combine high quality omics and phenome data in the same database that are already demonstrating an impressive level of sophistication and predictive capability. Currently there is restricted access to the data but they do demonstrate what is possible. For example, the Clinical Genome Resource, which was set up by the US National Human Genome Research Institute, is a database of disease-related variants, and contains information that could guide medical responses to these variants as well as the evidence supporting those associations (Rehm et al., 2015). Genomics England, which runs the 100,000 Genomes Project, aims to bolster progress in this area by establishing 'clinical interpretation partnerships': doctors and researchers will collaborate to establish robust models of diseases that can potentially be mapped to specific genetic alterations (Eisenstein, 2015). This will be achieved by building a database of clinical data with matching rich phenotype data from patient records 9 . Data will remain in a secure environment within which researchers will work (Siva, 2015) 10,11 . The 100,000 Genomes Project provides a monthly update of the number whole human genomes sequenced. As of first February 2016, 6,597 genomes have been sequenced 12 .
The Health Nucleus offered by Human Longevity Inc. 13 uses whole genome sequence analysis, advanced clinical imaging and innovative machine learning -combined with a comprehensive curation of personal health history -to deliver the most complete picture of individual health. Currently, the database contains 20,000 genomes with matching phenomes with the view of expanding to over one million. The larger the database the more effective the correlations because the 6.5 billion bases in each individual's DNA differs from another individual by just 3%. The phenome data is generated using the most advanced techniques to measure unique body structures and metabolomics profiles. Machine learning techniques are used to uncover associations. They claim a level of sophistication where genomes can be matched to photographs and voice recordings matched to sex, age, and height and face shape. In the "Face Project" they claim to have matched 100 photographs to 100 genomes 14,15 .
As with all new developments a global initiative involving industry, regulatory agencies and academic institutions is required to standardize 'omics' methods and reach a consensus on the reliability and interpretation of endpoints (Leung et al., 2013).
A PBPK model is an independent, structural model, comprising compartments that correspond directly and realistically to the organs and tissues of the body and connected by the cardiovascular system. They are mathematical descriptions of biological systems, in this case the human body, which are translated into computer code and solved computationally. They are frameworks that can capture our understanding of the science underlying the biological processes that lead to disease (McNally et al., 2011). The principle application of PBPK models is in the prediction of the appropriate form of the target tissue dose, or dose-metric, of the parent chemical or its reactive metabolite(s). The dose-metric must capture the critical biochemical steps that lead to the moiety causing the effect at the target site. Such mechanisms may take place within any compartment, e.g., blood, organ or sub-cellular compartment. Use of an appropriate dose-metric in chemical safety assessment calculations provides a better basis for relating the observed effects to the external or administered exposure concentration of the parent chemical (Conolly and Butterworth, 1995;Barton et al., 1998;IGHRC, 1999;Johanson et al., 1999;Andersen, 2003;Lipscomb and Poet, 2008).
Physiologically-based pharmacokinetic models can be used for forward or reverse dosimetry. The former converts inhalation, dermal exposure, oral or intravenous administration of a chemical to a target-tissue dose; the latter can reconstruct exposure or dose from parent chemical and/or metabolite(s) in urine, blood or in vitro surrogates of organ or tissue concentration (Tan et al., 2006;Lyons et al., 2008;Louisse et al., 2012;McNally et al., 2012McNally et al., , 2014Bessems et al., 2014). Therefore, PBPK models can be used to translate a RP derived from concentration-response relationships measured in biochips to a plausible distribution of human in vivo concentrations. This can be achieved by linking a PBPK model with Bayesian inference where replacing single point values for model parameters with informative prior distributions converts a deterministic model to a population-based model (McNally et al., 2012(McNally et al., , 2014.

REGULATORY ACCEPTANCE
In the USA regulatory action must be defensible in court where, in the absence of the preferred proof that something, "is or is not true, " the supporting arguments are based on "precedent and expert opinion." Regulators will change their actions when expert scientific opinion supports the use of alternative models over animal models, and regulatory action based on those alternative models can be defended in court i.e., regulators take their cues from expert scientists who provide them with legally defensible actions, not the other way around.
In the absence of a way to measure the "accuracy" of a new test versus existing animal test results, the default is to prove that an alternative-to-animals test produces results that are "similar" or "comparable" to the previous animal studies. If the results are different, then the alternative system cannot replace the animal studies, i.e., current practice and historical precedent win.
The emergence of human data, such as, chemical body burdens (i.e., full chemical and metabolite profiles) and biomarkers of effects for health status or steps along an AOP progression should change the current paradigm. The burden of proof to "validate" a new test should not require comparison with animal data, but should be which test provides the most accurate result to best protect public health. More accurate estimates of risk to protect public health based on human in vivo data must be considered more relevant and less uncertain than an estimate based on current practice which is derived from a few animal test results, primarily in rodents, adjusted by uncertainty factors which are scientifically poorly supported.
In Europe, the regulatory objective in not to obtain the most accurate estimate of risk, rather it is to drive the control of exposure to a level at which there is confidence of no significant risk. Regulatory acceptance is based more on understanding, transparency and robustness of new approaches and adherence with the stipulations of the regulations. European regulatory authorities must be confident in new technologies in order to adopt them and would generally do so without reference to court proceedings.

THE NEAR FORESEEABLE FUTURE
If "foreseeable" refers to a range of time for which forecasts are possible and "forecasting" is to calculate or predict (some future event or condition) usually as a result of study and analysis of available pertinent data, then the next 5-10 years should see the transformation of occupational and environmental toxicology from a human data poor to a human data rich discipline. This transformation will come about through the coalescence of systems medicine and the digital revolution, the components of which, will in turn coalesce with the highthroughput in vitro systems-based toxicity testing paradigm proposed in the US National Research Council vision and strategy for future toxicity testing and safety assessment (NRC, 2007). The generation of human data that may be used in chemical safety assessment continues. For example, the development of a fully integrated wrist-band sensor for in situ analysis of sweat can provide real-time assessment of the physiological state of human subjects and may represent a platform for the development of a wide range of personalized diagnostic and physiological monitoring applications (Gao et al., 2016). Human sweat is a medium considered to be rich in physiological information (Sonner et al., 2015). It is reasonable to predict that the sophisticated sampling technology developed for such a device could be used to provide sweat samples for frequent, non-invasive metabolic profiling. Likewise, the development of relatively lowcost "electronic noses" for the non-invasive analysis of volatile organic compounds (VOCs) signatures in exhaled breath for the early detection of various cancers and other diseases must also bode well for the near future (Konvalina and Haick, 2013;Rattray et al., 2014;Krilaviciute et al., 2015;Gasparri et al., 2016).
Optimism for the development of an animal-free chemical safety assessment paradigm based on the identification of presymptomatic, diagnostic and prognostic metabolic signatures of toxicity and disease using non-invasive or minimally invasive biosensors appears to be justified.

AUTHOR CONTRIBUTIONS
This review is entirely the work of GL.

FUNDING
This publication and the work it describes were funded by the Health and Safety Executive (HSE). Its contents, including any opinions and/or conclusions expressed, are those of the author alone and do not necessarily reflect HSE policy.

ACKNOWLEDGMENTS
The author thanks Dr. Rob DeWoskin for his helpful comments in general and on regulatory acceptance in the USA in particular and Dr. Steve Fairhurst for the UK and European perspective on regulatory acceptance and Dr. Julia Fentem and Dr. Carl Westmoreland of Unilever, UK for their general comments.