DHEAS and Human Development: An Evolutionary Perspective

Adrenarche, the post-natal rise of DHEA and DHEAS, is unique to humans and the African Apes. Recent findings have linked DHEA in humans to the development of the left dorsolateral prefrontal cortex (LDPFC) between the ages of 4–8 years and the right temporoparietal junction (rTPJ) from 7 to 12 years of age. Given the association of the LDLPFC with the 5-to-8 transition and the rTPJ with mentalizing during middle childhood DHEA may have played an important role in the evolution of the human brain. I argue that increasing protein in the diet over the course of human evolution not only increased levels of DHEAS, but linked meat consumption with brain development during the important 5- to-8 transition. Consumption of animal protein has been associated with IGF-1, implicated in the development of the adrenal zona reticularis (ZR), the site of DHEAS production. In humans and chimps, the zona reticularis emerges at 3–4 years, along with the onset of DHEA/S production. For chimps this coincides with weaning and peak synaptogenesis. Among humans, weaning is completed around 2 ½ years, while synaptogenesis peaks around 5 years. Thus, in chimpanzees, early cortical maturation is tied to the mother; in humans it may be associated with post-weaning provisioning by others. I call for further research on adrenarche among the African apes as a critical comparison to humans. I also suggest research in subsistence populations to establish the role of nutrition and energetics in the timing of adrenarche and the onset of middle childhood.


INTRODUCTION
Adrenarche, the post-natal rise in androgen production by the adrenal gland, including both (DHEA) dehydroepiandrosterone and its sulfated form (DHEAS), has attracted increasing attention for its role in middle childhood (1). Once thought to be involved in the initiation of puberty (2), it is now clear that the rise in DHEAS is an independent event (3). Once thought unique to humans, it now known to be shared with chimps (4), bonobos (5), and gorillas (6). Previously considered for its actions as sex steroid, or a sex steroid precursor (7), DHEAS it is now known to act through a variety of non-genomic mechanisms (8). Yet the functional significance of adrenarche remains little understood.
Together DHEA and DHEAS are the most abundant hormone in human circulation, with DHEAS the more abundant of the two (9). DHEA is generally considered the active form, but DHEAS can be converted to DHEA within cells (7). In addition, DHEAS is the form produced by the adrenal gland (10) making it a marker of adrenal function. Thus, in what follows I will use DHEA/S as a general term, but will differentiate between DHEA and DHEAS where their specific effects have been demonstrated.
Ironically, the wide-ranging physiological impact of DHEA/S makes it hard to focus on a single primary function and has impeded a more systematic understanding of this important hormone. In addition, the most common animal models, the rat and the mouse do not exhibit adrenal production of DHEA/S. However, both mice and rats produce DHEAS within the brain (29), as do humans (30-32). Hence the term neurosteroid for DHEA/S (33). However, neural production of DHEA/S can't explain the origins of adrenal production of DHEA/S and its circulation throughout the body.
More recently, the spiny mouse has been reported to develop an adrenal zona reticularis from post natal day 8-20, a possible analog to human adrenarche, in addition to producing DHEAS in the brain throughout life (34). Interestingly, the species also exhibits high level of fetal adrenal production and menstruates (35) suggesting that it may be a useful rodent model for studying the effects of DHEA/S in humans.
Thus, current results beg the question of what is evolutionarily novel and adaptive about the high and increasing level of circulating DHEAS in humans at the onset of middle childhood around 6-7 years. I suggested that the primary effect (among others) of increasing DHEAS is the maintenance of plasticity in the developing brain (36, 37), a point since elaborated by others (38, 39). Recent finding showing an impact of DHEA on cortical development and cortical-limbic connectivity in children (14)(15)(16)(40)(41)(42)(43) provide strong support for this argument.
Here I extend my argument about the evolutionary origins of adrenarche in humans to include both the social and nutritional context of the infancy-early child transition as well as the shift from early childhood to middle childhood. More specifically, I suggest that the higher titers of DHEA/S in humans relative to apes may reflect increasing levels meat in the diet with advent of the genus Homo (44). Consumption of meat in early hominids is generally agreed to provide increased energy to support a larger brain, whether this includes a reduction in gut size (45), or not (46), and regardless of when the role of cooking became important (47).
Importantly, consumption of animal protein intake has been related to increased IGF-1 levels in human adults (48, 49) and children (50, 51) and mice (19). At the same time IGF-1 has been implicated in the development of the zona reticularis (52) the layer of the adrenal gland responsible for DHEA production.
I hypothesize that increased meat consumption over the course of human evolution played a role in increasing production of DHEA/S as a part of the extended development of an energetically costly brain (53). Starting with the genus Homo meat consumption would have lead to increased IGF-1 and elevated DHEA/S levels beyond those in the African apes. After weaning, meat provided by males (54) would have provisioned children (55,56) and supplied protein and energy for cortical synaptogenesis (53,57). Increased DHEA/S would have acted as a co-factor in promoting cortical maturation, including in the right temporal parietal junction (rTPJ) leading to increased capacity for mentalizing and perspective-taking before the onset of reproductive maturation, a useful trait for a species with pair-bonding and biparental care (58).
I forward this argument in three parts. In the first section, I lay out a series of steps potentially linking intake of animal protein to brain development. These start with the consumption of animal protein which has been associated with higher IGF-1 levels in adults (48, 49) and children (59). IGF-1 also has been implicated in the development of zona reticularis (52), suggesting that higher IGF-1 titers could promote a thicker zona reticularis and increased DHEAS production. IGF-1 itself plays a key role in brain development, both in utero and post-natally (60,61), promoting neurogenesis, neurite growth, and synaptogenesis (62,63). DHEA has been show to increase energy available in neurons through mitochondrial respiration (64)(65)(66), potentially providing additional energy for the metabolic costs of early brain development (53).
In the second section I compare the timing of adrenarche in humans and the great apes. I present evidence showing that while the zona reticularis begins to mature around the age of 3-4 years in both humans (67) and chimpanzees (68), its relationship to other developmental landmarks varies between the two species. Most importantly, in chimpanzees the onset of weaning and peak synaptogenesis in the prefrontal cortex at 3-4 years (69) occur roughly in concert with the emergence of the ZR. In comparison, in humans weaning is completed around 2 ½ years (70,71), while the peak of synaptogenesis is around 5 years (72) years. Thus, in chimpanzees, early cortical maturation is tied to weaning, while in humans it is more closely associated with DHEAS production.
In the third section, I integrate the timing of DHEA impacts on cortical thickness (14) with social and behavioral markers to present a picture of DHEAS's potential role in the evolution of human childhood stages. Most obviously, the 5-to-8 transition to middle childhood, referred to as the "age of reason and responsibility" by White (73), maps onto the effects of DHEA on LDLPC from 4 to 8 years. Lancy and Grove (74) point out that in many societies children are considered incapable of learning before this age, consistent with the importance of LDLFC for the development of executive function (75).
The impact of DHEA on the rTPJ during middle childhood and its consequences for mentalizing is harder to interpret. Hrdy (76) emphasizes the importance of child care by girls, as practice for their own infants, during human evolution. Mentalizing is an important part of such skills (77,78). Child care would presumably have been less important for males, but mentalizing may have been useful in learning the skills of cooperative hunting (79). In addition, mentalizing might be beneficial for developing an understanding of one's self in relationship to the opposite sex (1, 80) before puberty exaggerates sexually dimorphic physical and behavioral characteristics that can lead to misunderstandings and tension between the sexes.
Based on the first three sections I end with suggestions for future research directions in the comparative study of adrenarche and middle childhood among both the great apes and humans. Current evidence for adrenarche and its relationship to differences in hormonal (81), cranial (82), and brain (83)(84)(85) traits in these two related species (86) is quite limited. A simple direct comparison of adrenarcheal timing in the two species would add immensely to our understanding.
In addition, the onset of middle childhood is thought to be consistent across human populations (74,87) the evidence regarding adrenarche in subsistence populations (88)(89)(90)(91) is scattered and incoherent. A better understanding of variation in adrenarche in the context of poor nutrition, high disease burdens and traditional child care practices would help to ground evolutionary perspectives on adrenarche in a more realistic context. It would also set the foundation of a better understanding of the role of adrenarcheal timing in cognitive development (37).

IGF-1 AND DHEA/S
IGF-1 (Insulin-like Growth Factor 1), as the name suggests, is closely related to insulin. Like insulin, IGF-1 plays a role in glucose regulation (92), and both hormones impact mitochondrial function, protecting against oxidative damage while promoting oxidative capacity (93). However, IGF-1 is most directly associated with protein metabolism, including cell proliferation and differentiation (72) as well as protein synthesis (92). At an organismal level, variation in circulating IGF-1 levels has been related to differences in protein consumption across the life cycle, including during infancy (94), childhood (59), and adulthood (48, 49). In fact, early protein consumption may lead to programing of the IGF-1 axis that becomes apparent during adolescence (95).
IGF-1 is of special interest because it plays an important role in the development of the adrenal gland, including the zona reticularis (52, 96). The zona reticularis develops as primordial stem cells located at the surface of the adrenal cortex move from the zona glomerosa through the zona fasicularis to their final residence in the zona reticularis (52, 96). IGF-1, given its ability to inhibit apoptosis (97) is thought to enhance the survival of the migrating cells. Higher levels of IGF-1 during the formative period of the adrenal gland before the age of 6 years when the zona reticularis is established (67) may lead to increased numbers of progenitor cells reaching the ZR. More ZR cells would presumably lead to increased DHEA/S production. Thus accelerated early somatic growth, linked to increased IGF-1 (98, 99) may be associated with greater DHEA/S as well.
DHEA is generally classified as a weak androgen, i.e., a sex steroid. However, DHEA acts on a variety of cells types and receptor making it more difficult to characterize a single mode of action. For instance in prostate-derived LNCaP tumor cells, DHEA acts at the androgen receptor (AR) and beta estrogen receptor beta (ER-beta) at similar affinities, but the effects at ERbeta appear to be more physiologically relevant (100). In contrast, DHEA does appear to have demonstrable effects on AR mRNA expression in ovarian granulosa cells (101). Thus, the actions of DHEAS are unlikely to be sufficiently characterized as that of a weak androgen alone, and it is important to consider the specific tissue and/or organ involved.
DHEAS has also been suggested to act primarily as a sexsteroid precursor because of its conversion into testosterone and/or estrogen within peripheral target tissues (102). However, the importance of peripheral conversion of DHEAS is most obvious in post-menopausal women for whom ovarian steroid production has ceased. In this case, DHEAS is responsible for 100% of estrogens and 70% of circulating androgens (7). In men and premenopausal women gonadal sources of estrogen and testosterone may obscure the contribution of peripheral conversion of DHEAS to the sex hormones.
More recently, non-genomic mechanisms of DHEA action have been clearly demonstrated, including actions through the IGF-1, sigma 1, TrkA, and GABA receptors as well as the DHEA specific GCPR (8,103). I specifically mention Sigma-1, TrkAR, and IGF1R because of their role in the brain. The sigma-1R is a chaperone that brings molecules from the cell membrane to the mitochondrial associated membrane (MAM) at the nexus of the mitochondria and endoplasmic reticulum (ER) (103). Sigma-1R has been related to axonal guidance and dendritic arborization (104). The TrkA receptor is important in transducing the effect of nerve growth factor (NGF) and has been related to neuronal differentiation and survival (105). As already mentioned the IGF-1R is related to mitochondrial energy production in neurons (106).
Regardless of the specific receptors involved, DHEA/S has demonstrable effects in the brain. DHEAS regulates the IGF-1 system in the rat hypothalamus by down regulating IGF-1 levels (107). DHEAS is also known to modulate the release of neurotransmitters such as GABA, 5-HT, glutamate and dopamine [see (108) for a review], effects that may involve the Sigma-1R. DHEA, on the other hand, has also been shown to modulate glucose and lactate uptake (109), glucose metabolism (110) and increase mitochondrial energy production in the rat brain (65,66). A fuller picture of the metabolic effects of DHEA/S' on the brain await future research.
Nonetheless, DHEAS-related impact on neurotransmitter production and release has important implications for patterns of neural activity and brain development. DHEA administration in adults has been shown to inhibit connectivity between the amygdala, the hippocampus and the insula (13) regions connected by glutamatergic and dopaminergic pathways. In addition, individual variation in DHEA from the age of 4-23 years has been linked to differences in connectivity of the amygdala with both the anterior cingulate cortex and the visual cortex (14,16). Such differences in connectivity may reflect the impact of increasing levels of DHEA on neurotransmitter release and the subsequent production and maintenance synaptic connections.
Taken as a whole, findings on DHEA and the Sigma-1R, TrkAR, and IGF-1Rs in neurons suggest that DHEA/S may be one thread in a non-genomic metabolic pathway linking protein intake and brain development in humans, as follows. Increased protein intake would lead to increased IGF-1 production by the liver. Increased IGF-1 would act directly to increase mitochondrial energy production within brain neurons. At the same time, increased levels of IGF-1 during development would promote the growth of the adrenal zona reticularis and with it DHEA/S production. DHEA/S would act on the brain to augment mitochondrial energy production (65,66) while protecting neurons against related mitochondrial production of oxygen free-radicals and apoptosis (64), with the net income of increasing neurotransmitter release. See Figure 1 for a diagrammatic representation of how meat consumption might play a role in neurotransmitter release acting through the sigma-1 receptor.

ADRENARCHE IN THE AFRICAN APES
DHEAS produced by the fetal adrenal is present prenatally in a wide variety of primate species (111). The fetal adrenal is very large in utero and then atrophies after birth, meaning DHEAS levels decline rapidly post-natally, with levels among adults generally low across primate species (112). In rhesus macaques, close examination of the adrenal gland indicates that the zona reticularis develops during the period just after birth while the fetal zone of the adrenal is atrophying (113), leading to a transient post-natal increase in DHEAS production (114).
Many primate species, including rhesus and pigtailed macaques and yellow baboons (115) show detectable levels of circulating DHEAS post-natally (112). In fact, across primate species circulating adult DHEAS levels are strongly related to life span (115,116). Interestingly in the common marmoset (Callithrix jacchus) female show increased levels of DHEAS during adulthood, while adult males do not (117,118).
However, clear and sustained post-natal increases in DHEAS are limited to humans (119, 120) chimps (4), and bonobos (5). In fact, DHEAS levels appear to be much higher in chimps and bonobos compared to gorillas, which would make adrenarche a derived trait <10 million years old (121). At the same time, substantially higher levels of DHEAS have been documented in human females relative to chimpanzees (68), suggesting that the human pattern is derived from that of chimpanzees and bonobos.
The timing of increases in DHEA/S production across humans, bonobos and chimpanzee species appear to be generally similar. Recently published longitudinal results based on 53 wild chimpanzees from birth to 20 years of age show that urinary DHEAS, after declining from birth, starts to rise around 2-3 years (4). This rise continues until at least 20 years of age, with no significance in urinary DHEAS between the sexes. These findings are consistent with those from a cross-sectional study of 86 captive chimpanzees, ages 1-12 years (122) in which females showing higher serum levels of DHEA starting at 2-4 (122). Males from the same study showing increases in DHEAS starting at 4-6 years.
In humans, a recent cross-sectional study of almost 2,000 (m = 1,031; f = 926) individuals using a sensitive LC-MS/MS system demonstrate an increase in serum DHEA (119) starting at 3-5 years and continuing into the 20's. Remer et al. (120) show a similar pattern based on LC-MS/MS measurements of urinary DHEA & metabolites in 400 children 3-17 years of age. The data on age patterns of DHEAS for bonobos is much more limited. A single cross-sectional study of 53 captive animals, ages 1-18 years, shows an increase in urinary DHEA-S after 5 years of age (5). As with chimpanzees the values appear to be still increasing at the upper end of the age range.
Taken together, these studies suggest very similar patterns of DHEAS in humans, bonobos and chimpanzees, with onset around 3-5 years and continued increase into at least the late teens. Comparison of age related cortisol patterns re-enforces the similarity of adrenal development between of humans and chimpanzees. In humans, urinary cortisol declines from birth and then starts to increase at 10 years of age with higher levels for males (123). Sabbi et al. (4) show a similar pattern for the wild chimpanzees with an increase from 10 years, but no sex differentiation. Comparable results are not available for bonobos.
It is important, however, to point out that the phenotypic signs used to define adrenarche (presence of acne, body odor, and hair) appear on average at about 8 years for girls (124) and 9 years for boys (125). Phenotypic signs of adrenarche have not be been characterized in either of the two ape species. Thus, while chimpanzees show similar pattern of age related DHEAS and cortisol to humans, the timing of adrenarche as defined by phenotypic markers and associated behavioral changes in chimps and bonobos remains to be investigated.

Current Research on Adrenarche in Subsistence Populations
Up to this point, I have drawn from results among WEIRD (white educated, industrialized rich and democratic) populations (126) to characterize human adrenarche. However, these populations are generally well-nourished, largely sedentary, and with low disease burdens leaving abundant energy to fuel the development of the brain, confounding any comparison with wild chimpanzees. Nonetheless, Shi et al. (127) report that animal protein intake and fat mass explained a small but significant amount of variation in adrenal androgen secretion (5 and 1% respectively) in a sample of 137 pre-pubertal (ages 3-12 years) German children. Thus, given the existence of undernutrition, high disease burdens and habitual physical activity we might expect that adrenarche, like puberty, would be delayed in subsistence populations. Furthermore, given energetic constraints, the relationship between DHEA/S, fat depots and animal protein intake might be more sharply drawn.
A previous set of studies in subsistence populations based on older sample collection and assay techniques (88,91,128) could not adequately address the timing of adrenarche. These studies conceptualized adrenarche as a part of puberty and as a consequence didn't sample individuals young enough to be able catch adrenal onset.
However, three recent studies yield results speak to variation in the timing of adrenarche in subsistence populations and its possible causes. In the study most comparable to that of wild chimps, Helfrecht et al. (89) compared cross-sectional agerelated patterns of DHEAS and cortisol derived from hair among . Increased IGF-1 would result in a thicker ZR and greater production of DHEA/S. DHEA/S crosses the blood brain barrier and enters into neurons. Within the neuron, DHEA acts at the Sigma-1 Receptor located on the mitochondrial associated membrane between the mitochondria and the endoplasmic reticulum. Activation of the Sigma-1 receptor acts to increase energy production and alleviate stress-related production of oxygen free radicals. The result is the increased production and release of neurotransmitters as suggested by the arrows along the axon.
Aka pygmies and Ngandu farmers of the Congo with those of Sidama agriculturalist from Ethiopia. The authors used GM models to determine the transition point at which DHEAS levels start to increase again after declining from birth, based on 160 individuals (80 m; 80 f) ages 3-18 years from any all three populations. Results indicate an average age of adrenarche of about 8 years for the Ngandu, and Sidama and 9 years for the Aka, with no significant sex differences in any of the three groups. In addition, age patterns of cortisol among Aka females and Sidama males appear to include a nadir around the age of 10 years.
Importantly, all three of these subsistence populations exhibit high rates of growth stunting (89). Stunting has been associated with lower DHEAS among children in a single study from rural Malawi (59), so undernutrition alone might account for later age at adrenal onset among Aka, Ngandu and Sidama children. In addition, the Aka and the Ngandu, as residents of a tropical forest, presumably carry high parasite burden (89), which could play a role in adrenarche time relative to the non-forest dwelling Sidama.
However, Hlefrecht et al. (89) don't include anthropometric measures to test the relationship of DHEAS with stunting or other nutritional status indices. No do they have measures of animal protein consumption that might be used to test the hypothesis that meat consumption is important to variation in in adrenarcheal timing. Thus, the results are tantalizing, but without reference values for hair DHEAS from industrialized populations for comparison as well as measures of dietary intake and nutritional status they simply beg further investigation.
In a study of the impacts of migration on reproductive maturation (also cross-sectional), Houghton et al. (124) compared the timing of adrenarche in girls among British natives, Bangladeshi natives and both 1st and 2nd generation Bangladeshi immigrants to the U.K., using salivary DHEAS. Average age at adrenarche (determined by a Wiebell regression as a measures of estimating when 50% of the sample passed a threshold value of 400 pg/ml) was 7.1, 7.2, and 7.4 years for the British, Bangladeshi Natives, and 2nd generation immigrants, respectively, while for 1st generation immigrants the average was 5.3 years.
Houghton et al. also analyzed their DHEAS results with regard to nutritional status, reporting that BMI quartiles predicted onset of adrenarche, with the highest quartile showing a significant difference from the others. However, BMI quartiles did not predict DHEAS levels subsequent to adrenarche, suggesting that earlier onset of DHEAS is not related to higher levels of DHEAS during middle childhood and adolescence. In other words, the earlier emergence of the zona reticularis did not appear to be related to the development of a thicker zona reticularis, as indexed by DHEAS production, contrary to my argument above.
The population comparison in Houghton et al.'s study is instructive in two important ways. An earlier study, based on serum DHEAS, reported an average age of adrenarche of 7.7 years among school girls in Taiwan (129). Thus, the timing of adrenarche among the British, Bangledeshi and Taiwan natives provides a clear baseline for the onset of adrenarche among adequately nourished girls at 7-8 years. The results from the Bangledeshi immigrants also shows that adrenarche can vary substantially across populations. On the other hand, the results do not suggest a clear reason for substantially earlier onset of adrenarche among the 1st generation migrants nor do they speak to adrenarche in boys.
Hodges Together these three studies provide clear evidence that the timing of adrenarche can vary across populations and may be related to nutritional status. However, they say little about potential causes for such differences, including differences in energy stores (131), diet (132), and/or heavy parasite burden (133), or protein consumption as proposed here, that characterize subsistence populations. They also leave open potential sex differences in adrenarcheal timing, which may be involved in the development of attachment and gender roles (1).

Comparative Timing of Zona Reticularis, Brain, and Weaning
Examination of adrenal glands from chimpanzees suggest the ZR begins to emerge around the age of 3 years and continues to broaden into adulthood (134), similar to the pattern found in humans (67). Unfortunately, data regarding the maturation of the zona reticularis is not available for bonobos.
The emergence of the zona reticularis in chimpanzees appears co-incident with the eruption of the deciduous (baby teeth) M1 molar at 3.3 years of age in both wild and captive animals (135,136). Deciduous M1 eruption is of specific interest because it is related to the age of weaning across mammals in general (137). At the same time a period of elevated synaptogenesis lasts from 3 to 5 years (69) paralleling the process of weaning starting around 3 and finishing at ca. 4.5 years (135). Thus it appears that increased synaptogenesis, with its elevated energy requirements (138), takes place while energy availability from breast milk declines, and DHEA/S levels rise.
In humans the emergence of the ZR at age 3 (67) is similar to that seen in chimpanzees (134). However, the relationship of ZR emergence with the timing of molar eruption, the period of elevated synaptogenesis and weaning differ from that observed in the chimpanzee. Deciduous M1 eruption, at 5.5 years, is delayed by a couple of years relative to chimps. The period of peak brain glucose utilization, associated with synaptogenesis, is reached at about 5 years (53,57). In contrast, age at weaning in natural fertility populations is close to 2 1/2 years, although the data are poor (70,71).
In all, it appears that in humans the onset of weaning has been accelerated relative to chimps, with delayed dental maturation and a later peak in synaptogenesis. Thus, the human pattern of early development appears to separate the integration of breastfeeding and brain development from its ancestral roots. This comparison makes it clear that breast feeding intervals have been shortened to increase reproductive rates, while at the same time humans show slower growth rates consistent with our extended life spans, a point made previously by Bogin (55, 56), Bogin et al. (139), Kramer (140), and Kramer and Otárola-Castillo (141).
It has been suggested on the basis of deciduous M1 eruption timing that humans would be expected to exhibit weaning somewhere between 5 and 7 years (137). If this were so, weaning in humans would be associated with a period of declining glucose utilization (53) as in chimps. Instead the temporal advance of weaning means that the period of peak glucose utilization and increased synaptogenesis from the age of 4-8 years falls outside of the period of breastfeeding.

DHEAS AND HUMAN DEVELOPMENT
The association of DHEA with the development of the LDLPC is notable for two reasons; 4-8 years is a period of elevated glucose utilization (53, 57) and increased synaptogenesis (138). The timing also maps onto the so-called 5-to-8 transition when children begin to develop cognitive skills that allow for great individual independence (142) White (73) point out that children become cognitively and socially capable of carrying out basic social tasks during this period. Among hunter-gatherer groups this shift is associated with increasing play directed toward subsistence activities [see examples in (143)], as well as the care of younger siblings (144).
The association of DHEA with the rTPJ from 7 to 12 years of age is even more striking because it suggests DHEA's importance to the development of theory of mind (ToM). ToM is welldeveloped in humans [for whom it is associated with activity in the rTPJ (145)], but rudimentary in chimpanzees (146). During middle childhood, the differentiation of thinking about the thought of others, rather than their actions or feelings is associated with the development of the rTPJ (147,148).
Thus, DHEA appears to play a role in two key transitional periods; that from infancy to early childhood and from early childhood to the juvenile stage or middle childhood. The first of these steps represents a standard transition in mammals from dependent infants to largely self-sufficient juveniles. As such one might expect that the pattern of DHEAS production during this period would differ between human and chimpanzees primarily by magnitude or timing. The second transition, on the other hand, is thought to be unique to humans as is middle childhood itself (55). Hence the role of DHEA in the development of the rTPJ may be a relatively recent phenomenon in evolutionary terms and as such associated with other novel physiological, neurological or genetic differences between humans and both chimps and bonobos.
The dramatic changes in secondary sexual characteristics during puberty brought on by testosterone and estrogen overshadow any of the physical effect of DHEAS, including acne, body odor, oily skin, and body hair [see (37) for a review]. As mentioned earlier DHEA and DHEAS can be converted into testosterone and/or estrogen and act through the AR (100) or the ER (149). So it possible that DHEA contributes to the effects of testosterone and estradiol. However, given the lack of strong affinity for the AR and ER, DHEA/S it is unlikely to have much discernable impact on secondary sexual characteristics during puberty when testosterone and estrogen levels are rising.
Nonetheless, DHEA/S may continue to have effects on brain development throughout puberty. In fact, among prepubertal children, Nguyen et al. (14) found an interaction of DHEA with testosterone on cortical thickness in the right cingulate cortex and occipital pole while Barendse et al. (150) report an interaction of testosterone and DHEA on white matter microstructure. These findings presumably reflect different modes of action for the two hormones, with testosterone acting through the AR while DHEA acts at other receptors, including sigma-1, TrkA, and IGF-1 (103). If so, prolonged cortical maturation starting at 6 years and continuing into the 20's would appear to reflect increasing levels of DHEA/S as a separate but interrelated process with that of the impact of reproductive maturation on brain development.
Less attention has been given to the implications for the end of the steady rise in DHEAS in the 20's (151,152). One study reported a peak for females about 25 years, with males showing a peak at 30 (152). This timing is roughly consistent with the end of cortical maturation in the 20's as judged by myelination (153)(154)(155). Such continued maturation presumably underlies the emergence of young adulthood as a human developmental stage [see (156) for a discussion of young adulthood].
In contrast, myelination in the chimp appears to reach a peak during puberty, around 12 years (157). However, it is unclear how this timing corresponds to age patterns of DHEAS. Bernstein et al. (112) show a peak in serum DHEAS at 10 years of age for a sample they label PAN, i.e., both chimpanzees and bonobos. In contrast, Behringer et al.

EVOLUTION OF EARLY AND MIDDLE CHILDHOOD
Up to this point I have focused on physiological and cellular mechanisms underlying adrenarche and the effects of DHEA on the brain, including increased cortical thickness in the LDLPFC from 4 to 8 years, and the rTPJ from 7 to 12 (14). Together these brain changes underlie the behavioral and cognitive changes of the 5-to-8 transition and middle childhood (1,37). The evolutionary question is the nature of the selection pressures that created this novel human life stage of early and middle childhood, inserted between infancy and adolescence (55,56,139).
Bogin (55) suggests three possible hypotheses for the evolutionary benefits to early and middle childhood against the backdrop of generally longer development in humans. These include; (1) a reproductive and feeding strategy for the mother, (2) a way of eliciting child care for older children, (3) a way of reducing the energetic cost of juvenile children. All of these factors appear to apply to early childhood. While a nursing mother is pre-occupied by a nursing infant, toddlers can look to other adults and old siblings for both food and social interaction, while their small size makes them less expensive to feed.
With the advent of middle childhood the focus of development shifts to the role of socialization and cultural learning for the child itself (87). Hrdy (76) stresses the importance of child care experience during middle childhood for girls among hunter-gather societies, such as the Ju'Hoansi. The development of mentalizing would be especially helpful for such mothers in training, as recent findings documenting changes in the social brain during pregnancy emphasize (158). But girls of this age are also encultured in women's subsistence activities, as well as morality and religion (87).
Boys are solicitous of their younger siblings as well. However, among subsistence societies childhood activities start to become gender specific during the 7-12 year stage, with girls expected to do childcare (144). Thus, among H-Gs, pre-pubertal boys will start to spend more time practicing and playing hunting skills [see (159,160) for examples]. As part of this, mentalizing would be important for coordinating behavior during a hunt. In fact, mentalizing can include understanding the mind of prey animals [see (161) for such accounts among the San].
In addition, mentalizing would be useful for developing an understanding of one's self in relationship to the opposite sex. Del Guidice (1) has argued that middle childhood is a time when attachment becomes sexually differentiated and the increase in DHEAS stimulate genetically-based sexually related behaviors that will be more fully developed with puberty. The fact that DHEAS has been related to amygdala connectivity and emotion in prepubertal children (40) together with the role of amygdala in human chemosensory processing (162, 163) provides a potential link by which body odor at adrenarche might be related to the emergence of sexual awareness. Specifically, the development of body odor as a function of sebaceous glands (37) may be synchronous with the development of the emotional salience of body odor as a signal.

HUMAN EVOLUTION, WEANING, AND MEAT
To be convincing arguments about the evolution of unique human traits like mentalizing (148) or the emergence of a novel life stage like middle childhood (55) require both a physiological substrate and evidence for a phylogenetic precursor. The onset of DHEAS production during early childhood (119,120), and its impact on brain development during both early and middle childhood (14,42,43), together with the emergence of mentalizing from 9 to 11 years (147) fits both criteria. Adrenarche is a physiological process with neurological consequences related to behaviors during middle childhood, and the increase in DHEAS is linked to the development of the adrenal ZR. A similar increase in DHEA/S among chimpanzees provides a phylogenetic precursor, though the behavioral consequences of adrenarche for chimpanzees are unclear.
Evidence for the role of meat in human brain evolution starting with the origins of the genus Homo can only come from the hominid evolutionary record. Recent findings based on analysis of barium/calcium isotopic ratios from five Australopithecus Africanus teeth suggest weaning around 1 year (164). Furthermore, the isotopic analysis show cyclic changes in the barium/calcium ratios after apparent weaning suggesting renewed period of milk intake in response to environmental fluctuations, similar to that observed among orangutans.
These finding are important in suggest that provisioning by non-material family members during early childhood development may have started alongside an increase in endocranial volume with Homo Habilis (165). In other words, provisioning of weaned infants with meat would have helped to alleviate seasonal undernutrition, thus promoting growth and survival while allowing for energetically expensive brain development during this period as discussed above for modern humans.
It is generally agreed that increased meat consumption was critical to early Homo adaptations, including increased brain size (166,167). The importance of specific factors, such as the role of cooking in making meat especially energy rich (47, 168) and timing of the habitual use of fire for cooking is a topic of much discussion (169)(170)(171). As are the roles of essential fatty acids (172,173) and vitamins and minerals (174). I am not arguing that DHEA/S supplants those factors, i.e., this is not a case of endocrinological "newcomer bias" with regard to metabolic regulation (175), but that DHEA/S represents a previously unrecognized pathway promoting increased brain size, one that seems to fits the specific trajectory of human brain development.

FUTURE RESEARCH DIRECTIONS
Because direct experimentation is not possible, arguments about evolutionary history are by their nature speculative. However, evolutionary arguments can be substantiated by what Wilson (176) referred to as consilience, the jumping together of relevant elements, each subject to empirical investigation, to bear on the central question. In this case, I highlight two related areas where further research can generate empirical results that can then to applied to the interpretation of the fossil record. These are; (1) better characterization of adrenarcheal timing and its association with behavior in the Africa apes; and (2) the role of protein consumption and nutritional status in adrenarcheal timing among human subsistence populations and its potential implications for behavioral and cognitive development.

Adrenarche in Chimpanzees and Bonobos
For the great apes, given the longitudinal results available from wild chimpanzees (4) the immediate focus of inquiry shifts to a more complete characterization of adrenarche among bonobos. On-going work at the Kokolopori and Luikotali study sites in the Democrative Republic of Congo by researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig (https://www.eva.mpg.de/primat/research-groups/ bonobos/main-page) are ideally positioned to produce results for wild bonobos. The value of such results would be greatly increased by comparison with an on-going project among captive bonobos and chimpanzee also on-going at the Max Planck Institute. Most specifically, the availability of a welltested urinary IGFBP-3 (IGF binding protein 3) assay, as proxy for IGF-1 (5) would make it possible to test the mechanism put forward here. Does individual variation in IGFBP-3 predict variation in DHEAS and the timing of adrenarche? Do longitudinal increases in IGFBP-3 predict increases in DHEA/S within individuals?
If bonobos show age-related increases in DHEA/S similar to those clearly documented for wild chimpanzees by Sabbi et al. (4), together the two species would represent a common pattern useful as a single comparison for humans. On other hand, if patterns of adrenarche are different, it is possible that DHEAS may have some role in reported differences between chimpanzee and bonobo brains (83)(84)(85). This is a topic worthy of investigation in its own right.

Adrenarche in Subsistence Populations
As discussed previously, current results suggest that adrenarche may be delayed by a year or two in subsistence populations, but the possible causes and potential implications have yet to receive much attention. The hypothesis advanced here, that meat consumption plays a role is adrenarche, can be tested by using measures of animal protein intake as well as skinfold measures as a marker of energy stores as predictors of DHEA/S in a lean subsistence population much as Shi et al. (127) found among children in Germany. If the results show a significant relationship with animal protein intake, controlling for measures of energy stores, the next step is test IGF-1 as a possible mechanism mediating protein consumption and DHEA/S. The question remains as to whether differences in timing of adrenarche associated with undernutrition have a demonstrable impact on the timing of cognitive processes associated with the 5to-8 transition and middle childhood. In WEIRD populations, in addition to underlying process of brain development, cognitive development is scheduled by age-related progression through school which plays an important role in entraining attention (177)(178)(179). In fact, cross-cultural studies suggest that the timing of the 5-8 transition is consistent across societies (74,87), implying that underlying brain maturation, including peak glucose utilization (53) and associated synaptogenesis (138) show similar timing across populations.
Thus, a significant delay in adrenarcheal timing would seem mostly like to shift the relative impact of increasing DHEA/S levels away from maturation of the LDLPFC at 4-8 years and toward the maturation of the rTPJ from 7 to 12 years. As a consequence, delayed adrenal timing might bias brain development away from impulse control and decision-making during the 5-to-8 transition and more toward mentalizing during middle childhood.
Such a brain might end up producing a mind characterized more by attention to the thoughts of others than to abstract thoughts and rules, as suggested by reports of important differences in attentional style in subsistence vs. WEIRD populations (178,180). Furthermore, greater emphasis on mentalizing might lead to metalizing about the thoughts of animals, especially among hunter-gatherers who are so intimate with their prey (181)(182)(183). Or, in the words of Hallowell (184), animals might be come be seen as non-human persons.

CONCLUSION
Recent evidence that DHEA/S plays a role in the development of the human brain calls for an evolutionary explanation of adrenarche among both humans and the African Apes. I hypothesize that increasing consumption of meat among our hominid ancestors promoted increased IGF-1 leading to increased growth of the adrenal reticularis and increased production of DHEA/S. In turn, DHEA/S may promote mitochondrial energy production critical for synaptogenesis and brain development.
Comparison of the timing of brain development relative to weaning and dental eruption patterns suggests that unlike for chimps, in humans the maximal brain energy demands are not supported by maternal energy stores. Thus, provisioning of young children by kin with meat may have to the increased important of DHEA/S in brain development during the 5to-8 transition along with the later development of a middle childhood stage.
Results from wild chimpanzees demonstrate age related patterns of DHEAS and cortisol very similar to those displayed in humans, providing a baseline from which to understand how DHEAS may have played a role in childhood development over the course of human evolution. More work is needed to determine adrenarcheal timing among wild bonobos, and whether differences in DHEA/S is related to developmental differences between the two ape species.
In addition, findings among subsistence populations are tantalizing in suggesting delayed age at adrenarche relative to the industrialized world. But the current results are subject to interpretation and specific factors behind apparent delays call for investigation. To test the hypothesis suggested here, collect data on animal protein consumption as well as anthropometric measures of energetic status in energetically constrained populations are needed. Such work may have important implications for understanding the impact of adrenarcheal timing on the development of cognition in subsistence populations and by inference among humans generally.

AUTHOR CONTRIBUTIONS
BC was responsible for the entire production of this manuscript. The feto-placental unit, and potential roles of dehydroepiandrosterone (DHEA) in prenatal and postnatal brain development: a re-examination using the spiny mouse.