23 resultados para adrenal cortex hormones
em DigitalCommons@The Texas Medical Center
Resumo:
BACKGROUND: Decisions regarding whether to administer intensive care to extremely premature infants are often based on gestational age alone. However, other factors also affect the prognosis for these patients. METHODS: We prospectively studied a cohort of 4446 infants born at 22 to 25 weeks' gestation (determined on the basis of the best obstetrical estimate) in the Neonatal Research Network of the National Institute of Child Health and Human Development to relate risk factors assessable at or before birth to the likelihood of survival, survival without profound neurodevelopmental impairment, and survival without neurodevelopmental impairment at a corrected age of 18 to 22 months. RESULTS: Among study infants, 3702 (83%) received intensive care in the form of mechanical ventilation. Among the 4192 study infants (94%) for whom outcomes were determined at 18 to 22 months, 49% died, 61% died or had profound impairment, and 73% died or had impairment. In multivariable analyses of infants who received intensive care, exposure to antenatal corticosteroids, female sex, singleton birth, and higher birth weight (per each 100-g increment) were each associated with reductions in the risk of death and the risk of death or profound or any neurodevelopmental impairment; these reductions were similar to those associated with a 1-week increase in gestational age. At the same estimated likelihood of a favorable outcome, girls were less likely than boys to receive intensive care. The outcomes for infants who underwent ventilation were better predicted with the use of the above factors than with use of gestational age alone. CONCLUSIONS: The likelihood of a favorable outcome with intensive care can be better estimated by consideration of four factors in addition to gestational age: sex, exposure or nonexposure to antenatal corticosteroids, whether single or multiple birth, and birth weight. (ClinicalTrials.gov numbers, NCT00063063 [ClinicalTrials.gov] and NCT00009633 [ClinicalTrials.gov].).
Resumo:
The role of adrenal and thyroid hormones on the development of chief and parietal cells was studied in the rat. Administration of corticosterone or thyroxine in the first and second postnatal weeks resulted in the precocious appearance of pepsinogen in the oxyntic gland mucosa and an increase in basal acid output. When pups were adrenalectomized or made hypothyroid, both pepsinogen and basal acid secretion were lowed. Corticosterone injection increased pepsinogen content and acid secretion to levels higher than those of control in hypothyroid and adrenalectomized rats while thyroxine had no such effect in adrenalectomized rats. Morphologically, chief cells responded to corticosterone or thyroxine with increases in both zymogen granules and RER. Chief cells, however, contained less zymogen granules and RER in adrenalectomized and hypothyroid rats. Corticosterone was effective in restoring the normal morphological appearance of chief cells in the hypothyroid rats while thyroxine had no effect in the adrenalectomized rats. In response to corticosterone or thyroxine, parietal cells in normal animals appeared to contain more mitochondria, tubulovesicles and intracellular canaliculi than those of control. Unlike chief cells, parietal cells retained normal ultrastructure in the absence of adrenal and thyroid hormones. These data indicate that (1) corticosterone is necessary for the functional and morphological development of chief cells; (2) the morphological development of parietal cells does not appear to depend upon corticosterone, (3) the effect of thyroxine on the development of chief and parietal cells is due to corticosterone. ^
Resumo:
We used micro-infusions during eyelid conditioning in rabbits to investigate the relative contributions of cerebellar cortex and the underlying deep nuclei (DCN) to the expression of cerebellar learning. These tests were conducted using two forms of cerebellum-dependent eyelid conditioning for which the relative roles of cerebellar cortex and DCN are controversial: delay conditioning, which is largely unaffected by forebrain lesions, and trace conditioning, which involves interactions between forebrain and cerebellum. For rabbits trained with delay conditioning, silencing cerebellar cortex by micro-infusions of the local anesthetic lidocaine unmasked stereotyped short-latency responses. This was also the case after extinction as observed previously with reversible blockade of cerebellar cortex output. Conversely, increasing cerebellar cortex activity by micro-infusions of the GABA(A) antagonist picrotoxin reversibly abolished conditioned responses. Effective cannula placements were clustered around the primary fissure and deeper in lobules hemispheric lobule IV (HIV) and hemispheric lobule V (HV) of anterior lobe. In well-trained trace conditioned rabbits, silencing this same area of cerebellar cortex or reversibly blocking cerebellar cortex output also unmasked short-latency responses. Because Purkinje cells are the sole output of cerebellar cortex, these results provide evidence that the expression of well-timed conditioned responses requires a well-timed decrease in the activity of Purkinje cells in anterior lobe. The parallels between results from delay and trace conditioning suggest similar contributions of plasticity in cerebellar cortex and DCN in both instances.
Resumo:
The ability to represent time is an essential component of cognition but its neural basis is unknown. Although extensively studied both behaviorally and electrophysiologically, a general theoretical framework describing the elementary neural mechanisms used by the brain to learn temporal representations is lacking. It is commonly believed that the underlying cellular mechanisms reside in high order cortical regions but recent studies show sustained neural activity in primary sensory cortices that can represent the timing of expected reward. Here, we show that local cortical networks can learn temporal representations through a simple framework predicated on reward dependent expression of synaptic plasticity. We assert that temporal representations are stored in the lateral synaptic connections between neurons and demonstrate that reward-modulated plasticity is sufficient to learn these representations. We implement our model numerically to explain reward-time learning in the primary visual cortex (V1), demonstrate experimental support, and suggest additional experimentally verifiable predictions.
Resumo:
Inappropriate response tendencies may be stopped via a specific fronto/basal ganglia/primary motor cortical network. We sought to characterize the functional role of two regions in this putative stopping network, the right inferior frontal gyrus (IFG) and the primary motor cortex (M1), using electocorticography from subdural electrodes in four patients while they performed a stop-signal task. On each trial, a motor response was initiated, and on a minority of trials a stop signal instructed the patient to try to stop the response. For each patient, there was a greater right IFG response in the beta frequency band ( approximately 16 Hz) for successful versus unsuccessful stop trials. This finding adds to evidence for a functional network for stopping because changes in beta frequency activity have also been observed in the basal ganglia in association with behavioral stopping. In addition, the right IFG response occurred 100-250 ms after the stop signal, a time range consistent with a putative inhibitory control process rather than with stop-signal processing or feedback regarding success. A downstream target of inhibitory control is M1. In each patient, there was alpha/beta band desynchronization in M1 for stop trials. However, the degree of desynchronization in M1 was less for successfully than unsuccessfully stopped trials. This reduced desynchronization on successful stop trials could relate to increased GABA inhibition in M1. Together with other findings, the results suggest that behavioral stopping is implemented via synchronized activity in the beta frequency band in a right IFG/basal ganglia network, with downstream effects on M1.
Resumo:
The modulation of gene regulation by progesterone (P) and its classical intracellular regulation by progestin receptors in the brain, resulting in alterations in physiology and behavior has been well studied. The mechanisms mediating the short latency effects of P are less well understood. Recent studies have revealed rapid nonclassical signaling action of P involving the activation of intracellular signaling pathways. We explored the involvement of protein kinase C (PKC) in P-induced rapid signaling in the ventromedial nucleus of the hypothalamus (VMN) and preoptic area (POA) of the rat brain. Both the Ca2+-independent (basal) PKC activity representing the activation of PKC by the in vivo treatments and the Ca+2-dependent (total) PKC activity assayed in the presence of exogenous cofactors in vitro were determined. A comparison of the two activities demonstrated the strength and temporal status of PKC regulation by steroid hormones in vivo. P treatment resulted in a rapid increase in basal PKC activity in the VMN but not the POA. Estradiol benzoate priming augmented P-initiated increase in PKC basal activity in both the VMN and POA. These increases were inhibited by intracerebroventricular administration of a PKC inhibitor administered 30 min prior to P. The total PKC activity remained unchanged demonstrating maximal PKC activation within 30 min in the VMN. In contrast, P regulation in the POA significantly attenuated total PKC activity +/- estradiol benzoate priming. These rapid changes in P-initiated PKC activity were not due to changes in PKC protein levels or phosphorylation status.
Resumo:
A model for cerebellar involvement in motor learning was tested using classical eyelid conditioning in the rabbit. Briefly, we assume that modifications of the strength of granule cell synapses at Purkinje cells in the cerebellar cortex and mossy fiber (MF) synapses at cerebellar interpositus nuclei are responsible for the acquisition, adaptively-timed expression, and extinction of conditioned eyelid responses (CRs). A corollary of these assumptions is that the cerebellar cortex is necessary for acquisition and extinction. This model also suggests a mechanism whereby the cerebellar cortex can discriminate different times during a conditioned stimulus (CS) and thus mediate the learned timing of CRs. Therefore, experiments were done to determine the role of the cerebellar cortex in the timing, extinction, and acquisition of CRs. Lesions of the cerebellar cortex that included the anterior lobe disrupted the learned timing of CRs such that they occurred at extremely short latencies. Stimulation of MFs in the middle cerebellar peduncle as the CS could support differently timed CRs in the same animal. These data indicate that synaptic plasticity in the cerebellar cortex mediates the learned timing of CRs. These short-latency CRs which resulted from anterior lobe damage did not extinguish, while CRs in animals receiving lesions which did not include the anterior lobe extinguished normally. Preliminary data suggests that lesions of the anterior lobe which produce short-latency responses prevent the acquisition of CRs to a novel CS. These findings indicate that the anterior lobe of cerebellar cortex is necessary for eyelid conditioning. The involvement of the anterior lobe in eyelid conditioning has not been previously reported, however, the anterior lobe has generally been spared in lesion studies examining cerebellar cortex involvement in eyelid conditioning due to its relatively inaccessible location. The observation that the anterior lobe of the cerebellar cortex is not always required for the basic expression of CRs, but is necessary for response timing, extinction, and acquisition, is consistent with the hypothesis that eyelid conditioning can involve plasticity in both the cerebellar cortex and interpositus nucleus and that plasticity in the nucleus is controlled by Purkinje cell activity. ^
Resumo:
Congenital Adrenal Hyperplasia (CAH), due to 21-Hydroxylase deficiency, has an estimated incidence of 1:15,000 births and can result in death, salt-wasting crisis or impaired growth. It has been proposed that early diagnosis and treatment of infants detected from newborn screening for CAH will decrease the incidence of mortality and morbidity in the affected population. The Texas Department of Health (TDH) began mandatory screening for CAH in June, 1989 and Texas is one of fourteen states to provide neonatal screening for the disorder.^ The purpose of this study was to describe the cost and effect of screening for CAH in Texas during 1994 and to compare cases first detected by screen and first detected clinically between January 1, 1990 and December 31, 1994. This study used a longitudinal descriptive research design. The data was secondary and previously collected by the Texas Department of Health. Along with the descriptive study, an economic analysis was done. The cost of the program was defined, measured and valued for four phases of screening: specimen collection, specimen testing, follow-up and diagnostic evaluation.^ There were 103 infants with Classical CAH diagnosed during the study and 71 of the cases had the more serious Salt-Wasting form of the disease. Of the infants diagnosed with Classical CAH, 60% of the cases were first detected by screen and 40% were first detected because of clinical findings before the screening results were returned. The base case cost of adding newborn screening to an existing program (excluding the cost of specimen collection) was $357,989 for 100,000 infants. The cost per case of Classical CAH diagnosed, based on the number of infants first detected by screen in 1994, was \$126,892. There were 42 infants diagnosed with the more benign Nonclassical form of the disease. When these cases were included in the total, the cost per infant to diagnose Congenital Adrenal/Hyperplasia was $87,848. ^
Resumo:
Despite much attention, the function of oligosaccharide chains of glycoproteins remains largely unknown. Our understanding of oligosaccharide function in vivo has been limited to the use of reagents and targeted mutations that eliminate entire oligosaccharide chains. However, most, if not all biological functions for oligosaccharides have been attributed to specific terminal sequences on these oligosaccharides, yet there have been few studies to examine the consequences of modifying terminal oligosaccharide structures in vivo. To address this issue, mice were created bearing a targeted mutation in $\beta$1,4-galactosyltransferase, an enzyme responsible for elaboration of many of the proposed biologically-active carbohydrate epitopes. Most galactosyltransferase-null mice died within the first few weeks after birth and were characterized by stunted growth, thin skin, sparse hair, and dehydration. In addition, the adrenal cortices were poorly stratified and spermatogenesis was delayed. The few surviving adults had puffy skin (myxedema), difficulty delivering pups at birth (dystocia), and failed to lactate (agalactosis). All of these defects are consistant with endocrine insufficiency, which was confirmed by markedly decreased levels of serum thyroxine. The anterior pituitary gland appeared functionally delayed in newborn mutant mice, since the constituent cells were quiescent and nonsecretory, unlike that of control littermates. However, the anterior pituitary acquired a normal secretory phenotype during neonatal development, although it remained abnormally small and its glycoprotein hormones were devoid of $\beta$1,4-galactosyl residues. These results support in vitro studies suggesting that incomplete glycosylation of pituitary hormones leads to the creation of hormone antagonists that down regulate subsequent endocrine function producing polyglandular endocrine insufficiency. More surprisingly, the fact that some mice survive this neonatal period indicates the presence of a previously unrecognized compensatory pathway for glycoprotein hormone glycosylation and/or action.^ In addition to its well-studied biosynthetic function in the Golgi complex, a GalTase isoform is also expressed on the sperm surface where it functions as a gamete receptor during fertilization by binding to its oligosaccharide ligand on the egg coat glycoprotein, ZP3. Aggregation of GalTase by multivalent ZP3 oligosaccharides activates a G-protein cascade leading to the acrosome reaction. Although GalTase-null males are fertile, the mutant sperm bind less ZP3 than wild-type sperm, and are unable to undergo the acrosome reaction in response to either zona pellucida glycoproteins or to anti-GalTase anti-serum, as do wild-type sperm. However, mutant and wild-type sperm undergo the acrosome reaction normally in response to calcium ionophore which bypasses the requirement for ZP3 binding. Interestingly, the phenotype of the GalTase-null sperm is reciprocal to that of sperm that overexpress surface GalTAse and which bind more ZP3 leading to precocious acrosome reactions. These results confirm that GalTase functions as at least one of the sperm receptors for ZP3, and that GalTase participates in the ZP3-induced signal transduction pathway during zona pellucida-induced acrosome reactions. ^
Resumo:
This research demonstrates cholinergic modulation of thalamic input into the limbic cortex. A projection from the mediodorsal thalamus (MD) to the anterior cingulate cortex was defined anatomically and physiologically. Injections of horse-radish peroxidase into the anterior cingulate cortex labels neurons in the lateral, parvocellular, region of MD. Electrical Stimulation of this area produces a complex field potential in the anterior cingulate cortex which was further characterized by current density analysis and single cell recordings.^ The monsynaptic component of the response was identified as a large negative field which is maximal in layer IV of the anterior cingulate cortex. This response shows remarkable tetanic potentiation of frequencies near 7 Hz. During a train of 50 or more stimuli, the response would grow quickly and remain at a fairly stable potentiated level throughout the train.^ Cholinergic modulation of this thalamic response was demonstrated by iontophoretic application of the cholinergic agonist carbachol decreased the effectiveness of the thalamic imput by rapidly attenuation the response during a train of stimuli. The effect was apparently mediated by muscarinic receptors since the effect of carbachol was blocked by atropine but not by hexamethonium.^ To determine the source of the cingulate cortex cholinergic innervation, lesions were made in the anterior and medial thalamus and in the nucleus of the diagonal band of Broca. The effects of these lesions on choline acetyltranferase activity in the cingulate cortex were determined by a micro-radio-enzymatical assay. Only the lesions of the nucleus of the diagonal band significantly decreased the choline acetyltransferase activity in the cingulate cortex regions. Therefore, the diagonal band appears to be a major source of sensory cholinergic innervation and may be involved in gating of sensory information from the thalamus into the limbic cortex. Attempts to modulate the cingulate response to MD stimulation with electrical stimulation of the diagonal band, however were not successful.^
Resumo:
Human behavior appears to be regulated in part by noradrenergic transmission since antidepressant drugs modify the number and function of (beta)-adrenergic receptors in the central nervous system. Affective illness is also known to be associated with the endocrine system, particularly the hypothalamic-pituitary-adrenal axis. The aim of the present study was to determine whether hormones, in particular adrencorticotrophin (ACTH) and corticosterone, may influence behavior by regulating brain noradrenergic receptor function.^ Chronic treatment with ACTH accelerated the increase or decrease in rat brain (beta)-adrenergic receptor number induced by a lesion of the dorsal noradrenergic bundle or treatment with the antidepressant imipramine. Chronic administration of ACTH alone had no effect on (beta)-receptor number although it reduced norepinephrine stimulated cyclic AMP accumulation in brain slices. Treatment with imipramine also reduced the cyclic AMP response to norepinephrine but was accompanied by a decrease in (beta)-adrenergic receptor number. Both the imipramine and ACTH treatments reduced the affinity of (beta)-adrenergic receptors for norepinephrine, but only the antidepressant modified the potency of the neurotransmitter to stimulate second messenger production. Neither ACTH nor imipramine treatment altered Gpp(NH)p- or fluoride-stimulated adenylate cyclase, cyclic AMP, cyclic GMP, or cyclic GMP-stimulated cyclic AMP phosphodiesterase, or the activity of the guanine nucleotide binding protein (Gs). These findings suggested that post-receptor components of the cyclic nucleotide generating system are not influenced by the hormone or antidepressant. This conclusion was verified by the finding that neither treatment altered adenosine-stimulated cyclic AMP accumulation in brain tissue.^ A detailed examination of the (alpha)- and (beta)-adrenergic receptor components of norepinephrine-stimulated cyclic AMP production revealed that ACTH, but not imipramine, administration reduced the contribution of the (alpha)-receptor mediated response. Like ACTH treatment, corticosterone diminished the (alpha)-adrenergic component indicating that adrenal steroids probably mediate the neurochemical responses to ACTH administration. The data indicate that adrenal steroids and antidepressants decrease noradrenergic receptor function by selectively modifying the (alpha)- and (beta)-receptor components. The functional similarity in the action of the steroid and antidepressants suggests that adrenal hormones normally contribute to the maintenance of receptor systems which regulate affective behavior in man. ^
Resumo:
Deficits in social cognition are prominent symptoms of many human psychiatric disorders, but the origin of such deficits remains largely unknown. To further current knowledge regarding the neural network mediating social cognition, the present research program investigated the individual contributions of two temporal lobe structures, the amygdala and hippocampal formation, and one frontal lobe region, the orbital frontal cortex (Areas 11 and 13), to primate social cognition. Based on previous research, we hypothesized that the amygdala, hippocampal formation and orbital frontal cortex contribute significantly to the formation of new social relationships, but less to the maintenance of familiar ones. ^ Thirty-six male rhesus macaques (Macaca mulatta) served as subjects, and were divided into four experimental groups: Neurotoxic amygdala lesion (A-ibo, n = 9), neurotoxic or aspiration orbital frontal cortex lesion (O, n = 9), neurotoxic hippocampal formation lesion (H-ibo, n = 9) or sham-operated control (C, n = 9). Six social groups (tetrads) were created, each containing one member from each experimental group. The effect of lesion on established social relationships was assessed during pre- and post-surgical unrestrained social interactions, whereas the effect of lesion on the formation of new relationships was assessed during an additional phase of post-surgical testing with shuffled tetrad membership. Results indicated that these three neural structures each contribute significantly to both the formation and maintenance of social relationships. Furthermore, the amygdala appears to primarily mediate normal responses to threatening social signals, whereas the orbital frontal cortex plays a more global role in social cognition by mediating responses to both threatening and affiliative social signals. By contrast, the hippocampal formation seems to contribute to social cognition indirectly by providing access to previous experience during social judgments. ^ These conclusions were further investigated with three experiments that measured behavioral and physiological (stress hormone) reactivity to threatening stimuli, and three additional experiments that measured subjects' ability to flexibly alter behavioral responses depending on the incentive value of a food reinforcer. Data from these six experiments further confirmed and strengthened the three conclusions originating from the social behavior experiments and, when combined with the current literature, helped to formulate a simple, but testable, theoretical model of primate social cognition. ^
Resumo:
Adult monkeys (Macaca mulatta) with lesions of the hippocampal formation, perirhinal cortex, areas TH/TF, as well as controls were tested on tasks of object, spatial and contextual recognition memory. ^ Using a visual paired-comparison (VPC) task, all experimental groups showed a lack of object recognition relative to controls, although this impairment emerged at 10 sec with perirhinal lesions, 30 sec with areas TH/TF lesions and 60 sec with hippocampal lesions. In contrast, only perirhinal lesions impaired performance on delayed nonmatching-to-sample (DNMS), another task of object recognition memory. All groups were tested on DNMS with distraction (dDNMS) to examine whether the use of active cognitive strategies during the delay period could enable good performance on DNMS in spite of impaired recognition memory (revealed by the VPC task). Distractors affected performance of animals with perirhinal lesions at the 10-sec delay (the only delay in which their DNMS performance was above chance). They did not affect performance of animals with areas TH/TF lesions. Hippocampectomized animals were impaired at the 600-sec delay (the only delay at which prevention of active strategies would likely affect their behavior). ^ While lesions of areas TH/TF impaired spatial location memory and object-in-place memory, hippocampal lesions impaired only object-in-place memory. The pattern of results for perirhinal cortex lesions on the different task conditions indicated that this cortical area is not critical for spatial memory. ^ Finally, all three lesions impaired contextual recognition memory processes. The pattern of impairment appeared to result from the formation of only a global representation of the object and background, and suggests that all three areas are recruited for associating information across sources. ^ These results support the view that (1) the perirhinal cortex maintains storage of information about object and the context in which it is learned for a brief period of time, (2) areas TH/TF maintain information about spatial location and form associations between objects and their spatial relationship (a process that likely requires additional time) and (3) the hippocampal formation mediates associations between objects, their spatial relationship and the general context in which these associations are formed (an integrative function that requires additional time). ^