983 resultados para physiological control
Resumo:
Herbicides that inhibit the enzyme protoporphyrinogen oxidase (PROTOX) are usually effective to control dicotyledonous weeds and their agronomic efficacy is affected by environmental and physiological factors. The objective of this review is to summarize the knowledge of those factors available in the scientific literature in the last decade. Environmental factors that influence PROTOX inhibitors include temperature, irradiance and relative humidity. The most relevant physiological factors are the activity of enzymes that can detoxify herbicides and also of enzymes that mitigate the effects of oxidative stress in plants. The study also suggests some possible management strategies that could optimize the activity of PROTOX-inhibiting herbicides.
Resumo:
ABSTRACTDepending on the cultivar, the use of desiccants in the preharvest can favor maintenance of physiological quality. The objective of the study was to assess the physiological quality of soybean seeds as due to the use of apreharvest desiccant and desiccation time in two harvests (2011/12 and 2012/13). The treatments were five soybean cultivars, two growth stages of application, a control (without desiccant application), and three desiccants (glufosinate-ammonium, carfentrazone-ethyl and paraquat) (2011/12 harvest). In the 2012/13 harvest the carfentrazone-ethyl desiccant was replaced by diquat. The physiological quality of seeds was assessed by the percentage of viability and vigor (cold test, tetrazolium test and accelerated aging test). In 2011/12 harvest, there was an early harvest in six days with the use of glufosinate-ammonium and paraquat desiccants, when desiccation was done in stage R7.1, with maintenance of seed quality; however it was dependent on the cultivar. In the 2012/13 harvest there was no early harvesting due to the presence of rain in the preharvest and the use of desiccation did not affect the physiological quality of the seeds either. Cultivar NA5909 RG was more tolerant to remaining in the cultivation environment and maintained higher viability than 90% and greater vigor of 71% by the cold test compared to cultivar BMX Turbo (2011/12 harvest). It is concluded that desiccation can be a viable alternative to the soybean early harvesting, but it depends on the cultivar, the time of desiccation, the active principle of the desiccant and the absence of rain in preharvest.
Resumo:
In order to adapt to daily environmental changes, especially in relation to light availability, many organisms, such as plants, developed a vital mechanism that controls time-dependent biological events: the circadian clock. The circadian clock is responsible for predicting the changes that occur in the period of approximately 24 hours, preparing the plants for the following phases of the cycle. Some of these adaptations can influence the response of weeds to the herbicide application. Thus, the objectives of this review are to describe the physiological and genetic mechanisms of the circadian clock in plants, as well as to demonstrate the relationship of this phenomenon with the effectiveness of herbicides for weed control. Relationships are described between the circadian clock and the time of application of herbicides, leaf angle and herbicide interception, as well as photosynthetic activity in response to the circadian clock and herbicide efficiency. Further, it is discussed the role of phytochrome B (phyB) in the sensitivity of plants to glyphosate herbicide. The greater understanding of the circadian clock in plants is essential to achieve greater efficiency of herbicides and hence greater control of weeds and higher crop yields.
Resumo:
Considerable evidence suggests that nitroxidergic mechanisms in the nucleus tractus solitarii (NTS) participate in cardiovascular reflex control. Much of that evidence, being based on responses to nitric oxide precursors or inhibitors of nitric oxide synthesis, has been indirect and circumstantial. We sought to directly determine cardiovascular responses to nitric oxide donors microinjected into the NTS and to determine if traditional receptor mechanisms might account for responses to certain of these donors in the central nervous system. Anesthetized adult Sprague Dawley rats that were instrumented for recording arterial pressure and heart rate were used in the physiological studies. Microinjection of nitric oxide itself into the NTS did not produce any cardiovascular responses and injection of sodium nitroprusside elicited minimal depressor responses. The S-nitrosothiols, S-nitrosoglutathione (GSNO), S-nitrosoacetylpenicillamine (SNAP), and S-nitroso-D-cysteine (D-SNC) produced no significant cardiovascular responses while injection of S-nitroso-L-cysteine (L-SNC) elicited brisk, dose-dependent depressor and bradycardic responses. In contrast, injection of glyceryl trinitrate elicited minimal pressor responses without associated changes in heart rate. It is unlikely that the responses to L-SNC were dependent on release of nitric oxide in that 1) the responses were not affected by injection of oxyhemoglobin or an inhibitor of nitric oxide synthesis prior to injection of L-SNC and 2) L- and D-SNC released identical amounts of nitric oxide when exposed to brain tissue homogenates. Although GSNO did not independently affect blood pressure, its injection attenuated responses to subsequent injection of L-SNC. Furthermore, radioligand binding studies suggested that in rat brain synaptosomes there is a saturable binding site for GSNO that is displaced from that site by L-SNC. The studies suggest that S-nitrosocysteine, not nitric oxide, may be an interneuronal messenger for cardiovascular neurons in the NTS
Resumo:
Salivary cortisol is an index of plasma free cortisol and is obtained by a noninvasive procedure. We have been using salivary cortisol as a tool for physiological and diagnostic studies, among them the emergence of circadian rhythm in preterm and term infants. The salivary cortisol circadian rhythm in term and premature infants was established between 8 and 12 postnatal weeks. In the preterm infants the emergence of circadian rhythm was parallel to the onset of sleep rhythm. We also studied the use of salivary cortisol for screening for Cushing's syndrome (CS) in control and obese outpatients based on circadian rhythm and the overnight 1 mg dexamethasone (DEX) suppression test. Salivary cortisol was suppressed to less than 100 ng/dl after 1 mg DEX in control and obese patients. A single salivary cortisol measurement at 23:00 h and again after 1 mg DEX above the 90th percentile of the obese group values had sensitivity and specificity of 93 and 93% (23:00 h), and 91 and 94% (after DEX), respectively. The sensitivity improved to 100% when we combined both parameters. We also studied 11 CS children and 21 age-matched primary obese children for whom salivary cortisol sensitivity and specificity were 100/95% (23:00 h), and 100/95% (1 mg DEX), respectively. Similar to adults, sensitivity and specificity of 100% were obtained by combining 23:00 h and 1 mg DEX. The measurement of salivary cortisol is a useful tool for physiological studies and for the diagnosis of CS in children and adults on an outpatient basis.
Resumo:
There is a close association between the location of angiotensin (Ang) receptors and many important brain nuclei involved in the regulation of the cardiovascular system. The present review encompasses the physiological role of Ang II in the brainstem, particularly in relation to its influence on baroreflex control of the heart and kidney. Activation of AT1 receptors in the brainstem by fourth ventricle (4V) administration to conscious rabbits or local administration of Ang II into the rostral ventrolateral medulla (RVLM) of anesthetized rabbits acutely increases renal sympathetic nerve activity (RSNA) and RSNA baroreflex responses. Administration of the Ang antagonist Sarile into the RVLM of anesthetized rabbits blocked the effects of Ang II on the RSNA baroreflex, indicating that the RVLM is the major site of sympathoexcitatory action of Ang II given into the cerebrospinal fluid surrounding the brainstem. However, in conscious animals, blockade of endogenous Ang receptors in the brainstem by the 4V AT1 receptor antagonist losartan resulted in sympathoexcitation, suggesting an overall greater activity of endogenous Ang II within the sympathoinhibitory pathways. However, the RSNA response to airjet stress in conscious rabbits was markedly attenuated. While we found no effect of acute central Ang on heart rate baroreflexes, chronic 4V infusion inhibited the baroreflex and chronic losartan increased baroreflex gain. Thus, brainstem Ang II acutely alters sympathetic responses to specific afferent inputs thus forming part of a potentially important mechanism for the integration of autonomic response patterns. The sympathoexcitatory AT1 receptors appear to be activated during stress, surgery and anesthesia.
Resumo:
The sarcoplasmic reticulum (SR) Ca2+-ATPase (SERCA2a) is under the control of an SR protein named phospholamban (PLN). Dephosphorylated PLN inhibits SERCA2a, whereas phosphorylation of PLN at either the Ser16 site by PKA or the Thr17 site by CaMKII reverses this inhibition, thus increasing SERCA2a activity and the rate of Ca2+ uptake by the SR. This leads to an increase in the velocity of relaxation, SR Ca2+ load and myocardial contractility. In the intact heart, ß-adrenoceptor stimulation results in phosphorylation of PLN at both Ser16 and Thr17 residues. Phosphorylation of the Thr17 residue requires both stimulation of the CaMKII signaling pathways and inhibition of PP1, the major phosphatase that dephosphorylates PLN. These two prerequisites appear to be fulfilled by ß-adrenoceptor stimulation, which as a result of PKA activation, triggers the activation of CaMKII by increasing intracellular Ca2+, and inhibits PP1. Several pathological situations such as ischemia-reperfusion injury or hypercapnic acidosis provide the required conditions for the phosphorylation of the Thr17 residue of PLN, independently of the increase in PKA activity, i.e., increased intracellular Ca2+ and acidosis-induced phosphatase inhibition. Our results indicated that PLN was phosphorylated at Thr17 at the onset of reflow and immediately after hypercapnia was established, and that this phosphorylation contributes to the mechanical recovery after both the ischemic and acidic insults. Studies on transgenic mice with Thr17 mutated to Ala (PLN-T17A) are consistent with these results. Thus, phosphorylation of the Thr17 residue of PLN probably participates in a protective mechanism that favors Ca2+ handling and limits intracellular Ca2+ overload in pathological situations.
Resumo:
Methods for reliable evaluation of spinal cord (SC) injury in rats at short periods (2 and 24 h) after lesion were tested to characterize the mechanisms implicated in primary SC damage. We measured the physiological changes occurring after several procedures for producing SC injury, with particular emphasis on sensorimotor functions. Segmental and suprasegmental reflexes were tested in 39 male Wistar rats weighing 250-300 g divided into three control groups that were subjected to a) anesthesia, b) dissection of soft prevertebral tissue, and c) laminectomy of the vertebral segments between T10 and L1. In the lesion group the SC was completely transected, hemisected or subjected to vertebral compression. All animals were evaluated 2 and 24 h after the experimental procedure by the hind limb motility index, Bohlman motor score, open-field, hot-plate, tail flick, and paw compression tests. The locomotion scale proved to be less sensitive than the sensorimotor tests. A reduction in exploratory movements was detected in the animals 24 h after the procedures. The hot-plate was the most sensitive test for detecting sensorimotor deficiencies following light, moderate or severe SC injury. The most sensitive and simplest test of reflex function was the hot-plate. The hemisection model promoted reproducible moderate SC injury which allowed us to quantify the resulting behavior and analyze the evolution of the lesion and its consequences during the first 24 h after injury. We conclude that hemisection permitted the quantitation of behavioral responses for evaluation of the development of deficits after lesions. Hind limb evaluation scores and spontaneous exploration events provided a sensitive index of immediate injury effects after SC lesion at 2 and 24 h. Taken together, locomotion scales, open-field, and hot-plate tests represent reproducible, quantitatively sensitive methods for detecting functional deficiencies within short periods of time, indicating their potential for the study of cellular mechanisms of primary injury and repair after traumatic SC injury.
Resumo:
Several studies of the quantitative relationship between sodium need and sodium intake in rats are reviewed. Using acute diuretic treatment 24 h beforehand, intake matches need fairly accurately when intake is spread out in time by using a hypotonic solution of NaCl. In contrast, using a hypertonic solution, intake is typically double the need. Using the same diuretic treatment, although the natriuresis occurs within ~1 h, the appetite appears only slowly over 24 h. Increased plasma levels of aldosterone parallel the increased intake; however, treatment with metyrapone blocks the rise in aldosterone but has no effect on appetite. Satiation of sodium appetite was studied in rats using sodium loss induced by chronic diuretic treatment and daily salt consumption sessions. When a simulated foraging cost was imposed on NaCl access in the form of a progressive ratio lever press task, rats showed satiation for NaCl (break point) after consuming an amount close to their estimated deficit. The chronic diuretic regimen produced hypovolemia and large increases in plasma aldosterone concentration and renin activity. These parameters were reversed to or toward non-depleted control values at the time of behavioral satiation in the progressive ratio protocol. Satiation mechanisms for sodium appetite thus do appear to exist. However, they do not operate quantitatively when concentrated salt is available at no effort, but instead allow overconsumption. There are reasons to believe that such a bias toward overconsumption may have been beneficial over evolutionary time, but such biasing for salt and other commodities is maladaptive in a resource-rich environment.
Resumo:
Vacuolar H+-ATPase is a large multi-subunit protein that mediates ATP-driven vectorial H+ transport across the membranes. It is widely distributed and present in virtually all eukaryotic cells in intracellular membranes or in the plasma membrane of specialized cells. In subcellular organelles, ATPase is responsible for the acidification of the vesicular interior, which requires an intraorganellar acidic pH to maintain optimal enzyme activity. Control of vacuolar H+-ATPase depends on the potential difference across the membrane in which the proton ATPase is inserted. Since the transport performed by H+-ATPase is electrogenic, translocation of H+-ions across the membranes by the pump creates a lumen-positive voltage in the absence of a neutralizing current, generating an electrochemical potential gradient that limits the activity of H+-ATPase. In many intracellular organelles and cell plasma membranes, this potential difference established by the ATPase gradient is normally dissipated by a parallel and passive Cl- movement, which provides an electric shunt compensating for the positive charge transferred by the pump. The underlying mechanisms for the differences in the requirement for chloride by different tissues have not yet been adequately identified, and there is still some controversy as to the molecular identity of the associated Cl--conducting proteins. Several candidates have been identified: the ClC family members, which may or may not mediate nCl-/H+ exchange, and the cystic fibrosis transmembrane conductance regulator. In this review, we discuss some tissues where the association between H+-ATPase and chloride channels has been demonstrated and plays a relevant physiologic role.
Resumo:
In a prospective case-control study, we compared the amniotic fluid amino acid levels in non-immune hydrops fetalis (NIHF) and normal fetuses. Eighty fetuses underwent amniocentesis for different reasons at the prenatal diagnosis unit of the Department of Obstetrics and Gynecology, Faculty of Medicine, Dicle University. Forty of these fetuses were diagnosed with NIHF. The study included 40 women each in the NIHF (mean age: 27.69 ± 4.56 years) and control (27.52 ± 5.49 years) groups, who had abnormal double- or triple-screening test values with normal fetuses with gestational ages of 23.26 ± 1.98 and 23.68 ± 1.49 weeks at the time of sample collection, respectively. Amniotic fluid amino acid concentrations (intra-assay variation: 2.26-7.85%; interassay variation: 3.45-8.22%) were measured using EZ:faast kits (EZ:faast GC/FID free (physiological) amino acid kit; Phenomenex, USA) by gas chromatography. The standard for quantitation was a mixture of free amino acids from Phenomenex. The levels of 21 amino acids were measured. The mean phosphoserine and serine levels were significantly lower in the NIHF group, while the taurine, α-aminoadipic acid (aaa), glycine, cysteine, NH4, and arginine (Arg) levels were significantly higher compared to control. Significant risk variables for the NIHF group and odds coefficients were obtained using a binary logistic regression method. The respective odds ratios and 95% confidence intervals for the risk variables phosphoserine, taurine, aaa, Arg, and NH4 were 3.31 (1.84-5.97), 2.45 (1.56-3.86), 1.78 (1.18-2.68), 2.18 (1.56-3.04), and 2.41 (1.66-3.49), respectively. The significant difference between NIHF and control fetuses suggests that the amniotic fluid levels of some amino acids may be useful for the diagnosis of NIHF.
Resumo:
The software Seed Vigor Imaging System (SVIS®), has been successfully used to evaluate seed physiological potential by automated analyses of scanned seedlings. In this research, the efficiency of this system was compared to other tests accepted for assessing cucumber (Cucumis sativus L.) seed vigor of distinct seed lots of Supremo and Safira cultivars. Seeds were subjected to germination, traditional and saturated salt accelerated aging, seedling emergence, seedling length and SVIS analyses (determination of vigor indices and seedling growth uniformity, lengths of primary root, hypocotyl and whole seedlings). It was also determined whether the definition of seedling growth/uniformity ratios affects the sensitivity of the SVIS®. Results showed that analyses SVIS have provided consistent identification of seed lots performance, and have produced information comparable to those from recommended seed vigor tests, thus demonstrating a suitable sensitivity for a rapid and objective evaluation of physiological potential of cucumber seeds. Analyses of four-days-old cucumber seedlings using the SVIS® are more accurate and growth/uniformity does not affect the precision of results.
Resumo:
Traumatic brain injury (TBI) often affects social adaptive functioning and these changes in social adaptability are usually associated with general damage to the frontal cortex. Recent evidence suggests that certain neurons within the orbitofrontal cortex appear to be specialized for the processing of faces and facial expressions. The orbitofrontal cortex also appears to be involved in self-initiated somatic activation to emotionally-charged stimuli. According to Somatic Marker Theory (Damasio, 1994), the reduced physiological activation fails to provide an individual with appropriate somatic cues to personally-relevant stimuli and this, in turn, may result in maladaptive behaviour. Given the susceptibility of the orbitofrontal cortex in TBI, it was hypothesized that impaired perception and reactivity to socially-relevant information might be responsible for some of the social difficulties encountered after TBL Fifteen persons who sustained a moderate to severe brain injury were compared to age and education matched Control participants. In the first study, both groups were presented with photographs of models displaying the major emotions and either asked to identify the emotions or simply view the faces passively. In a second study, participants were asked to select cards from decks that varied in terms of how much money could be won or lost. Those decks with higher losses were considered to be high-risk decks. Electrodermal activity was measured concurrently in both situations. Relative to Controls, TBI participants were found to have difficulty identifying expressions of surprise, sadness, anger, and fear. TBI persons were also found to be under-reactive, as measured by electrodermal activity, while passively viewing slides of negative expressions. No group difference,in reactivity to high-risk card decks was observed. The ability to identify emotions in the face and electrodermal reactivity to faces and to high-risk decks in the card game were examined in relationship to social monitoring and empathy as described by family members or friends on the Brock Adaptive Functioning Questionnaire (BAFQ). Difficulties identifying negative expressions (i.e., sadness, anger, fear, and disgust) predicted problems in monitoring social situations. As well, a modest relationship was observed between hypo-arousal to negative faces and problems with social monitoring. Finally, hypo-arousal in the anticipation of risk during the card game related to problems in empathy. In summary, these data are consistent with the view that alterations in the ability to perceive emotional expressions in the face and the disruption in arousal to personally-relevant information may be accounting for some of the difficulties in social adaptation often observed in persons who have sustained a TBI. Furthermore, these data provide modest support for Damasio's Somatic Marker Theory in that physiological reactivity to socially-relevant information has some value in predicting social function. Therefore, the assessment of TBI persons, particularly those with adaptive behavioural problems, should be expanded to determine whether alterations in perception and reactivity to socially-relevant stimuli have occurred. When this is the case, rehabilitative strategies aimed more specifically at these difficulties should be considered.
Resumo:
Cognitive control involves the ability to flexibly adjust cognitive processing in order to resist interference and promote goal-directed behaviour. Although frontal cortex is considered to be broadly involved in cognitive control, the mechanisms by which frontal brain areas implement control functions are unclear. Furthermore, aging is associated with reductions in the ability to implement control functions and questions remain as to whether unique cortical responses serve a compensatory role in maintaining maximal performance in later years. Described here are three studies in which electrophysiological data were recorded while participants performed modified versions of the standard Sternberg task. The goal was to determine how top-down control is implemented in younger adults and altered in aging. In study I, the effects of frequent stimulus repetition on the interference-related N450 were investigated in a Sternberg task with a small stimulus set (requiring extensive stimulus resampling) and a task with a large stimulus set (requiring no stimulus resampling).The data indicated that constant stimulus res amp ling required by employing small stimulus sets can undercut the effect of proactive interference on the N450. In study 2, younger and older adults were tested in a standard version of the Sternberg task to determine whether the unique frontal positivity, previously shown to predict memory impairment in older adults during a proactive interference task, would be associated with the improved performance when memory recognition could be aided by unambiguous stimulus familiarity. Here, results indicated that the frontal positivity was associated with poorer memory performance, replicating the effect observed in a more cognitively demanding task, and showing that stimulus familiarity does not mediate compensatory cortical activations in older adults. Although the frontal positivity could be interpreted to reflect maladaptive cortical activation, it may also reflect attempts at compensation that fail to fully ameliorate agerelated decline. Furthermore, the frontal positivity may be the result of older adults' reliance on late occurring, controlled processing in contrast to younger adults' ability to identify stimuli at very early stages of processing. In the final study, working memory load was manipulated in the proactive interference Sternberg task in order to investigate whether the N450 reflects simple interference detection, with little need for cognitive resources, or an active conflict resolution mechanism that requires executive resources to implement. Independent component analysis was used to isolate the effect of interference revealing that the canonical N450 was based on two dissociable cognitive control mechanisms: a left frontal negativity that reflects active interference resolution, , but requires executive resources to implement, and a right frontal negativity that reflects global response inhibition that can be relied on when executive resources are minimal but at the cost of a slowed response. Collectively, these studies advance understanding of the factors that influence younger and older adults' ability to satisfy goal-directed behavioural requirements in the face of interference and the effects of age-related cognitive decline.
Resumo:
This thesis explored whether individual characteristics could predict changes in postural control in young adults under conditions of height-induced postural threat. Eighty-two young adults completed questionnaires to assess trait anxiety, trait movement reinvestment, physical risk-taking, and previous experience with height-related activities. Tests of static (quiet standing) and anticipatory (rise to toes) postural control were completed under conditions of low and high postural threat manipulated through changes in surface height. Individual characteristics were able to significantly predict changes in static, but not anticipatory postural control. Trait movement reinvestment and physical risk-taking were the most influential predictors. Evidence was provided that changes in fear and physiological arousal mediated the relationship between physical risk-taking and changes in static postural control. These results suggest that individual characteristics shape the postural strategy employed under threatening conditions and may be important for clinicians to consider during balance assessment and treatment protocols.