924 resultados para Low fungal load settings
Resumo:
We determined the effect of muscle glycogen concentration and postexercise nutrition on anabolic signaling and rates of myofibrillar protein synthesis after resistance exercise (REX). Sixteen young, healthy men matched for age, body mass, peak oxygen uptake (VO2peak) and strength (one repetition maximum; 1RM) were randomly assigned to either a nutrient or placebo group. After 48 h diet and exercise control, subjects undertook a glycogen-depletion protocol consisting of one-leg cycling to fatigue (LOW), whereas the other leg rested (NORM). The next morning following an overnight fast, a primed, constant infusion of L-[ring-13C6] phenylalanine was commenced and subjects completed 8 sets of 5 unilateral leg press repetitions at 80% 1RM. Immediately after REX and 2 h later, subjects consumed a 500 ml bolus of a protein/CHO (20 g whey + 40 g maltodextrin) or placebo beverage. Muscle biopsies from the vastus lateralis of both legs were taken at rest and 1 and 4 h after REX. Muscle glycogen concentration was higher in the NORM than LOW at all time points in both nutrient and placebo groups (P < 0.05). Postexercise Akt-p70S6K-rpS6 phosphorylation increased in both groups with no differences between legs (P < 0.05). mTORSer2448 phosphorylation in placebo increased 1 h after exercise in NORM (P < 0.05), whereas mTOR increased ?4-fold in LOW (P < 0.01) and ?11 fold in NORM with nutrient (P < 0.01; different between legs P < 0.05). Post-exercise rates of MPS were not different between NORM and LOW in nutrient (0.070 ± 0.022 vs. 0.068 ± 0.018 %/h) or placebo (0.045 ± 0.021 vs. 0.049 ± 0.017 %/h). We conclude that commencing high-intensity REX with low muscle glycogen availability does not compromise the anabolic signal and subsequent rates of MPS, at least during the early (4 h) postexercise recovery period.
Resumo:
Purpose The aim of this study was to determine the early time course of exercise-induced signaling after divergent contractile activity associated with resistance and endurance exercise. Methods Sixteen male subjects were randomly assigned to either a cycling (CYC; n = 8, 60 min, 70% V?O2peak) or resistance (REX; n = 8, 8×5 leg extension, 80% one-repetition maximum, 3-min recovery) exercise group. Serial muscle biopsies were obtained from vastus lateralis at rest before, immediately after, and after 15, 30, and 60 min of passive recovery to determine early signaling responses after exercise. Results There were comparable increases from rest in AktThr308/Ser473 and mTORSer2448 phosphorylation during the postexercise time course that peaked 30-60 min after both CYC and REX (P<0.05). There were also similar patterns in p70S6K Thr389 and 4E-BP1Thr37/46 phosphorylation, but a greater magnitude of effect was observed for REX and CYC, respectively (P<0.05). However, AMPKThr172 phosphorylation was only significantly elevated after CYC (P<0.05), and we observed divergent responses for glycogen synthaseSer641 and AS160 phosphorylation that were enhanced after CYC but not REX (P<0.05). Conclusions We show a similar time course for Akt-mTOR-S6K phosphorylation during the initial 60-min recovery period after divergent contractile stimuli. Conversely, enhanced phosphorylation status of proteins that promote glucose transport and glycogen synthesis only occurred after endurance exercise. Our results indicate that endurance and resistance exercise initiate translational signaling, but high-load, low-repetition contractile activity failed to promote phosphorylation of pathways regulating glucose metabolism.
Resumo:
Anxiety traits can be stable and permanent characteristics of an individual across time that is less susceptible of influences by a particular situation. One way to study trait anxiety in an experimental context is through the use of rat lines, selected according to contrasting phenotypes of fear and anxiety. It is not clear whether the behavioral differences between two contrasting rat lines in one given anxiety test are also present in others paradigms of state anxiety. Here, we examine the extent to which multiple anxiety traits generalize across selected animal lines originally selected for a single anxiety trait. We review the behavioral results available in the literature of eight rat genetic models of trait anxiety - namely Maudsley Reactive and Non-reactive rats, Floripa H and L rats, Tsukuba High and Low Emotional rats, High and Low Anxiety-related rats, High and Low Ultrasonic Vocalization rats, Roman High and Low Avoidance rats, Syracuse High and Low Avoidance rats, and Carioca High and Low Conditioned Freezing rats - across 11 behavioral paradigms of innate anxiety or aversive learning frequently used in the experimental setting. We observed both convergence and divergence of behavioral responses in these selected lines across the 11 paradigms. We find that predisposition for specific anxiety traits will usually be generalized to other anxiety provoking stimuli. However this generalization is not observed across all genetic models indicating some unique trait and state interactions. Genetic models of enhanced-anxiety related responses are beginning to help define how anxiety can manifest differently depending on the underlying traits and the current environmentally induced state.
Resumo:
Individual variability in the acquisition, consolidation and extinction of conditioned fear potentially contributes to the development of fear pathology including posttraumatic stress disorder (PTSD). Pavlovian fear conditioning is a key tool for the study of fundamental aspects of fear learning. Here, we used a selected mouse line of High and Low Pavlovian conditioned fear created from an advanced intercrossed line (AIL) in order to begin to identify the cellular basis of phenotypic divergence in Pavlovian fear conditioning. We investigated whether phosphorylated MAPK (p44/42 ERK/MAPK), a protein kinase required in the amygdala for the acquisition and consolidation of Pavlovian fear memory, is differentially expressed following Pavlovian fear learning in the High and Low fear lines. We found that following Pavlovian auditory fear conditioning, High and Low line mice differ in the number of pMAPK-expressing neurons in the dorsal sub nucleus of the lateral amygdala (LAd). In contrast, this difference was not detected in the ventral medial (LAvm) or ventral lateral (LAvl) amygdala sub nuclei or in control animals. We propose that this apparent increase in plasticity at a known locus of fear memory acquisition and consolidation relates to intrinsic differences between the two fear phenotypes. These data provide important insights into the micronetwork mechanisms encoding phenotypic differences in fear. Understanding the circuit level cellular and molecular mechanisms that underlie individual variability in fear learning is critical for the development of effective treatment of fear-related illnesses such as PTSD.
Resumo:
A growing body of evidence suggests that mitochondrial function may be important in brain development and psychiatric disorders. However, detailed expression profiles of those genes in human brain development and fear-related behavior remain unclear. Using microarray data available from the public domain and the Gene Ontology analysis, we identified the genes and the functional categories associated with chronological age in the prefrontal cortex (PFC) and the caudate nucleus (CN) of psychiatrically normal humans ranging in age from birth to 50 years. Among those, we found that a substantial number of genes in the PFC (115) and the CN (117) are associated with the GO term: mitochondrion (FDR qv <0.05). A greater number of the genes in the PFC (91%) than the genes in the CN (62%) showed a linear increase in expression during postnatal development. Using quantitative PCR, we validated the developmental expression pattern of four genes including monoamine oxidase B (MAOB), NADH dehydrogenase flavoprotein (NDUFV1), mitochondrial uncoupling protein 5 (SLC25A14) and tubulin beta-3 chain (TUBB3). In mice, overall developmental expression pattern of MAOB, SLC25A14 and TUBB3 in the PFC were comparable to the pattern observed in humans (p<0.05). However, mice selectively bred for high fear did not exhibit normal developmental changes of MAOB and TUBB3. These findings suggest that the genes associated with mitochondrial function in the PFC play a significant role in brain development and fear-related behavior.
Resumo:
Although the endocannabinoid system (ECS) has been implicated in brain development and various psychiatric disorders, precise mechanisms of the ECS on mood and anxiety disorders remain unclear. Here, we have investigated developmental and disease-related expression pattern of the cannabinoid receptor 1 (CB1) and the cannabinoid receptor 2 (CB2) genes in the dorsolateral prefrontal cortex (PFC) of humans. Using mice selectively bred for high and low fear, we further investigated potential association between fear memory and the cannabinoid receptor expression in the brain. The CB1, not the CB2, mRNA levels in the PFC gradually decrease during postnatal development ranging in age from birth to 50 years (r 2 > 0.6 & adj. p < 0.05). The CB1 levels in the PFC of major depression patients were higher when compared to the age-matched controls (adj. p < 0.05). In mice, the CB1, not the CB2, levels in the PFC were positively correlated with freezing behavior in classical fear conditioning (p < 0.05). These results suggest that the CB1 in the PFC may play a significant role in regulating mood and anxiety symptoms. Our study demonstrates the advantage of utilizing data from postmortem brain tissue and a mouse model of fear to enhance our understanding of the role of the cannabinoid receptors in mood and anxiety disorders
Resumo:
The findings of the recent independent review of the UK Liverpool Care Pathway (LCP)1, following substantial concerns raised by members of the public and health professionals found that the implementation of the LCP is often associated with poor care1. The Neuberger Report highlighted the complexity of various ethical, safety, clinical practice and negligence issues associated with pathway usage and how, despite technological advances, diagnosing dying continues to be challenging. The UK Government’s decision to phase out the LCP as policy following these findings, has generated considerable debate both within and beyond the UK. However, another key issue raised by the Neuberger’s report is the issue of the palliative care community’s perceived willingness to readily adopt new clinical practices in the absence of evidence. It is this translational issue that this editorial explores.
Resumo:
High-wind events such as storms and hurricanes cause severe damage to low-rise building (housing, schools, and industrial, commercial, and farm buildings). Roof claddings often suffer the worst, which then leads to accelerated damage to the whole building. Australia leads the way in solving this international problem through extensive research and development work, and has adequate documents in place. This paper first illustrates briefly the nature of high-wind events and then the commonly observed damage to buildings. Australian research work and design practice are then described, based on which suitable design recommendations for wind-resistant buildings are presented.
Resumo:
Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.
Resumo:
Background This is an updated version of a Cochrane review first published in Issue 1, 2010 of The Cochrane Library. In many clinical areas, integrated care pathways are utilised as structured multidisciplinary care plans that detail essential steps in caring for patients with specific clinical problems. In particular, care pathways for the dying have been developed as a model to improve care of patients who are in the last days of life. The care pathways were designed with an aim of ensuring that the most appropriate management occurs at the most appropriate time and that it is provided by the most appropriate health professional. There have been sustained concerns about the safety of implementing end-of-life care pathways, particularly in the UK. Therefore, there is a significant need for clinicians and policy makers to be informed about the effects of end-of-life care pathways with a systematic review. Objectives To assess the effects of end-of-life care pathways, compared with usual care (no pathway) or with care guided by another end-of-life care pathway across all healthcare settings (e.g. hospitals, residential aged care facilities, community). In particular, we aimed to assess the effects on symptom severity and quality of life of people who are dying; those related to the care such as families, carers and health professionals; or a combination of these. Search Methods We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (Issue 6, 2013), MEDLINE, EMBASE, PsycINFO, CINAHL, review articles and reference lists of relevant articles.We conducted the original search in September 2009, and the updated search in June 2013. Selection Criteria All randomised controlled trials (RCTs), quasi-randomised trial or high-quality controlled before-and-after studies comparing use versus non-use of an end-of-life care pathway in caring for the dying. Data Collection and Analysis Two review authors assessed the results of the searches against the predetermined criteria for inclusion. Main Results The original review identified 920 titles. The updated search found 2042 potentially relevant titles (including the original 920), but no additional studies met criteria for inclusion in the review update. Authors’ Conclusions With sustained concerns about the safety of the pathway implementation and the lack of available evidence on important patient and relative outcomes, recommendations for the use of end-of-life pathways in caring for the dying cannot be made. Since the last version of this review, no new studies met criteria for inclusion in the review update. With recently documented concerns related to the potential adverse effects associated with Liverpool Care Pathway (the most commonly used end-of-life care pathway), we do not recommend decision making based on indirect or low-quality evidence. All health services using end-of-life care pathways are encouraged to have their use of the pathway, to date, independently audited. Any subsequent use should be based on carefully documented evaluations. Large RCTs or other well-designed controlled studies are urgently required for the evaluation of the use of end-of-life care pathways in caring for dying people in various clinical settings. In future studies, outcome measures should include benefits or harms concerning the outcomes of interest in this review in relation to patients, families, carers and health professionals.
Resumo:
The Accelerating the Mathematics Learning of Low Socio-Economic Status Junior Secondary Students project aims to address the issues faced by very underperforming mathematics students as they enter high school. Its aim is to accelerate learning of mathematics through a vertical curriculum to enable students to access Year 10 mathematics subjects, thus improving life chances. This paper reports upon the theory underpinning this project and illustrates it with examples of the curriculum that has been designed to achieve acceleration.
Resumo:
Background In Pacific Island Countries (PICs) the epidemiology of dengue is characterized by long-term transmission of a single dengue virus (DENV) serotype. The emergence of a new serotype in one island country often indicates major outbreaks with this serotype will follow in other PICs. Objectives Filter paper (FP) cards on which whole blood or serum from dengue suspected patients had been dried was evaluated as a method for transportation of this material by standard mail delivery throughout the Pacific. Study design Twenty-two FP-dried whole blood samples collected from patients in New Caledonia and Wallis & Futuna Islands, during DENV-1 and DENV-4 transmission, and 76 FP-dried sera collected from patients in Yap State, Majuro (Republic of Marshall Islands), Tonga and Fiji, before and during outbreaks of DENV-2 in Yap State and DENV-4 in Majuro, were tested for the presence of DENV RNA, by serotype specific RT-PCR, at the Institut Louis Malardé in French Polynesia. Results The serotype of DENV could be determined, by a variety of RT-PCR procedures, in the FP-dried samples after more than three weeks of transport at ambient temperatures. In most cases, the sequencing of the envelope gene to genotype the viruses also was possible. Conclusions The serotype and genotype of DENV can be determined from FP-dried serum or whole blood samples transported over thousands of kilometers at ambient, tropical, temperatures. This simple and low-cost approach to virus identification should be evaluated in isolated and resource poor settings for surveillance for a range of significant viral diseases.
Resumo:
The Australian Commission on Safety and Quality in Health Care commissioned this rapid review to identify recent evidence in relation to three key questions: 1. What is the current evidence of quality and safety issues regarding the hospital experience of people with cognitive impairment (dementia/delirium)? 2. What are the existing evidence-based pathways, best practice or guidelines for cognitive impairment in hospitals? 3. What are the key components of an ideal patient journey for a person with dementia and/or delirium? The purpose of this review is to identify best practice in caring for patients with cognitive impairment (CI) in acute hospital settings. CI refers to patients with dementia and delirium but can include other conditions. For the purposes of this report, ‘Hospitals’ is defined as acute care settings and includes care provided by acute care institutions in other settings (e.g. Multipurpose Services and Hospital in the Home). It does not include residential aged care settings nor palliative care services that are not part of a service provided by an acute care institution. Method Both peer-reviewed publications and the grey literature were comprehensively searched for recent (primarily post 2010) publications, reports and guidelines that addressed the three key questions. The literature was evaluated and graded according to the National Health and Medical Research Council (NHMRC) levels of criteria (see Evidence Summary – Appendix B). Results Thirty-one recent publications were retrieved in relation to quality and safety issues faced by people with CI in acute hospitals. The results indicate that CI is a common problem in hospitals (upwards of 30% - the rate increases with increasing patient age), although this is likely to be an underestimate, in part, due to numbers of patients without a formal dementia diagnosis. There is a large body of evidence showing that patients with CI have worse outcomes than patients without CI following hospitalisation including increased mortality, more complications, longer hospital stays, increased system costs as well as functional and cognitive decline. 4 To improve the care of patients with CI in hospital, best practice guidelines have been developed, of which sixteen recent guidelines/position statements/standards were identified in this review (Table 2). Four guidelines described standards or quality indicators for providing optimal care for the older person with CI in hospital, in general, while three focused on delirium diagnosis, prevention and management. The remaining guidelines/statements focused on specific issues in relation to the care of patients with CI in acute hospitals including hydration, nutrition, wandering and care in the Emergency Department (ED). A key message in several of the guidelines was that older patients should be assessed for CI at admission and this is particularly important in the case of delirium, which can indicate an emergency, in order to implement treatment. A second clear mess...