985 resultados para INTENSIVE TRAINING
Resumo:
The standards in this chapter focus on maximising the patient`s ability to adhere to the treatment prescribed. Many people are extremely shocked when they are told they have TB, some refuse to accept it and others are relieved to find out what is wrong and that treatment is available. The reaction depends on many factors, including cultural beliefs and values, previous experience and knowledge of the disease. Even though TB is more common among vulnerable groups, it can affect anyone and it is important for patients to be able to discuss their concerns in relation to their own individual context. The cure for TB relies on the patient receiving a full, uninterrupted course of treatment, which can only be achieved if the patient and the health service work together. A system needs to be in place to trace patients who miss their appointments for treatment (late patients). The best success will be achieved through the use of flexible, innovative and individualised approaches. The treatment and care the patient has received will inevitably have an impact on his or her willingness to attend in the future. A well-defined system of late patient tracing is mandatory in all situations. However, when the rates are high (above 10%), any tracing system will be useless without also examining the service as a whole.
Resumo:
This article presents a fieldbus simulation platform and its remote access interface that enables a wide range of experiments, where users can configure operation sequences and procedures typical of Foundation Fieldbus systems. The simulation system was developed using LabVIEW, with requisites of deterministic execution, and a course management work frame web server called Moodle. The results were obtained through three different evaluations: schedule table execution, simulator functionality and finally, simulator productivity and achievement. The evaluation attests that this new tool is feasible, and can be applied for fieldbus automation systems training purposes, considering the robustness and stability in tests and the positive feedback from users. (C) 2008 ISA. Published by Elsevier Ltd. All rights reserved.
Resumo:
BACKGROUND: Guidelines for red blood cell (RBC) transfusions exist; however, transfusion practices vary among centers. This study aimed to analyze transfusion practices and the impact of patients and institutional characteristics on the indications of RBC transfusions in preterm infants. STUDY DESIGN AND METHODS: RBC transfusion practices were investigated in a multicenter prospective cohort of preterm infants with a birth weight of less than 1500 g born at eight public university neonatal intensive care units of the Brazilian Network on Neonatal Research. Variables associated with any RBC transfusions were analyzed by logistic regression analysis. RESULTS: Of 952 very-low-birth-weight infants, 532 (55.9%) received at least one RBC transfusion. The percentages of transfused neonates were 48.9, 54.5, 56.0, 61.2, 56.3, 47.8, 75.4, and 44.7%, respectively, for Centers 1 through 8. The number of transfusions during the first 28 days of life was higher in Center 4 and 7 than in other centers. After 28 days, the number of transfusions decreased, except for Center 7. Multivariate logistic regression analysis showed higher likelihood of transfusion in infants with late onset sepsis (odds ratio [OR], 2.8; 95% confidence interval [CI], 1.8-4.4), intraventricular hemorrhage (OR, 9.4; 95% CI, 3.3-26.8), intubation at birth (OR, 1.7; 95% CI, 1.0-2.8), need for umbilical catheter (OR, 2.4; 95% CI, 1.3-4.4), days on mechanical ventilation (OR, 1.1; 95% CI, 1.0-1.2), oxygen therapy (OR, 1.1; 95% CI, 1.0-1.1), parenteral nutrition (OR, 1.1; 95% CI, 1.0-1.1), and birth center (p < 0.001). CONCLUSIONS: The need of RBC transfusions in very-low-birth-weight preterm infants was associated with clinical conditions and birth center. The distribution of the number of transfusions during hospital stay may be used as a measure of neonatal care quality.
Resumo:
The sustainability of fast-growing tropical Eucalyptus plantations is of concern in a context of rising fertilizer costs, since large amounts of nutrients are removed with biomass every 6-7 years from highly weathered soils. A better understanding of the dynamics of tree requirements is required to match fertilization regimes to the availability of each nutrient in the soil. The nutrition of Eucalyptus plantations has been intensively investigated and many studies have focused on specific fluxes in the biogeochemical cycles of nutrients. However, studies dealing with complete cycles are scarce for the Tropics. The objective of this paper was to compare these cycles for Eucalyptus plantations in Congo and Brazil, with contrasting climates, soil properties, and management practices. The main features were similar in the two situations. Most nutrient fluxes were driven by crown establishment the two first years after planting and total biomass production thereafter. These forests were characterized by huge nutrient requirements: 155, 10, 52, 55 and 23 kg ha(-1) of N, P, K, Ca and Mg the first year after planting at the Brazilian study site, respectively. High growth rates the first months after planting were essential to take advantage of the large amounts of nutrients released into the soil solutions by organic matter mineralization after harvesting. This study highlighted the predominant role of biological and biochemical cycles over the geochemical cycle of nutrients in tropical Eucalyptus plantations and indicated the prime importance of carefully managing organic matter in these soils. Limited nutrient losses through deep drainage after clear-cutting in the sandy soils of the two study sites showed the remarkable efficiency of Eucalyptus trees in keeping limited nutrient pools within the ecosystem, even after major disturbances. Nutrient input-output budgets suggested that Eucalyptus plantations take advantage of soil fertility inherited from previous land uses and that long-term sustainability will require an increase in the inputs of certain nutrients. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Only 7% of the once extensive forest along the eastern coast of Brazil remains, and much of that is degraded and threatened by agricultural expansion and urbanization. We wondered if methods similar to those developed to establish fast-growing Eucalyptus plantations might also work to enhance survival and growth of rainforest species on degraded pastures composed of highly competitive C(4) grasses. An 8-factor experiment was laid out to contrast the value of different intensities of cultivation, application of fertilizer and weed control on the growth and survival of a mixture of 20 rainforest species planted at two densities: 3 m x 1 m, and 3 m x 2 m. Intensive management increased seedling survival from 90% to 98%, stemwood production and leaf area index (LAI) by similar to 4-fold, and stemwood production per unit of light absorbed by 30%. Annual growth in stem biomass was closely related to LAI alone (r(2) = 0.93, p < 0.0001), and the regression improved further in combination with canopy nitrogen content (r(2) =0.99, p < 0.0001). Intensive management resulted in a nearly closed forest canopy in less than 4 years, and offers a practical means to establish functional forests on abandoned agricultural land. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We used environmental accounting to evaluate high-intensity clonal eucalyptus production in Sao Paolo, Brazil, converting inputs (environmental, material, and labor) to emergy units so ecological efficiency could be compared on a common basis. Input data were compiled under three pH management scenarios (lime, ash, and sludge). The dominant emergy input is environmental work (transpired water, similar to 58% of total emergy), followed by diesel (similar to 15%); most purchased emergy is invested during harvest (41.8% of 7-year production totals). Where recycled materials are used for pH amendment (ash or sludge instead of lime), we observe marked improvements in ecological efficiency; lime (raw) yielded the highest unit emergy value (UEV = emergy per unit energy in the product = 9.6E + 03 sej J(-1)), whereas using sludge and ash (recycled) reduced the UEV to 8.9E + 03 and 8.8E + 03 sej J(-1), respectively. The emergy yield ratio was similarly affected, suggesting better ecological return on energy invested. Sensitivity of resource use to other operational modifications (e.g., decreased diesel, labor, or agrochemicals) was small (<3% change). Emergy synthesis permits comparison of sustainability among forest production systems globally. This eucalyptus scheme shows the highest ecological efficiency of analyzed pulp production operations (UEV range = 1.1 to 3.6E + 04 sej J(-1)) despite high operational intensity.
Resumo:
Objective: To describe an outbreak of imipenem-resistant metallo-beta-lactamase-producing Pseudomonas aeruginosa, enzyme type bla, by horizontal transmission in patients admitted to a mixed adult ICU. Methods: A case-control study was carried out, including 47 patients (cases) and 122 patients (control) admitted to the mixed ICU of a university hospital in Minas Gerais. Brazil from November 2003 to July 2005. The infection site, risk factors, mortality, antibiotic susceptibility, metallo-beta-lactamase (MBL) production, enzyme type, and clonal diversity were analyzed, Results: A temporal/spatial relationship was detected in most patients (94%), overall mortality was 55.3%, and pneumonia was the predominant infection (85%). The majority of isolates (95%) were resistant to imipenem and other antibiotics, except for polymyxin, and showed MBL production (76.7%). Only bla SPM-1 (33%) was identified in the 15 specimens analyzed. In addition, 4 clones were identified, with a predominance of clone A (61.5%) and B (23.1%). On multivariate analysis, advanced age, mechanical ventilation, tracheostomy, and previous imipenem use were significant risk factors for imipenem-resistant P. aeruginosa infection. Conclusions: Clonal dissemination of MBL-producing P. aeruginosa strains with a spatial/temporal relationship disclosed problems in the practice of hospital infection control, low adherence to hand hygiene, and empirical antibiotic use. (C) 2008 Elsevier Espana, S.L. All rights reserved.
Resumo:
Although it has long been supposed that resistance training causes adaptive changes in the CNS, the sites and nature of these adaptations have not previously been identified. In order to determine whether the neural adaptations to resistance training occur to a greater extent at cortical or subcortical sites in the CNS, we compared the effects of resistance training on the electromyographic (EMG) responses to transcranial magnetic (TMS) and electrical (TES) stimulation. Motor evoked potentials (MEPs) were recorded from the first dorsal interosseous muscle of 16 individuals before and after 4 weeks of resistance training for the index finger abductors (n = 8), or training involving finger abduction-adduction without external resistance (n = 8). TMS was delivered at rest at intensities from 5 % below the passive threshold to the maximal output of the stimulator. TMS and TES were also delivered at the active threshold intensity while the participants exerted torques ranging from 5 to 60 % of their maximum voluntary contraction (MVC) torque. The average latency of MEPs elicited by TES was significantly shorter than that of TMS MEPs (TES latency = 21.5 ± 1.4 ms; TMS latency = 23.4 ± 1.4 ms; P < 0.05), which indicates that the site of activation differed between the two forms of stimulation. Training resulted in a significant increase in MVC torque for the resistance-training group, but not the control group. There were no statistically significant changes in the corticospinal properties measured at rest for either group. For the active trials involving both TMS and TES, however, the slope of the relationship between MEP size and the torque exerted was significantly lower after training for the resistance-training group (P < 0.05). Thus, for a specific level of muscle activity, the magnitude of the EMG responses to both forms of transcranial stimulation were smaller following resistance training. These results suggest that resistance training changes the functional properties of spinal cord circuitry in humans, but does not substantially affect the organisation of the motor cortex.
Resumo:
Atherosclerotic plaque contains apoptotic endothelial cells with oxidative stress implicated in this process. Vitamin E and a-lipoic acid are a potent antioxidant combination with the potential to prevent endothelial apoptosis. Regular exercise is known to increase myocardial protection, however, little research has investigated the effects of exercise on the endothelium. The purpose of these studies was to investigate the effects of antioxidant supplementation and/or exercise training on proteins that regulate apoptosis in endothelial cells. Male rats received a control or antioxidant-supplemented diet (vitamin E and alpha-lipoic acid) and were assigned to sedentary or exercise-trained groups for 14 weeks. Left ventricular endothelial cells (LVECs) were isolated and levels of the anti-apoptotic protein Bcl-2 and the pro-apoptotic protein Bax were measured. Antioxidant supplementation caused a fourfold increase in Bcl-2 (P < 0.05) with no change in Bax (P > 0.05). Bcl-2:Bax was increased sixfold with antioxidant supplementation compared to non-supplemented animals (P < 0.05). Exercise training had no significant effect on Bcl-2, Bax or Bcl-2:Bax either alone or combined with antioxidant supplementation (P > 0.05) compared to non-supplemented animals. However, Bax was significantly lower (P < 0.05) in the supplemented trained group compared to non-supplemented trained animals. Cultured bovine endothelial cells incubated for 24 h with vitamin E and/or a-lipoic acid showed the combination of the two antioxidants increased Bcl-2 to a greater extent than cells incubated with the vehicle alone. In summary, vitamin E and a-lipoic acid increase endothelial cell Bcl-2, which may provide increased protection against apoptosis. (c) 2005 Elsevier Ltd. All rights reserved
Resumo:
While the physiological adaptations that occur following endurance training in previously sedentary and recreationally active individuals are relatively well understood, the adaptations to training in already highly trained endurance athletes remain unclear. While significant improvements in endurance performance and corresponding physiological markers are evident following submaximal endurance training in sedentary and recreationally active groups, an additional increase in submaximal training (i.e. volume) in highly trained individuals does not appear to further enhance either endurance performance or associated physiological variables [e.g. peak oxygen uptake (V-dot O2peak), oxidative enzyme activity]. It seems that, for athletes who are already trained, improvements in endurance performance can be achieved only through high-intensity interval training (HIT). The limited research which has examined changes in muscle enzyme activity in highly trained athletes, following HIT, has revealed no change in oxidative or glycolytic enzyme activity, despite significant improvements in endurance performance (p < 0.05). Instead, an increase in skeletal muscle buffering capacity may be one mechanism responsible for an improvement in endurance performance. Changes in plasma volume, stroke volume, as well as muscle cation pumps, myoglobin, capillary density and fibre type characteristics have yet to be investigated in response to HIT with the highly trained athlete. Information relating to HIT programme optimisation in endurance athletes is also very sparse. Preliminary work using the velocity at which V-dot O2max is achieved (Vmax) as the interval intensity, and fractions (50 to 75%) of the time to exhaustion at Vmax (Tmax) as the interval duration has been successful in eliciting improvements in performance in long-distance runners. However, Vmax and Tmax have not been used with cyclists. Instead, HIT programme optimisation research in cyclists has revealed that repeated supramaximal sprinting may be equally effective as more traditional HIT programmes for eliciting improvements in endurance performance. Further examination of the biochemical and physiological adaptations which accompany different HIT programmes, as well as investigation into the optimal HIT programme for eliciting performance enhancements in highly trained athletes is required.
Resumo:
To the Editor: The increase in medical graduates expected over the next decade presents a huge challenge to the many stakeholders involved in providing their prevocational and vocational medical training. 1 Increased numbers will add significantly to the teaching and supervision workload for registrars and consultants, while specialist training and access to advanced training positions may be compromised. However, this predicament may also provide opportunities for innovation in the way internships are delivered. Although facing these same challenges, regional and rural hospitals could use this situation to enhance their workforce by creating opportunities for interns and junior doctors to acquire valuable experience in non-metropolitan settings. We surveyed a representative sample (n = 147; 52% of total cohort) of Year 3 Bachelor of Medicine and Bachelor of Surgery students at the University of Queensland about their perceptions and expectations of their impending internship and the importance of its location (ie, urban/metropolitan versus regional/rural teaching hospitals) to their future training and career plans. Most students (n = 127; 86%) reported a high degree of contemplation about their internship choice. Issues relating to career progression and support ranked highest in their expectations. Most perceived internships in urban/metropolitan hospitals as more beneficial to their future career prospects compared with regional/rural hospitals, but, interestingly, felt that they would have more patient responsibility and greater contact with and supervision by senior staff in a regional setting (Box). Regional and rural hospitals should try to harness these positive perceptions and act to address any real or perceived shortcomings in order to enhance their future workforce.2 They could look to establish partnerships with rural clinical schools3 to enhance recruitment of interns as early as Year 3. To maximise competitiveness with their urban counterparts, regional and rural hospitals need to offer innovative training and career progression pathways to junior doctors, to combat the perception that internships in urban hospitals are more beneficial to future career prospects. Partnerships between hospitals, medical schools and vocational colleges, with input from postgraduate medical councils, should provide vertical integration4 in the important period between student and doctor. Work is underway to more closely evaluate and compare the intern experience across regional/rural and urban/metropolitan hospitals, and track student experiences and career choices longitudinally. This information may benefit teaching hospitals and help identify the optimal combination of resources necessary to provide quality teaching and a clear career pathway for the expected influx of new interns.