939 resultados para discharge
Resumo:
BACKGROUND Cold atmospheric plasma (CAP, i.e. ionized air) is an innovating promising tool in reducing bacteria. OBJECTIVE We conducted the first clinical trial with the novel PlasmaDerm(®) VU-2010 device to assess safety and, as secondary endpoints, efficacy and applicability of 45 s/cm(2) cold atmospheric plasma as add-on therapy against chronic venous leg ulcers. METHODS From April 2011 to April 2012, 14 patients were randomized to receive standardized modern wound care (n = 7) or plasma in addition to standard care (n = 7) 3× per week for 8 weeks. The ulcer size was determined weekly (Visitrak(®) , photodocumentation). Bacterial load (bacterial swabs, contact agar plates) and pain during and between treatments (visual analogue scales) were assessed. Patients and doctors rated the applicability of plasma (questionnaires). RESULTS The plasma treatment was safe with 2 SAEs and 77 AEs approximately equally distributed among both groups (P = 0.77 and P = 1.0, Fisher's exact test). Two AEs probably related to plasma. Plasma treatment resulted in a significant reduction in lesional bacterial load (P = 0.04, Wilcoxon signed-rank test). A more than 50% ulcer size reduction was noted in 5/7 and 4/7 patients in the standard and plasma groups, respectively, and a greater size reduction occurred in the plasma group (plasma -5.3 cm(2) , standard: -3.4 cm(2) ) (non-significant, P = 0.42, log-rank test). The only ulcer that closed after 7 weeks received plasma. Patients in the plasma group quoted less pain compared to the control group. The plasma applicability was not rated inferior to standard wound care (P = 0.94, Wilcoxon-Mann-Whitney test). Physicians would recommend (P = 0.06, Wilcoxon-Mann-Whitney test) or repeat (P = 0.08, Wilcoxon-Mann-Whitney test) plasma treatment by trend. CONCLUSION Cold atmospheric plasma displays favourable antibacterial effects. We demonstrated that plasma treatment with the PlasmaDerm(®) VU-2010 device is safe and effective in patients with chronic venous leg ulcers. Thus, larger controlled trials and the development of devices with larger application surfaces are warranted.
Resumo:
Radiogenic He is produced by the decay of uranium and thorium in the Earth’s mantle and crust. From here, it is degassed to the atmosphere and eventually escapes to space. Assuming that all of the 4He produced is degassed, about 70% of the total He degassed from Earth comes from the continental crust. However, the outgoing flux of crustal He has not been directly measured at the Earth’s surface and the migration pathways are poorly understood. Here we present measurements of helium isotopes and the long-lived cosmogenic radio-isotope Kr in the deep, continental-scale Guarani aquifer in Brazil and show that crustal He reaches the atmosphere primarily by the surficial discharge of deep groundwater. We estimate that He in Guarani groundwater discharge accounts for about 20% of the assumed global flux from continental crust, and that other large aquifers may account for about 33%. Old groundwater ages suggest that He in the Guarani aquifer accumulates over half- to one-millionyear timescales. We conclude that He degassing from the continents is regulated by groundwater discharge, rather than episodic tectonic events, and suggest that the assumed steady state between crustal production and degassing of He, and its resulting atmospheric residence time, should be re-examined.
Resumo:
Objectives: Depression is associated with poor prognosis in patients with cardiovascular disease (CVD). We hypothesized that depressive symptoms at discharge from a cardiac rehabilitation program are associated with an increased risk of future CVD-related hospitalizations. Methods: We examined 486 CVD patients (mean age = 59.8 ± 11.2) who enrolled in a comprehensive 3-month rehabilitation program and completed the depression subscale of the Hospital Anxiety and Depression Scale (HADS-D). At follow-up we evaluated the predictive value of depressive symptoms for CVD-related hospitalizations, controlling for sociodemographic factors, cardiovascular risk factors, and disease severity. Results: During a mean follow-up of 41.5 ± 15.6 months, 63 patients experienced a CVD-related hospitalization. The percentage of depressive patients (HADS-D ≥ 8) decreased from 16.9% at rehabilitation entry to 10.7% at discharge. Depressive symptoms at discharge from rehabilitation were a significant predictor of outcome (HR 1.32, 95% CI 1.09–1.60; p =0.004). Patients with clinically relevant depressive symptoms at discharge had a 2.5-fold increased relative risk of poor cardiac prognosis compared to patients without clinically relevant depressive symptoms independently of other prognostic variables. Conclusion: In patients with CVD, depressive symptoms at discharge from rehabilitation indicated a poor cardiac prognosis.
Resumo:
The evolution of landscapes crucially depends on the climate history. This is particularly evident in South America where landscape responses to orbital climate shifts have been well documented. However, while most studies have focused on inferring temperature variations from paleoclimate proxy data, estimates of water budget changes have been complicated because of a lack of adequate physical information. Here, we present a methodology and related results, which allowed us to extract water discharge values from the sedimentary record of the 40 Ka-old fluvial terrace deposits in the Pisco valley, western Peru. In particular, this valley hosts a Quaternary cut-and-fill succession that we used, in combination with beryllium-10 (10Be)-based sediment flux, gauging records, channel geometries and grain size measurements, to quantitatively assess sediment and water discharge values c. 40 Ka ago in relation to present-day conditions. We compare these discharge estimates to the discharge regime of the modern Pisco River and find that the water discharge of the paleo-Pisco River, during the Minchin pluvial period c. 40 Ka ago, was c. 7–8 times greater than the modern Pisco River if considering the mean and the maximum water discharge. In addition, the calculations show that inferred water discharge estimates are mainly dependent on channel gradients and grain size values, and to a lesser extent on channel width measures. Finally, we found that the c. 40 Ka-old Minchin terrace material was poorer sorted than the modern deposits, which might reflect that sediment transport during the past period was characterized by a larger divergence from equal mobility compared to the modern situation. In summary, the differences in grain size distribution and inferred water discharge estimates between the modern and the paleo-Pisco River suggests that the 40 Ka-old Minchin period was characterized by a wetter climate and more powerful flood events.
Resumo:
Accurate rainfall data are the key input parameter for modelling river discharge and soil loss. Remote areas of Ethiopia often lack adequate precipitation data and where these data are available, there might be substantial temporal or spatial gaps. To counter this challenge, the Climate Forecast System Reanalysis (CFSR) of the National Centers for Environmental Prediction (NCEP) readily provides weather data for any geographic location on earth between 1979 and 2014. This study assesses the applicability of CFSR weather data to three watersheds in the Blue Nile Basin in Ethiopia. To this end, the Soil and Water Assessment Tool (SWAT) was set up to simulate discharge and soil loss, using CFSR and conventional weather data, in three small-scale watersheds ranging from 112 to 477 ha. Calibrated simulation results were compared to observed river discharge and observed soil loss over a period of 32 years. The conventional weather data resulted in very good discharge outputs for all three watersheds, while the CFSR weather data resulted in unsatisfactory discharge outputs for all of the three gauging stations. Soil loss simulation with conventional weather inputs yielded satisfactory outputs for two of three watersheds, while the CFSR weather input resulted in three unsatisfactory results. Overall, the simulations with the conventional data resulted in far better results for discharge and soil loss than simulations with CFSR data. The simulations with CFSR data were unable to adequately represent the specific regional climate for the three watersheds, performing even worse in climatic areas with two rainy seasons. Hence, CFSR data should not be used lightly in remote areas with no conventional weather data where no prior analysis is possible.
Resumo:
As the requirements for health care hospitalization have become more demanding, so has the discharge planning process become a more important part of the health services system. A thorough understanding of hospital discharge planning can, then, contribute to our understanding of the health services system. This study involved the development of a process model of discharge planning from hospitals. Model building involved the identification of factors used by discharge planners to develop aftercare plans, and the specification of the roles of these factors in the development of the discharge plan. The factors in the model were concatenated in 16 discrete decision sequences, each of which produced an aftercare plan.^ The sample for this study comprised 407 inpatients admitted to the M. D. Anderson Hospital and Tumor Institution at Houston, Texas, who were discharged to any site within Texas during a 15 day period. Allogeneic bone marrow donors were excluded from the sample. The factors considered in the development of discharge plans were recorded by discharge planners and were used to develop the model. Data analysis consisted of sorting the discharge plans using the plan development factors until for some combination and sequence of factors all patients were discharged to a single site. The arrangement of factors that led to that aftercare plan became a decision sequence in the model.^ The model constructs the same discharge plans as those developed by hospital staff for every patient in the study. Tests of the validity of the model should be extended to other patients at the MDAH, to other cancer hospitals, and to other inpatient services. Revisions of the model based on these tests should be of value in the management of discharge planning services and in the design and development of comprehensive community health services.^