871 resultados para wastewater discharge


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper applies a policy analysis approach to the question of how to effectively regulate micropollution in a sustainable manner. Micropollution is a complex policy problem characterized by a huge number and diversity of chemical substances, as well as various entry paths into the aquatic environment. It challenges traditional water quality management by calling for new technologies in wastewater treatment and behavioral changes in industry, agriculture and civil society. In light of such challenges, the question arises as to how to regulate such a complex phenomenon to ensure water quality is maintained in the future? What can we learn from past experiences in water quality regulation? To answer these questions, policy analysis strongly focuses on the design and choice of policy instruments and the mix of such measures. In this paper, we review instruments commonly used in past water quality regulation. We evaluate their ability to respond to the characteristics of a more recent water quality problem, i.e., micropollution, in a sustainable way. This way, we develop a new framework that integrates both the problem dimension (i.e., causes and effects of a problem) as well as the sustainability dimension (e.g., long-term, cross-sectoral and multi-level) to assess which policy instruments are best suited to regulate micropollution. We thus conclude that sustainability criteria help to identify an appropriate instrument mix of end-of-pipe and source-directed measures to reduce aquatic micropollution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

REASONS FOR PERFORMING STUDY In clinical practice, veterinarians often depend on owner-reported signs to assess the clinical course of horses with recurrent airway obstruction (RAO). OBJECTIVES To test whether owner-reported information on frequency of coughing and observation of nasal discharge are associated with clinical, cytological and bronchoprovocation findings in RAO-affected horses in nonstandardised field conditions. STUDY DESIGN Cross-sectional study comparing healthy and RAO-affected horses. METHODS Twenty-eight healthy and 34 RAO-affected Swiss Warmblood horses were grouped according to owner-reported 'coughing frequency' and 'nasal discharge'. Differences between these groups were examined using clinical examination, blood gas analyses, endoscopic mucus scores, cytology of tracheobronchial secretion and bronchoalveolar lavage fluid, and airway hyperresponsiveness determined by plethysmography with histamine bronchoprovocation. RESULTS Frequently coughing horses differed most markedly from healthy control animals. Histamine bronchoprovocation-derived parameters were significantly different between the healthy control group and all RAO groups. Mucus grades and tracheobronchial secretion and bronchoalveolar lavage fluid neutrophil percentages had particularly high variability, with overlap of findings between groups. Owner satisfaction with the clinical status of the horse was high, even in severely affected horses. CONCLUSIONS Owner-reported coughing and nasal discharge are associated with specific clinical and diagnostic findings in RAO-affected horses in field settings. While airway hyperresponsiveness differentiates best between healthy horses and asymptomatic RAO-affected horses, the absence of coughing and nasal discharge does not rule out significant neutrophilic airway inflammation. Owner satisfaction with the clinical status of the horse was uninformative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Cold atmospheric plasma (CAP, i.e. ionized air) is an innovating promising tool in reducing bacteria. OBJECTIVE We conducted the first clinical trial with the novel PlasmaDerm(®) VU-2010 device to assess safety and, as secondary endpoints, efficacy and applicability of 45 s/cm(2) cold atmospheric plasma as add-on therapy against chronic venous leg ulcers. METHODS From April 2011 to April 2012, 14 patients were randomized to receive standardized modern wound care (n = 7) or plasma in addition to standard care (n = 7) 3× per week for 8 weeks. The ulcer size was determined weekly (Visitrak(®) , photodocumentation). Bacterial load (bacterial swabs, contact agar plates) and pain during and between treatments (visual analogue scales) were assessed. Patients and doctors rated the applicability of plasma (questionnaires). RESULTS The plasma treatment was safe with 2 SAEs and 77 AEs approximately equally distributed among both groups (P = 0.77 and P = 1.0, Fisher's exact test). Two AEs probably related to plasma. Plasma treatment resulted in a significant reduction in lesional bacterial load (P = 0.04, Wilcoxon signed-rank test). A more than 50% ulcer size reduction was noted in 5/7 and 4/7 patients in the standard and plasma groups, respectively, and a greater size reduction occurred in the plasma group (plasma -5.3 cm(2) , standard: -3.4 cm(2) ) (non-significant, P = 0.42, log-rank test). The only ulcer that closed after 7 weeks received plasma. Patients in the plasma group quoted less pain compared to the control group. The plasma applicability was not rated inferior to standard wound care (P = 0.94, Wilcoxon-Mann-Whitney test). Physicians would recommend (P = 0.06, Wilcoxon-Mann-Whitney test) or repeat (P = 0.08, Wilcoxon-Mann-Whitney test) plasma treatment by trend. CONCLUSION Cold atmospheric plasma displays favourable antibacterial effects. We demonstrated that plasma treatment with the PlasmaDerm(®) VU-2010 device is safe and effective in patients with chronic venous leg ulcers. Thus, larger controlled trials and the development of devices with larger application surfaces are warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radiogenic He is produced by the decay of uranium and thorium in the Earth’s mantle and crust. From here, it is degassed to the atmosphere and eventually escapes to space. Assuming that all of the 4He produced is degassed, about 70% of the total He degassed from Earth comes from the continental crust. However, the outgoing flux of crustal He has not been directly measured at the Earth’s surface and the migration pathways are poorly understood. Here we present measurements of helium isotopes and the long-lived cosmogenic radio-isotope Kr in the deep, continental-scale Guarani aquifer in Brazil and show that crustal He reaches the atmosphere primarily by the surficial discharge of deep groundwater. We estimate that He in Guarani groundwater discharge accounts for about 20% of the assumed global flux from continental crust, and that other large aquifers may account for about 33%. Old groundwater ages suggest that He in the Guarani aquifer accumulates over half- to one-millionyear timescales. We conclude that He degassing from the continents is regulated by groundwater discharge, rather than episodic tectonic events, and suggest that the assumed steady state between crustal production and degassing of He, and its resulting atmospheric residence time, should be re-examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Depression is associated with poor prognosis in patients with cardiovascular disease (CVD). We hypothesized that depressive symptoms at discharge from a cardiac rehabilitation program are associated with an increased risk of future CVD-related hospitalizations. Methods: We examined 486 CVD patients (mean age = 59.8 ± 11.2) who enrolled in a comprehensive 3-month rehabilitation program and completed the depression subscale of the Hospital Anxiety and Depression Scale (HADS-D). At follow-up we evaluated the predictive value of depressive symptoms for CVD-related hospitalizations, controlling for sociodemographic factors, cardiovascular risk factors, and disease severity. Results: During a mean follow-up of 41.5 ± 15.6 months, 63 patients experienced a CVD-related hospitalization. The percentage of depressive patients (HADS-D ≥ 8) decreased from 16.9% at rehabilitation entry to 10.7% at discharge. Depressive symptoms at discharge from rehabilitation were a significant predictor of outcome (HR 1.32, 95% CI 1.09–1.60; p =0.004). Patients with clinically relevant depressive symptoms at discharge had a 2.5-fold increased relative risk of poor cardiac prognosis compared to patients without clinically relevant depressive symptoms independently of other prognostic variables. Conclusion: In patients with CVD, depressive symptoms at discharge from rehabilitation indicated a poor cardiac prognosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution of landscapes crucially depends on the climate history. This is particularly evident in South America where landscape responses to orbital climate shifts have been well documented. However, while most studies have focused on inferring temperature variations from paleoclimate proxy data, estimates of water budget changes have been complicated because of a lack of adequate physical information. Here, we present a methodology and related results, which allowed us to extract water discharge values from the sedimentary record of the 40 Ka-old fluvial terrace deposits in the Pisco valley, western Peru. In particular, this valley hosts a Quaternary cut-and-fill succession that we used, in combination with beryllium-10 (10Be)-based sediment flux, gauging records, channel geometries and grain size measurements, to quantitatively assess sediment and water discharge values c. 40 Ka ago in relation to present-day conditions. We compare these discharge estimates to the discharge regime of the modern Pisco River and find that the water discharge of the paleo-Pisco River, during the Minchin pluvial period c. 40 Ka ago, was c. 7–8 times greater than the modern Pisco River if considering the mean and the maximum water discharge. In addition, the calculations show that inferred water discharge estimates are mainly dependent on channel gradients and grain size values, and to a lesser extent on channel width measures. Finally, we found that the c. 40 Ka-old Minchin terrace material was poorer sorted than the modern deposits, which might reflect that sediment transport during the past period was characterized by a larger divergence from equal mobility compared to the modern situation. In summary, the differences in grain size distribution and inferred water discharge estimates between the modern and the paleo-Pisco River suggests that the 40 Ka-old Minchin period was characterized by a wetter climate and more powerful flood events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate rainfall data are the key input parameter for modelling river discharge and soil loss. Remote areas of Ethiopia often lack adequate precipitation data and where these data are available, there might be substantial temporal or spatial gaps. To counter this challenge, the Climate Forecast System Reanalysis (CFSR) of the National Centers for Environmental Prediction (NCEP) readily provides weather data for any geographic location on earth between 1979 and 2014. This study assesses the applicability of CFSR weather data to three watersheds in the Blue Nile Basin in Ethiopia. To this end, the Soil and Water Assessment Tool (SWAT) was set up to simulate discharge and soil loss, using CFSR and conventional weather data, in three small-scale watersheds ranging from 112 to 477 ha. Calibrated simulation results were compared to observed river discharge and observed soil loss over a period of 32 years. The conventional weather data resulted in very good discharge outputs for all three watersheds, while the CFSR weather data resulted in unsatisfactory discharge outputs for all of the three gauging stations. Soil loss simulation with conventional weather inputs yielded satisfactory outputs for two of three watersheds, while the CFSR weather input resulted in three unsatisfactory results. Overall, the simulations with the conventional data resulted in far better results for discharge and soil loss than simulations with CFSR data. The simulations with CFSR data were unable to adequately represent the specific regional climate for the three watersheds, performing even worse in climatic areas with two rainy seasons. Hence, CFSR data should not be used lightly in remote areas with no conventional weather data where no prior analysis is possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 6-month-long, bench-scale simulation of an industrial wastewater stabilization pond (WSP) system was conducted to evaluate responses to several potential performance-enhancing treatments. The industrial WSP system consists of an anaerobic primary (1ry) WSP treating high-strength wastewater, followed by facultative secondary (2ry) and aerobic tertiary (3ry) WSPs in series treating lower-strength wastewater. The 1ry WSP was simulated with four glass aquaria which were fed with wastewater from the actual WSP system. The treatments examined were phosphorus supplementation (PHOS), phosphorus supplementation with pH control (PHOS+ALK), and phosphorus supplementation with pH control and effluent recycle (PHOS+ALK+RCY). The supplementary phosphorus treatment alone did not yield any significant change versus the CONTROL 1ry model pond. The average carbon to phosphorus ratio of the feed wastewater received from the WSP system was already 100:0.019 (i.e., 2,100 mg/l: 0.4 mg/l). The pH-control treatments (PHOS+ALK and PHOS+ALK+RCY) produced significant results, with 9 to 12 percent more total organic carbon (TOC) removal, 43 percent more volatile organic acid (VOA) generation, 78 percent more 2-ethoxyethanol and 14 percent more bis(2-chloroethyl)ether removal, and from 100- to 10,000-fold increases in bacterial enzyme activity and heterotrophic bacterial numbers. Recycling a 10-percent portion of the effluent yielded less variability for certain physicochemical parameters in the PHOS+ALK+RCY 1ry model pond, but overall there was no statistically-detectable improvement in performance versus no recycle. The 2ry and 3ry WSPs were also simulated in the laboratory to monitor the effect and fate of increased phosphorus loadings, as might occur if supplemental phosphorus were added to the 1ry WSP. Noticeable increases in algal growth were observed at feed phosphorus concentrations of 0.5 mg/l; however, there were no significant changes in the monitored physicochemical parameters. The effluent phosphorus concentrations from both the 2ry and 3ry model ponds did increase notably when feed phosphorus concentrations were increased from 0.5 to 1.0 mg/l. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the requirements for health care hospitalization have become more demanding, so has the discharge planning process become a more important part of the health services system. A thorough understanding of hospital discharge planning can, then, contribute to our understanding of the health services system. This study involved the development of a process model of discharge planning from hospitals. Model building involved the identification of factors used by discharge planners to develop aftercare plans, and the specification of the roles of these factors in the development of the discharge plan. The factors in the model were concatenated in 16 discrete decision sequences, each of which produced an aftercare plan.^ The sample for this study comprised 407 inpatients admitted to the M. D. Anderson Hospital and Tumor Institution at Houston, Texas, who were discharged to any site within Texas during a 15 day period. Allogeneic bone marrow donors were excluded from the sample. The factors considered in the development of discharge plans were recorded by discharge planners and were used to develop the model. Data analysis consisted of sorting the discharge plans using the plan development factors until for some combination and sequence of factors all patients were discharged to a single site. The arrangement of factors that led to that aftercare plan became a decision sequence in the model.^ The model constructs the same discharge plans as those developed by hospital staff for every patient in the study. Tests of the validity of the model should be extended to other patients at the MDAH, to other cancer hospitals, and to other inpatient services. Revisions of the model based on these tests should be of value in the management of discharge planning services and in the design and development of comprehensive community health services.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A bench-scale treatability study was conducted on a high-strength wastewater from a chemical plant to develop an alternative for the existing waste stabilization pond treatment system. The objective of this study was to determine the treatability of the wastewater by the activated sludge process and, if treatable, to determine appropriate operating conditions, and to evaluate the degradability of bis(2-chloroethyl)ether (Chlorex) and benzene in the activated sludge system. Four 4-L Plexi-glass, complete mixing, continuous flow activated sludge reactors were operated in parallel under different operating conditions over a 6-month period. The operating conditions examined were hydraulic retention time (HRT), sludge retention time (SRT), nutrient supplement, and Chlorex/benzene spikes. Generally the activated sludge system treating high-strength wastewater was stable under large variations of organic loading and operating conditions. At an HRT of 2 days, more than 90% removal efficiency with good sludge settleability was achieved when the organic loading was less than 0.4 g BOD$\sb5$/g MLVSS/d or 0.8 g COD/g MLVSS/d. At least 20 days of SRT was required to maintain steady operation. Phosphorus addition enhanced the performance of the system especially during stressed operation. On the average, removals of benzene and Chlorex were 73-86% and 37-65%, respectively. In addition, the low-strength wastewater was treatable by activated sludge process, showing more than 90% BOD removal at a HRT of 0.5 days. In general, the sludge had poor settling characteristics. The aerated lagoon process treating high-strength wastewater also provided significant organic reduction, but did not produce an acceptable effluent concentration. ^