975 resultados para INTENSIVE TREATMENT


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in grassland management intended to increase productivity can lead to sequestration of substantial amounts of atmospheric C in soils. Management-intensive grazing (MiG) can increase forage production in mesic pastures, but potential impacts on soil C have not been evaluated. We sampled four pastures (to 50 cm depth) in Virginia, USA, under MiG and neighboring pastures that were extensively grazed or bayed to evaluate impacts of grazing management on total soil organic C and N pools, and soil C fractions. Total organic soil C averaged 8.4 Mg C ha(-1) (22%) greater under MiG; differences were significant at three of the four sites examined while total soil N was greater for two sites. Surface (0-10 cm) particulate organic matter (POM) C increased at two sites; POM C for the entire depth increment (0-50 cm) did not differ significantly between grazing treatments at any of the sites. Mineral-associated C was related to silt plus clay content and tended to be greater under MiG. Neither soil C:N ratios, POM C, or POM C:total C ratios were accurate indicators of differences in total soil C between grazing treatments, though differences in total soil C between treatments attributable to changes in POM C (43%) were larger than expected based on POM C as a percentage of total C (24.5%). Soil C sequestration rates, estimated by calculating total organic soil C differences between treatments (assuming they arose from changing grazing management and can be achieved elsewhere) and dividing by duration of treatment, averaged 0.41 Mg C ha(-1) year(-1) across the four sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To determine the prevalence, severity, location, etiology, treatment, and healing of medical device-related pressure ulcers in intensive care patients for up to 7 days. Design: Prospective repeated measures study. Setting and participants: Patients in 6 intensive care units of 2 major medical centers, one each in Australia and the United States, were screened 1 day per month for 6 months. Those with device-related ulcers were followed daily up to 7 days. Outcome measures: Device-related ulcer prevalence, pain, infection, treatment, healing. Results: 15/483 patients had device-related ulcers and 9/15 with 11 ulcers were followed beyond screening. Their mean age was 60.5 years, most were men, over-weight, and at increased pressure ulcer risk. Endotracheal and nasogastric tubes were the cause of most device-related ulcers. Repositioning was the most frequent treatment. 4/11 ulcers healed within the 7 day observation period. Conclusion: Device-related ulcer prevalence was 3.1%, similar to that reported in the limited literature available, indicating an ongoing problem. Systematic assessment and repositioning of devices are the mainstays of care. We recommend continued prevalence determination and that nurses remain vigilant to prevent device-related ulcers, especially in patients with nasogastric and endotracheal tubes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mortality and cost outcomes of elderly intensive care unit (ICU) trauma patients were characterised in a retrospective cohort study from an Australian tertiary ICU. Trauma patients admitted between January 2000 and December 2005 were grouped into three major age categories: aged ≥65 years admitted into ICU (n=272); aged ≥65 years admitted into general ward (n=610) and aged <65 years admitted into ICU (n=1617). Hospital mortality predictors were characterised as odds ratios (OR) using logistic regression. The impact of predictor variables on (log) total hospital-stay costs was determined using least squares regression. An alternate treatment-effects regression model estimated the mortality cost-effect as an endogenous variable. Mortality predictors (P ≤0.0001, comparator: ICU ≥65 years, ventilated) were: ICU <65 not-ventilated (OR 0.014); ICU <65 ventilated (OR 0.090); ICU age ≥65 not-ventilated (OR 0.061) and ward ≥65 (OR 0.086); increasing injury severity score and increased Charlson comorbidity index of 1 and 2, compared with zero (OR 2.21 [1.40 to 3.48] and OR 2.57 [1.45 to 4.55]). The raw mean daily ICU and hospital costs in A$ 2005 (US$) for age <65 and ≥65 to ICU, and ≥65 to the ward were; for year 2000: ICU, $2717 (1462) and $2777 (1494); hospital, $1837 (988) and $1590 (855); ward $933 (502); for year 2005: ICU, $3202 (2393) and $3086 (2307); hospital, $1938 (1449) and $1914 (1431); ward $1180 (882). Cost increments were predicted by age ≥65 and ICU admission, increasing injury severity score, mechanical ventilation, Charlson comorbidity index increments and hospital survival. Mortalitycost-effect was estimated at -63% by least squares regression and -82% by treatment-effects regression model. Patient demographic factors, injury severity and its consequences predict both cost and survival in trauma. The cost mortality effect was biased upwards by conventional least squares regression estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The common goal of tissue engineering is to develop substitutes that can closely mimic the structure of extracellular matrix (ECM). However, similarly important is the intensive material properties which have often been overlooked, in particular, for soft tissues that are not to bear load assumingly. The mechanostructural properties determine not only the structural stability of biomaterials but also their physiological functionality by directing cellular activity and regulating cell fate decision. The aim here is to emphasize that cells could sense intensive material properties like elasticity and reside, proliferate, migrate and differentiate accordinglyno matter if the construct is from a natural source like cartilage, skin etc. or of synthetic one. Meanwhile, the very objective of this work is to provide a tunable scheme for manipulating the elasticity of collagen-based constructs to be used to demonstrate how to engineer cell behavior and regulate mechanotransduction. Articular cartilage was chosen as it represents one of the most complex hierarchical arrangements of collagen meshwork in both connective tissues and ECM-like biomaterials. Corona discharge treatment was used to produce constructs with varying density of crosslinked collagen and stiffness accordingly. The results demonstrated that elastic modulus increased up to 33% for samples treated up to one minute as crosslink density was found to increase with exposure time. According to the thermal analysis, longer exposure to corona increased crosslink density as the denaturation enthalpy increased. However the spectroscopy results suggested that despite the stabilization of the collagen structure the integrity of the triple helical structure remained intact. The in vitro superficial culture of heterologous chondrocytes also determined that the corona treatment can modulate migration with increased focal adhesion of cells due to enhanced stiffness, without cytotoxicity effects, and providing the basis for reinforcing three-dimensional collagen-based biomaterials in order to direct cell function and mediate mechanotransduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim To test the efficacy of Medilixir [cream] against the standard treatment of aqueous cream in the provision of relief from the symptoms of postburn itch. Design RCT with two parallel arms. Setting Professor Stuart Pegg Adult Burns Centre, Royal Brisbane and Women's Hospital, Brisbane, Australia. Participants Fifty-two patients aged between 18 and 80 years, admitted directly to the burns centre between 10 March and 22 July 2008, were able to provide informed consent, and had shown no allergic reaction to a patch test with the study medication, were randomised. Patients admitted from intensive care or high dependency were excluded. Main results Effect estimates and confidence intervals were not reported for any of the outcomes; only group means/proportions and P-values from hypothesis testing were provided. More patients in the intervention group reported itch reduction compared to comparison treatment (91 vs. 82%, P=0.001). Itch recurrence after cream application occurred later in the intervention group compared to the control group (P<0.001). Use of antipruritic medication was significantly greater in the control group (P=0.023). There was no difference in sleep disturbance between groups (not quantified). On average, Medilixir took longer to apply than aqueous cream (157s for Medilixir vs. 139s for aqueous cream; mean difference 17s), but authors noted that the groups did not differ significantly (CI for mean difference and P-values were not reported).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of the Sengstaken–Blakemore tube as a life-saving treatment for bleeding oesophageal varices is slowly becoming the least preferred method possibly due to the potential complications associated with its placement. Nursing practice pertaining to the care of this patient group appears ad hoc and reliant on local knowledge and experience as opposed to recognised evidence of best practice. Therefore, this paper focuses on the application of Lewin's transitional change theory used to introduce a change in nursing practice with the application of a guideline to enhance the care of patients with a Sengstaken–Blakemore tube in situ within a general intensive care unit. This method identified some of the complexities surrounding the change process including the driving and restraining forces that must be harnessed and minimised in order for the adoption of change to be successful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As for many other cancers, metastasis is the leading cause of death of patients with ovarian cancer. Vigorous basic and clinical research is being performed to initiate more efficacious treatment strategies to improve the poor outcome of women with this cancer. Current treatment for ovarian cancer includes advanced cyto-reductive surgery and traditional platinum and taxane combined chemotherapy. Clinical trials using novel cytotoxic reagents and tyrosine kinase inhibitors have also been progressing. In parallel, the application of robust unbiased high throughput research platforms using transcriptomic and proteomic approaches has identified that not only individual cell signalling pathways, but a network of molecular pathways, play an important role in the biology of ovarian cancer. Furthermore, intensive genomic and epigenetic analyses have also revealed single nucleotide polymorphisms associated with risk and/or aetiology of this cancer including patient response to treatment. Taken together, these approaches, that are advancing our understanding, will have an impact on the generation of new therapeutic approaches and strategies for improving the outcome and quality of life of patients with ovarian cancer in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser. Elevated emissions of nitrous oxide (N2O) can be expected as a consequence. In order to mitigate N2O emissions from fertilised agricultural fields, the use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted. However, no data is currently available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment was conducted to investigate the effect of the nitrification inhibitor 3,4-dimethylpyrazole phosphate (DMPP) on N2O emissions and yield from broccoli production in sub-tropical Australia. Soil N2O fluxes were monitored continuously (3 h sampling frequency) with fully automated, pneumatically operated measuring chambers linked to a sampling control system and a gas chromatograph. Cumulative N2O emissions over the 5 month observation period amounted to 298 g-N/ha, 324 g-N/ha, 411 g-N/ha and 463 g-N/ha in the conventional fertiliser (CONV), the DMPP treatment (DMPP), the DMMP treatment with a 10% reduced fertiliser rate (DMPP-red) and the zero fertiliser (0N), respectively. The temporal variation of N2O fluxes showed only low emissions over the broccoli cropping phase, but significantly elevated emissions were observed in all treatments following broccoli residues being incorporated into the soil. Overall 70–90% of the total emissions occurred in this 5 weeks fallow phase. There was a significant inhibition effect of DMPP on N2O emissions and soil mineral N content over the broccoli cropping phase where the application of DMPP reduced N2O emissions by 75% compared to the standard practice. However, there was no statistical difference between the treatments during the fallow phase or when the whole season was considered. This study shows that DMPP has the potential to reduce N2O emissions from intensive vegetable systems, but also highlights the importance of post-harvest emissions from incorporated vegetable residues. N2O mitigation strategies in vegetable systems need to target these post-harvest emissions and a better evaluation of the effect of nitrification inhibitors over the fallow phase is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his letter Cunha suggests that oral antibiotic therapy is safer and less expensive than intravenous therapy via central venous catheters (CVCs) (1). The implication is that costs will fall and increased health benefits will be enjoyed resulting in a gain in efficiency within the healthcare system. CVCs are often used in critically ill patients to deliver antimicrobial therapy, but expose patients to a risk of catheter-related bloodstream infection (CRBSI). Our current knowledge about the efficiency (i.e. costeffectiveness) of allocating resources toward interventions that prevent CRBSI in patients requiring a CVC has already been reviewed (2). If for some patient groups antimicrobial therapy can be delivered orally, instead of through a CVC, then the costs and benefits of this alternate strategy should be evaluated...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To examine the level of knowledge of doctors about the law on withholding and withdrawing life-sustaining treatment from adults who lack decision-making capacity, and factors associated with a higher level of knowledge. Design, setting and participants Postal survey of all specialists in emergency medicine, geriatric medicine, intensive care, medical oncology, palliative medicine, renal medicine and respiratory medicine on the AMPCo Direct database in New South Wales, Victoria and Queensland. Survey initially posted to participants on 18 July 2012 and closed on 31 January 2013. Main outcome measures Medical specialists’ levels of knowledge about the law, based on their responses to two survey questions. Results Overall response rate was 32%. For the seven statements contained in the two questions about the law, the mean knowledge score was 3.26 out of 7. State and specialty were the strongest predictors of legal knowledge. Conclusions Among doctors who practise in the end-of-life field, there are some significant knowledge gaps about the law on withholding and withdrawing life-sustaining treatment from adults who lack decision-making capacity. Significant consequences for both patients and doctors can flow from a failure to comply with the law. Steps should be taken to improve doctors’ legal knowledge in this area and to harmonise the law across Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Donation after Cardiac Death (DCD) is one possible solution to the world wide organ shortage. Intensive care physicians are central to DCD becoming successful since they are responsible for making the clinical judgements and decisions associated with DCD. Yet international evidence shows health care professionals have not embraced DCD and are often reluctant to consider it as an option for patients. PURPOSE: To explore intensive care physicians' clinical judgements when selecting a suitable DCD candidate. METHODS: Using interpretative exploratory methods six intensive care physicians were interviewed from three hospital sites in Australia. Following verbatim transcription, data was subjected to thematic analysis. FINDINGS: Three distinct themes emerged. Reducing harm and increasing benefit was a major focus of intensive care physicians during determination of DCD. There was an acceptance of DCD if there was clear evidence that donation was what the patient and family wanted. Characteristics of a defensible decision reflected the characteristics of sequencing, separation and isolation, timing, consensus and collaboration, trust and communication to ensure that judgements were robust and defensible. The final theme revealed the importance of minimising uncertainty and discomfort when predicting length of survival following withdrawal of life-sustaining treatment. CONCLUSION: DCD decisions are made within an environment of uncertainty due to the imprecision associated with predicting time of death. Lack of certainty contributed to the cautious and collaborative strategies used by intensive care physicians when dealing with patients, family members and colleagues around end-of-life decisions, initiation of withdrawal of life-sustaining treatment and the discussion about DCD. This study recommends that nationally consistent policies are urgently needed to increase the degree of certainty for intensive care staff concerning the DCD processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To effectively care for people who are terminally ill, including those without decision-making capacity, palliative care physicians must know and understand the legal standing of Advance Care Planning (ACP) in their jurisdiction of practice. This includes the use of advance directives/living wills (ADs) and substitute decision-makers (SDMs) who can legally consent to or refuse treatment if there is no valid AD. Aim: The study aimed to investigate the knowledge, attitudes and practices of medical specialists most often involved in end-of-life care in relation to the law on withholding/ withdrawing life-sustaining treatment (WWLST) from adults without decision-making capacity. Design/participants: A pre-piloted survey was posted to specialists in palliative, emergency, geriatric, renal and respiratory medicine, intensive care and medical oncology in three Australian States. Surveys were analysed using SPSS20 and SAS 9.3. Results: The overall response rate was 32% (867/2702); 52% from palliative care specialists. Palliative Care specialists and Geriatricians had significantly more positive attitudes towards the law (χ242 = 94.352; p < 0.001) and higher levels of knowledge about the WWLST law (χ27 = 30.033; p < 0.001), than did the other specialists, while still having critical gaps in their knowledge. Conclusions: A high level of knowledge of the law is essential to ensure that patients’ wishes and decisions, expressed through ACP, are respected to the maximum extent possible within the law, thereby according with the principles and philosophy of palliative care. It is also essential to protect health professionals from legal action resulting from unauthorised provision or removal of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point sources of wastewater pollution, including effluent from municipal sewage treatment plants and intensive livestock and processing industries, can contribute significantly to the degradation of receiving waters (Chambers et al. 1997; Productivity Commission 2004). This has led to increasingly stringent local wastewater discharge quotas (particularly regarding Nitrogen, Phosphorous and suspended solids), and many municipal authorities and industry managers are now faced with upgrading their existing treatment facilities in order to comply. However, with high construction, energy and maintenance expenses and increasing labour costs, traditional wastewater treatment systems are becoming an escalating financial burden for the communities and industries that operate them. This report was generated, in the first instance, for the Burdekin Shire Council to provide information on design aspects and parameters critical for developing duckweed-based wastewater treatment (DWT) in the Burdekin region. However, the information will be relevant to a range of wastewater sources throughout Queensland. This information has been collated from published literature and both overseas and local studies of pilot and full-scale DWT systems. This report also considers options to generate revenue from duckweed production (a significant feature of DWT), and provides specifications and component cost information (current at the time of publication) for a large-scale demonstration of an integrated DWT and fish production system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coal seam gas (CSG) is a growing industry in Queensland and represents a potential major employer and deliverer of financial prosperity for years to come. CSG is a natural gas composed primarily of methane and is found trapped underground in coal beds. During the gas extraction process, significant volumes of associated water are also produced. This associated water could be a valuable resource, however, the associated water comprises of various salt constituents that make it problematic for beneficial use. Consequently, there is a need to implement various water treatment strategies to purify the associated water to comply with Queensland’s strict guidelines and to mitigate environmental risks. The resultant brine is also of importance as ultimately it also has to be dealt with in an economical manner. In some ways it can be considered that the CSG industry does not face a water problem, as this has inherent value to society, but rather has a “salt issue” to solve. This study analyzes the options involved in both the water treatment and salt recovery processes. A brief overview of the constituents present in Queensland CS water is made to illustrate the challenges involved and a range of treatment technologies discussed. Water treatment technologies examined include clarification (ballasted flocculation, dissolved air flotation, electrocoagulation), membrane filtration (ultrafiltration), ion exchange softening and desalination (ion exchange, reverse osmosis desalination and capacitance deionization). In terms of brine management we highlighted reinjection, brine concentration ponds, membrane techniques (membrane distillation, forward osmosis), thermal methods, electrodialysis, electrodialysis reversal, bipolar membrane electrodialysis, wind assisted intensive evaporation, membrane crystallization, eutectic freeze crystallization and vapor compression. As an entirety this investigation is designed to be an important tool in developing CS water treatment management strategies for effective management in Queensland and worldwide.