966 resultados para Months.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phenomenology is a term that has been described as a philosophy, a research paradigm, a methodology, and equated with qualitative research. In this paper first we clarify phenomenology by tracing its movement both as a philosophy and as a research method. Next we make a case for the use of phenomenology in empirical investigations of management phenomena. The paper discusses a selection of central concepts pertaining to phenomenology as a scientific research method, which include description, phenomenological reduction and free imaginative variation. In particular, the paper elucidates the efficacy of Giorgi’s descriptive phenomenological research praxis as a qualitative research method and how its utility can be applied in creating a deeper and richer understanding of management practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Cooperative Research Centre (CRC) for Rail Innovation is conducting a tranche of industry-led research projects looking into safer rail level crossings. This paper will provide an overview of the Affordable Level Crossings project, a project that is performing research in both engineering and human factors aspects of low-cost level crossing warning devices (LCLCWDs), and is facilitating a comparative trial of these devices over a period of 12 months in several jurisdictions. Low-cost level crossing warning devices (LCLCWDs) are characterised by the use of alternative technologies for high cost components including train detection and connectivity (e.g. radar, acoustic, magnetic induction train detection systems and wireless connectivity replacing traditional track circuits and wiring). These devices often make use of solar power where mains power is not available, and aim to make substantial savings in lifecycle costs. The project involves trialling low-cost level crossing warning devices in shadow-mode, where devices are installed without the road-user interface at a number of existing level crossing sites that are already equipped with conventional active warning systems. It may be possible that the deployment of lower-cost devices can provide a significantly larger safety benefit over the network than a deployment of expensive conventional devices, as the lower cost would allow more passive level crossing sites to be upgraded with the same capital investment. The project will investigate reliability and safety integrity issues of the low-cost devices, as well as evaluate lifecycle costs and investigate human factors issues related to warning reliability. This paper will focus on the requirements and safety issues of LCLCWDs, and will provide an overview of the Rail CRC projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to investigate the association between temperament in Australian infants aged 2–7 months and feeding practices of their first-time mothers (n=698). Associations between feeding practices and beliefs (Infant Feeding Questionnaire) and infant temperament (easy-difficult continuous scale from the Short Temperament Scale for Infants) were tested using linear and binary logistic regression models adjusted for a comprehensive range of covariates. Mothers of infants with a more difficult temperament reported a lower awareness of infant cues, were more likely to use food to calm and reported high concern about overweight and underweight. The covariate maternal depression score largely mirrored these associations. Infant temperament may be an important variable to consider in future research on the prevention of childhood obesity. In practice, mothers of temperamentally difficult infants may need targeted feeding advice to minimise the adoption of undesirable feeding practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives We aimed to use simple clinical questions to group women and provide their specific rates of miscarriage, preterm delivery, and stillbirth for reference. Further, our purpose was to describe who has experienced particularly low or high rates of each event. Methods Data were collected as part of the Australian Longitudinal Study on Women's Health, a national prospective cohort. Reproductive histories were obtained from 5806 women aged 31–36 years in 2009, who had self-reported an outcome for one or more pregnancy. Age at first birth, number of live births, smoking status, fertility problems, use of in vitro fertilisation (IVF), education and physical activity were the variables that best separated women into groups for calculating the rates of miscarriage, preterm delivery, and stillbirth. Results Women reported 10,247 live births, 2544 miscarriages, 1113 preterm deliveries, and 113 stillbirths. Miscarriage was correlated with stillbirth (r = 0.09, P<0.001). The calculable rate of miscarriage ranged from 11.3 to 86.5 miscarriages per 100 live births. Women who had high rates of miscarriage typically had fewer live births, were more likely to smoke and were more likely to have tried unsuccessfully to conceive for ≥12 months. The highest proportion of live preterm delivery (32.2%) occurred in women who had one live birth, had tried unsuccessfully to conceive for ≥12 months, had used IVF, and had 12 years education or equivalent. Women aged 14–19.99 years at their first birth and reported low physical activity had 38.9 stillbirths per 1000 live births, compared to the lowest rate at 5.5 per 1000 live births. Conclusion Different groups of women experience vastly different rates of each adverse pregnancy event. We have used simple questions and established reference data that will stratify women into low- and high-rate groups, which may be useful in counselling those who have experienced miscarriage, preterm delivery, or stillbirth, plus women with fertility intent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examined the effects of progressive resistance training (PRT) and supplementation with calcium-vitamin D(3) fortified milk on markers of systemic inflammation, and the relationship between inflammation and changes in muscle mass, size and strength. Healthy men aged 50-79 years (n = 180) participated in this 18-month randomized controlled trial that comprised a factorial 2 x 2 design. Participants were randomized to (1) PRT + fortified milk supplement, (2) PRT, (3) fortified milk supplement, or (4) a control group. Participants assigned to PRT trained 3 days per week, while those in the supplement groups consumed 400 ml day(-1) of milk containing 1,000 mg calcium plus 800 IU vitamin D(3). We collected venous blood samples at baseline, 12 and 18 months to measure the serum concentrations of IL-6, TNF-alpha and hs-CRP. There were no exercise x supplement interactions, but serum IL-6 was 29% lower (95% CI, -62, 0) in the PRT group compared with the control group after 12 months. Conversely, IL-6 was 31% higher (95% CI, -2, 65) in the supplement group compared with the non-supplemented groups after 12 and 18 months. These between-group differences did not persist after adjusting for changes in fat mass. In the PRT group, mid-tibia muscle cross-sectional area increased less in men with higher pre-training inflammation compared with those men with lower inflammation (net difference similar to 2.5%, p < 0.05). In conclusion, serum IL-6 concentration decreased following PRT, whereas it increased after supplementation with fortified milk concomitant with changes in fat mass. Furthermore, low-grade inflammation at baseline restricted muscle hypertrophy following PRT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The expedited 10g protein counter (EP-10) is a quick and valid clinical tool for dietary protein quantification. This study aims to assess the clinical effectiveness of the EP-10 in improving serum albumin and transferrin in chronic hemodialysis patients. Methods: Forty-five patients with low serum albumin (< 38 g /L) were enrolled in this study. Parameters measured included dry weight, height, dietary intake, and levels of serum albumin, transferrin, potassium, phosphate and kinetic modeling (Kt/v). The nutritional intervention incorporated the EP-10 in two ways (1)lto quantify protein intake of patients and (2)ito educate patients to meet their protein requirements. Mean values of the nutritional parameters before and after intervention were compared using paired t-test. Results: Three months after nutritional intervention, mean albumin levels increased significantly from 32.2+4.8g/L to 37.0+3.2g/L (p<0.001). Thirty-eight (84%) patients showed an increase in albumin levels while two (4%) maintained their levels. Of the thirty-six (80%) patients with low transferrin levels (<200 mg/dL), 28 (78%) had an increase and two maintained their levels post-intervention. Mean transferrin levels increased significantly from 169.4+39.9mg/dL to 180.9+38.1mg/dL (p< 0.05). Conclusion: Nutritional intervention incorporating the EP-10 method is able to make significant improvements to albumin and transferrin levels of chronic hemodialysis patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diabetic foot ulcers are one of the most hospitalised diabetes complications and contribute to many leg amputations. Trained diabetic foot teams and specialists managing diabetic foot ulcers have demonstrated reductions in amputations and hospitalisation by up to 90%. Few such teams exist in Australia. Thus, access is limited for all geographical populations and may somewhat explain the high rates of hospitalisation. Aim: This pilot study aims to analyse if local clinicians managing diabetic foot complications report improved access to diabetic foot specialists and outcomes with the introduction of a telehealth store-and-forward system. Method: A store-and-forward telehealth system was implemented in six different Queensland locations between August 2009 and February 2010. Sites were offered ad hoc and/or fortnightly telehealth access to a diabetic foot speciality service. A survey was sent six months following commencement of the trial to the 14 eligible clinicians involved in the trial to gauge clinical perception of the telehealth system. Results: Eight participants returned the surveys. The majority of responding clinicians reported that the telehealth system was easy to use (100%), improved their access to diabetic foot speciality services (75%), improved upskilling of local diabetes service staff (100%), and improved patient outcomes (100%). Conclusion: This pilot study suggests that clinicians found the use of a telehealth store-and-forward system very useful in improving access to speciality services, clinical skills and patient outcomes. This study supports the recommendation that telehealth systems should be made available for diabetic foot ulcer management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose The primary objective of this study was to examine the effect of exercise on subjective sleep quality in heart failure patients. Methods This study used a randomised, controlled trial design with blinded end-point analysis. Participants were randomly assigned to a 12-week programme of education and self-management support (control) or to the same programme with the addition of a tailored physical activity programme designed and supervised by an exercise specialist (intervention). The intervention consisted of 1 hour of aerobic and resistance exercise twice a week. Participants included 108 patients referred to three hospital heart failure services in Queensland, Australia. Results Patients who participated in supervised exercise classes showed significant improvement in subjective sleep quality, sleep latency, sleep disturbance and global sleep quality scores after 12 weeks of supervised hospital based exercise. Secondary analysis showed that improvements in sleep quality were correlated with improvements in geriatric depression score (p=0.00) and exercise performance (p=0.03). General linear models were used to examine whether the changes in sleep quality following intervention occurred independently of changes in depression, exercise performance and weight. Separate models adjusting for each covariate were performed. Results suggest that exercise significantly improved sleep quality independent of changes in depression, exercise performance and weight. Conclusion This study supports the hypothesis that a 12 week program of aerobic and resistance exercise improves subjective sleep quality in patients with heart failure. This is the first randomised controlled trial to examine the role of exercise in the improvement of sleep quality for patients with this disease. While this study establishes exercise as a therapy for poor sleep quality, further research is needed to investigate exercise as a treatment for other parameters of sleep in this population. Study investigators plan to undertake a more in-depth examination within the next 12 months

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Exercise for Health was a randomized, controlled trial designed to evaluate two modes of delivering (face-to-face [FtF] and over-the-telephone [Tel]) an 8-month translational exercise intervention, commencing 6-weeks post-breast cancer surgery (PS). Methods Outcomes included quality of life (QoL), function (fitness and upper-body) and treatment-related side effects (fatigue, lymphoedema, body mass index, menopausal symptoms, anxiety, depression and pain). Generalised estimating equation modelling determined time (baseline [5-weeks PS], mid-intervention [6-months PS], post-intervention [12-months PS]), group (FtF, Tel, Usual Care [UC]) and time-by-group effects. 194 women representative of the breast cancer population were randomised to the FtF (n=67), Tel (n=67) and UC (n=60) groups. Results: There were significant (p<0.05) interaction effects on QoL, fitness and fatigue, with differences being observed between the treatment groups and the UC group. Trends observed for the treatment groups were similar. The treatment groups reported improved QoL, fitness and fatigue over time and changes observed between baseline and post-intervention were clinically relevant. In contrast, the UC group experienced no change, or worsening QoL, fitness and fatigue, mid-intervention. Although improvements in the UC group occurred by 12-months post-surgery, the change did not meet the clinically relevant threshold. There were no differences in other treatment-related side-effects between groups. Conclusion This translational intervention trial, delivered either face-to-face or over-the-telephone, supports exercise as a form of adjuvant breast cancer therapy that can prevent declines in fitness and function during treatment and optimise recovery post-treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and aims: Lower-limb lymphoedema is a serious and feared sequela after treatment for gynaecological cancer. Given the limited prospective data on incidence of and risk factors for lymphoedema after treatment for gynaecological cancer we initiated a prospective cohort study in 2008. Methods: Data were available for 353 women with malignant disease. Participants were assessed before treatment and at regular intervals after treatment for two years. Follow-up visits were grouped into time-periods of six weeks to six months (time 1), nine months to 15 months (time 2), and 18 months to 24 months (time 3). Preliminary data analyses were undertaken up to time 2 using generalised estimating equations to model the repeated measures data of Functional Assessment of Cancer Therapy-General (FACT-G) quality of life (QoL) scores and self-reported swelling at each follow-up period (best-fitting covariance structure). Results: Depending on the time-period, between 30% and 40% of patients self-reported swelling of the lower limb. The QoL of those with self-reported swelling was lower at all time-periods compared with those who did not have swelling. Mean (95% CI) FACT-G scores at time 0, 1 and 2 were 80.7 (78.2, 83.2), 83.0 (81.0, 85.0) and 86.3 (84.2, 88.4), respectively for those with swelling and 85.0 (83.0, 86.9), 86.0 (84.1, 88.0) and 88.9 (87.0, 90.7), respectively for those without swelling. Conclusions: Lower-limb swelling adversely influences QoL and change in QoL over time in patients with gynaecological cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Women who birth in private facilities in Australia are more likely to have a caesarean birth than women who birth in public facilities and these differences remain after accounting for sector differences in the demographic and health risk profiles of women. However, the extent to which women’s preferences and/or freedom to choose their mode of birth further account for differences in the likelihood of caesarean birth between the sectors remains untested. Method: Women who birthed in Queensland, Australia during a two-week period in 2009 were mailed a self-report survey approximately three months after birth. Seven hundred and fifty-seven women provided cross-sectional retrospective data on where they birthed (public or private facility), mode of birth (vaginal or caesarean) and risk factors, along with their preferences and freedom to choose their mode of birth. A hierarchical logistic regression was conducted to determine the extent to which maternal risk and freedom to choose one’s mode of birth explain sector differences in the likelihood of having a caesarean birth. Findings: While there was no sector difference in women’s preference for mode of birth, women who birthed in private facilities had higher odds of feeling able to choose either a vaginal or caesarean birth, and feeling able to choose only a caesarean birth. Women had higher odds of having caesarean birth if they birthed in private facilities, even after accounting for significant risk factors such as age, body mass index, previous caesarean and use of assisted reproductive technology. However, there was no association between place of birth and odds of having a caesarean birth after also accounting for freedom to choose one’s mode of birth. Conclusions: These findings call into question suggestions that the higher caesarean birth rate in the private sector in Australia is attributable to increased levels of obstetric risk among women birthing in the private sector or maternal preferences alone. Instead, the determinants of sector differences in the likelihood of caesarean births are complex and are linked to differences in the perceived choices for mode of birth between women birthing in the private and public systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Malaria is a major public health burden in the tropics with the potential to significantly increase in response to climate change. Analyses of data from the recent past can elucidate how short-term variations in weather factors affect malaria transmission. This study explored the impact of climate variability on the transmission of malaria in the tropical rain forest area of Mengla County, south-west China. Methods: Ecological time-series analysis was performed on data collected between 1971 and 1999. Auto-regressive integrated moving average (ARIMA) models were used to evaluate the relationship between weather factors and malaria incidence. Results: At the time scale of months, the predictors for malaria incidence included: minimum temperature, maximum temperature, and fog day frequency. The effect of minimum temperature on malaria incidence was greater in the cool months than in the hot months. The fog day frequency in October had a positive effect on malaria incidence in May of the following year. At the time scale of years, the annual fog day frequency was the only weather predictor of the annual incidence of malaria. Conclusion: Fog day frequency was for the first time found to be a predictor of malaria incidence in a rain forest area. The one-year delayed effect of fog on malaria transmission may involve providing water input and maintaining aquatic breeding sites for mosquitoes in vulnerable times when there is little rainfall in the 6-month dry seasons. These findings should be considered in the prediction of future patterns of malaria for similar tropical rain forest areas worldwide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Falciparum malaria is the most deadly among the four main types of human malaria. Although great success has been achieved since the launch of the National Malaria Control Programme in 1955, malaria remains a serious public health problem in China. This paper aimed to analyse the geographic distribution, demographic patterns and time trends of falciparum malaria in China. Methods: The annual numbers of falciparum malaria cases during 1992–2003 and the individual case reports of each clinical falciparum malaria during 2004–2005 were extracted from communicable disease information systems in China Center for Diseases Control and Prevention. The annual number of cases and the annual incidence were mapped by matching them to corresponding province- and county-level administrative units in a geographic information system. The distribution of falciparum malaria by age, gender and origin of infection was analysed. Time-series analysis was conducted to investigate the relationship between the falciparum malaria in the endemic provinces and the imported falciparum malaria in non-endemic provinces. Results: Falciparum malaria was endemic in two provinces of China during 2004–05. Imported malaria was reported in 26 non-endemic provinces. Annual incidence of falciparum malaria was mapped at county level in the two endemic provinces of China: Yunnan and Hainan. The sex ratio (male vs. female) for the number of cases in Yunnan was 1.6 in the children of 0–15 years and it reached 5.7 in the adults over 15 years of age. The number of malaria cases in Yunnan was positively correlated with the imported malaria of concurrent months in the non-endemic provinces. Conclusion: The endemic area of falciparum malaria in China has remained restricted to two provinces, Yunnan and Hainan. Stable transmission occurs in the bordering region of Yunnan and the hilly-forested south of Hainan. The age and gender distribution in the endemic area is characterized by the predominance of adult men cases. Imported falciparum malaria in the non-endemic area of China, affected mainly by the malaria transmission in Yunnan, has increased both spatially and temporally. Specific intervention measures targeted at the mobile population groups are warranted.