968 resultados para Central cost


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contains Board of Directors minutes (1903, 1907), Executive Committee minutes (1907), Removal Committee minutes (1903-1917), Annual Reports (1910, 1913), Monthly Reports (1901-1919), Monthly Bulletins (1914-1915), studies of those removed, Bressler's "The Removal Work, Including Galveston," and several papers relating to the IRO and immigration. Financial papers include a budget (1914), comparative per capita cost figures (1909-1922), audits (1915-1918), receipts and expenditures (1918-1922), investment records, bank balances (1907-1922), removal work cash book (1904-1911), office expenses cash account (1903-1906), and the financial records of other agencies working with the IRO (1906). Includes also removal case records of first the Jewish Agricultural Society (1899-1900), and then of the IRO (1901-1922) when it took over its work, family reunion case records (1901-1904), and the follow-up records of persons removed to various cities (1903-1914). Contains also the correspondence of traveling agents' contacts throughout the U.S. from 1905-1914, among them Stanley Bero, Henry P. Goldstein, Philip Seman, and Morris D. Waldman.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an increasing requirement for more astute land resource management through efficiencies in agricultural inputs in a sugar cane production system. A precision agriculture (PA) approach can provide a pathway for a sustainable sugarcane production system. One of the impediments to the adoption of PA practices is access to paddock-scale mapping layers displaying variability in soil properties, crop growth and surface drainage. Variable rate application (VRA) of nutrients is an important component of PA. However, agronomic expertise within PA systems has fallen well behind significant advances in PA technologies. Generally, advisers in the sugar industry have a poor comprehension of the complex interaction of variables that contribute to within-paddock variations in crop growth. This is regarded as a significant impediment to the progression of PA in sugarcane and is one of the reasons for the poor adoption of VRA of nutrients in a PA approach to improved sugar cane production. This project therefore has established a number of key objectives which will contribute to the adoption of PA and the staged progression of VRA supported by relevant and practical agronomic expertise. These objectives include provision of base soils attribute mapping that can be determined using Veris 3100 Electrical Conductivity (EC) and digital elevation datasets using GPS mapping technology for a large sector of the central cane growing region using analysis of archived satellite imagery to determine the location and stability of yield patterns over time and in varying seasonal conditions on selected project study sites. They also include the stablishment of experiments to determine appropriate VRA nitrogen rates on various soil types subjected to extended anaerobic conditions, and the establishment of trials to determine nitrogen rates applicable to a declining yield potential associated with the aging of ratoons in the crop cycle. Preliminary analysis of archived yield estimation data indicates that yield patterns remain relatively stable overtime. Results also indicate the where there is considerable variability in EC values there is also significant variation in yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background A cancer diagnosis elicits greater distress than any other medical diagnosis, and yet very few studies have evaluated the efficacy of structured online self-help therapeutic programs to alleviate this distress. This study aims to assess the efficacy over time of an internet Cognitive Behaviour Therapy (iCBT) intervention (‘Finding My Way’) in improving distress, coping and quality of life for individuals with a recent diagnosis of early stage cancer of any type. Methods/Design The study is a multi-site Randomised Controlled Trial (RCT) seeking to enrol 188 participants who will be randomised to either the Finding My Way Intervention or an attention-control condition. Both conditions are delivered online; with 6 modules released once per week, and an additional booster module released one month after program-completion. Participants complete online questionnaires on 4 occasions: at baseline (immediately prior to accessing the modules); post-treatment (immediately after program-completion); then three and six months later. Primary outcomes are general distress and cancer-specific distress, with secondary outcomes including Health-Related Quality of Life (HRQoL), coping, health service utilisation, intervention adherence, and user satisfaction. A range of baseline measures will be assessed as potential moderators of outcomes. Eligible participants are individuals recently diagnosed with any type of cancer, being treated with curative intent, aged over 18 years with sufficient English language literacy, internet access and an active email account and phone number. Participants are blinded to treatment group allocation. Randomisation is computer generated and stratified by gender. Discussion Compared to the few prior published studies, Finding My Way will be the first adequately powered trial to offer an iCBT intervention to curatively treated patients of heterogeneous cancer types in the immediate post-diagnosis/treatment period. If found efficacious, Finding My Way will assist with overcoming common barriers to face-to-face therapy in a cost-effective and accessible way, thus helping to reduce distress after cancer diagnosis and consequently decrease the cancer burden for individuals and the health system. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12613000001​796 16.10.13

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many countries over the last decade, have used performance-based contracting (PBC) to manage and maintain roads. The implementation of PBC provides additional benefits for the government/public such as cost savings and improved conditions of contracted road assets. In Australia, PBC is already being implemented on all categories of roads: national, state, urban and rural. Australian PBC arrangement is designed to turn over control and responsibility for roadway system maintenance, rehabilitation, and capital improvement projects to private contractors. Contractors’ responsibilities include determination of treatment types, the design, programming and the undertaking of works needed to maintain road networks at predetermined performance levels. Indonesia initiated two PBC pilot projects in 2011, the Pantura Section Demak-Trengguli (7.68 kilometers) in Central Java Province and Section Ciasem-Pamanukan (18.5 kilometers) in West Java Province. Both sections are categorized as national roads. The contract duration for both of these projects is four years. To facilitate a possible way forward, it is proposed to conduct a study to understand Australia's experiences of advancing from pilot projects to nation-wide programs using PBC. The study focuses on the scope of contracts, bidding processes, risk allocation, and key drivers, using relevant PBC case studies from Australia. Recommendations for future PBC deployment nation-wide should be based on more research associated with risk allocation. This will include investigation of standard conditions of contract. Implications of the contract clauses for the risk management strategy to be adopted by contractors. Based on the nature of risks, some are best managed by the project owner. It is very important that all parties involved to be open to the new rules of contract and to convince themselves about the potential increased benefits of the use of PBC. The most recent states of challenging issues were explored and described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainable management of native pastures requires an understanding of what the bounds of pasture composition, cover and soil surface condition are for healthy pastoral landscapes to persist. A survey of 107 Aristida/Bothriochloa pasture sites in inland central Queensland was conducted. The sites were chosen for their current diversity of tree cover, apparent pasture condition and soil type to assist in setting more objective bounds on condition ‘states’ in such pastures. Assessors’ estimates of pasture condition were strongly correlated with herbage mass (r = 0.57) and projected ground cover (r = 0. 58), and moderately correlated with pasture crown cover (r = 0.35) and tree basal area (r = 0.32). Pasture condition was not correlated with pasture plant density or the frequency of simple guilds of pasture species. The soil type of Aristida/Bothriochloa pasture communities was generally hard-setting, low in cryptogam cover but moderately covered with litter and projected ground cover (30–50%). There was no correlation between projected ground cover of pasture and estimated ground-level cover of plant crowns. Tree basal area was correlated with broad categories of soil type, probably because greater tree clearing has occurred on the more fertile, heavy-textured clay soils. Of the main perennial grasses, some showed strong soil preferences, for example Tripogon loliiformis for hard-setting soils and Dichanthium sericeum for clays. Common species, such as Chrysopogon fallax and Heteropogon contortus, had no strong soil preference. Wiregrasses (Aristida spp.) tended to be uncommon at both ends of the estimated pasture condition scale whereas H. contortus was far more common in pastures in good condition. Sedges (Cyperaceae) were common on all soil types and for all pasture condition ratings. Plants identified as increaser species were Tragus australianus, daisies (Asteraceae) and potentially toxic herbaceous legumes such as Indigofera spp. and Crotalaria spp. Pasture condition could not be reliably predicted based on the abundance of a single species or taxon but there may be scope for using integrated data for four to five ecologically contrasting plants such as Themeda triandra with daisies, T. loliiformis and flannel weeds (Malvaceae).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A survey was conducted in central inland Queensland, Australia of 108 sites that were deemed to contain Aristida/Bothriochloa native pastures to quantitatively describe the pastures and attempt to delineate possible sub-types. The pastures were described in terms of their floristic composition, plant density and crown cover. There were generally ~20 (range 5–33) main pasture species at a site. A single dominant perennial grass was rare with three to six prominent species the norm. Chrysopogon fallax (golden-beard grass) was the perennial grass most consistently found in all pastures whereas Aristida calycina (dark wiregrass), Enneapogon spp. (bottlewasher grasses), Brunoniella australis (blue trumpet) and Panicum effusum (hairy panic) were all regularly present. The pastures did not readily separate into broad floristic sub-groups, but three groups that landholders could recognise from a combination of the dominant tree and soil type were identified. The three groups were Eucalyptus crebra (narrow-leaved ironbark), E. melanophloia (silver-leaved ironbark) and E. populnea (poplar box). The pastures of the three main sub-groups were then characterised by the prominent presence, singly or in combination, of Bothriochloa ewartiana (desert bluegrass), Eremochloa bimaculata (poverty grass), Bothriochloa decipiens (pitted bluegrass) or Heteropogon contortus (black speargrass). The poplar box group had the greatest diversity of prominent grasses whereas the narrow-leaved ironbark group had the least. Non-native Cenchrus ciliaris (buffel grass) and Melinis repens (red Natal grass) were generally present at low densities. Describing pastures in terms of frequency of a few species or species groups sometimes failed to capture the true nature of the pasture but plant abundance for most species, as density, herbage mass of dry matter or plant crown cover, was correlated with its recorded frequency. A quantitative description of an average pasture in fair condition is provided but it was not possible to explain why some species often occur together or fail to co-exist in Aristida/Bothriochloa pastures, for example C. ciliaris and E. bimaculata rarely co-exist whereas Tragus australianus (small burrgrass) and Enneapogon spp. are frequently recorded together. Most crown cover was provided by perennial grasses but many of these are Aristida spp. (wiregrasses) and not regarded as useful forage for livestock. No new or improved categorisation of the great variation evident in the Aristida/Bothriochloa native pasture type can be given despite the much improved detail provided of the floristic composition by this survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prevalence of resistance to phosphine in the rust-red flour beetle, Tribolium castaneum, from eastern Australia was investigated, as well as the potential fitness cost of this type of resistance. Discriminating dose tests on 115 population samples collected from farms from 2006 to 2010 showed that populations containing insects with the weakly resistant phenotype are common in eastern Australia (65.2 of samples), although the frequency of resistant phenotypes within samples was typically low (median of 2.3). The population cage approach was used to investigate the possibility that carrying the alleles for weak resistance incurs a fitness cost. Hybridized populations were initiated using a resistant strain and either of two different susceptible strains. There was no evidence of a fitness cost based on the frequency of susceptible phenotypes in hybridized populations that were reared for seven generations without exposure to phosphine. This suggests that resistant alleles will tend to persist in field populations that have undergone selection even if selection pressure is removed. The prevalence of resistance is a warning that this species has been subject to considerable selection pressure and that effective resistance management practices are needed to address this problem. The resistance prevalence data also provide a basis against which to measure management success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Breaches of biosecurity, leading to incursions by invasive species, have the potential to cause substantial economic, social and environmental losses, including drastic reduction in biodiversity. It is argued that improving biosecurity reduces risk to biodiversity, while maintaining stable ecosystems through biodiversity can be a safeguard against biosecurity breaches. The global costs of invasive alien species (IAS) have been estimated at around US$350 billion, while alien invertebrate and vertebrate pests and weeds are estimated to cost Australia at least $7 billion a year. A striking, current, example is the incursion by Myrtle Rust (Puccinia psidii) an organism which can infect all members of the Myrtaceae, the most important family in the Australian flora. Myrtle rust was first detected on a property on the central coast of New South Wales in late April 2010. Two years later the disease has been detected in numerous locations in Queensland and New South Wales ranging from commercial plant nurseries and public amenities to large areas of bushland. This particular breach of biosecurity will, inevitably, diminish biodiversity of flora and fauna over large areas of the continent. Integrated pest management (IPM), an enrichment of diversity in managing invasive and other pest species, offers the best opportunity to address problems such as these. Australia's response to increasing biosecurity risk is comprehensive and includes national networking of scientists engaged in a complex program of biosecurity research and development, including studies of IPM. This network is being enhanced by the development of international linkages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The built environment is a major contributor to the world’s carbon dioxide emissions, with a considerable amount of energy being consumed in buildings due to heating, ventilation and air-conditioning, space illumination, use of electrical appliances, etc., to facilitate various anthropogenic activities. The development of sustainable buildings seeks to ameliorate this situation mainly by reducing energy consumption. Sustainable building design, however, is a complicated process involving a large number of design variables, each with a range of feasible values. There are also multiple, often conflicting, objectives involved such as the life cycle costs and occupant satisfaction. One approach to dealing with this is through the use of optimization models. In this paper, a new multi-objective optimization model is developed for sustainable building design by considering the design objectives of cost and energy consumption minimization and occupant comfort level maximization. In a case study demonstration, it is shown that the model can derive a set of suitable design solutions in terms of life cycle cost, energy consumption and indoor environmental quality so as to help the client and design team gain a better understanding of the design space and trade-off patterns between different design objectives. The model can very useful in the conceptual design stages to determine appropriate operational settings to achieve the optimal building performance in terms of minimizing energy consumption and maximizing occupant comfort level.