987 resultados para Nicholas, of Cusa, Cardinal, 1401-1464.
Resumo:
Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.
Resumo:
Objective: To examine if streamlining a medical research funding application process saved time for applicants. Design: Cross-sectional surveys before and after the streamlining. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures: Average researcher time spent preparing an application and the total time for all applications in working days. Results: The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions: Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications.
Resumo:
The cost effectiveness of antimicrobial stewardship (AMS) programmes was reviewed in hospital settings of Organisation for Economic Co-operation and Development (OECD) countries, and limited to adult patient populations. In each of the 36 studies, the type of AMS strategy and the clinical and cost outcomes were evaluated. The main AMS strategy implemented was prospective audit with intervention and feedback (PAIF), followed by the use of rapid technology, including rapid polymerase chain reaction (PCR)-based methods and matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF) technology, for the treatment of bloodstream infections. All but one of the 36 studies reported that AMS resulted in a reduction in pharmacy expenditure. Among 27 studies measuring changes to health outcomes, either no change was reported post-AMS, or the additional benefits achieved from these outcomes were not quantified. Only two studies performed a full economic evaluation: one on a PAIF-based AMS intervention; and the other on use of rapid technology for the selection of appropriate treatment for serious Staphylococcus aureus infections. Both studies found the interventions to be cost effective. AMS programmes achieved a reduction in pharmacy expenditure, but there was a lack of consistency in the reported cost outcomes making it difficult to compare between interventions. A failure to capture complete costs in terms of resource use makes it difficult to determine the true cost of these interventions. There is an urgent need for full economic evaluations that compare relative changes both in clinical and cost outcomes to enable identification of the most cost-effective AMS strategies in hospitals.
Resumo:
Normal range for scrotal circumference in Australian beef bulls was established using more than 300,000 measurements of breed, management group, age, liveweight, and scrotal circumference. The data used were derived from Australian bull breeders and two large research projects in northern Australia. Most bulls were within 250 to 750 kg liveweight and 300 to 750 days of age. The differences between breeds and variances within breeds were higher when scrotal circumference was predicted from age rather than liveweight, because of variance in growth rates. The average standard deviation for predicted scrotal circumference from liveweight and age was 25 and 30 mm, respectively. Scrotal circumference by liveweight relationships have a similar pattern across all breeds, except in Waygu, with a 50 to 70 mm range in average scrotal circumference at liveweights between 250 and 750 kg. Temperate breed bulls tended to have higher scrotal circumference at the same liveweight than tropically adapted breeds. Five groupings of common beef breeds in Australian were identified, within which there were similar predictions of scrotal circumference from liveweight. It was concluded that liveweight and breed are required to identify whether scrotal circumference is within normal range for Australian beef bulls that experience a wide range of nutritional conditions.
Resumo:
- Objective This study examined chronic disease risks and the use of a smartphone activity tracking application during an intervention in Australian truck drivers (April-October 2014). - Methods Forty-four men (mean age=47.5 [SD 9.8] years) completed baseline health measures, and were subsequently offered access to a free wrist-worn activity tracker and smartphone application (Jawbone UP) to monitor step counts and dietary choices during a 20-week intervention. Chronic disease risks were evaluated against guidelines; weekly step count and dietary logs registered by drivers in the application were analysed to evaluate use of the Jawbone UP. - Results Chronic disease risks were high (e.g. 97% high waist circumference [≥94 cm]). Eighteen drivers (41%) did not start the intervention; smartphone technical barriers were the main reason for drop out. Across 20-weeks, drivers who used the Jawbone UP logged step counts for an average of 6 [SD 1] days/week; mean step counts remained consistent across the intervention (weeks 1–4=8,743[SD 2,867] steps/day; weeks 17–20=8,994[SD 3,478] steps/day). The median number of dietary logs significantly decreased from start (17 [IQR 38] logs/weeks) to end of the intervention (0 [IQR 23] logs/week; p<0.01); the median proportion of healthy diet choices relative to total diet choices logged increased across the intervention (weeks 1–4=38[IQR 21]%; weeks 17–20=58[IQR 18]%). - Conclusions Step counts were more successfully monitored than dietary choices in those drivers who used the Jawbone UP. - Implications Smartphone technology facilitated active living and healthy dietary choices, but also prohibited intervention engagement in a number of these high-risk Australian truck drivers.
Resumo:
Cooked prawn colour is known to be a driver of market price and a visual indicator of product quality for the consumer. Although there is a general understanding that colour variation exists in farmed prawns, there has been no attempt to quantify this variation or identify where this variation is most prevalent. The objectives of this study were threefold: firstly to compare three different quantitative methods to measure prawn colour or pigmentation, two different colorimeters and colour quantification from digital images. Secondly, to quantify the amount of pigmentation variation that exists in farmed prawns within ponds, across ponds and across farms. Lastly, to assess the effects of ice storage or freeze-thawing of raw product prior to cooking. Each method was able to detect quantitative differences in prawn colour, although conversion of image based quantification of prawn colour from RGB to Lab was unreliable. Considerable colour variation was observed between prawns from different ponds and different farms, and this variation potentially affects product value. Different post-harvest methods prior to cooking were also shown to have a profound detrimental effect on prawn colour. Both long periods of ice storage and freeze thawing of raw product were detrimental to prawn colour. However, ice storage immediately after cooking was shown to be beneficial to prawn colour. Results demonstrated that darker prawn colour was preserved by holding harvested prawns alive in chilled seawater, limiting the time between harvesting and cooking, and avoiding long periods of ice storage or freeze thawing of uncooked product.
Resumo:
Bioactivities of peel and flesh extracts of 3 genetically diverse mango (Mangifera indica L.) varieties were studied. Nam Doc Mai peel extracts, containing the largest amounts of polyphenols, were associated with an effect on MCF-7 viable cell numbers with an IC50 (dose required for 50% inhibition of cell viability) of 56 μg/mL and significantly (p<0.01) induced cell death in MDA-MB-231 cells, compared with other varieties. Hydrophilic fractions of Nam Doc Mai peel extracts had the highest bioactivity values against both MCF-7 and MDA-MB-231 cells. Soluble polyphenols were present in the largest amounts in most hydrophilic fractions. The Nam Doc Mai mango variety contains high levels of fruit peel bioactivity, which appears to be related to the nature of the polyphenol composition.
Resumo:
An estimated 110 Mt of dust is eroded by wind from the Australian land surface each year, most of which originates from the arid and semi-arid rangelands. Livestock production is thought to increase the susceptibility of the rangelands to wind erosion by reducing vegetation cover and modifying surface soil stability. However, research is yet to quantify the impacts of grazing land management on the erodibility of the Australian rangelands, or determine how these impacts vary among land types and over time. We present a simulation analysis that links a pasture growth and animal production model (GRASP) to the Australian Land Erodibility Model (AUSLEM) to evaluate the impacts of stocking rate, stocking strategy and land condition on the erodibility of four land types in western Queensland, Australia. Our results show that declining land condition, over stocking, and using inflexible stocking strategies have potential to increase land erodibility and amplify accelerated soil erosion. However, land erodibility responses to grazing are complex and influenced by land type sensitivities to different grazing strategies and local climate characteristics. Our simulations show that land types which are more resilient to livestock grazing tend to be least susceptible to accelerated wind erosion. Increases in land erodibility are found to occur most often during climatic transitions when vegetation cover is most sensitive to grazing pressure. However, grazing effects are limited during extreme wet and dry periods when the influence of climate on vegetation cover is strongest. Our research provides the opportunity to estimate the effects of different land management practices across a range of land types, and provides a better understanding of the mechanisms of accelerated erosion resulting from pastoral activities. The approach could help further assessment of land erodibility at a broader scale notably if combined with wind erosion models.
Resumo:
Background The objective is to estimate the incremental cost-effectiveness of the Australian National Hand Hygiene Inititiave implemented between 2009 and 2012 using healthcare associated Staphylococcus aureus bacteraemia as the outcome. Baseline comparators are the eight existing state and territory hand hygiene programmes. The setting is the Australian public healthcare system and 1,294,656 admissions from the 50 largest Australian hospitals are included. Methods The design is a cost-effectiveness modelling study using a before and after quasi-experimental design. The primary outcome is cost per life year saved from reduced cases of healthcare associated Staphylococcus aureus bacteraemia, with cost estimated by the annual on-going maintenance costs less the costs saved from fewer infections. Data were harvested from existing sources or were collected prospectively and the time horizon for the model was 12 months, 2011–2012. Findings No useable pre-implementation Staphylococcus aureus bacteraemia data were made available from the 11 study hospitals in Victoria or the single hospital in Northern Territory leaving 38 hospitals among six states and territories available for cost-effectiveness analyses. Total annual costs increased by $2,851,475 for a return of 96 years of life giving an incremental cost-effectiveness ratio (ICER) of $29,700 per life year gained. Probabilistic sensitivity analysis revealed a 100% chance the initiative was cost effective in the Australian Capital Territory and Queensland, with ICERs of $1,030 and $8,988 respectively. There was an 81% chance it was cost effective in New South Wales with an ICER of $33,353, a 26% chance for South Australia with an ICER of $64,729 and a 1% chance for Tasmania and Western Australia. The 12 hospitals in Victoria and the Northern Territory incur annual on-going maintenance costs of $1.51M; no information was available to describe cost savings or health benefits. Conclusions The Australian National Hand Hygiene Initiative was cost-effective against an Australian threshold of $42,000 per life year gained. The return on investment varied among the states and territories of Australia.
Resumo:
Four species of large mackerels (Scomberomorus spp.) co-occur in the waters off northern Australia and are important to fisheries in the region. State fisheries agencies monitor these species for fisheries assessment; however, data inaccuracies may exist due to difficulties with identification of these closely related species, particularly when specimens are incomplete from fish processing. This study examined the efficacy of using otolith morphometrics to differentiate and predict among the four mackerel species off northeastern Australia. Seven otolith measurements and five shape indices were recorded from 555 mackerel specimens. Multivariate modelling including linear discriminant analysis (LDA) and support vector machines, successfully differentiated among the four species based on otolith morphometrics. Cross validation determined a predictive accuracy of at least 96% for both models. An optimum predictive model for the four mackerel species was an LDA model that included fork length, feret length, feret width, perimeter, area, roundness, form factor and rectangularity as explanatory variables. This analysis may improve the accuracy of fisheries monitoring, the estimates based on this monitoring (i.e. mortality rate) and the overall management of mackerel species in Australia.
Resumo:
Head motion (HM) is a well known confound in analyses of functional MRI (fMRI) data. Neuroimaging researchers therefore typically treat HM as a nuisance covariate in their analyses. Even so, it is possible that HM shares a common genetic influence with the trait of interest. Here we investigate the extent to which this relationship is due to shared genetic factors, using HM extracted from resting-state fMRI and maternal and self report measures of Inattention and Hyperactivity-Impulsivity from the Strengths and Weaknesses of ADHD Symptoms and Normal Behaviour (SWAN) scales. Our sample consisted of healthy young adult twins (N = 627 (63% females) including 95 MZ and 144 DZ twin pairs, mean age 22, who had mother-reported SWAN; N = 725 (58% females) including 101 MZ and 156 DZ pairs, mean age 25, with self reported SWAN). This design enabled us to distinguish genetic from environmental factors in the association between head movement and ADHD scales. HM was moderately correlated with maternal reports of Inattention (r = 0.17, p-value = 7.4E-5) and Hyperactivity-Impulsivity (r = 0.16, p-value = 2.9E-4), and these associations were mainly due to pleiotropic genetic factors with genetic correlations [95% CIs] of rg = 0.24 [0.02, 0.43] and rg = 0.23 [0.07, 0.39]. Correlations between self-reports and HM were not significant, due largely to increased measurement error. These results indicate that treating HM as a nuisance covariate in neuroimaging studies of ADHD will likely reduce power to detect between-group effects, as the implicit assumption of independence between HM and Inattention or Hyperactivity-Impulsivity is not warranted. The implications of this finding are problematic for fMRI studies of ADHD, as failing to apply HM correction is known to increase the likelihood of false positives. We discuss two ways to circumvent this problem: censoring the motion contaminated frames of the RS-fMRI scan or explicitly modeling the relationship between HM and Inattention or Hyperactivity-Impulsivity
Resumo:
- Objective To compare health service cost and length of stay between a traditional and an accelerated diagnostic approach to assess acute coronary syndromes (ACS) among patients who presented to the emergency department (ED) of a large tertiary hospital in Australia. - Design, setting and participants This historically controlled study analysed data collected from two independent patient cohorts presenting to the ED with potential ACS. The first cohort of 938 patients was recruited in 2008–2010, and these patients were assessed using the traditional diagnostic approach detailed in the national guideline. The second cohort of 921 patients was recruited in 2011–2013 and was assessed with the accelerated diagnostic approach named the Brisbane protocol. The Brisbane protocol applied early serial troponin testing for patients at 0 and 2 h after presentation to ED, in comparison with 0 and 6 h testing in traditional assessment process. The Brisbane protocol also defined a low-risk group of patients in whom no objective testing was performed. A decision tree model was used to compare the expected cost and length of stay in hospital between two approaches. Probabilistic sensitivity analysis was used to account for model uncertainty. - Results Compared with the traditional diagnostic approach, the Brisbane protocol was associated with reduced expected cost of $1229 (95% CI −$1266 to $5122) and reduced expected length of stay of 26 h (95% CI −14 to 136 h). The Brisbane protocol allowed physicians to discharge a higher proportion of low-risk and intermediate-risk patients from ED within 4 h (72% vs 51%). Results from sensitivity analysis suggested the Brisbane protocol had a high chance of being cost-saving and time-saving. - Conclusions This study provides some evidence of cost savings from a decision to adopt the Brisbane protocol. Benefits would arise for the hospital and for patients and their families.
Resumo:
An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.
Resumo:
A ternary metal complex involving Vitamin B6 with the formula [Cu(bipy)(pn) (OH)]H2O (bipy = 2,2'²-bipyridine, PN = anionic pyridoxine) has been synthesized and studied in the solid state by means of spectroscopy and X-ray crystallography. The geometry around copper(II) is distorted square pyramidal, two oxygens from phenolic and 4-(hydroxymethyl) groups of pn, two nitrogens from bipy and an axial OH- ion forming the coordination sphere. In this structure pn exists in a new anionic form with deprotonation of the phenolic group. The structure also provides a rare example of monodentate hydroxyl coordination to copper.
Resumo:
Background The Global Burden of Diseases (GBD), Injuries, and Risk Factors study used the disability-adjusted life year (DALY) to quantify the burden of diseases, injuries, and risk factors. This paper provides an overview of injury estimates from the 2013 update of GBD, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country. Methods Injury mortality was estimated using the extensive GBD mortality database, corrections for ill-defined cause of death and the cause of death ensemble modelling tool. Morbidity estimation was based on inpatient and outpatient data sets, 26 cause-of-injury and 47 nature-of-injury categories, and seven follow-up studies with patient-reported long-term outcome measures. Results In 2013, 973 million (uncertainty interval (UI) 942 to 993) people sustained injuries that warranted some type of healthcare and 4.8 million (UI 4.5 to 5.1) people died from injuries. Between 1990 and 2013 the global age-standardised injury DALY rate decreased by 31% (UI 26% to 35%). The rate of decline in DALY rates was significant for 22 cause-of-injury categories, including all the major injuries. Conclusions Injuries continue to be an important cause of morbidity and mortality in the developed and developing world. The decline in rates for almost all injuries is so prominent that it warrants a general statement that the world is becoming a safer place to live in. However, the patterns vary widely by cause, age, sex, region and time and there are still large improvements that need to be made.