361 resultados para Temperature-related health
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Purpose: To investigate whether wearing different presbyopic refractive corrections alters the pattern of eye and head movements when searching for dynamic targets in driving-related traffic scenes. Methods: Eye and head movements of 20 presbyopes (mean age = 56.2 ± 5.7 years), who had no experience of wearing presbyopic corrections or were unadapted wearers were recorded using the faceLABTM eye and head tracker, while wearing five different corrections: single vision lenses (SV), progressive addition lenses (PALs), bifocal spectacles (BIF), monovision and multifocal contact lenses (MTF CLs) in random order (within-subjects comparison). Recorded traffic scenes of suburban roads and expressways with edited targets were viewed as dynamic stimuli. Results: The magnitude of eye and head movements was significantly greater for SV, BIF and PALs than monovision and MTF CLs (p < 0.001). In addition, BIF wear led to more eye movements than PAL wear (p = 0.017), while PAL wear resulted in greater head movements than SV wear (p = 0.018). The ratio of eye to head movement was smaller for PALs than all other groups (p < 0.001). The number of saccades made to fixate a target was significantly higher for BIF and PALs than monovision or MTF CLs (p < 0.05). Conclusions: Different presbyopic corrections can alter eye and head movement patterns. Wearing spectacles such as BIF and PALs produced relatively greater eye and head movements and saccades when viewing dynamic targets. The impact of these changes in eye and head movement patterns may have implications for driving performance under real world driving conditions.
Resumo:
Purpose: To investigate whether wearing different presbyopic vision corrections alters the pattern of eye and head movements when viewing dynamic driving-related traffic scenes. Methods: Participants included 20 presbyopes (mean age: 56±5.7 years) who had no experience of wearing presbyopic vision corrections (i.e. all were single vision wearers). Eye and head movements were recorded while wearing five different vision corrections: single vision lenses (SV), progressive addition spectacle lenses (PALs), bifocal spectacle lenses (BIF), monovision (MV) and multifocal contact lenses (MTF CL) in random order. Videotape recordings of traffic scenes of suburban roads and expressways (with edited targets) were presented as dynamic driving-related stimuli and digital numeric display panels included as near visual stimuli (simulating speedometer and radio). Eye and head movements were recorded using the faceLAB™ system and the accuracy of target identification was also recorded. Results: The magnitude of eye movements while viewing the driving-related traffic scenes was greater when wearing BIF and PALs than MV and MTF CL (p≤0.013). The magnitude of head movements was greater when wearing SV, BIF and PALs than MV and MTF CL (p<0.0001) and the number of saccades was significantly higher for BIF and PALs than MV (p≤0.043). Target recognition accuracy was poorer for all vision corrections when the near stimulus was located at eccentricities inferiorly and to the left, rather than directly below the primary position of gaze (p=0.008), and PALs gave better performance than MTF CL (p=0.043). Conclusions: Different presbyopic vision corrections alter eye and head movement patterns. In particular, the larger magnitude of eye and head movements and greater number of saccades associated with the spectacle presbyopic corrections, may impact on driving performance.
Resumo:
Objective: To quantify the extent to which alcohol related injuries are adequately identified in hospitalisation data using ICD-10-AM codes indicative of alcohol involvement. Method: A random sample of 4373 injury-related hospital separations from 1 July 2002 to 30 June 2004 were obtained from a stratified random sample of 50 hospitals across 4 states in Australia. From this sample, cases were identified as involving alcohol if they contained an ICD-10-AM diagnosis or external cause code referring to alcohol, or if the text description extracted from the medical records mentioned alcohol involvement. Results: Overall, identification of alcohol involvement using ICD codes detected 38% of the alcohol-related sample, whilst almost 94% of alcohol-related cases were identified through a search of the text extracted from the medical records. The resultant estimate of alcohol involvement in injury-related hospitalisations in this sample was 10%. Emergency department records were the most likely to identify whether the injury was alcohol-related with almost three-quarters of alcohol-related cases mentioning alcohol in the text abstracted from these records. Conclusions and Implications: The current best estimates of the frequency of hospital admissions where alcohol is involved prior to the injury underestimate the burden by around 62%. This is a substantial underestimate that has major implications for public policy, and highlights the need for further work on improving the quality and completeness of routine administrative data sources for identification of alcohol-related injuries.
Resumo:
Background: Work-related injuries in Australia are estimated to cost around $57.5 billion annually, however there are currently insufficient surveillance data available to support an evidence-based public health response. Emergency departments (ED) in Australia are a potential source of information on work-related injuries though most ED’s do not have an ‘Activity Code’ to identify work-related cases with information about the presenting problem recorded in a short free text field. This study compared methods for interrogating text fields for identifying work-related injuries presenting at emergency departments to inform approaches to surveillance of work-related injury.---------- Methods: Three approaches were used to interrogate an injury description text field to classify cases as work-related: keyword search, index search, and content analytic text mining. Sensitivity and specificity were examined by comparing cases flagged by each approach to cases coded with an Activity code during triage. Methods to improve the sensitivity and/or specificity of each approach were explored by adjusting the classification techniques within each broad approach.---------- Results: The basic keyword search detected 58% of cases (Specificity 0.99), an index search detected 62% of cases (Specificity 0.87), and the content analytic text mining (using adjusted probabilities) approach detected 77% of cases (Specificity 0.95).---------- Conclusions The findings of this study provide strong support for continued development of text searching methods to obtain information from routine emergency department data, to improve the capacity for comprehensive injury surveillance.
Resumo:
Introduction. In adults, oral health has been shown to worsen during critical illness as well as influence systemic health. There is a paucity of paediatric critical care research in the area of oral health; hence the purpose of the Critically ill Children’s Oral Health (CCOH) study is to describe the status of oral health of critically ill children over time spent in the paediatric intensive care unit (PICU). The study will also examine the relationship between poor oral health and a variety of patient characteristics and PICU therapies and explore the relationship between dysfunctional oral health and PICU related Healthcare-Associated Infections (HAI). Method. An observational study was undertaken at a single tertiary-referral PICU. Oral health was measured using the Oral Assessment Scale (OAS) and culturing oropharyngeal flora. Information was also collected surrounding the use of supportive therapies, clinical characteristics of the children and the occurrence of PICU related HAI. Results. Forty-six participants were consecutively recruited to the CCOH study. Of the participants 63% (n=32) had oral dysfunction while 41% (n=19) demonstrated pathogenic oropharyngeal colonisation during their critical illness. The potential systemic pathogens isolated from the oropharynx and included Candida sp., Staphylococcus aureus, Haemophilus influenzae, Enterococcus sp. and Pseudomonas aeruginosa. The severity of critical illness had a significant positive relationship (p=0.046) with pathogenic and absent colonisation of the oropharynx. Sixty-three percent of PICU-related HAI involved the preceding or simultaneous colonisation of the oropharynx by the causative pathogen. Conclusion. Given the prevalence of poor oral health during childhood critical illness and the subsequent potential systemic consequences, evidence based oral hygiene practices should be developed and validated to guide clinicians when nursing critically ill children.
Resumo:
The increase of life expectancy worldwide during the last three decades has increased age-related disability leading to the risk of loss of quality of life. How to improve quality of life including physical health and mental health for older people and optimize their life potential has become an important health issue. This study used the Theory of Planned Behaviour Model to examine factors influencing health behaviours, and the relationship with quality of life. A cross-sectional mailed survey of 1300 Australians over 50 years was conducted at the beginning of 2009, with 730 completed questionnaires returned (response rate 63%). Preliminary analysis reveals that physiological changes of old age, especially increasing waist circumference and co morbidity was closely related to health status, especially worse physical health summary score. Physical activity was the least adherent behaviour among the respondents compared to eating healthy food and taking medication regularly as prescribed. Increasing number of older people living alone with co morbidity of disease may be the barriers that influence their attitude and self control toward physical activity. A multidisciplinary and integrated approach including hospital and non hospital care is required to provide appropriate services and facilities toward older people.
Resumo:
Alcohol and drug dependency is a widespread health and social issue encountered by registered nurses in contemporary practice. A study aiming to describe the experiences of registered nurses working in an alcohol and drug unit in South East Queensland was implemented. Data were analysed via Giorgi’s phenomenological method and an unexpected but significant finding highlighted the frustration felt by registered nurses regarding experiences of stigma they identified in their daily work encounters. Secondary analysis confirmed the phenomenon of stigma with three themes: (1) inappropriate judgement; (2) advocacy; and (3) education. Resultantly, findings concluded registered nurses’ working in this field need to become advocates for their clients, ensuring professional conduct is upheld at all times. This paper recommends that stigma could be addressed by incorporating alcohol and other drug dependency subjects and clinical placements into the curriculum of the Bachelor of Nursing degrees, and in-services for all practising registered nurses.
Resumo:
Negative mood regulation (NMR) expectancies have been linked to substance problems in previous research, but the neurobiological correlates of NMR are unknown. In the present study, NMR was examined in relation to self-report indices of frontal lobe functioning, mood and alcohol use in 166 volunteers of both genders who ranged in age from 17 to 43 years. Contrary to expectations based on previous findings in addicts and problem drinkers, scores on the NMR scale did not differ between Low Risk and High Risk drinkers as defined by the Alcohol Use Disorders Identification Test (AUDIT). However, NMR scores were significantly negatively correlated with all three indices of frontal lobe dysfunction on the Frontal Systems Behavior Scale (FrSBe) Self-Rating Form as well as with all three indices of negative mood on the Depression Anxiety Stress Scales (DASS), which in turn were all positively correlated with FrSBe. Path analyses indicated that NMR partially mediated the direct effects of frontal lobe dysfunction (as indexed by FrSBe) on DASS Stress and DASS Depression. Further, the High Risk drinkers scored significantly higher on the Disinhibition and Executive Dysfunction indices of the FrSBe than did Low Risk drinkers. Results are consistent with the notion that NMR is a frontal lobe function.
Resumo:
Chlamydia trachomatis sexually transmitted infection can cause serious reproductive morbidities. This study determined the prevalence of serum IgG response to C. trachomatis putative stress response proteins in females to test for an association with genital tract pathology. There was no significant association of serum IgG to HtrA, Tsp, or RseP with infection or pathology. cHSP60 serum IgG prevalence was significantly associated with infection compared to negative (infertile) controls (p = 0.002), but not with upper genital tract pathology. Serum IgG1-4 antibody subclasses reactive with the antigens was not significantly different between cohorts, although different responses to each antigen were detected.
Resumo:
Hot and cold temperatures significantly increase mortality rates around the world, but which measure of temperature is the best predictor of mortality is not known. We used mortality data from 107 US cities for the years 1987–2000 and examined the association between temperature and mortality using Poisson regression and modelled a non-linear temperature effect and a non-linear lag structure. We examined mean, minimum and maximum temperature with and without humidity, and apparent temperature and the Humidex. The best measure was defined as that with the minimum cross-validated residual. We found large differences in the best temperature measure between age groups, seasons and cities, and there was no one temperature measure that was superior to the others. The strong correlation between different measures of temperature means that, on average, they have the same predictive ability. The best temperature measure for new studies can be chosen based on practical concerns, such as choosing the measure with the least amount of missing data.
Resumo:
The field of collaborative health planning faces significant challenges created by the narrow focus of the available information, the absence of a framework to organise that information and the lack of systems to make information accessible and guide decision-making. These challenges have been magnified by the rise of the ‘healthy communities movement’, as a result of which, there have been more frequent calls for localised, collaborative and evidence-driven health related decision-making. This paper discusses the role of decision support systems as a mechanism to facilitate collaborative health decision-making. The paper presents a potential information management framework to underpin a health decision support system and describes the participatory process that is currently being used to create an online tool for health planners using geographic information systems. The need for a comprehensive information management framework to guide the process of planning for healthy communities has been emphasised. The paper also underlines the critical importance of the proposed framework not only in forcing planners to engage with the entire range of health determinants, but also in providing sufficient flexibility to allow exploration of the local setting-based determinants of health.
Resumo:
The performance criteria of piezoelectric polymers based on polyvinylidene flouride (PVDF) in complex space environments have been evaluated. Thin films of these materials are being explored as in-situ responsive materials for large aperture space-based telescopes with the shape deformation and optical features dependent on long-term deformation and optical features dependent on long-term degradation effects, mainly due to thermal cycling, vacuum UV exposure and atomic oxygen. A summary of previous studies related to materials testing and performance prediction based on a laboratory environment is presented. The degradation pathways are a combination of molecular chemical changes primarily induced via radiative damage and physical degradation processes due to temperature and atomic oxygen exposure resulting in depoling, loss of orientation and surface erosing. Experimental validation for these materials to be used in space is being conducted as part of MISSE-6 (Materials International Space Station Experiment) with an overview of the experimental strategies discussed here.
Resumo:
Context: The magnitude of exercise-induced weight loss depends on the extent of compensatory responses. An increase in energy intake is likely to result from changes in the appetite control system toward an orexigenic environment; however, few studies have measured how exercise impacts on both orexigenic and anorexigenic peptides. ---------- Objective: The aim of the study was to investigate the effects of medium-term exercise on fasting/postprandial levels of appetite-related hormones and subjective appetite sensations in overweight/obese individuals. ---------- Design and Setting: We conducted a longitudinal study in a university research center. ---------- Participants and Intervention: Twenty-two sedentary overweight/obese individuals (age, 36.9 ± 8.3 yr; body mass index, 31.3 ± 3.3 kg/m2) took part in a 12-wk supervised exercise programme (five times per week, 75% maximal heart rate) and were requested not to change their food intake during the study. ---------- Main Outcome Measures: We measured changes in body weight and fasting/postprandial plasma levels of glucose, insulin, total ghrelin, acylated ghrelin (AG), peptide YY, and glucagon-like peptide-1 and feelings of appetite. ---------- Results: Exercise resulted in a significant reduction in body weight and fasting insulin and an increase in AG plasma levels and fasting hunger sensations. A significant reduction in postprandial insulin plasma levels and a tendency toward an increase in the delayed release of glucagon-like peptide-1 (90–180 min) were also observed after exercise, as well as a significant increase (127%) in the suppression of AG postprandially. ---------- Conclusions: Exercise-induced weight loss is associated with physiological and biopsychological changes toward an increased drive to eat in the fasting state. However, this seems to be balanced by an improved satiety response to a meal and improved sensitivity of the appetite control system.
Resumo:
Advances in symptom management strategies through a better understanding of cancer symptom clusters depend on the identification of symptom clusters that are valid and reliable. The purpose of this exploratory research was to investigate alternative analytical approaches to identify symptom clusters for patients with cancer, using readily accessible statistical methods, and to justify which methods of identification may be appropriate for this context. Three studies were undertaken: (1) a systematic review of the literature, to identify analytical methods commonly used for symptom cluster identification for cancer patients; (2) a secondary data analysis to identify symptom clusters and compare alternative methods, as a guide to best practice approaches in cross-sectional studies; and (3) a secondary data analysis to investigate the stability of symptom clusters over time. The systematic literature review identified, in 10 years prior to March 2007, 13 cross-sectional studies implementing multivariate methods to identify cancer related symptom clusters. The methods commonly used to group symptoms were exploratory factor analysis, hierarchical cluster analysis and principal components analysis. Common factor analysis methods were recommended as the best practice cross-sectional methods for cancer symptom cluster identification. A comparison of alternative common factor analysis methods was conducted, in a secondary analysis of a sample of 219 ambulatory cancer patients with mixed diagnoses, assessed within one month of commencing chemotherapy treatment. Principal axis factoring, unweighted least squares and image factor analysis identified five consistent symptom clusters, based on patient self-reported distress ratings of 42 physical symptoms. Extraction of an additional cluster was necessary when using alpha factor analysis to determine clinically relevant symptom clusters. The recommended approaches for symptom cluster identification using nonmultivariate normal data were: principal axis factoring or unweighted least squares for factor extraction, followed by oblique rotation; and use of the scree plot and Minimum Average Partial procedure to determine the number of factors. In contrast to other studies which typically interpret pattern coefficients alone, in these studies symptom clusters were determined on the basis of structure coefficients. This approach was adopted for the stability of the results as structure coefficients are correlations between factors and symptoms unaffected by the correlations between factors. Symptoms could be associated with multiple clusters as a foundation for investigating potential interventions. The stability of these five symptom clusters was investigated in separate common factor analyses, 6 and 12 months after chemotherapy commenced. Five qualitatively consistent symptom clusters were identified over time (Musculoskeletal-discomforts/lethargy, Oral-discomforts, Gastrointestinaldiscomforts, Vasomotor-symptoms, Gastrointestinal-toxicities), but at 12 months two additional clusters were determined (Lethargy and Gastrointestinal/digestive symptoms). Future studies should include physical, psychological, and cognitive symptoms. Further investigation of the identified symptom clusters is required for validation, to examine causality, and potentially to suggest interventions for symptom management. Future studies should use longitudinal analyses to investigate change in symptom clusters, the influence of patient related factors, and the impact on outcomes (e.g., daily functioning) over time.