711 resultados para CI
Resumo:
OBJECTIVES: Ecological studies have suggested an inverse relationship between latitude and risks of some cancers. However, associations between solar ultraviolet radiation (UVR) exposure and esophageal cancer risk have not been fully explored. We therefore investigated the association between nevi, freckles, and measures of ambient UVR over the life-course with risks of esophageal cancers. METHODS: We compared estimated lifetime residential ambient UVR among Australian patients with esophageal cancer (330 esophageal adenocarcinoma (EAC), 386 esophago-gastric junction adenocarcinoma (EGJAC), and 279 esophageal squamous cell carcinoma (ESCC)), and 1471 population controls. We asked people where they had lived at different periods of their life, and assigned ambient UVR to each location based on measurements from NASA's Total Ozone Mapping Spectrometer database. Freckling and nevus burden were self-reported. We used multivariable logistic regression models to estimate the magnitude of associations between phenotype, ambient UVR, and esophageal cancer risk. RESULTS: Compared with population controls, patients with EAC and EGJAC were less likely to have high levels of estimated cumulative lifetime ambient UVR (EAC odds ratio (OR) 0.59, 95% confidence interval (CI) 0.35-0.99, EGJAC OR 0.55, 0.34-0.90). We found no association between UVR and risk of ESCC (OR 0.91, 0.51-1.64). The associations were independent of age, sex, body mass index, education, state of recruitment, frequency of reflux, smoking status, alcohol consumption, and H. pylori serostatus. Cases with EAC were also significantly less likely to report high levels of nevi than controls. CONCLUSIONS: These data show an inverse association between ambient solar UVR at residential locations and risk of EAC and EGJAC, but not ESCC.
Resumo:
The medical records of 273 patients 75 years and older were reviewed to evaluate quality of emergency department (ED) care through the use of quality indicators. One hundred fifty records contained evidence of an attempt to carry out a cognitive assessment. Documented evidence of cognitive impairment (CI) was reported in 54 cases. Of these patients, 30 had no documented evidence of an acute change in cognitive function from baseline; of 26 patients discharged home with preexisting CI (i.e., no acute change from baseline), 15 had no documented evidence of previous consideration of this issue by a health care provider; and 12 of 21 discharged patients who screened positive for cognitive issues for the first time were not referred for outpatient evaluation. These findings suggest that the majority of older adults in the ED are not receiving a formal cognitive assessment, and more than half with CI do not receive quality of care according to the quality indicators for geriatric emergency care. Recommendations for improvement are discussed.
Resumo:
What is the contribution of innovation brokers in leveraging research and development (R&D) investment to enhance industry-wide capabilities? The case of the Australian Cooperative Research Centre for Construction Innovation (CRC CI) is considered in the context of motivating supply chain firms to improve their organizational capabilities in order to acquire, assimilate, transfer and exploit R&D outcomes to their advantage, and to create broader industry and national benefits. A previous audit and analysis has shown an increase in business R&D investment since 2001. The role of the CRC CI in contributing to growth in the absorptive capacity of the Australian construction industry as a whole is illustrated through two programmes: digital modelling building information modelling (BIM) and construction site safety. Numerous positive outcomes in productivity, quality, improved safety and competitiveness were achieved between 2001 and 2009.
Resumo:
Background & aims The Australasian Nutrition Care Day Survey (ANCDS) ascertained if malnutrition and poor food intake are independent risk factors for health-related outcomes in Australian and New Zealand hospital patients. Methods Phase 1 recorded nutritional status (Subjective Global Assessment) and 24-h food intake (0, 25, 50, 75, 100% intake). Outcomes data (Phase 2) were collected 90-days post-Phase 1 and included length of hospital stay (LOS), readmissions and in-hospital mortality. Results Of 3122 participants (47% females, 65 ± 18 years) from 56 hospitals, 32% were malnourished and 23% consumed ≤ 25% of the offered food. Malnourished patients had greater median LOS (15 days vs. 10 days, p < 0.0001) and readmissions rates (36% vs. 30%, p = 0.001). Median LOS for patients consuming ≤ 25% of the food was higher than those consuming ≤ 50% (13 vs. 11 days, p < 0.0001). The odds of 90-day in-hospital mortality were twice greater for malnourished patients (CI: 1.09–3.34, p = 0.023) and those consuming ≤ 25% of the offered food (CI: 1.13–3.51, p = 0.017), respectively. Conclusion The ANCDS establishes that malnutrition and poor food intake are independently associated with in-hospital mortality in the Australian and New Zealand acute care setting.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
We review the theory of intellectual property (IP) in the creative industries (CI) from the evolutionary economic perspective based on evidence from China. We argue that many current confusions and dysfunctions about IP can be traced to three widely overlooked aspects of the growth of knowledge context of IP in the CI: (1) the effect of globalization; (2) the dominating relative economic value of reuse of creative output over monopoly incentives to create input; and (3) the evolution of business models in response to institutional change. We conclude that a substantial weakening of copyright will, in theory, produce positive net public and private gain due to the evolutionary dynamics of all three dimensions.
Resumo:
Background: Chronic leg ulcers cause long term ill-health for older adults and the condition places a significant burden on health service resources. Although evidence on effective management of the condition is available, a significant evidence-practice gap is known to exist, with many suggested reasons e.g. multiple care providers, costs of care and treatments. This study aimed to identify effective health service pathways of care which facilitated evidence-based management of chronic leg ulcers. Methods: A sample of 70 patients presenting with a lower limb leg or foot ulcer at specialist wound clinics in Queensland, Australia were recruited for an observational study and survey. Retrospective data were collected on demographics, health, medical history, treatments, costs and health service pathways in the previous 12 months. Prospective data were collected on health service pathways, pain, functional ability, quality of life, treatments, wound healing and recurrence outcomes for 24 weeks from admission. Results: Retrospective data indicated that evidence based guidelines were poorly implemented prior to admission to the study, e.g. only 31% of participants with a lower limb ulcer had an ABPI or duplex assessment in the previous 12 months. On average, participants accessed care 2–3 times/week for 17 weeks from multiple health service providers in the twelve months before admission to the study clinics. Following admission to specialist wound clinics, participants accessed care on average once per week for 12 weeks from a smaller range of providers. The median ulcer duration on admission to the study was 22 weeks (range 2–728 weeks). Following admission to wound clinics, implementation of key indicators of evidence based care increased (p<0.001) and Kaplan-Meier survival analysis found the median time to healing was 12 weeks (95% CI 9.3–14.7). Implementation of evidence based care was significantly related to improved healing outcomes (p<0.001). Conclusions: This study highlights the complexities involved in accessing expertise and evidence based wound care for adults with chronic leg or foot ulcers. Results demonstrate that access to wound management expertise can promote streamlined health services and evidence based wound care, leading to efficient use of health resources and improved health.
Resumo:
Objective: To establish risk factors for moderate and severe microbial keratitis among daily contact lens (CL) wearers in Australia. Design: A prospective, 12-month, population-based, case-control study. Participants: New cases of moderate and severe microbial keratitis in daily wear CL users presenting in Australia over a 12-month period were identified through surveillance of all ophthalmic practitioners. Case detection was augmented by record audits at major ophthalmic centers. Controls were users of daily wear CLs in the community identified using a national telephone survey. Testing: Cases and controls were interviewed by telephone to determine subject demographics and CL wear history. Multiple binary logistic regression was used to determine independent risk factors and univariate population attributable risk percentage (PAR%) was estimated for each risk factor.; Main Outcome Measures: Independent risk factors, relative risk (with 95% confidence intervals [CIs]), and PAR%. Results: There were 90 eligible moderate and severe cases related to daily wear of CLs reported during the study period. We identified 1090 community controls using daily wear CLs. Independent risk factors for moderate and severe keratitis while adjusting for age, gender, and lens material type included poor storage case hygiene 6.4× (95% CI, 1.9-21.8; PAR, 49%), infrequent storage case replacement 5.4× (95% CI, 1.5-18.9; PAR, 27%), solution type 7.2× (95% CI, 2.3-22.5; PAR, 35%), occasional overnight lens use (<1 night per week) 6.5× (95% CI, 1.3-31.7; PAR, 23%), high socioeconomic status 4.1× (95% CI, 1.2-14.4; PAR, 31%), and smoking 3.7× (95% CI, 1.1-12.8; PAR, 31%). Conclusions: Moderate and severe microbial keratitis associated with daily use of CLs was independently associated with factors likely to cause contamination of CL storage cases (frequency of storage case replacement, hygiene, and solution type). Other factors included occasional overnight use of CLs, smoking, and socioeconomic class. Disease load may be considerably reduced by attention to modifiable risk factors related to CL storage case practice.
Resumo:
Background The mechanisms underlying socioeconomic inequalities in mortality from cardiovascular diseases (CVD) are largely unknown. We studied the contribution of childhood socioeconomic conditions and adulthood risk factors to inequalities in CVD mortality in adulthood. Methods The prospective GLOBE study was carried out in the Netherlands, with baseline data from 1991, and linked with the cause of death register in 2007. At baseline, participants reported on adulthood socioeconomic position (SEP) (own educational level), childhood socioeconomic conditions (occupational level of respondent’s father), and a broad range of adulthood risk factors (health behaviours, material circumstances, psychosocial factors). This present study is based on 5,395 men and 6,306 women, and the data were analysed using Cox regression models and hazard ratios (HR). Results A low adulthood SEP was associated with increased CVD mortality for men (HR 1.84; 95% CI: 1.41-2.39) and women (HR 1.80; 95%CI: 1.04-3.10). Those with poorer childhood socioeconomic conditions were more likely to die from CVD in adulthood, but this reached statistical significance only among men with the poorest childhood socioeconomic circumstances. About half of the investigated adulthood risk factors showed significant associations with CVD mortality among both men and women, namely renting a house, experiencing financial problems, smoking, physical activity and marital status. Alcohol consumption and BMI showed a U-shaped relationship with CVD mortality among women, with the risk being significantly greater for both abstainers and heavy drinkers, and among women who were underweight or obese. Among men, being single or divorced and using sleep/anxiety drugs increased the risk of CVD mortality. In explanatory models, the largest contributor to adulthood CVD inequalities were material conditions for men (42%; 95% CI: −73 to −20) and behavioural factors for women (55%; 95% CI: -191 to −28). Simultaneous adjustment for adulthood risk factors and childhood socioeconomic conditions attenuated the HR for the lowest adulthood SEP to 1.34 (95% CI: 0.99-1.82) for men and 1.19 (95% CI: 0.65-2.15) for women. Conclusions Adulthood material, behavioural and psychosocial factors played a major role in the explanation of adulthood SEP inequalities in CVD mortality. Childhood socioeconomic circumstances made a modest contribution, mainly via their association with adulthood risk factors. Policies and interventions to reduce health inequalities are likely to be most effective when considering the influence of socioeconomic circumstances across the entire life course and in particular, poor material conditions and unhealthy behaviours in adulthood.
Resumo:
Objective: To examine the association between individual- and neighborhood-level disadvantage and self-reported arthritis. Methods: We used data from a population-based cross-sectional study conducted in 2007 among 10,757 men and women ages 40–65 years, selected from 200 neighborhoods in Brisbane, Queensland, Australia using a stratified 2-stage cluster design. Data were collected using a mail survey (68.5% response). Neighborhood disadvantage was measured using a census-based composite index, and individual disadvantage was measured using self-reported education, household income, and occupation. Arthritis was indicated by self-report. Data were analyzed using multilevel modeling. Results: The overall rate of self-reported arthritis was 23% (95% confidence interval [95% CI] 22–24). After adjustment for sociodemographic factors, arthritis prevalence was greatest for women (odds ratio [OR] 1.5, 95% CI 1.4–1.7) and in those ages 60–65 years (OR 4.4, 95% CI 3.7–5.2), those with a diploma/associate diploma (OR 1.3, 95% CI 1.1–1.6), those who were permanently unable to work (OR 4.0, 95% CI 3.1–5.3), and those with a household income <$25,999 (OR 2.1, 95% CI 1.7–2.6). Independent of individual-level factors, residents of the most disadvantaged neighborhoods were 42% (OR 1.4, 95% CI 1.2–1.7) more likely than those in the least disadvantaged neighborhoods to self-report arthritis. Cross-level interactions between neighborhood disadvantage and education, occupation, and household income were not significant. Conclusion: Arthritis prevalence is greater in more socially disadvantaged neighborhoods. These are the first multilevel data to examine the relationship between individual- and neighborhood-level disadvantage upon arthritis and have important implications for policy, health promotion, and other intervention strategies designed to reduce the rates of arthritis, indicating that intervention efforts may need to focus on both people and places.
Resumo:
PURPOSE: To test the reliability of Timed Up and Go Tests (TUGTs) in cardiac rehabilitation (CR) and compare TUGTs to the 6-Minute Walk Test (6MWT) for outcome measurement. METHODS: Sixty-one of 154 consecutive community-based CR patients were prospectively recruited. Subjects undertook repeated TUGTs and 6MWTs at the start of CR (start-CR), postdischarge from CR (post-CR), and 6 months postdischarge from CR (6 months post-CR). The main outcome measurements were TUGT time (TUGTT) and 6MWT distance (6MWD). RESULTS: Mean (SD) TUGTT1 and TUGTT2 at the 3 assessments were 6.29 (1.30) and 5.94 (1.20); 5.81 (1.22) and 5.53 (1.09); and 5.39 (1.60) and 5.01 (1.28) seconds, respectively. A reduction in TUGTT occurred between each outcome point (P ≤ .002). Repeated TUGTTs were strongly correlated at each assessment, intraclass correlation (95% CI) = 0.85 (0.76–0.91), 0.84 (0.73–0.91), and 0.90 (0.83–0.94), despite a reduction between TUGTT1 and TUGTT2 of 5%, 5%, and 7%, respectively (P ≤ .006). Relative decreases in TUGTT1 (TUGTT2) occurred from start-CR to post-CR and from start-CR to 6 months post-CR of −7.5% (−6.9%) and −14.2% (−15.5%), respectively, while relative increases in 6MWD1 (6MWD2) occurred, 5.1% (7.2%) and 8.4% (10.2%), respectively (P < .001 in all cases). Pearson correlation coefficients for 6MWD1 to TUGTT1 and TUGTT2 across all times were −0.60 and −0.68 (P < .001) and the intraclass correlations (95% CI) for the speeds derived from averaged 6MWDs and TUGTTs were 0.65 (0.54, 0.73) (P < .001). CONCLUSIONS: Similar relative changes occurred for the TUGT and the 6MWT in CR. A significant correlation between the TUGTT and 6MWD was demonstrated, and we suggest that the TUGT may provide a related or a supplementary measurement of functional capacity in CR.
Resumo:
BACKGROUND: A long length of stay (LOS) in the emergency department (ED) associated with overcrowding has been found to adversely affect the quality of ED care. The objective of this study is to determine whether patients who speak a language other than English at home have a longer LOS in EDs compared to those whose speak only English at home. METHODS: A secondary data analysis of a Queensland state-wide hospital EDs dataset (Emergency Department Information System) was conducted for the period, 1 January 2008 to 31 December 2010. RESULTS: The interpreter requirement was the highest among Vietnamese speakers (23.1%) followed by Chinese (19.8%) and Arabic speakers (18.7%). There were significant differences in the distributions of the departure statuses among the language groups (Chi-squared=3236.88, P<0.001). Compared with English speakers, the Beta coeffi cient for the LOS in the EDs measured in minutes was among Vietnamese, 26.3 (95%CI: 22.1–30.5); Arabic, 10.3 (95%CI: 7.3–13.2); Spanish, 9.4 (95%CI: 7.1–11.7); Chinese, 8.6 (95%CI: 2.6–14.6); Hindi, 4.0 (95%CI: 2.2–5.7); Italian, 3.5 (95%CI: 1.6–5.4); and German, 2.7 (95%CI: 1.0–4.4). The fi nal regression model explained 17% of the variability in LOS. CONCLUSION: There is a close relationship between the language spoken at home and the LOS at EDs, indicating that language could be an important predictor of prolonged LOS in EDs and improving language services might reduce LOS and ease overcrowding in EDs in Queensland's public hospitals.
Resumo:
Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.
Resumo:
Background: Decreased ability to perform Activities of Daily Living (ADLs) during hospitalisation has negative consequences for patients and health service delivery. Objective: To develop an Index to stratify patients at lower and higher risk of a significant decline in ability to perform ADLs at discharge. Design: Prospective two cohort study comprising a derivation (n=389; mean age 82.3 years; SD� 7.1) and a validation cohort (n=153; mean age 81.5 years; SD� 6.1). Patients and setting: General medical patients aged = 70 years admitted to three university-affiliated acute care hospitals in Brisbane, Australia. Measurement and main results: The short ADL Scale was used to identify a significant decline in ability to perform ADLs from premorbid to discharge. In the derivation cohort, 77 patients (19.8%) experienced a significant decline. Four significant factors were identified for patients independent at baseline: 'requiring moderate assistance to being totally dependent on others with bathing'; 'difficulty understanding others (frequently or all the time)'; 'requiring moderate assistance to being totally dependent on others with performing housework'; a 'history of experiencing at least one fall in the previous 90 days prior to hospital admission' in addition to 'independent at baseline', which was protective against decline at discharge. 'Difficulty understanding others (frequently or all the time)' and 'requiring moderate assistance to being totally dependent on others with performing housework' were also predictors for patients dependent in ADLs at baseline. Sensitivity, specificity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) of the DADLD dichotomised risk scores were: 83.1% (95% CI 72.8; 90.7); 60.5% (95% CI 54.8; 65.9); 34.2% (95% CI 27.5; 41.5); 93.5% (95% CI 89.2; 96.5). In the validation cohort, 47 patients (30.7%) experienced a significant decline. Sensitivity, specificity, PPV and NPV of the DADLD were: 78.7% (95% CI 64.3; 89.3); 69.8% (95% CI 60.1, 78.3); 53.6% (95% CI 41.2; 65.7); 88.1% (95% CI 79.2; 94.1). Conclusions: The DADLD Index is a useful tool for identifying patients at higher risk of decline in ability to perform ADLs at discharge.
Resumo:
Objective The aim of this study was to examine the prevalence of overweight and obesity and the association with demographic, reproductive work variables in a representative cohort of working nurses and midwives. Design A cross sectional study of self reported survey data. Settings Australia, New Zealand and the United Kingdom. Methods Measurement outcomes included BMI categories, demographic (age, gender, marital status, ethnicity), reproductive (parity, number of births, mother's age at first birth, birth type and menopausal status) and workforce (registration council, employment type and principal specialty) variables. Participants 4996 respondents to the Nurses and Midwives e-Cohort study who were currently registered and working in nursing or midwifery in Australia (n=3144), New Zealand (n=778) or the United Kingdom (n=1074). Results Amongst the sample 61.87% were outside the healthy weight range and across all three jurisdictions the prevalence of obesity in nurses and midwives exceeded rates in the source populations by 1.73% up to 3.74%. Being overweight or obese was significantly associated with increasing age (35–44 yrs aOR 1.71, 95% CI 1.41–2.08; 45–55 yrs aOR 1.90, 95%CI 1.56–2.31; 55–64 aOR 2.22, 95% CI 1.71–2.88), and male gender (aOR 1.46, 95% CI 1.15–1.87). Primiparous nurses and midwives were more likely to be overweight or obese (aOR 1.37, 95% CI 1.06–1.76) as were those who had reached menopause (aOR 1.37, 95% CI 1.11–1.69). Nurses and midwives in part-time or casual employment had significantly reduced risk of being overweight or obese, (aOR 0.81, 95% CI 0.70–0.94 and aOR 0.75, 95% CI 0.59–0.96 respectively), whilst working in aged carried increased risk (aOR 1.37, 95% CI 1.04–1.80). Conclusion Nurses and midwives in this study have higher prevalence of obesity and overweight than the general population and those who are older, male, or female primiparous and menopausal have significantly higher risk of overweight or obesity as do those working fulltime, or in aged care. The consequences of overweight and obesity in this occupational group may impact on their workforce participation, their management of overweight and obese patients in their care as well as influencing their individual health behaviours and risks of occupational injury and chronic disease.