229 resultados para 1017
Resumo:
Study/Objective This study examines the current state of disaster response education for Australian paramedics from a national and international perspective and identifies both potential gaps in content and challenges to the sustainability of knowledge acquired through occasional training. Background As demands for domestic and international disaster response increase, experience in the field has begun to challenge traditional assumptions that response to mass casualty events requires little specialist training. The need for a “streamlined process of safe medical team deployment into disaster regions”1 is generally accepted and, in Australia, the emergence of national humanitarian aid training has begun to respond to this gap. However, calls for a national framework for disaster health education2 haven’t received much traction. Methods A critical analysis of the peer reviewed and grey literature on the core components/competencies and training methods required to prepare Australian paramedics to contribute to effective health disaster response has been conducted. Research from the past 10 years has been examined along with federal and state policy with regard to paramedic disaster education. Results The literature shows that education and training for disaster response is variable and that an evidence based study specifically designed to outline sets of core competencies for Australian health care professionals has never been undertaken. While such competencies in disaster response have been developed for the American paradigm it is suggested that disaster response within the Australian context is somewhat different to that of the US, and therefore a gap in the current knowledge base exists. Conclusion Further research is needed to develop core competencies specific to Australian paramedics in order to standardise teaching in the area of health disaster management. Until this occurs the task of evaluating or creating disaster curricula that adequately prepares and maintains paramedics for an effective all hazards disaster response is seen as largely unattainable.
Resumo:
Meat/meat alternatives (M/MA) are key sources of Fe, Zn and protein, but intake tends to be low in young children. Australian recommendations state that Fe-rich foods, including M/MA, should be the first complementary foods offered to infants. The present paper reports M/MA consumption of Australian infants and toddlers, compares intake with guidelines, and suggests strategies to enhance adherence to those guidelines. Mother–infant dyads recruited as part of the NOURISH and South Australian Infants Dietary Intake studies provided 3 d of intake data at three time points: Time 1 (T1) (n 482, mean age 5·5 (SD 1·1) months), Time 2 (T2) (n 600, mean age 14·0 (SD 1·2) months) and Time 3 (T3) (n 533, mean age 24 (SD 0·7) months). Of 170 infants consuming solids and aged greater than 6 months at T1, 50 (29 %) consumed beef, lamb, veal (BLV) or pork on at least one of 3 d. Commercial infant foods containing BLV or poultry were the most common form of M/MA consumed at T1, whilst by T2 BLV mixed dishes (including pasta bolognaise) became more popular and remained so at T3. The processed M/MA increased in popularity over time, led by pork (including ham). The present study shows that M/MA are not being eaten by Australian infants or toddlers regularly enough; or in adequate quantities to meet recommendations; and that the form in which these foods are eaten can lead to smaller M/MA serve sizes and greater Na intake. Parents should be encouraged to offer M/MA in a recognisable form, as one of the first complementary foods, in order to increase acceptance at a later age.
Resumo:
Background Dementia is a chronic illness without cure or effective treatment, which results in declining mental and physical function and assistance from others to manage activities of daily living. Many people with dementia live in long term care facilities, yet research into their quality of life (QoL) was rare until the last decade. Previous studies failed to incorporate important variables related to the facility and care provision or to look closely at the daily lives of residents. This paper presents a protocol for a comprehensive, multi-perspective assessment of QoL of residents with dementia living in long term care in Australia. A secondary aim is investigating the effectiveness of self-report instruments for measuring QoL. Methods The study utilizes a descriptive, mixed methods design to examine how facility, care staff, and resident factors impact QoL. Over 500 residents with dementia from a stratified, random sample of 53 facilities are being recruited. A sub-sample of 12 residents is also taking part in qualitative interviews and observations. Conclusions This national study will provide a broad understanding of factors underlying QoL for residents with dementia in long term care. The present study uses a similar methodology to the US-based Collaborative Studies of Long Term Care (CS-LTC) Dementia Care Study, applying it to the Australian setting.
Resumo:
This paper examines factors that affect the trade of recyclable waste in both exporting and importing countries. To this end, we employ two important elements: first, we adopt a gravity model in our empirical methodology; second, we select five waste and scrap commodities and undertake estimations using commodity-level trade data. We demonstrate that, the higher the wage/per capita GDP/population of an importing country, the more recyclable wastes it imports. This result suggests that the demand for final goods and, accordingly, the demand for materials including recycled material, have strong effects on the import volume of recyclable waste. Moreover, this implies that the imports of a developing country from developed countries increase with expanding industrial activity and economic growth. We find no evidence for a pollution haven for wastes and recycling.
Resumo:
This paper reports findings from an empirical study examining the influence of student background and educational experiences on the development of career choice capability. Secondary school students attending years 9-12 (N = 706) in New South Wales, Australia, were invited to participate in an online survey that sought to examine factors influencing their readiness to make career choice. The survey included questions relating to student demographics, parental occupation, attitudes to school and to learning, career aspirations, and students’ knowledge of the further education or skills required to achieve their desired goal. We found no significant differences in the proportions of students who were ‘uncertain’ of their future career aspirations with respect to their individual characteristics such as age and gender. There were, however, significant differences in relation to students’ family background, and their perceptions associated with own academic abilities and self-efficacy.
Resumo:
This article will discuss some of the findings from a qualitative research project that explored the connections between alternative education and Indigenous learners. This study investigated how flexi school leaders reported they were supporting Indigenous young people to remain engaged in education. The results of the survey provide demographic data focusing on Indigenous participation in this sample of flexi schools. The results revealed that a high number of Indigenous young people are participating in flexi schools within this sample. Furthermore, a high number of Indigenous staff members are working in multiple roles within these schools. The implications of these findings are twofold. First, the current Indigenous education policy environment is focused heavily on ‘Closing the Gap’, emphasising the urgent need for significant improvement of educational outcomes for Indigenous young people. The findings from this study propose that flexi schools are playing a significant role in supporting Indigenous young people to remain engaged in education, yet there remains a limited focus on this within the literature and education policy. Second, the high participation rates of Indigenous young people and staff suggest an urgent need to explore this context through research. Further research will assist in understanding the culture of the flexi school context. Research should also explore why a high number of Indigenous young people and staff members participate in this educational context and how this could influence the approach to engagement of Indigenous young people in conventional school settings.
Resumo:
Antioxidant requirements have neither been defined for endurance nor been defined for ultra-endurance athletes. To verify whether an acute bout of ultra-endurance exercise modifies the need for nutritive antioxidants, we aimed (1) to investigate the changes of endogenous and exogenous antioxidants in response to an Ironman triathlon; (2) to particularise the relevance of antioxidant responses to the indices of oxidatively damaged blood lipids, blood cell compounds and lymphocyte DNA and (3) to examine whether potential time-points of increased susceptibility to oxidative damage are associated with alterations in the antioxidant status. Blood that was collected from forty-two well-trained male athletes 2 d pre-race, immediately post-race, and 1, 5 and 19 d later was sampled. The key findings of the present study are as follows: (1) Immediately post-race, vitamin C, alpha-tocopherol, and levels of the Trolox equivalent antioxidant capacity, the ferric reducing ability of plasma and the oxygen radical absorbance capacity (ORAC) assays increased significantly. Exercise-induced changes in the plasma antioxidant capacity were associated with changes in uric acid, bilirubin and vitamin C. (2) Significant inverse correlations between ORAC levels and indices of oxidatively damaged DNA immediately and 1 d post-race suggest a protective role of the acute antioxidant responses in DNA stability. (3) Significant decreases in carotenoids and gamma-tocopherol 1 d post-race indicate that the antioxidant intake during the first 24 h of recovery following an acute ultra-endurance exercise requires specific attention. Furthermore, the present study illustrates the importance of a diversified and well-balanced diet to maintain a physiological antioxidant status in ultra-endurance athletes in reference to recommendations.
Resumo:
Background Depression is a common psychiatric disorder in older people. The study aimed to examine the screening accuracy of the Geriatric Depression Scale (GDS) and the Collateral Source version of the Geriatric Depression Scale (CS-GDS) in the nursing home setting. Methods Eighty-eight residents from 14 nursing homes were assessed for depression using the GDS and the CS-GDS, and validated against clinician diagnosed depression using the Semi-structured Clinical Diagnostic Interview for DSM-IV-TR Axis I Disorders (SCID) for residents without dementia and the Provisional Diagnostic Criteria for Depression in Alzheimer Disease (PDCdAD) for those with dementia. The screening performances of five versions of the GDS (30-, 15-, 10-, 8-, and 4-item) and two versions of the CS-GDS (30- and 15-item) were analyzed using receiver operating characteristic (ROC) curves. Results Among residents without dementia, both the self-rated (AUC = 0.75–0.79) and proxy-rated (AUC = 0.67) GDS variations performed significantly better than chance in screening for depression. However, neither instrument adequately identified depression among residents with dementia (AUC between 0.57 and 0.70). Among the GDS variations, the 4- and 8-item scales had the highest AUC and the optimal cut-offs were >0 and >3, respectively. Conclusions The validity of the GDS in detecting depression requires a certain level of cognitive functioning. While the CS-GDS is designed to remedy this issue by using an informant, it did not have adequate validity in detecting depression among residents with dementia. Further research is needed on informant selection and other factors that can potentially influence the validity of proxy-based measures in the nursing home setting.
Resumo:
A lack of access to primary care services, decreasing numbers of general practitioners (GPs) and free of charge visits have been cited as factors contributing to the rising demand on emergency departments. This study aims to investigate the sources of patients' referrals to emergency departments and track changes in the source of referral over a six-year period in Queensland. Data from Queensland Emergency Departments Information Systems were analyzed based on records from 21 hospitals for the periods 2003–04 to 2008–09. The emergency department data were compared with publicly available data on GPs services and patients attendance rates. In Queensland, the majority of patients are self-referred and a 6.6% growth between 2003–04 and 2008–09 (84.4% to 90% respectively) has been observed. The number of referrals made by GPs, hospitals and community services decreased by 29.4%, 40%, 42% respectively during the six-year period. The full-time workload equivalent GPs per 100,000 people increased by 4.5% and the number of GP attendances measured per capita rose by 4% (4.25 to 4.42). An examination of changes in the triage category of self-referred patients revealed an increase in triage category 1-3 by 60%, 36.2%, and 14.4% respectively. The number of self-referred patients in triage categories 4–5 decreased by 10.5% and 21.9% respectively. The results of this analysis reveal that although the number of services provided by GPs increased, the amount of referrals decreased, and the proportion of self-referred patients to emergency departments rose during the six-year period. In addition, a growth in urgent triage categories (1–3) has been observed, with a decline in the number of non-urgent categories (4–5) among patients who came directly to emergency departments. Understanding the reasons behind this situation is crucial for appropriate demand management. Possible explanations will be sought and presented based on patients' responses to an emergency department users' questionnaire.
Resumo:
Background In the emergency department, portable point-of-care testing (POCT) coagulation devices may facilitate stroke patient care by providing rapid International Normalized Ratio (INR) measurement. The objective of this study was to evaluate the reliability, validity, and impact on clinical decision-making of a POCT device for INR testing in the setting of acute ischemic stroke (AIS). Methods A total of 150 patients (50 healthy volunteers, 51 anticoagulated patients, 49 AIS patients) were assessed in a tertiary care facility. The INR's were measured using the Roche Coaguchek S and the standard laboratory technique. Results The interclass correlation coefficient and 95% confidence interval between overall POCT device and standard laboratory value INRs was high (0.932 (0.69 - 0.78). In the AIS group alone, the correlation coefficient and 95% CI was also high 0.937 (0.59 - 0.74) and diagnostic accuracy of the POCT device was 94%. Conclusions When used by a trained health professional in the emergency department to assess INR in acute ischemic stroke patients, the CoaguChek S is reliable and provides rapid results. However, as concordance with laboratory INR values decreases with higher INR values, it is recommended that with CoaguChek S INRs in the > 1.5 range, a standard laboratory measurement be used to confirm the results.
Resumo:
In this paper we introduce a novel domain-invariant covariance normalization (DICN) technique to relocate both in-domain and out-domain i-vectors into a third dataset-invariant space, providing an improvement for out-domain PLDA speaker verification with a very small number of unlabelled in-domain adaptation i-vectors. By capturing the dataset variance from a global mean using both development out-domain i-vectors and limited unlabelled in-domain i-vectors, we could obtain domain- invariant representations of PLDA training data. The DICN- compensated out-domain PLDA system is shown to perform as well as in-domain PLDA training with as few as 500 unlabelled in-domain i-vectors for NIST-2010 SRE and 2000 unlabelled in-domain i-vectors for NIST-2008 SRE, and considerable relative improvement over both out-domain and in-domain PLDA development if more are available.
Resumo:
Background Stroke incidence has fallen since 1950. Recent trends suggest that stroke incidence may be stabilizing or increasing. We investigated time trends in stroke occurrence and in-hospital morbidity and mortality in the Calgary Health Region. Methods All patients admitted to hospitals in the Calgary Health Region between 1994 and 2002 with a primary discharge diagnosis code (ICD-9 or ICD-10) of stroke were included. In-hospital strokes were also included. Stroke type, date of admission, age, gender,discharge disposition (died, discharged) and in-hospital complications (pneumonia, pulmonary embolism, deep venous thrombosis) were recorded. Poisson and simple linear regression was used to model time trends of occurrence by stroke type and age-group and to extrapolate future time trends. Results From 1994 to 2002, 11642 stroke events were observed. Of these, 9879 patients (84.8%) were discharged from hospital, 1763 (15.1%) died in hospital, and 591 (5.1%) developed in-hospital complications from pneumonia, pulmonary embolism or deep venous thrombosis. Both in-hospital mortality and complication rates were highest for hemorrhages. Over the period of study, the rate of stroke admission has remained stable. However, total numbers of stroke admission to hospital have faced a significant increase (p=0.012) due to the combination of increases in intracerebral hemorrhage (p=0.021) and ischemic stroke admissions (p=0.011). Sub-arachnoid hemorrhage rates have declined. In-hospital stroke mortality has experienced an overall decline due to a decrease in deaths from ischemic stroke, intracerebral hemorrhage and sub-arachnoid hemorrhage. Conclusions Although age-adjusted stroke occurrence rates were stable from 1994 to 2002, this is associated with both a sharp increase in the absolute number of stroke admissions and decline in proportional in-hospital mortality. Further research is needed into changes in stroke severity over time to understand the causes of declining in-hospital stroke mortality rates.
Resumo:
Background: Prediction of outcome after stroke is important for triage decisions, prognostic estimates for family and for appropriate resource utilization. Prognostication must be timely and simply applied. Several scales have shown good prognostic value. In Calgary, the Orpington Prognostic Score (OPS) has been used to predict outcome as an aid to rehabilitation triage. However, the OPS has not been assessed at one week for predictive capability. Methods: Among patients admitted to a sub-acute stroke unit, OPS from the first week were examined to determine if any correlation existed between final disposition after rehabilitation and first week score. The predictive validity of the OPS at one week was compared to National Institute of Health Stroke Scale (NIHSS) score at 24 hours using logistic regression and receiver operator characteristics analysis. The primary outcome was final disposition after discharge from the stroke unit if the patient went directly home, or died, or from the inpatient rehabilitation unit. Results: The first week OPS was highly predictive of final disposition. However, no major advantage in using the first week OPS was observed when compared to 24h NIHSS score. Both scales were equally predictive of final disposition of stroke patients, post rehabilitation. Conclusion: The first week OPS can be used to predict final outcome. The NIHSS at 24h provides the same prognostic information.
Resumo:
"4,400 people die every day of AIDS in sub-Saharan Africa. Treatment exists. In about 60 days, a patient can go from here to here. We call this transformation the Lazarus Effect. It is the result of two pills a day taken by a HIV/AIDS patient for about 60 days. Learn more about how you can help give people the chance of life and joinred.com."The Lazarus Effect video, the (RED) Campaign.This Chapter explores how a number of non-government organizations, charities, and philanthropists have promoted ’grants' as a means of stimulating investment in research and development into neglected diseases. Each section considers the nature of the campaign; the use of intellectual property rights, such as trade marks; and the criticisms made of such endeavors. Section 2 looks at the (RED) Campaign, which is designed to boost corporate funding and consumer support for the Global Fund. Section 3 examines the role of the Gates Foundation in funding research and development in respect of infectious diseases. It explores the championing by Bill Gates of ’creative capitalism'. Section 4 considers the part of the Clinton Foundation in the debate over access to essential medicines. The Chapter concludes that, despite their qualities, such marketing initiatives fail to address the underlying inequalities and injustices of international patent law.
Resumo:
Despite the extent of works done on modelling port water collisions, not much research effort has been devoted to modelling collisions at port anchorages. This paper aims to fill this important gap in literature by applying the Navigation Traffic Conflict Technique (NTCT) for measuring the collision potentials in anchorages and for examining the factors contributing to collisions. Grounding on the principles of the NTCT, a collision potential measurement model and a collision potential prediction model were developed. These models were illustrated by using vessel movement data of the anchorages in Singapore port waters. Results showed that the measured collision potentials are in close agreement with those perceived by harbour pilots. Higher collision potentials were found in anchorages attached to shoreline and international fairways, but not at those attached to confined water. Higher operating speeds, larger numbers of isolated danger marks and day conditions were associated with reduction in the collision potentials.