825 resultados para Mansfield, Katherine
Resumo:
Background The Global Burden of Disease, Injuries, and Risk Factor study 2013 (GBD 2013) is the first of a series of annual updates of the GBD. Risk factor quantification, particularly of modifiable risk factors, can help to identify emerging threats to population health and opportunities for prevention. The GBD 2013 provides a timely opportunity to update the comparative risk assessment with new data for exposure, relative risks, and evidence on the appropriate counterfactual risk distribution. Methods Attributable deaths, years of life lost, years lived with disability, and disability-adjusted life-years (DALYs) have been estimated for 79 risks or clusters of risks using the GBD 2010 methods. Risk–outcome pairs meeting explicit evidence criteria were assessed for 188 countries for the period 1990–2013 by age and sex using three inputs: risk exposure, relative risks, and the theoretical minimum risk exposure level (TMREL). Risks are organised into a hierarchy with blocks of behavioural, environmental and occupational, and metabolic risks at the first level of the hierarchy. The next level in the hierarchy includes nine clusters of related risks and two individual risks, with more detail provided at levels 3 and 4 of the hierarchy. Compared with GBD 2010, six new risk factors have been added: handwashing practices, occupational exposure to trichloroethylene, childhood wasting, childhood stunting, unsafe sex, and low glomerular filtration rate. For most risks, data for exposure were synthesised with a Bayesian meta-regression method, DisMod-MR 2.0, or spatial-temporal Gaussian process regression. Relative risks were based on meta-regressions of published cohort and intervention studies. Attributable burden for clusters of risks and all risks combined took into account evidence on the mediation of some risks such as high body-mass index (BMI) through other risks such as high systolic blood pressure and high cholesterol. Findings All risks combined account for 57·2% (95% uncertainty interval [UI] 55·8–58·5) of deaths and 41·6% (40·1–43·0) of DALYs. Risks quantified account for 87·9% (86·5–89·3) of cardiovascular disease DALYs, ranging to a low of 0% for neonatal disorders and neglected tropical diseases and malaria. In terms of global DALYs in 2013, six risks or clusters of risks each caused more than 5% of DALYs: dietary risks accounting for 11·3 million deaths and 241·4 million DALYs, high systolic blood pressure for 10·4 million deaths and 208·1 million DALYs, child and maternal malnutrition for 1·7 million deaths and 176·9 million DALYs, tobacco smoke for 6·1 million deaths and 143·5 million DALYs, air pollution for 5·5 million deaths and 141·5 million DALYs, and high BMI for 4·4 million deaths and 134·0 million DALYs. Risk factor patterns vary across regions and countries and with time. In sub-Saharan Africa, the leading risk factors are child and maternal malnutrition, unsafe sex, and unsafe water, sanitation, and handwashing. In women, in nearly all countries in the Americas, north Africa, and the Middle East, and in many other high-income countries, high BMI is the leading risk factor, with high systolic blood pressure as the leading risk in most of Central and Eastern Europe and south and east Asia. For men, high systolic blood pressure or tobacco use are the leading risks in nearly all high-income countries, in north Africa and the Middle East, Europe, and Asia. For men and women, unsafe sex is the leading risk in a corridor from Kenya to South Africa. Interpretation Behavioural, environmental and occupational, and metabolic risks can explain half of global mortality and more than one-third of global DALYs providing many opportunities for prevention. Of the larger risks, the attributable burden of high BMI has increased in the past 23 years. In view of the prominence of behavioural risk factors, behavioural and social science research on interventions for these risks should be strengthened. Many prevention and primary care policy options are available now to act on key risks.
Resumo:
Introduction Hospitalisation for percutaneous coronary intervention (PCI) is often short, with limited nurse-teaching time and poor information absorption. Currently, patients are discharged home only to wait up to 4-8 weeks to commence a secondary prevention program and visit their cardiologist. This wait is an anxious time for patients and confidence or self-efficacy (SE) to self-manage may be low. Objectives To determine the effects of a nurse-led, educational intervention on participant SE and anxiety in the early post-discharge period. Methods A pilot study was undertaken as a randomised controlled clinical trial. Thirty-three participants were recruited, with n=13 randomised to the intervention group. A face-to-face, nurse-led, educational intervention was undertaken within the first 5-7 days post-discharge. Intervention group participants received standard post-discharge education, physical assessment, with a strong focus on the emotional impact of cardiovascular events and PCI. Early reiteration of post-discharge education was offered, along with health professional support with the aim to increase patients’ SE and to effectively manage their post-discharge health and well being, as well as anxieties. Self-efficacy to return to normal activities was measured to gauge participants’ abilities to manage post-PCI after attending the intervention using the cardiac self-efficacy (CSE) scale. State and trait anxiety was also measured using the State-Trait Anxiety Inventory (STAI) to determine if an increase in SE would influence participant anxiety. Results There were some increases in mean CSE scores in the intervention group participants over time. Areas of increase included return to normal social activities and confidence to change diet. Although reductions were observed in mean state and trait anxiety scores in both groups, an overall larger reduction in intervention group participants was observed over time. Conclusion It is essential that patients are given the education, support, and skills to self-manage in the early post-discharge period so that they have greater SE and are less anxious. This study provides some initial evidence that nurse-led support and education during this period, particularly the first week following PCI, is beneficial and could be trialled using alternate modes of communication to support remote and rural PCI patients and extend to other cardiovascular patients.
Resumo:
Eleven new human polyomaviruses have been recently discovered, yet for most of these viruses, little is known of their biology and clinical impact. Rolling circle amplification (RCA) is an ideal method for the amplification of the circular polyomavirus genome due to its high fidelity amplification of circular DNA. In this study, a modified RCA method was developed to selectively amplify a range of polyomavirus genomes. Initial evaluation showed a multiplexed temperature-graded reaction profile gave the best yield and sensitivity in amplifying BK polyomavirus in a background of human DNA, with up to 1 × 10(8)-fold increases in viral genomes from as little as 10 genome copies per reaction. Furthermore, the method proved to be more sensitive and provided a 200-fold greater yield than that of random hexamers based standard RCA. Application of the method to other novel human polyomaviruses showed successful amplification of TSPyV, HPyV6, HPyV7, and STLPyV from low-viral load positive clinical samples, with viral genome enrichment ranging from 1 × 10(8) up to 1 × 10(10). This directed RCA method can be applied to selectively amplify other low-copy polyomaviral genomes from a background of competing non-specific DNA, and is a useful tool in further research into the rapidly expanding Polyomaviridae family.
Resumo:
Aim: The purpose of this study was to determine the percentage of patients assessed as malnourished using the Subjective Global Assessment in two hospitals in Ho Chi Minh City and Can Tho across multiple wards; and to investigate the association with factors including gender, age, days since admission, medical diagnosis and number of medications used. Methods: This cross-sectional study involved 205 inpatients from a hospital in Ho Chi Minh City and 78 inpatients and 89 outpatients from a hospital in Can Tho. Malnutrition status was assessed using Subjective Global Assessment. Ward, gender, age, medical diagnosis, time since admission and medication number were extracted from medical records. Results: 35.6% of inpatients and 9.0% of outpatients were malnourished. Multivariate analysis revealed factors predicting malnutrition status within inpatients (OR (95%CI)) were: age (OR = 1.03 (1.01-1.06)); cancer diagnosis (OR = 34.25 (3.16-370.89)); respiratory ward (11.49 (1.05-125.92)); or general medicine ward (20.34 (2.10-196.88)). Conclusions: Results indicate that malnutrition is a common problem in hospitals in Vietnam. Further research is needed to confirm this finding across a wider range of hospitals and to investigate the feasibility and efficacy of implementation of nutrition interventions in hospital settings.
Resumo:
Alternaria leaf blight is the most prevalent disease of cotton in northern Australia. A trial was conducted at Katherine Research Station, Northern Territory, Australia, to determine the effects of foliar application of potassium nitrate (KNO3) on the suppression of Alternaria leaf blight of cotton. Disease incidence, severity and leaf shedding were assessed at the bottom (1-7 nodes), middle (8-14 nodes) and the top (15+ nodes) of plants at weekly intervals from 7 July to 22 September 2004. Disease incidence, severity and shedding at the middle canopy level were significantly higher for all treatments than those from bottom and top canopies. Foliar KNO3, applied at 13 kg/ha, significantly (P < 0.05) reduced the mean disease incidence, severity and leaf shedding assessed during the trial period. KNO 3 significantly (P < 0.001) reduced the disease severity and leaf shedding at the middle canopy level. Almost all leaves in the middle canopy became infected in the first week of July in contrast to infection levels of 50-65% at the bottom and top of the canopy. Disease severity and leaf shedding in the middle canopy were significantly (P < 0.05) lower in KNO 3-treated plots than the control plots from the second and third weeks of July to the second and third weeks of August. This study demonstrates that foliar application of KNO3 may be effective in reducing the effect of Alternaria leaf blight of cotton in northern Australia.
Resumo:
Prediction of the initiation, appearance and emergence of leaves is critically important to the success of simulation models of crop canopy development and some aspects of crop ontogeny. Data on leaf number and crop ontogeny were collected on five cultivars of maize differing widely in maturity and genetic background grown under natural and extended photoperiods, and planted on seven sowing dates from October 1993 to March 1994 at Gatton, South-east Queensland. The same temperature coefficients were established for crop ontogeny before silking, and the rates of leaf initiation, leaf tip appearance and full leaf expansion, the base, optimum and maximum temperatures for each being 8, 34 and 40 degrees C. After silking, the base temperature for ontogeny was 0 degrees C, but the optimum and maximum temperatures remained unchanged. The rates of leaf initiation, appearance of leaf tips and full leaf expansion varied in a relatively narrow range across sowing times and photoperiod treatments, with average values of 0.040 leaves (degrees Cd)-1, 0.021 leaves (degrees Cd)-1, and 0.019 leaves (degrees Cd)-1, respectively. The relationships developed in this study provided satisfactory predictions of leaf number and crop ontogeny (tassel initiation to silking, emergence to silking and silking to physiological maturity) when assessed using independent data from Gatton (South eastern Queensland), Katherine and Douglas Daly (Northern Territory), Walkamin (North Queensland) and Kununurra (Western Australia).
Resumo:
Background Previous studies (mostly questionnaire-based in children) suggest that outdoor activity is protective against myopia. There are few studies on young adults investigating both the impact of simply being outdoors versus performing physical activity. The aim was to study the relationship between the refractive error of young adults and their physical activity patterns. Methods Twenty-seven university students, aged 18 to 25 years, wore a pedometer (Omron HJ720ITE) for seven days both during the semester and holiday periods. They simultaneously recorded the type of activity performed, its duration, the number of steps taken (from the pedometer) and their location (indoors/outdoors) in a logbook. Mean spherical refractive error was used to divide participants into three groups (emmetropes: +1.00 to -0.50 D, low myopes: -0.62 to -3.00 D, higher myopes: -3.12 D or greater myopia). Results There were no significant differences between the refractive groups during the semester or holiday periods; the average daily times spent outdoors, the duration of physical activity, the ratio of physical activity performed outdoors to indoors and amount of near work performed were similar. The peak exercise intensity was similar across all groups: approximately 100 steps perminute, a brisk walk. Up to one-third of all physical activity was performed outdoors. There were some significant differences in activities performed during semester and holiday times. For example, lowmyopes spent significantly less time outside (49 ± 47 versus 74 ± 41 minutes, p = 0.005) and performed less physical activity (6,388 ± 1,747 versus 6,779 ± 2,746 steps per day; p = 0.03) during the holidays compared to during semester. Conclusions The fact that all groups had similar low exercise intensity butmany were notmyopic suggests that physical activity levels are not critical. There were differences in the activity patterns of lowmyopes during semester and holiday periods. This study highlights the need for a larger longitudinal-based study with particular emphasis on how discretionary time is spent.
Resumo:
Hand hygiene is the primary measure in hospitals to reduce the spread of infections, with nurses experiencing the greatest frequency of patient contact. The ‘5 critical moments’ of hand hygiene initiative has been implemented in hospitals across Australia, accompanied by awareness-raising, staff training and auditing. The aim of this study was to understand the determinants of nurses’ hand hygiene decisions, using an extension of a common health decision-making model, the theory of planned behaviour (TPB), to inform future health education strategies to increase compliance. Nurses from 50 Australian hospitals (n = 2378) completed standard TPB measures (attitude, subjective norm, perceived behavioural control [PBC], intention) and the extended variables of group norm, risk perceptions (susceptibility, severity) and knowledge (subjective, objective) at Time 1, while a sub-sample (n = 797) reported their hand hygiene behaviour 2 weeks later. Regression analyses identified subjective norm, PBC, group norm, subjective knowledge and risk susceptibility as the significant predictors of nurses’ hand hygiene intentions, with intention and PBC predicting their compliance behaviour. Rather than targeting attitudes which are already very favourable among nurses, health education strategies should focus on normative influences and perceptions of control and risk in efforts to encourage hand hygiene adherence.
Resumo:
The persistent low employment rate of people with disability has emerged as a concern for the Australian Government and society in general. The research addressed the gap between the supply and demand sides of disability employment by exploring organisational mechanisms underlying the proactive employment of people with disability. Data was collected from a large Australian retail organisation that currently employs people with disability. The findings revealed how the organisation legitimises disability employment practices, within its internal and external operating environments. The research informs the areas of government policy and organisational practices concerning future employment opportunities for people with disability.
Resumo:
Bigeyed bugs (Geocoris spp., Hemiptera: Geocoridae) are common predators in Australian agricultural crops yet the development and reproductive biology of Australian geocorids has not been described before. Here we present the effects of diet, temperature and photoperiod on the development and survival of Geocoris lubra Kirkaldy from egg to adult. Nymphal survival of G. lubra reared on live aphids (Aphis gossypii Glover) was very low but improved slightly on a diet of Helicoverpa armigera (Hübner) eggs. Development was faster and nymphal survival improved significantly at 27°C compared with 25°C. Further investigation at 27°C showed photoperiod influenced development time, but not survival of immature G. lubra. Development time was significantly longer at 10L:14D. Fecundity of first generation G. lubra was not affected by photoperiod, although egg viability was greater at 12L:12D.
Resumo:
The current study explored underlying beliefs regarding work safety among a sample of experienced Australian electrical workers. A qualitative research methodology using the theory of planned behavior as a framework was employed. A series of interviews and focus groups with licensed electrical workers (N = 46) were analyzed using thematic content analysis. Beliefs were classified as advantages (e.g. personal safety of self and co-workers), disadvantages (e.g., inconvenience to customer/clients and workload), referents (e.g., supervisors, work colleagues, customers), barriers (e.g., time and cost), and facilitators (e.g., training and knowledge, equipment availability) of safety adherence. The belief basis of the theory of planned behavior was a useful framework for exploring workers’ safety beliefs. The identified beliefs can inform future research about the important factors influencing safe work decisions and inform strategies to promote safer workplace decision making within the electrical safety context.
Resumo:
Seated: David Stern, Bernice Stern, and Jessica Agosti; left to right: Doreen Stern, her husband Michael Stern, Carol Richardson, her husband Blake Richardson, John Agosti. The baby is Katherine Stern
Resumo:
Seated: David Stern, Bernice Stern, and Jessica Agosti; left to right: Doreen Stern, her husband Michael Stern, Carol Richardson, her husband Blake Richardson, John Agosti. The baby is Katherine Stern
Resumo:
Background Alcohol expectancies likely play a role in people’s perceptions of alcohol-involved sexual violence. However, no appropriate measure exists to examine this link comprehensively. Objective The aim of this research was to develop an alcohol expectancy measure which captures young adults’ beliefs about alcohol’s role in sexual aggression and victimization. Method Two cross-sectional samples of young Australian adults (18–25 years) were recruited for scale development (Phase 1) and scale validation (Phase 2). In Phase 1, participants (N = 201; 38.3% males) completed an online survey with an initial pool of alcohol expectancy items stated in terms of three targets (self, men, women) to identify the scale’s factor structure and most effective items. A revised alcohol expectancy scale was then administered online to 322 young adults (39.6% males) in Phase 2. To assess the predictive, convergent, and discriminant validity of the scale, participants also completed established measures of personality, social desirability, alcohol use, general and context-specific alcohol expectancies, and impulsiveness. Results Principal axis factoring (Phase 1) and confirmatory factor analysis (Phase 2) resulted in a target-equivalent five-factor structure for the final 66-item Drinking Expectancy Sexual Vulnerabilities Questionnaire (DESV-Q). The factors were labeled: - (1) Sexual Coercion - (2) Sexual Vulnerability - (3) Confidence - (4) Self-Centeredness - (5) Negative Cognitive and Behavioral Changes The measure demonstrated effective items, high internal consistency, and satisfactory predictive, convergent, and discriminant validity. Conclusions The DESV-Q is a purpose-specific instrument that could be used in future research to elucidate people’s attributions for alcohol-involved sexual aggression and victimization.