908 resultados para 100 years
Resumo:
Barmah Forest virus (BFV) disease is one of the most widespread mosquito-borne diseases in Australia. The number of outbreaks and the incidence rate of BFV in Australia have attracted growing concerns about the spatio-temporal complexity and underlying risk factors of BFV disease. A large number of notifications has been recorded continuously in Queensland since 1992. Yet, little is known about the spatial and temporal characteristics of the disease. I aim to use notification data to better understand the effects of climatic, demographic, socio-economic and ecological risk factors on the spatial epidemiology of BFV disease transmission, develop predictive risk models and forecast future disease risks under climate change scenarios. Computerised data files of daily notifications of BFV disease and climatic variables in Queensland during 1992-2008 were obtained from Queensland Health and Australian Bureau of Meteorology, respectively. Projections on climate data for years 2025, 2050 and 2100 were obtained from Council of Scientific Industrial Research Organisation. Data on socio-economic, demographic and ecological factors were also obtained from relevant government departments as follows: 1) socio-economic and demographic data from Australian Bureau of Statistics; 2) wetlands data from Department of Environment and Resource Management and 3) tidal readings from Queensland Department of Transport and Main roads. Disease notifications were geocoded and spatial and temporal patterns of disease were investigated using geostatistics. Visualisation of BFV disease incidence rates through mapping reveals the presence of substantial spatio-temporal variation at statistical local areas (SLA) over time. Results reveal high incidence rates of BFV disease along coastal areas compared to the whole area of Queensland. A Mantel-Haenszel Chi-square analysis for trend reveals a statistically significant relationship between BFV disease incidence rates and age groups (ƒÓ2 = 7587, p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. A cluster analysis was used to detect the hot spots/clusters of BFV disease at a SLA level. Most likely spatial and space-time clusters are detected at the same locations across coastal Queensland (p<0.05). The study demonstrates heterogeneity of disease risk at a SLA level and reveals the spatial and temporal clustering of BFV disease in Queensland. Discriminant analysis was employed to establish a link between wetland classes, climate zones and BFV disease. This is because the importance of wetlands in the transmission of BFV disease remains unclear. The multivariable discriminant modelling analyses demonstrate that wetland types of saline 1, riverine and saline tidal influence were the most significant risk factors for BFV disease in all climate and buffer zones, while lacustrine, palustrine, estuarine and saline 2 and saline 3 wetlands were less important. The model accuracies were 76%, 98% and 100% for BFV risk in subtropical, tropical and temperate climate zones, respectively. This study demonstrates that BFV disease risk varied with wetland class and climate zone. The study suggests that wetlands may act as potential breeding habitats for BFV vectors. Multivariable spatial regression models were applied to assess the impact of spatial climatic, socio-economic and tidal factors on the BFV disease in Queensland. Spatial regression models were developed to account for spatial effects. Spatial regression models generated superior estimates over a traditional regression model. In the spatial regression models, BFV disease incidence shows an inverse relationship with minimum temperature, low tide and distance to coast, and positive relationship with rainfall in coastal areas whereas in whole Queensland the disease shows an inverse relationship with minimum temperature and high tide and positive relationship with rainfall. This study determines the most significant spatial risk factors for BFV disease across Queensland. Empirical models were developed to forecast the future risk of BFV disease outbreaks in coastal Queensland using existing climatic, socio-economic and tidal conditions under climate change scenarios. Logistic regression models were developed using BFV disease outbreak data for the existing period (2000-2008). The most parsimonious model had high sensitivity, specificity and accuracy and this model was used to estimate and forecast BFV disease outbreaks for years 2025, 2050 and 2100 under climate change scenarios for Australia. Important contributions arising from this research are that: (i) it is innovative to identify high-risk coastal areas by creating buffers based on grid-centroid and the use of fine-grained spatial units, i.e., mesh blocks; (ii) a spatial regression method was used to account for spatial dependence and heterogeneity of data in the study area; (iii) it determined a range of potential spatial risk factors for BFV disease; and (iv) it predicted the future risk of BFV disease outbreaks under climate change scenarios in Queensland, Australia. In conclusion, the thesis demonstrates that the distribution of BFV disease exhibits a distinct spatial and temporal variation. Such variation is influenced by a range of spatial risk factors including climatic, demographic, socio-economic, ecological and tidal variables. The thesis demonstrates that spatial regression method can be applied to better understand the transmission dynamics of BFV disease and its risk factors. The research findings show that disease notification data can be integrated with multi-factorial risk factor data to develop build-up models and forecast future potential disease risks under climate change scenarios. This thesis may have implications in BFV disease control and prevention programs in Queensland.
Resumo:
What are the most appropriate methodological approaches for researching the psychosocial determinants of health and wellbeing among young people from refugee backgrounds over the resettlement period? What kinds of research models can involve young people in meaningful reflections on their lives and futures while simultaneously yielding valid data to inform services and policy? This paper reports on the methods developed for a longitudinal study of health and wellbeing among young people from refugee backgrounds in Melbourne, Australia. The study involves 100 newly-arrived young people 12 to 18 years of age, and employs a combination of qualitative and quantitative methods implemented as a series of activities carried out by participants in personalized settlement journals. This paper highlights the need to think outside the box of traditional qualitative and/or quantitative approaches for social research into refugee youth health and illustrates how integrated approaches can produce information that is meaningful to policy makers, service providers and to the young people themselves.
Resumo:
One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.
Resumo:
People with Parkinson’s disease (PD) have been reported to be at higher risk of malnutrition than an age-matched population due to PD motor and non-motor symptoms and pharmacotherapy side effects. The prevalence of malnutrition in PD has yet to be well-defined. Community-dwelling people with PD, aged > 18 years, were recruited (n = 97, 61 M, 36 F). The Patient-Generated Subjective Global Assessment (PGSGA) was used to assess nutritional status, the Parkinson’s Disease Questionnaire (PDQ-39) was used to assess quality of life, and the Beck’s Depression Inventory (BDI) was used to measure depression. Levodopa equivalent doses (LEDs) were calculated based on reported Parkinson’s disease medication. Weight, height, mid-arm circumference (MAC) and calf circumference were measured. Cognitive function was measured using the Addenbrooke’s Cognitive Examination. Average age was 70.0 (9.1, 35–92) years. Based on SGA, 16 (16.5%) were moderately malnourished (SGA B) while none were severely malnourished (SGA C). The well-nourished participants (SGA A) had a better quality of life, t(90) = −2.28, p < 0.05, and reported less depressive symptoms, t(94)= −2.68, p < 0.05 than malnourished participants. Age, years since diagnosis, cognitive function and LEDs did not signifi cantly differ between the groups. The well-nourished participants had lower PG-SGA scores, t(95) = −5.66, p = 0.00, higher BMIs, t(95) = 3.44, p < 0.05, larger MACs, t(95) = 3.54, p < 0.05 and larger calf circumferences, t(95) = 2.29, p < 0.05 than malnourished participants. Prevalence of malnutrition in community-dwelling adults with PD in this study is comparable to that in other studies with community-dwelling adults without PD and is higher than other PD studies where a nutritional status assessment tool was used. Further research is required to understand the primary risk factors for malnutrition in this group.
Resumo:
Establishing single sex classes within co-educational sites is an option that Australian schools are again exploring. To date Australia has experienced three ‘waves’ of interest in establishing single sex classes, the first focused on equitable education opportunities for girls (Alloway & Gilbert, 1997), the second centered on boys’ literacy and engagement (Gilbert & Gilbert, 1998) and this current wave focuses on perceived difference between the sexes in co-educational classrooms (Protheroe, 2009; Gurian, Stevens & Daniels, 2009). With the intersection of middle schooling movement, focusing on learner centered classrooms (Pendergast & Bahr, 2010) and current educational agendas aimed at improving student performance and measurable learning outcomes (Ministerial Council on Education, Employment, Training and Youth Affairs, 2008), it is understandable that schools are exploring such student grouping options. However, after thirty years of international research into the efficacy of single sex classes in co-educational settings, the results still remain unclear. This paper seeks to navigate the ‘muddy waters’ of this body of research and suggests a framework to help guide school communities through the decision-making process associated with considering single sex classes.
Resumo:
This paper examines the history of the IRGEE Journal in terms of its sustainable future. The development of geographical and environmental education is evaluated, as reflected from the articles published in the Journal “International Research in Geographical and Environmental Education” (IRGEE). A content analysis of all papers and forum sections which have appeared in the journal since Volume 1 Number 1 was published in 1992 has been conducted, examining the content of as many as 526 articles. The method was a content analysis, and revealed themes which have experienced an increasing or declining interest over the 18 years of publication of IRGEE (1992-2009), while other themes have remained current during this period. The main findings of this analysis are: a) the total number of articles has increased more than threefold, b) articles related to geographical education (sensu stricto) outweighed those related to environmental education, c) the themes “syllabi, textbooks, curricula” and “values, attitudes” attract the attention of researchers with increasing strength and d) emerging subjects, such as GIS and sustainability have appeared dynamically in the last years.
Resumo:
The Australian Curriculum marks national reforms in social science education, first with the return to the disciplines of history and geography and second, through a new approach to interdisciplinary learning. This paper raises the question of whether the promise of interdisciplinary learning can be realised in the middle years of schooling if teachers have to teach history as a discipline rather than within an over-arching integrated curriculum framework. The paper explores the national blueprints and considers the national history curriculum in light of theories of teachers’ knowledge and middle school education. Evidence from teacher interviews indicates that historical understanding can be achieved through integrated frameworks to meet the goals of middle schooling.
Resumo:
During the last four decades, educators have created a range of critical literacy approaches for different contexts, including compulsory schooling (Luke & Woods, 2009) and second language education (Luke & Dooley, 2011). Despite inspirational examples of critical work with young students (e.g., O’Brien, 1994; Vasquez, 1994), Comber (2012) laments the persistent myth that critical literacy is not viable in the early years. Assumptions about childhood innocence and the priorities of the back-to-basics movement seem to limit the possibilities for early years literacy teaching and learning. Yet, teachers of young students need not face an either/or choice between the basic and critical dimensions of literacy. Systematic ways of treating literacy in all its complexity exist. We argue that the integrative imperative is especially important in schools that are under pressure to improve technical literacy outcomes. In this chapter, we document how critical literacy was addressed in a fairytales unit taught to 4.5 - 5.5 year olds in a high diversity, high poverty Australian school. We analyze the affordances and challenges of different approaches to critical literacy, concluding they are complementary rather than competing sources of possibility. Furthermore, we make the case for turning familiar classroom activities to critical ends.
Resumo:
Background & aims: One aim of the Australasian Nutrition Care Day Survey was to determine the nutritional status and dietary intake of acute care hospital patients. Methods: Dietitians from 56 hospitals in Australia and New Zealand completed a 24-h survey of nutritional status and dietary intake of adult hospitalised patients. Nutritional risk was evaluated using the Malnutrition Screening Tool. Participants ‘at risk’ underwent nutritional assessment using Subjective Global Assessment. Based on the International Classification of Diseases (Australian modification), participants were also deemed malnourished if their body mass index was <18.5 kg/m2. Dietitians recorded participants’ dietary intake at each main meal and snacks as 0%, 25%, 50%, 75%, or 100% of that offered. Results: 3122 patients (mean age: 64.6 ± 18 years) participated in the study. Forty-one percent of the participants were “at risk” of malnutrition. Overall malnutrition prevalence was 32%. Fifty-five percent of malnourished participants and 35% of well-nourished participants consumed ≤50% of the food during the 24-h audit. “Not hungry” was the most common reason for not consuming everything offered during the audit. Conclusion: Malnutrition and sub-optimal food intake is prevalent in acute care patients across hospitals in Australia and New Zealand and warrants appropriate interventions.
Resumo:
Background & aims The Australasian Nutrition Care Day Survey (ANCDS) ascertained if malnutrition and poor food intake are independent risk factors for health-related outcomes in Australian and New Zealand hospital patients. Methods Phase 1 recorded nutritional status (Subjective Global Assessment) and 24-h food intake (0, 25, 50, 75, 100% intake). Outcomes data (Phase 2) were collected 90-days post-Phase 1 and included length of hospital stay (LOS), readmissions and in-hospital mortality. Results Of 3122 participants (47% females, 65 ± 18 years) from 56 hospitals, 32% were malnourished and 23% consumed ≤ 25% of the offered food. Malnourished patients had greater median LOS (15 days vs. 10 days, p < 0.0001) and readmissions rates (36% vs. 30%, p = 0.001). Median LOS for patients consuming ≤ 25% of the food was higher than those consuming ≤ 50% (13 vs. 11 days, p < 0.0001). The odds of 90-day in-hospital mortality were twice greater for malnourished patients (CI: 1.09–3.34, p = 0.023) and those consuming ≤ 25% of the offered food (CI: 1.13–3.51, p = 0.017), respectively. Conclusion The ANCDS establishes that malnutrition and poor food intake are independently associated with in-hospital mortality in the Australian and New Zealand acute care setting.
Resumo:
One aim of the Australasian Nutrition Care Day Survey (ANCDS) was to explore dietary intake and nutritional status of acute care hospital patients. Dietitians from 56 hospitals in Australia and New Zealand completed a 24-hour nutritional status and dietary intake audit of 3000 adult patients. Participants were evaluated for nutritional risk using the Malnutrition Screening Tool (MST). Those ‘at risk’ underwent nutritional assessment using Subjective Global Assessment (SGA). Dietitians observed participants’ dietary intake at each main meal and recorded mid-meal intake via participant interviews. Intakes were recorded as 0%, 25%, 50%, 75%, or 100% of that offered for each meal during the 24-hour audit. Preliminary results for 1550 participants (males = 853; females = 697), age = 64 ± 17 years and BMI = 27 ± 7 kg/m2. Fifty-five percent (n = 853) of the participants had BMI > 25 kg/m2. The MST identified 41% (n = 636) ‘at risk’ for malnutrition. Of those ‘at risk’, 70% were assessed as malnourished resulting in an overall malnutrition prevalence of 30% (25% moderately malnourished, 5% severely malnourished). One-quarter of malnourished participants (n = 118) were on standard hospital diets without additional nutritional support. Fifty percent of malnourished patients (n = 235) and 40% of all patients (n = 620) had an overall 24-hour food consumption of ≤50% during the 24-hour audit. The ANCDS found that skeletons in the hospital closet continue to exist and that acute care patients continue to have suboptimal dietary intake. The ANCDS provides valuable insight into gaps in existing nutrition care practices.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Since the architectural design studio learning environment was first established in the early 19th century at the École des Beaux-Arts in Paris, there has been a complete transformation in how the discipline of architecture is practiced and how students of architecture acquire information. Digital technologies allow students to access information instantly and learning is no longer confined to the rigid boundaries of a physical campus environment. In many schools of architecture in Australia, the physical design studio learning environments however, remain largely unchanged. Many learning environments could be mistaken for those last refurbished 30 years ago, being devoid of any significant technological intervention. While some teaching staff are eagerly embracing new digital technologies and attempting to modify their pedagogical approaches, the physical design studio learning environment is resistant to such efforts. In a study aimed at better understanding how staff and students adapt to new blended learning environments, a group of 165 second year architecture students at a large school of architecture in Australia were separated into two different design studio learning environments. 70% of students were allocated to a traditional design studio setting and 30% to a new, high technology embedded, prototype digital learning laboratory. The digital learning laboratory was purpose designed for the case-study users, adapted Student-Centred Active Learning Environment for Undergraduate Programs [SCALE-UP] principles, and built as part of a larger university research project. The architecture students attended the same lectures, followed the same studio curriculum and completed the same pieces of assessment; the only major differences were the teaching staff and physical environment within which the studios were conducted. At the end of the semester, all staff and students were asked to complete a questionnaire about their experiences and preferences within the two respective learning environments. The questionnaire response rate represented the opinions of 100% of the 10 teaching staff and over 70% of the students. Using a qualitative grounded theory approach, data were coded, extrapolated and compared, to reveal emerging key themes. The key themes formed the basis for in-depth interviews and focus groups of teaching staff and students, allowing the researchers to understand the data in more detail. The results of the data verified what had become increasingly evident during the course of the semester: an underlying negative resistance to the new digital studio learning environment, by both staff and students. Many participants openly exhibited a yearning for a return to the traditional design studio learning environments, particularly when the new technology caused frustration, by being unreliable or failing altogether. This paper reports on the study, discusses the negative resistance and explores the major contributors to resistance. The researchers are not aware of any similar previous studies across these particular settings and believe that it offers a necessary and important contribution to emergent research about adaptation to new digital learning environments.
Resumo:
Superconducting thick films of Bi2Sr2CaCu2Oy (Bi-2212) on single-crystalline (100) MgO substrates have been prepared using a doctor-blade technique and a partial-melt process. It is found that the phase composition and the amount of Ag addition to the paste affect the structure and superconducting properties of the partially melted thick films. The optimum heat treatment schedule for obtaining high Jc has been determined for each paste. The heat treatment ensures attainment of high purity for the crystalline Bi-2212 phase and high orientation of Bi-2212 crystals, in which the c-axis is perpendicular to the substrate. The highest Tc, obtained by resistivity measurement, is 92.2 K. The best value for Jct (transport) of these thick films, measured at 77 K in self-field, is 8 × 10 3 Acm -2.
Resumo:
Since March 2010 in Queensland, legislation has specified the type of restraint and seating row for child passengers under 7 years according to age. The following study explored regional parents’ child restraint practices and the influence of their health beliefs over these. A brief intercept interview was verbally administered to a convenience sample of parent-drivers (n = 123) in Toowoomba in February 2010, after the announcement of changes to legislation but prior to enforcement. Parents who agreed to be followed-up were then reinterviewed after the enforcement (May-June 2010). The Health Beliefs Model was used to gauge beliefs about susceptibility to crashing, children being injured in a crash, and likely severity of injuries. Self-efficacy and perceptions about barriers to, and benefits of, using age-appropriate restraints with children, were also assessed. Results: There were very high levels of rear seating reported for children (initial interview 91%; follow-up 100%). Dedicated child restraint use was 96.9% at initial interview, though 11% were deemed inappropriate for the child’s age. Self-reported restraint practices for children under 7 were used to categorise parental practices into ‘Appropriate’ (all children in age-appropriate restraint and rear seat) or ‘Inappropriate’ (≥1 child inappropriately restrained). 94% of parents were aware of the legislation, but only around one third gave accurate descriptions of the requirements. However, 89% of parents were deemed to have ‘Appropriate’ restraint practices. Parents with ‘Inappropriate’ practices were significantly more likely than those with ‘Appropriate’ practices to disagree that child restraints provide better protection for children in a crash than adult seatbelts. For self-efficacy, parents with ‘Appropriate’ practices were more likely than those with ‘Inappropriate’ practices to report being ‘completely confident’ about installing child restraints. The results suggest that efforts to increase the level of appropriate restraint should attempt to better inform them about the superior protection offered by child restraints compared with seat belts for children.