288 resultados para Statistical tools
Resumo:
Introduction: Built environment interventions designed to reduce non-communicable diseases and health inequity, complement urban planning agendas focused on creating more ‘liveable’, compact, pedestrian-friendly, less automobile dependent and more socially inclusive cities.However, what constitutes a ‘liveable’ community is not well defined. Moreover, there appears to be a gap between the concept and delivery of ‘liveable’ communities. The recently funded NHMRC Centre of Research Excellence (CRE) in Healthy Liveable Communities established in early 2014, has defined ‘liveability’ from a social determinants of health perspective. Using purpose-designed multilevel longitudinal data sets, it addresses five themes that address key evidence-base gaps for building healthy and liveable communities. The CRE in Healthy Liveable Communities seeks to generate and exchange new knowledge about: 1) measurement of policy-relevant built environment features associated with leading non-communicable disease risk factors (physical activity, obesity) and health outcomes (cardiovascular disease, diabetes) and mental health; 2) causal relationships and thresholds for built environment interventions using data from longitudinal studies and natural experiments; 3) thresholds for built environment interventions; 4) economic benefits of built environment interventions designed to influence health and wellbeing outcomes; and 5) factors, tools, and interventions that facilitate the translation of research into policy and practice. This evidence is critical to inform future policy and practice in health, land use, and transport planning. Moreover, to ensure policy-relevance and facilitate research translation, the CRE in Healthy Liveable Communities builds upon ongoing, and has established new, multi-sector collaborations with national and state policy-makers and practitioners. The symposium will commence with a brief introduction to embed the research within an Australian health and urban planning context, as well as providing an overall outline of the CRE in Healthy Liveable Communities, its structure and team. Next, an overview of the five research themes will be presented. Following these presentations, the Discussant will consider the implications of the research and opportunities for translation and knowledge exchange. Theme 2 will establish whether and to what extent the neighbourhood environment (built and social) is causally related to physical and mental health and associated behaviours and risk factors. In particular, research conducted as part of this theme will use data from large-scale, longitudinal-multilevel studies (HABITAT, RESIDE, AusDiab) to examine relationships that meet causality criteria via statistical methods such as longitudinal mixed-effect and fixed-effect models, multilevel and structural equation models; analyse data on residential preferences to investigate confounding due to neighbourhood self-selection and to use measurement and analysis tools such as propensity score matching and ‘within-person’ change modelling to address confounding; analyse data about individual-level factors that might confound, mediate or modify relationships between the neighbourhood environment and health and well-being (e.g., psychosocial factors, knowledge, perceptions, attitudes, functional status), and; analyse data on both objective neighbourhood characteristics and residents’ perceptions of these objective features to more accurately assess the relative contribution of objective and perceptual factors to outcomes such as health and well-being, physical activity, active transport, obesity, and sedentary behaviour. At the completion of the Theme 2, we will have demonstrated and applied statistical methods appropriate for determining causality and generated evidence about causal relationships between the neighbourhood environment, health, and related outcomes. This will provide planners and policy makers with a more robust (valid and reliable) basis on which to design healthy communities.
Resumo:
BACKGROUND: Dengue has been a major public health concern in Australia since it re-emerged in Queensland in 1992-1993. We explored spatio-temporal characteristics of locally-acquired dengue cases in northern tropical Queensland, Australia during the period 1993-2012. METHODS: Locally-acquired notified cases of dengue were collected for northern tropical Queensland from 1993 to 2012. Descriptive spatial and temporal analyses were conducted using geographic information system tools and geostatistical techniques. RESULTS: 2,398 locally-acquired dengue cases were recorded in northern tropical Queensland during the study period. The areas affected by the dengue cases exhibited spatial and temporal variation over the study period. Notified cases of dengue occurred more frequently in autumn. Mapping of dengue by statistical local areas (census units) reveals the presence of substantial spatio-temporal variation over time and place. Statistically significant differences in dengue incidence rates among males and females (with more cases in females) (χ(2) = 15.17, d.f. = 1, p<0.01). Differences were observed among age groups, but these were not statistically significant. There was a significant positive spatial autocorrelation of dengue incidence for the four sub-periods, with the Moran's I statistic ranging from 0.011 to 0.463 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the northern Queensland. CONCLUSIONS: Tropical areas are potential high-risk areas for mosquito-borne diseases such as dengue. This study demonstrated that the locally-acquired dengue cases have exhibited a spatial and temporal variation over the past twenty years in northern tropical Queensland, Australia. Therefore, this study provides an impetus for further investigation of clusters and risk factors in these high-risk areas.
Resumo:
This report presents the final deliverable from the project titled Conceptual and statistical framework for a water quality component of an integrated report card’ funded by the Marine and Tropical Sciences Research Facility (MTSRF; Project 3.7.7). The key management driver of this, and a number of other MTSRF projects concerned with indicator development, is the requirement for state and federal government authorities and other stakeholders to provide robust assessments of the present ‘state’ or ‘health’ of regional ecosystems in the Great Barrier Reef (GBR) catchments and adjacent marine waters. An integrated report card format, that encompasses both biophysical and socioeconomic factors, is an appropriate framework through which to deliver these assessments and meet a variety of reporting requirements. It is now well recognised that a ‘report card’ format for environmental reporting is very effective for community and stakeholder communication and engagement, and can be a key driver in galvanising community and political commitment and action. Although a report card it needs to be understandable by all levels of the community, it also needs to be underpinned by sound, quality-assured science. In this regard this project was to develop approaches to address the statistical issues that arise from amalgamation or integration of sets of discrete indicators into a final score or assessment of the state of the system. In brief, the two main issues are (1) selecting, measuring and interpreting specific indicators that vary both in space and time, and (2) integrating a range of indicators in such a way as to provide a succinct but robust overview of the state of the system. Although there is considerable research and knowledge of the use of indicators to inform the management of ecological, social and economic systems, methods on how to best to integrate multiple disparate indicators remain poorly developed. Therefore the objective of this project was to (i) focus on statistical approaches aimed at ensuring that estimates of individual indicators are as robust as possible, and (ii) present methods that can be used to report on the overall state of the system by integrating estimates of individual indicators. It was agreed at the outset, that this project was to focus on developing methods for a water quality report card. This was driven largely by the requirements of Reef Water Quality Protection Plan (RWQPP) and led to strong partner engagement with the Reef Water Quality Partnership.
Resumo:
This thesis proposes three novel models which extend the statistical methodology for motor unit number estimation, a clinical neurology technique. Motor unit number estimation is important in the treatment of degenerative muscular diseases and, potentially, spinal injury. Additionally, a recent and untested statistic to enable statistical model choice is found to be a practical alternative for larger datasets. The existing methods for dose finding in dual-agent clinical trials are found to be suitable only for designs of modest dimensions. The model choice case-study is the first of its kind containing interesting results using so-called unit information prior distributions.
Resumo:
There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.
Resumo:
The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na–Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na–HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous–Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls.
Resumo:
This paper explores how traditional media organizations (such as magazines, music, film, books, and newspapers) develop routines for coping with an increasingly productive audience. While previous studies have reported on how such organizations have been affected by digital technologies, this study makes a contribution to this literature by being one of the first to show how organizational routines for engaging with an increasingly productive audience actually emerge and diffuse between industries. The paper explores to what extent routines employed by two traditional media organizations have been brought in from other organizational settings, specifically from so-called ‘software platform operators’. Data on routines for engaging with productive audiences have been collected from two information-rich cases in the music and the magazine industries, and from eight high-profile software platform operators. The paper concludes that the routines employed by the two traditional media organizations and by the software platform operators are based on the same set of principles: Provide the audience with (a) tools that allow them to easily generate cultural content; (b) building blocks which facilitate their creative activities; and (c) recognition and rewards based on both rationality and emotion.
Resumo:
This paper presents an overview of the strengths and limitations of existing and emerging geophysical tools for landform studies. The objectives are to discuss recent technical developments and to provide a review of relevant recent literature, with a focus on propagating field methods with terrestrial applications. For various methods in this category, including ground-penetrating radar (GPR), electrical resistivity (ER), seismics, and electromagnetic (EM) induction, the technical backgrounds are introduced, followed by section on novel developments relevant to landform characterization. For several decades, GPR has been popular for characterization of the shallow subsurface and in particular sedimentary systems. Novel developments in GPR include the use of multi-offset systems to improve signal-to-noise ratios and data collection efficiency, amongst others, and the increased use of 3D data. Multi-electrode ER systems have become popular in recent years as they allow for relatively fast and detailed mapping. Novel developments include time-lapse monitoring of dynamic processes as well as the use of capacitively-coupled systems for fast, non-invasive surveys. EM induction methods are especially popular for fast mapping of spatial variation, but can also be used to obtain information on the vertical variation in subsurface electrical conductivity. In recent years several examples of the use of plane wave EM for characterization of landforms have been published. Seismic methods for landform characterization include seismic reflection and refraction techniques and the use of surface waves. A recent development is the use of passive sensing approaches. The use of multiple geophysical methods, which can benefit from the sensitivity to different subsurface parameters, is becoming more common. Strategies for coupled and joint inversion of complementary datasets will, once more widely available, benefit the geophysical study of landforms.Three cases studies are presented on the use of electrical and GPR methods for characterization of landforms in the range of meters to 100. s of meters in dimension. In a study of polygonal patterned ground in the Saginaw Lowlands, Michigan, USA, electrical resistivity tomography was used to characterize differences in subsurface texture and water content associated with polygon-swale topography. Also, a sand-filled thermokarst feature was identified using electrical resistivity data. The second example is on the use of constant spread traversing (CST) for characterization of large-scale glaciotectonic deformation in the Ludington Ridge, Michigan. Multiple CST surveys parallel to an ~. 60. m high cliff, where broad (~. 100. m) synclines and narrow clay-rich anticlines are visible, illustrated that at least one of the narrow structures extended inland. A third case study discusses internal structures of an eolian dune on a coastal spit in New Zealand. Both 35 and 200. MHz GPR data, which clearly identified a paleosol and internal sedimentary structures of the dune, were used to improve understanding of the development of the dune, which may shed light on paleo-wind directions.
Resumo:
This paper addresses research from a three-year longitudinal study that engaged children in data modeling experiences from the beginning school year through to third year (6-8 years). A data modeling approach to statistical development differs in several ways from what is typically done in early classroom experiences with data. In particular, data modeling immerses children in problems that evolve from their own questions and reasoning, with core statistical foundations established early. These foundations include a focus on posing and refining statistical questions within and across contexts, structuring and representing data, making informal inferences, and developing conceptual, representational, and metarepresentational competence. Examples are presented of how young learners developed and sustained informal inferential reasoning and metarepresentational competence across the study to become “sophisticated statisticians”.