537 resultados para Statistical decision


Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a performance-based optimisation approach for conducting trade-off analysis between safety (roads) and condition (bridges and roads). Safety was based on potential for improvement (PFI). Road condition was based on surface distresses and bridge condition was based on apparent age per subcomponent. The analysis uses a non-monetised optimisation that expanded upon classical Pareto optimality by observing performance across time. It was found that achievement of good results was conditioned by the availability of early age treatments and impacted by a frontier effect preventing the optimisation algorithm from realising of the long-term benefits of deploying actions when approaching the end of the analysis period. A disaggregated bridge condition index proved capable of improving levels of service in bridge subcomponents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe an investigation into how Massey University’s Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University’s pollen reference collection (2,890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set.We additionally work through a real world case study where we assess the ability of the system to determine the pollen make-up of samples of New Zealand honey. In addition to the Classifynder’s native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na–Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na–HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous–Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a layered framework for the purposes of integrating different Socio-Technical Systems (STS) models and perspectives into a whole-of-systems model. Holistic modelling plays a critical role in the engineering of STS due to the interplay between social and technical elements within these systems and resulting emergent behaviour. The framework decomposes STS models into components, where each component is either a static object, dynamic object or behavioural object. Based on existing literature, a classification of the different elements that make up STS, whether it be a social, technical or a natural environment element, is developed; each object can in turn be classified according to the STS elements it represents. Using the proposed framework, it is possible to systematically decompose models to an extent such that points of interface can be identified and the contextual factors required in transforming the component of one model to interface into another is obtained. Using an airport inbound passenger facilitation process as a case study socio-technical system, three different models are analysed: a Business Process Modelling Notation (BPMN) model, Hybrid Queue-based Bayesian Network (HQBN) model and an Agent Based Model (ABM). It is found that the framework enables the modeller to identify non-trivial interface points such as between the spatial interactions of an ABM and the causal reasoning of a HQBN, and between the process activity representation of a BPMN and simulated behavioural performance in a HQBN. Such a framework is a necessary enabler in order to integrate different modelling approaches in understanding and managing STS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Systematic reviews, through the synthesis of multiple primary research studies, can be powerful tools in enabling evidence-informed public health policy debate, development and action. In seeking to optimize the utility of these reviews, it is important to understand the needs of those using them. Previous work has emphasized that researchers should adopt methods that are appropriate to the problems that public health decision-makers are grappling with, as well as to the policy context in which they operate.1,2 Meeting these demands poses significant methodological challenges for review authors and prompts a reconsideration of the resources, training and support structures available to facilitate the efficient and timely production of useful, comprehensive reviews. The Cochrane Public Health Group (CPHG) was formed in 2008 to support reviews of complex, upstream public health topics. The majority of CPHG authors are from the UK, which has historically been at the forefront of efforts to promote the production and use of systematic reviews of research relevant to public health decision-makers. The UK therefore provides a suitably mature national context in which to examine (i) the current and future demands of decision-makers to increase the use, value and impact of evidence syntheses; (ii) the implications this has for the scope and methods of reviews and (iii) the required action to build and support capacity to conduct such reviews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses research from a three-year longitudinal study that engaged children in data modeling experiences from the beginning school year through to third year (6-8 years). A data modeling approach to statistical development differs in several ways from what is typically done in early classroom experiences with data. In particular, data modeling immerses children in problems that evolve from their own questions and reasoning, with core statistical foundations established early. These foundations include a focus on posing and refining statistical questions within and across contexts, structuring and representing data, making informal inferences, and developing conceptual, representational, and metarepresentational competence. Examples are presented of how young learners developed and sustained informal inferential reasoning and metarepresentational competence across the study to become “sophisticated statisticians”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence from economic evaluations is often not used to inform healthcare policy despite being well regarded by policy makers and physicians. This article employs the accessibility and acceptability framework to review the barriers to using evidence from economic evaluation in healthcare policy and the strategies used to overcome these barriers. Economic evaluations are often inaccessible to policymakers due to the absence of relevant economic evaluations, the time and cost required to conduct and interpret economic evaluations, and lack of expertise to evaluate quality and interpret results. Consistently reported factors that limit the translation of findings from economic evaluations into healthcare policy include poor quality of research informing economic evaluations, assumptions used in economic modelling, conflicts of interest, difficulties in transferring resources between sectors, negative attitudes to healthcare rationing, and the absence of equity considerations. Strategies to overcome these barriers have been suggested in the literature, including training, structured abstract databases, rapid evaluation, reporting checklists for journals, and considering factors other than cost effectiveness in economic evaluations, such as equity or budget impact. The factors that prevent or encourage decision makers to use evidence from economic evaluations have been identified, but the relative importance of these factors to decision makers is uncertain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extant models of decision making in social neurobiological systems have typically explained task dynamics as characterized by transitions between two attractors. In this paper, we model a three-attractor task exemplified in a team sport context. The model showed that an attacker–defender dyadic system can be described by the angle x between a vector connecting the participants and the try line. This variable was proposed as an order parameter of the system and could be dynamically expressed by integrating a potential function. Empirical evidence has revealed that this kind of system has three stable attractors, with a potential function of the form V(x)=−k1x+k2ax2/2−bx4/4+x6/6, where k1 and k2 are two control parameters. Random fluctuations were also observed in system behavior, modeled as white noise εt, leading to the motion equation dx/dt = −dV/dx+Q0.5εt, where Q is the noise variance. The model successfully mirrored the behavioral dynamics of agents in a social neurobiological system, exemplified by interactions of players in a team sport.