961 resultados para Statistical Theory
Resumo:
2000 Mathematics Subject Classification: 41A25, 41A36, 40G15.
Resumo:
2000 Mathematics Subject Classification: 41A25, 41A36.
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
In a seminal data mining article, Leo Breiman [1] argued that to develop effective predictive classification and regression models, we need to move away from the sole dependency on statistical algorithms and embrace a wider toolkit of modeling algorithms that include data mining procedures. Nevertheless, many researchers still rely solely on statistical procedures when undertaking data modeling tasks; the sole reliance on these procedures has lead to the development of irrelevant theory and questionable research conclusions ([1], p.199). We will outline initiatives that the HPC & Research Support group is undertaking to engage researchers with data mining tools and techniques; including a new range of seminars, workshops, and one-on-one consultations covering data mining algorithms, the relationship between data mining and the research cycle, and limitations and problems with these new algorithms. Organisational limitations and restrictions to these initiatives are also discussed.
Resumo:
Many traffic situations require drivers to cross or merge into a stream having higher priority. Gap acceptance theory enables us to model such processes to analyse traffic operation. This discussion demonstrated that numerical search fine tuned by statistical analysis can be used to determine the most likely critical gap for a sample of drivers, based on their largest rejected gap and accepted gap. This method shares some common features with the Maximum Likelihood Estimation technique (Troutbeck 1992) but lends itself well to contemporary analysis tools such as spreadsheet and is particularly analytically transparent. This method is considered not to bias estimation of critical gap due to very small rejected gaps or very large rejected gaps. However, it requires a sufficiently large sample that there is reasonable representation of largest rejected gap/accepted gap pairs within a fairly narrow highest likelihood search band.
Resumo:
To enhance workplace safety in the construction industry it is important to understand interrelationships among safety risk factors associated with construction accidents. This study incorporates the systems theory into Heinrich’s domino theory to explore the interrelationships of risks and break the chain of accident causation. Through both empirical and statistical analyses of 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, the study investigates relationships between accidents and injury elements (e.g., injury type, part of body, injury severity) and the nature of construction injuries by accident type. The study then discusses relationships between accidents and risks, including worker behavior, injury source, and environmental condition, and identifies key risk factors and risk combinations causing accidents. The research outcomes will assist safety managers to prioritize risks according to the likelihood of accident occurrence and injury characteristics, and pay more attention to balancing significant risk relationships to prevent accidents and achieve safer working environments.
Resumo:
Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.
Resumo:
This paper addresses research from a three-year longitudinal study that engaged children in data modeling experiences from the beginning school year through to third year (6-8 years). A data modeling approach to statistical development differs in several ways from what is typically done in early classroom experiences with data. In particular, data modeling immerses children in problems that evolve from their own questions and reasoning, with core statistical foundations established early. These foundations include a focus on posing and refining statistical questions within and across contexts, structuring and representing data, making informal inferences, and developing conceptual, representational, and metarepresentational competence. Examples are presented of how young learners developed and sustained informal inferential reasoning and metarepresentational competence across the study to become “sophisticated statisticians”.
Exploring variation in measurement as a foundation for statistical thinking in the elementary school
Resumo:
This study was based on the premise that variation is the foundation of statistics and statistical investigations. The study followed the development of fourth-grade students' understanding of variation through participation in a sequence of two lessons based on measurement. In the first lesson all students measured the arm span of one student, revealing pathways students follow in developing understanding of variation and linear measurement (related to research question 1). In the second lesson each student's arm span was measured once, introducing a different aspect of variation for students to observe and contrast. From this second lesson, students' development of the ability to compare their representations for the two scenarios and explain differences in terms of variation was explored (research question 2). Students' documentation, in both workbook and software formats, enabled us to monitor their engagement and identify their increasing appreciation of the need to observe, represent, and contrast the variation in the data. Following the lessons, a written student assessment was used for judging retention of understanding of variation developed through the lessons and the degree of transfer of understanding to a different scenario (research question 3).
Resumo:
We apply an information-theoretic cost metric, the symmetrized Kullback-Leibler (sKL) divergence, or $J$-divergence, to fluid registration of diffusion tensor images. The difference between diffusion tensors is quantified based on the sKL-divergence of their associated probability density functions (PDFs). Three-dimensional DTI data from 34 subjects were fluidly registered to an optimized target image. To allow large image deformations but preserve image topology, we regularized the flow with a large-deformation diffeomorphic mapping based on the kinematics of a Navier-Stokes fluid. A driving force was developed to minimize the $J$-divergence between the deforming source and target diffusion functions, while reorienting the flowing tensors to preserve fiber topography. In initial experiments, we showed that the sKL-divergence based on full diffusion PDFs is adaptable to higher-order diffusion models, such as high angular resolution diffusion imaging (HARDI). The sKL-divergence was sensitive to subtle differences between two diffusivity profiles, showing promise for nonlinear registration applications and multisubject statistical analysis of HARDI data.
Resumo:
With the extension of the work of the preceding paper, the relativistic front form for Maxwell's equations for electromagnetism is developed and shown to be particularly suited to the description of paraxial waves. The generators of the Poincaré group in a form applicable directly to the electric and magnetic field vectors are derived. It is shown that the effect of a thin lens on a paraxial electromagnetic wave is given by a six-dimensional transformation matrix, constructed out of certain special generators of the Poincaré group. The method of construction guarantees that the free propagation of such waves as well as their transmission through ideal optical systems can be described in terms of the metaplectic group, exactly as found for scalar waves by Bacry and Cadilhac. An alternative formulation in terms of a vector potential is also constructed. It is chosen in a gauge suggested by the front form and by the requirement that the lens transformation matrix act locally in space. Pencils of light with accompanying polarization are defined for statistical states in terms of the two-point correlation function of the vector potential. Their propagation and transmission through lenses are briefly considered in the paraxial limit. This paper extends Fourier optics and completes it by formulating it for the Maxwell field. We stress that the derivations depend explicitly on the "henochromatic" idealization as well as the identification of the ideal lens with a quadratic phase shift and are heuristic to this extent.
Resumo:
Post-traumatic stress disorder (PTSD) is a debilitating psychiatric disorder that has a major impact on the ability to function effectively in daily life. PTSD may develop as a response to exposure to an event or events perceived as potentially harmful or life-threatening. It has high prevalence rates in the community, especially among vulnerable groups such as military personnel or those in emergency services. Despite extensive research in this field, the underlying mechanisms of the disorder remain largely unknown. The identification of risk factors for PTSD has posed a particular challenge as there can be delays in onset of the disorder, and most people who are exposed to traumatic events will not meet diagnostic criteria for PTSD. With the advent of the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM V), the classification for PTSD has changed from an anxiety disorder into the category of stress- and trauma-related disorders. This has the potential to refocus PTSD research on the nature of stress and the stress response relationship. This review focuses on some of the important findings from psychological and biological research based on early models of stress and resilience. Improving our understanding of PTSD by investigating both genetic and psychological risk and coping factors that influence stress response, as well as their interaction, may provide a basis for more effective and earlier intervention.