907 resultados para Statistical model
Resumo:
The interatomic potential of the system I - I at intermediate and small distances is calculated from atomic DFS electron densities within a statistical model. Structures in the potential, due to the electronic shells, are investigated. Calculations of the elastic differential scattering cross section for small angles and several keV impact energies show a detailed peak pattern which can be correlated to individual electronic shell interaction.
Resumo:
The interatomic potential of the ion-atom scattering system I^N+-I at small intermediate internuclear distances is calculated for different charge states N from atomic Dirac-Focker-Slater (DFS) electron densities within a statistical model. The behaviour of the potential structures, due to ionized electronic shells, is studied by calculations of classical elastic differential scattering cross-sections.
Resumo:
Object recognition is complicated by clutter, occlusion, and sensor error. Since pose hypotheses are based on image feature locations, these effects can lead to false negatives and positives. In a typical recognition algorithm, pose hypotheses are tested against the image, and a score is assigned to each hypothesis. We use a statistical model to determine the score distribution associated with correct and incorrect pose hypotheses, and use binary hypothesis testing techniques to distinguish between them. Using this approach we can compare algorithms and noise models, and automatically choose values for internal system thresholds to minimize the probability of making a mistake.
Resumo:
This paper describes a new statistical, model-based approach to building a contact state observer. The observer uses measurements of the contact force and position, and prior information about the task encoded in a graph, to determine the current location of the robot in the task configuration space. Each node represents what the measurements will look like in a small region of configuration space by storing a predictive, statistical, measurement model. This approach assumes that the measurements are statistically block independent conditioned on knowledge of the model, which is a fairly good model of the actual process. Arcs in the graph represent possible transitions between models. Beam Viterbi search is used to match measurement history against possible paths through the model graph in order to estimate the most likely path for the robot. The resulting approach provides a new decision process that can be use as an observer for event driven manipulation programming. The decision procedure is significantly more robust than simple threshold decisions because the measurement history is used to make decisions. The approach can be used to enhance the capabilities of autonomous assembly machines and in quality control applications.
Resumo:
We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.
Resumo:
Compositional random vectors are fundamental tools in the Bayesian analysis of categorical data. Many of the issues that are discussed with reference to the statistical analysis of compositional data have a natural counterpart in the construction of a Bayesian statistical model for categorical data. This note builds on the idea of cross-fertilization of the two areas recommended by Aitchison (1986) in his seminal book on compositional data. Particular emphasis is put on the problem of what parameterization to use
Resumo:
A statistical method for classification of sags their origin downstream or upstream from the recording point is proposed in this work. The goal is to obtain a statistical model using the sag waveforms useful to characterise one type of sags and to discriminate them from the other type. This model is built on the basis of multi-way principal component analysis an later used to project the available registers in a new space with lower dimension. Thus, a case base of diagnosed sags is built in the projection space. Finally classification is done by comparing new sags against the existing in the case base. Similarity is defined in the projection space using a combination of distances to recover the nearest neighbours to the new sag. Finally the method assigns the origin of the new sag according to the origin of their neighbours
Resumo:
El cáncer de mama es la neoplasia más frecuente en mujeres en el mundo y en Colombia. Este artículo describe las tendencias de mortalidad por cáncer de mama en Bogotá y Colombia entre 1995 y 2009. Metodología: Estudio descriptivo de análisis de tendencias de las tasas de mortalidad a través del modelo estadístico de Edad – Periodo – Cohorte. Los casos se tomaron de los certificados de defunción por cáncer de mama registrados en el Departamento Nacional de Estadística entre 1995 y 2009. Se evaluaron diferentes modelos: periodo, periodo-drift (cambio lineal en el tiempo), periodo-edad, periodo-cohorte, periodo-edad-cohorte por el método funciones estimables. Resultados: La tasa de mortalidad por cáncer de mama en Colombia fue 6.78 por 100.000 con comportamiento constante en los tres periodos. Mientras en Bogotá fue de 7.78 por 100.000 con tendencia a la disminución entre 40 y 69 años, en el último periodo estudiado. En este estudio el efecto periodo + cohorte es el que mejor describe el comportamiento de las tasas de mortalidad por cáncer de mama en ambos escenarios (Bogotá AIC: 13.8 p=0,314; Colombia: AIC=27.4 p=0,238). Conclusiones: Existe una tendencia a la disminución en la mortalidad por cáncer de mama en ciertos grupos etáreos en Bogotá en el periodo 2005-2009, diversas hipótesis explicarían dicho fenómeno, entre estas el acceso a métodos de tamizaje, aunque no hay estudios al respecto. Se recomienda reforzar los métodos de tamizaje y diagnóstico temprano en el resto del país.
Resumo:
ANTECEDENTES: En Colombia, reportes del año 2010 de la Encuesta Nacional de la Situación en Nutrición ENSIN 2010(2), muestran uno de cada dos colombianos, presentan un índice de masa corporal mayor al esperado (3) METODO: El presente estudio de corte transversal, determino la prevalencia de obesidad y otros factores de riesgo cardiovascular en una población de estudiantes de Ciencias de la Salud de una Universidad regional en el primer periodo académico del año 2013. El tamaño de muestra fue n=113 sujetos que corresponden 60,5% a la carrera de medicina y 39,95% a enfermería. Con el fin de conocer su comportamiento con respecto a hábitos y estilos de vida específicos como el consumo de alcohol, el consumo de tabaco y el sedentarismo, así como su asociación a eventos inflamatorios relacionados con la fisiopatología de los procesos de salud asociados al peso, por medio de instrumentos de medición clínica, antropométrica y sérica, determino un modelo estadístico propicio para entender el comportamiento de la obesidad y la enfermedad Cardiovascular RESULTADOS: La prevalencia estimada de sobrepeso y obesidad por Índice de Masa Corporal (IMC), fue del 27,7% (IC 95%: 19.9%,37.2%); por el perímetro abdominal (OBPABD) se encontró una prevalencia estimada del 27,4% (IC 95%: 19,9% – 36,4%), y la prevalencia con el Índice Cintura Cadera (OBICC) fue de 3,5% (IC 95%:1,3% – 9,3%). CONCLUSIONES: La presencia de hábitos no saludables y la presencia de sobrepeso y obesidad se considera que es necesario en primera instancia una valoración general de estado nutricional de los universitarios de las diferentes facultados y plantear estrategias preventivas ya que la literatura documenta los efectos de los hábitos no saludables sino además documenta los efectos de la prevención de la misma ya que en si se ha encontrado asociación para enfermedades cardiovasculares. Se propone que para obtener mayor información del comportamiento de los factores de riesgo cardiovasculares se deberían realizar estudios retrospectivos en el que intervengan las demás carreras de la universidad y poder evaluar la totalidad de población universitaria
Resumo:
Asset correlations are of critical importance in quantifying portfolio credit risk and economic capitalin financial institutions. Estimation of asset correlation with rating transition data has focusedon the point estimation of the correlation without giving any consideration to the uncertaintyaround these point estimates. In this article we use Bayesian methods to estimate a dynamicfactor model for default risk using rating data (McNeil et al., 2005; McNeil and Wendin, 2007).Bayesian methods allow us to formally incorporate human judgement in the estimation of assetcorrelation, through the prior distribution and fully characterize a confidence set for the correlations.Results indicate: i) a two factor model rather than the one factor model, as proposed bythe Basel II framework, better represents the historical default data. ii) importance of unobservedfactors in this type of models is reinforced and point out that the levels of the implied asset correlationscritically depend on the latent state variable used to capture the dynamics of default,as well as other assumptions on the statistical model. iii) the posterior distributions of the assetcorrelations show that the Basel recommended bounds, for this parameter, undermine the levelof systemic risk.
Resumo:
We use an empirical statistical model to demonstrate significant skill in making extended-range forecasts of the monthly-mean Arctic Oscillation (AO). Forecast skill derives from persistent circulation anomalies in the lowermost stratosphere and is greatest during boreal winter. A comparison to the Southern Hemisphere provides evidence that both the time scale and predictability of the AO depend on the presence of persistent circulation anomalies just above the tropopause. These circulation anomalies most likely affect the troposphere through changes to waves in the upper troposphere, which induce surface pressure changes that correspond to the AO.
Resumo:
A seasonal forecasting system that is capable of skilfully predicting rainfall totals on a regional scale would be of great value to Ethiopia. Here, we describe how a statistical model can exploit the teleconnections described in part 1 of this pair of papers to develop such a system. We show that, in most cases, the predictors selected objectively by the statistical model can be interpreted in the light of physical teleconnections with Ethiopian rainfall, and discuss why, in some cases, unexpected regions are chosen as predictors. We show that the forecast has skill in all parts of Ethiopia, and argue that this method could provide the basis of an operational seasonal forecasting system for Ethiopia.
Resumo:
A method is proposed to determine the extent of degradation in the rumen involving a two-stage mathematical modeling process. In the first stage, a statistical model shifts (or maps) the gas accumulation profile obtained using a fecal inoculum to a ruminal gas profile. Then, a kinetic model determines the extent of degradation in the rumen from the shifted profile. The kinetic model is presented as a generalized mathematical function, allowing any one of a number of alternative equation forms to be selected. This method might allow the gas production technique to become an approach for determining extent of degradation in the rumen, decreasing the need for surgically modified animals while still maintaining the link with the animal. Further research is needed before the proposed methodology can be used as a standard method across a range of feeds.
Resumo:
Biologists frequently attempt to infer the character states at ancestral nodes of a phylogeny from the distribution of traits observed in contemporary organisms. Because phylogenies are normally inferences from data, it is desirable to account for the uncertainty in estimates of the tree and its branch lengths when making inferences about ancestral states or other comparative parameters. Here we present a general Bayesian approach for testing comparative hypotheses across statistically justified samples of phylogenies, focusing on the specific issue of reconstructing ancestral states. The method uses Markov chain Monte Carlo techniques for sampling phylogenetic trees and for investigating the parameters of a statistical model of trait evolution. We describe how to combine information about the uncertainty of the phylogeny with uncertainty in the estimate of the ancestral state. Our approach does not constrain the sample of trees only to those that contain the ancestral node or nodes of interest, and we show how to reconstruct ancestral states of uncertain nodes using a most-recent-common-ancestor approach. We illustrate the methods with data on ribonuclease evolution in the Artiodactyla. Software implementing the methods ( BayesMultiState) is available from the authors.
Resumo:
1. We studied a reintroduced population of the formerly critically endangered Mauritius kestrel Falco punctatus Temmink from its inception in 1987 until 2002, by which time the population had attained carrying capacity for the study area. Post-1994 the population received minimal management other than the provision of nestboxes. 2. We analysed data collected on survival (1987-2002) using program MARK to explore the influence of density-dependent and independent processes on survival over the course of the population's development. 3.We found evidence for non-linear, threshold density dependence in juvenile survival rates. Juvenile survival was also strongly influenced by climate, with the temporal distribution of rainfall during the cyclone season being the most influential climatic variable. Adult survival remained constant throughout. 4. Our most parsimonious capture-mark-recapture statistical model, which was constrained by density and climate, explained 75.4% of the temporal variation exhibited in juvenile survival rates over the course of the population's development. 5. This study is an example of how data collected as part of a threatened species recovery programme can be used to explore the role and functional form of natural population regulatory processes. With the improvements in conservation management techniques and the resulting success stories, formerly threatened species offer unique opportunities to further our understanding of the fundamental principles of population ecology.