935 resultados para Random Walk Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic models for three-dimensional particles have many applications in applied sciences. Lévy–based particle models are a flexible approach to particle modelling. The structure of the random particles is given by a kernel smoothing of a Lévy basis. The models are easy to simulate but statistical inference procedures have not yet received much attention in the literature. The kernel is not always identifiable and we suggest one approach to remedy this problem. We propose a method to draw inference about the kernel from data often used in local stereology and study the performance of our approach in a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. According to the sequential accretion model (or core-nucleated accretion model), giant planet formation is based first on the formation of a solid core which, when massive enough, can gravitationally bind gas from the nebula to form the envelope. The most critical part of the model is the formation time of the core: to trigger the accretion of gas, the core has to grow up to several Earth masses before the gas component of the protoplanetary disc dissipates. Aims: We calculate planetary formation models including a detailed description of the dynamics of the planetesimal disc, taking into account both gas drag and excitation of forming planets. Methods: We computed the formation of planets, considering the oligarchic regime for the growth of the solid core. Embryos growing in the disc stir their neighbour planetesimals, exciting their relative velocities, which makes accretion more difficult. Here we introduce a more realistic treatment for the evolution of planetesimals' relative velocities, which directly impact on the formation timescale. For this, we computed the excitation state of planetesimals, as a result of stirring by forming planets, and gas-solid interactions. Results: We find that the formation of giant planets is favoured by the accretion of small planetesimals, as their random velocities are more easily damped by the gas drag of the nebula. Moreover, the capture radius of a protoplanet with a (tiny) envelope is also larger for small planetesimals. However, planets migrate as a result of disc-planet angular momentum exchange, with important consequences for their survival: due to the slow growth of a protoplanet in the oligarchic regime, rapid inward type I migration has important implications on intermediate-mass planets that have not yet started their runaway accretion phase of gas. Most of these planets are lost in the central star. Surviving planets have masses either below 10 M⊕ or above several Jupiter masses. Conclusions: To form giant planets before the dissipation of the disc, small planetesimals (~0.1 km) have to be the major contributors of the solid accretion process. However, the combination of oligarchic growth and fast inward migration leads to the absence of intermediate-mass planets. Other processes must therefore be at work to explain the population of extrasolar planets that are presently known.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perceptual learning is a training induced improvement in performance. Mechanisms underlying the perceptual learning of depth discrimination in dynamic random dot stereograms were examined by assessing stereothresholds as a function of decorrelation. The inflection point of the decorrelation function was defined as the level of decorrelation corresponding to 1.4 times the threshold when decorrelation is 0%. In general, stereothresholds increased with increasing decorrelation. Following training, stereothresholds and standard errors of measurement decreased systematically for all tested decorrelation values. Post training decorrelation functions were reduced by a multiplicative constant (approximately 5), exhibiting changes in stereothresholds without changes in the inflection points. Disparity energy model simulations indicate that a post-training reduction in neuronal noise can sufficiently account for the perceptual learning effects. In two subjects, learning effects were retained over a period of six months, which may have application for training stereo deficient subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models generated from dental impressions (Digimodels; Orthoproof, Nieuwegein, The Netherlands). The Digimodels were used as the reference standard. METHODS The 3 types of digital models were made from 10 subjects. Four examiners repeated 37 linear tooth and arch measurements 10 times. Paired t tests and the intraclass correlation coefficient were performed to determine the reproducibility and accuracy of the measurements. RESULTS The CBCT images showed significantly smaller intraclass correlation coefficient values and larger duplicate measurement errors compared with the corresponding values for Digimodels and Anatomodels. The average difference between measurements on CBCT images and Digimodels ranged from -0.4 to 1.65 mm, with limits of agreement values up to 1.3 mm for crown-width measurements. The average difference between Anatomodels and Digimodels ranged from -0.42 to 0.84 mm with limits of agreement values up to 1.65 mm. CONCLUSIONS Statistically significant differences between measurements on Digimodels and Anatomodels, and between Digimodels and CBCT images, were found. Although the mean differences might be clinically acceptable, the random errors were relatively large compared with corresponding measurements reported in the literature for both Anatomodels and CBCT images, and might be clinically important. Therefore, with the CBCT settings used in this study, measurements made directly on CBCT images and Anatomodels are not as accurate as measurements on Digimodels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effects of conspecific neighbours on survival and growth of trees have been found to be related to species abundance. Both positive and negative relationships may explain observed abundance patterns. Surprisingly, it is rarely tested whether such relationships could be biased or even spurious due to transforming neighbourhood variables or influences of spatial aggregation, distance decay of neighbour effects and standardization of effect sizes. To investigate potential biases, communities of 20 identical species were simulated with log-series abundances but without species-specific interactions. No relationship of conspecific neighbour effects on survival or growth with species abundance was expected. Survival and growth of individuals was simulated in random and aggregated spatial patterns using no, linear, or squared distance decay of neighbour effects. Regression coefficients of statistical neighbourhood models were unbiased and unrelated to species abundance. However, variation in the number of conspecific neighbours was positively or negatively related to species abundance depending on transformations of neighbourhood variables, spatial pattern and distance decay. Consequently, effect sizes and standardized regression coefficients, often used in model fitting across large numbers of species, were also positively or negatively related to species abundance depending on transformation of neighbourhood variables, spatial pattern and distance decay. Tests using randomized tree positions and identities provide the best benchmarks by which to critically evaluate relationships of effect sizes or standardized regression coefficients with tree species abundance. This will better guard against potential misinterpretations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To test the applicability, accuracy, precision, and reproducibility of various 3D superimposition techniques for radiographic data, transformed to triangulated surface data. METHODS Five superimposition techniques (3P: three-point registration; AC: anterior cranial base; AC + F: anterior cranial base + foramen magnum; BZ: both zygomatic arches; 1Z: one zygomatic arch) were tested using eight pairs of pre-existing CT data (pre- and post-treatment). These were obtained from non-growing orthodontic patients treated with rapid maxillary expansion. All datasets were superimposed by three operators independently, who repeated the whole procedure one month later. Accuracy was assessed by the distance (D) between superimposed datasets on three form-stable anatomical areas, located on the anterior cranial base and the foramen magnum. Precision and reproducibility were assessed using the distances between models at four specific landmarks. Non parametric multivariate models and Bland-Altman difference plots were used for analyses. RESULTS There was no difference among operators or between time points on the accuracy of each superimposition technique (p>0.05). The AC + F technique was the most accurate (D<0.17 mm), as expected, followed by AC and BZ superimpositions that presented similar level of accuracy (D<0.5 mm). 3P and 1Z were the least accurate superimpositions (0.790.05), the detected structural changes differed significantly between different techniques (p<0.05). Bland-Altman difference plots showed that BZ superimposition was comparable to AC, though it presented slightly higher random error. CONCLUSIONS Superimposition of 3D datasets using surface models created from voxel data can provide accurate, precise, and reproducible results, offering also high efficiency and increased post-processing capabilities. In the present study population, the BZ superimposition was comparable to AC, with the added advantage of being applicable to scans with a smaller field of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strategies are compared for the development of a linear regression model with stochastic (multivariate normal) regressor variables and the subsequent assessment of its predictive ability. Bias and mean squared error of four estimators of predictive performance are evaluated in simulated samples of 32 population correlation matrices. Models including all of the available predictors are compared with those obtained using selected subsets. The subset selection procedures investigated include two stopping rules, C$\sb{\rm p}$ and S$\sb{\rm p}$, each combined with an 'all possible subsets' or 'forward selection' of variables. The estimators of performance utilized include parametric (MSEP$\sb{\rm m}$) and non-parametric (PRESS) assessments in the entire sample, and two data splitting estimates restricted to a random or balanced (Snee's DUPLEX) 'validation' half sample. The simulations were performed as a designed experiment, with population correlation matrices representing a broad range of data structures.^ The techniques examined for subset selection do not generally result in improved predictions relative to the full model. Approaches using 'forward selection' result in slightly smaller prediction errors and less biased estimators of predictive accuracy than 'all possible subsets' approaches but no differences are detected between the performances of C$\sb{\rm p}$ and S$\sb{\rm p}$. In every case, prediction errors of models obtained by subset selection in either of the half splits exceed those obtained using all predictors and the entire sample.^ Only the random split estimator is conditionally (on $\\beta$) unbiased, however MSEP$\sb{\rm m}$ is unbiased on average and PRESS is nearly so in unselected (fixed form) models. When subset selection techniques are used, MSEP$\sb{\rm m}$ and PRESS always underestimate prediction errors, by as much as 27 percent (on average) in small samples. Despite their bias, the mean squared errors (MSE) of these estimators are at least 30 percent less than that of the unbiased random split estimator. The DUPLEX split estimator suffers from large MSE as well as bias, and seems of little value within the context of stochastic regressor variables.^ To maximize predictive accuracy while retaining a reliable estimate of that accuracy, it is recommended that the entire sample be used for model development, and a leave-one-out statistic (e.g. PRESS) be used for assessment. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implementation of a charging policy for heavy goods vehicles in European Union (EU) member countries has been imposed to reflect costs of construction and maintenance of infrastructure as well as externalities such as congestion, accidents and environmental impact. In this context, EU countries approved the Eurovignette directive (1999/62/EC) and its amending directive (2006 /38/EC) which established a legal framework to regulate the system of tolls. Even if that regulation seek s to increase the efficien cy of freight, it will trigger direct and indirect effects on Spain’s regional economies by increasing transport costs. This paper presents the development of a multiregional Input-Output methodology (MRIO) with elastic trade coefficients to predict in terregional trade, using transport attributes integrated in multinomial logit models. This method is highly useful to carry out an ex-ante evaluation of transport policies because it involves road freight transport cost sensitivity, and determine regional distributive and substitution economic effect s of countries like Spain, characterized by socio-demographic and economic attributes, differentiated region by region. It will thus be possible to determine cost-effective strategies, given different policy scenarios. MRIO mode l would then be used to determine the impact on the employment rate of imposing a charge in the Madrid-Sevilla corridor in Spain. This methodology is important for measuring the impact on the employment rate since it is one of the main macroeconomic indicators of Spain’s regional and national economic situation. A previous research developed (DESTINO) using a MRIO method estimated employment impacts of road pricing policy across Spanish regions considering a fuel tax charge (€/liter) in the entire shortest cost path network for freight transport. Actually, it found that the variation in employment is expected to be substantial for some regions, and negligible for others. For example, in this Spanish case study of regional employment has showed reductions between 16.1% (Rioja) and 1.4% (Madrid region). This variation range seems to be related to either the intensity of freight transport in each region or dependency of regions to transport intensive economic sect ors. In fact, regions with freight transport intensive sectors will lose more jobs while regions with a predominantly service economy undergo a fairly insignificant loss of employment. This paper is focused on evaluating a freight transport vehicle-kilometer charge (€/km) in a non-tolled motorway corridor (A-4) between Madrid-Sevilla (517 Km.). The consequences of the road pricing policy implementation show s that the employment reductions are not as high as the diminution stated in the previous research because this corridor does not affect the whole freight transport system of Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During launch, satellite and their equipment are subjected to loads of random nature and with a wide frequency range. Their vibro-acoustic response is an important issue to be analysed, for example for folded solar arrays and antennas. The main issue at low modal density is the modelling combinations engaging air layers, structures and external fluid. Depending on the modal density different methodologies, as FEM, BEM and SEA should be considered. This work focuses on the analysis of different combinations of the methodologies previously stated used in order to characterise the vibro-acoustic response of two rectangular sandwich structure panels isolated and engaging an air layer between them under a diffuse acoustic field. Focusing on the modelling of air layers, different models are proposed. To illustrate the phenomenology described and studied, experimental results from an acoustic test on an ARA-MKIII solar array in folded configuration are presented along with numerical results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crowd induced dynamic loading in large structures, such as gymnasiums or stadium, is usually modelled as a series of harmonic loads which are defined in terms of their Fourier coefficients. Different values of these coefficients that were obtained from full scale measurements can be found in codes. Recently, an alternative has been proposed, based on random generation of load time histories that take into account phase lag among individuals inside the crowd. This paper presents the testing done on a structure designed to be a gymnasium. Two series of dynamic test were performed on the gym slab. For the first test an electrodynamic shaker was placed at several locations and during the second one people located inside a marked area bounced and jumped guided by different metronome rates. A finite element model (FEM) is presented and a comparison of numerically predicted and experimentally observed vibration modes and frequencies has been used to assess its validity. The second group of measurements will be compared with predictions made using the FEM model and three alternatives for crowd induced load modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special issue gathers together a number of recent papers on fractal geometry and its applications to the modeling of flow and transport in porous media. The aim is to provide a systematic approach for analyzing the statics and dynamics of fluids in fractal porous media by means of theory, modeling and experimentation. The topics covered include lacunarity analyses of multifractal and natural grayscale patterns, random packing's of self-similar pore/particle size distributions, Darcian and non-Darcian hydraulic flows, diffusion within fractals, models for the permeability and thermal conductivity of fractal porous media and hydrophobicity and surface erosion properties of fractal structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Services in smart environments pursue to increase the quality of people?s lives. The most important issues when developing this kind of environments is testing and validating such services. These tasks usually imply high costs and annoying or unfeasible real-world testing. In such cases, artificial societies may be used to simulate the smart environment (i.e. physical environment, equipment and humans). With this aim, the CHROMUBE methodology guides test engineers when modeling human beings. Such models reproduce behaviors which are highly similar to the real ones. Originally, these models are based on automata whose transitions are governed by random variables. Automaton?s structure and the probability distribution functions of each random variable are determined by a manual test and error process. In this paper, it is presented an alternative extension of this methodology which avoids the said manual process. It is based on learning human behavior patterns automatically from sensor data by using machine learning techniques. The presented approach has been tested on a real scenario, where this extension has given highly accurate human behavior models,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent experimental data on the conductivity σ+(T), T → 0, on the metallic side of the metal–insulator transition in ideally random (neutron transmutation-doped) 70Ge:Ga have shown that σ+(0) ∝ (N − Nc)μ with μ = ½, confirming earlier ultra-low-temperature results for Si:P. This value is inconsistent with theoretical predictions based on diffusive classical scaling models, but it can be understood by a quantum-directed percolative filamentary amplitude model in which electronic basis states exist which have a well-defined momentum parallel but not normal to the applied electric field. The model, which is based on a new kind of broken symmetry, also explains the anomalous sign reversal of the derivative of the temperature dependence in the critical regime.