937 resultados para Bose-Einstein condensation statistical model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is a recognized need to move from mortality to morbidity outcome predictions following traumatic injury. However, there are few morbidity outcome prediction scoring methods and these fail to incorporate important comorbidities or cofactors. This study aims to develop and evaluate a method that includes such variables. Methods: This was a consecutive case series registered in the Queensland Trauma Registry that consented to a prospective 12-month telephone conducted follow-up study. A multivariable statistical model was developed relating Trauma Registry data to trichotomized 12-month post-injury outcome (categories: no limitations, minor limitations and major limitations). Cross-validation techniques using successive single hold-out samples were then conducted to evaluate the model's predictive capabilities. Results: In total, 619 participated, with 337 (54%) experiencing no limitations, 101 (16%) experiencing minor limitations and 181 (29%) experiencing major limitations 12 months after injury. The final parsimonious multivariable statistical model included whether the injury was in the lower extremity body region, injury severity, age, length of hospital stay, pulse at admission and whether the participant was admitted to an intensive care unit. This model explained 21% of the variability in post-injury outcome. Predictively, 64% of those with no limitations, 18% of those with minor limitations and 37% of those with major limitations were correctly identified. Conclusion: Although carefully developed, this statistical model lacks the predictive power necessary for its use as a basis of a useful prognostic tool. Further research is required to identify variables other than those routinely used in the Trauma Registry to develop a model with the necessary predictive utility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We experimentally investigate the outcoupling of atoms from Bose-Einstein condensates using two radio-frequency (rf) fields in the presence of gravity. We show that the fringe separation in the resulting interference pattern derives entirely from the energy difference between the two rf fields and not the gravitational potential difference between the two resonances. We subsequently demonstrate how the phase and polarization of the rf radiation directly control the phase of the matter wave interference and provide a semiclassical interpretation of the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the quantum many-body dynamics of dissociation of a Bose-Einstein condensate of molecular dimers into pairs of constituent bosonic atoms and analyze the resulting atom-atom correlations. The quantum fields of both the molecules and atoms are simulated from first principles in three dimensions using the positive-P representation method. This allows us to provide an exact treatment of the molecular field depletion and s-wave scattering interactions between the particles, as well as to extend the analysis to nonuniform systems. In the simplest uniform case, we find that the major source of atom-atom decorrelation is atom-atom recombination which produces molecules outside the initially occupied condensate mode. The unwanted molecules are formed from dissociated atom pairs with nonopposite momenta. The net effect of this process-which becomes increasingly significant for dissociation durations corresponding to more than about 40% conversion-is to reduce the atom-atom correlations. In addition, for nonuniform systems we find that mode mixing due to inhomogeneity can result in further degradation of the correlation signal. We characterize the correlation strength via the degree of squeezing of particle number-difference fluctuations in a certain momentum-space volume and show that the correlation strength can be increased if the signals are binned into larger counting volumes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta pesquisa teve como objetivo verificar se existe diferença entre os retornos das Empresas listadas no IBOVESPA e nos Níveis de Governança Corporativa criados pela BOVESPA em dezembro de 2000, visando diferenciar as empresas que voluntariamente adotassem práticas adicionais de governança corporativa. O principal objetivo desde processo é melhorar a transparência entre o investidor e as empresas, reduzindo assim a assimetria de informação. Para efetuar esta pesquisa primeiramente foi realizada uma revisão bibliográfica sobre a assimetria da informação, teoria dos custos de transação, teoria da agência e um histórico da governança corporativa no mundo e no Brasil. Para verificar se existe diferença entre os retornos das empresas listadas no IBOVESPA e nos níveis diferenciados de governança corporativa utilizou-se o modelo estatístico ANOVA e Teste t em uma amostra composta por todas as empresas listadas no IBOVESPA em 31/03/2011 que tiveram ações negociadas entre 2006 e 2010 que são os últimos cinco anos da implantação dos níveis de governança no Brasil. Como resultado obteve-se que as empresa listadas no Novo Mercado e no Nível 1 possuem maior retorno que as empresas listadas no mercado tradicional.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to determine the cues used to signal avoidance of difficult driving situations and to test the hypothesis that drivers with relatively poor high contrast visual acuity (HCVA) have fewer crashes than drivers with relatively poor normalised low contrast visual acuity (NLCVA). This is because those with poorer HCVA are well aware of their difficulties and avoid dangerous driving situations while those poorer NLCVA are often unaware of the extent of their problem. Age, self-reported situation avoidance and HCVA were collected during a practice based study of 690 drivers. Screening was also carried out on 7254 drivers at various venues, mainly motorway sites, throughout the UK. Age, self-reported situation avoidance and prior crash involvement were recorded and Titmus vision screeners were used to measure HCVA and NLCVA. Situation avoidance increased in reduced visibility conditions and was influenced by age and HCVA. Only half of the drivers used visual cues to signal situation avoidance and most of these drivers used high rather than low contrast cues. A statistical model designed to remove confounding interrelationships between variables showed, for drivers that did not report situation avoidance, that crash involvement decreased for drivers with below average HCVA and increased for those with below average NLCVA. These relationships accounted for less than 1% of the crash variance, so the hypothesis was not strongly supported. © 2002 The College of Optometrists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A visualization plot of a data set of molecular data is a useful tool for gaining insight into a set of molecules. In chemoinformatics, most visualization plots are of molecular descriptors, and the statistical model most often used to produce a visualization is principal component analysis (PCA). This paper takes PCA, together with four other statistical models (NeuroScale, GTM, LTM, and LTM-LIN), and evaluates their ability to produce clustering in visualizations not of molecular descriptors but of molecular fingerprints. Two different tasks are addressed: understanding structural information (particularly combinatorial libraries) and relating structure to activity. The quality of the visualizations is compared both subjectively (by visual inspection) and objectively (with global distance comparisons and local k-nearest-neighbor predictors). On the data sets used to evaluate clustering by structure, LTM is found to perform significantly better than the other models. In particular, the clusters in LTM visualization space are consistent with the relationships between the core scaffolds that define the combinatorial sublibraries. On the data sets used to evaluate clustering by activity, LTM again gives the best performance but by a smaller margin. The results of this paper demonstrate the value of using both a nonlinear projection map and a Bernoulli noise model for modeling binary data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The target of no-reference (NR) image quality assessment (IQA) is to establish a computational model to predict the visual quality of an image. The existing prominent method is based on natural scene statistics (NSS). It uses the joint and marginal distributions of wavelet coefficients for IQA. However, this method is only applicable to JPEG2000 compressed images. Since the wavelet transform fails to capture the directional information of images, an improved NSS model is established by contourlets. In this paper, the contourlet transform is utilized to NSS of images, and then the relationship of contourlet coefficients is represented by the joint distribution. The statistics of contourlet coefficients are applicable to indicate variation of image quality. In addition, an image-dependent threshold is adopted to reduce the effect of content to the statistical model. Finally, image quality can be evaluated by combining the extracted features in each subband nonlinearly. Our algorithm is trained and tested on the LIVE database II. Experimental results demonstrate that the proposed algorithm is superior to the conventional NSS model and can be applied to different distortions. © 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide a theoretical explanation of the results on the intensity distributions and correlation functions obtained from a random-beam speckle field in nonlinear bulk waveguides reported in the recent publication by Bromberg et al. [Nat. Photonics 4, 721 (2010) ].. We study both the focusing and defocusing cases and in the limit of small speckle size (short-correlated disordered beam) provide analytical asymptotes for the intensity probability distributions at the output facet. Additionally we provide a simple relation between the speckle sizes at the input and output of a focusing nonlinear waveguide. The results are of practical significance for nonlinear Hanbury Brown and Twiss interferometry in both optical waveguides and Bose-Einstein condensates. © 2012 American Physical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 62J12.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 94A17.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores factors related to the prompt difficulty in Automated Essay Scoring. The sample was composed of 6,924 students. For each student, there were 1-4 essays, across 20 different writing prompts, for a total of 20,243 essays. E-rater® v.2 essay scoring engine developed by the Educational Testing Service was used to score the essays. The scoring engine employs a statistical model that incorporates 10 predictors associated with writing characteristics of which 8 were used. The Rasch partial credit analysis was applied to the scores to determine the difficulty levels of prompts. In addition, the scores were used as outcomes in the series of hierarchical linear models (HLM) in which students and prompts constituted the cross-classification levels. This methodology was used to explore the partitioning of the essay score variance.^ The results indicated significant differences in prompt difficulty levels due to genre. Descriptive prompts, as a group, were found to be more difficult than the persuasive prompts. In addition, the essay score variance was partitioned between students and prompts. The amount of the essay score variance that lies between prompts was found to be relatively small (4 to 7 percent). When the essay-level, student-level-and prompt-level predictors were included in the model, it was able to explain almost all variance that lies between prompts. Since in most high-stakes writing assessments only 1-2 prompts per students are used, the essay score variance that lies between prompts represents an undesirable or "noise" variation. Identifying factors associated with this "noise" variance may prove to be important for prompt writing and for constructing Automated Essay Scoring mechanisms for weighting prompt difficulty when assigning essay score.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solar activity indicators, each as sunspot numbers, sunspot area and flares, over the Sun’s photosphere are not considered to be symmetric between the northern and southern hemispheres of the Sun. This behavior is also known as the North-South Asymmetry of the different solar indices. Among the different conclusions obtained by several authors, we can point that the N-S asymmetry is a real and systematic phenomenon and is not due to random variability. In the present work, the probability distributions from the Marshall Space Flight Centre (MSFC) database are investigated using a statistical tool arises from well-known Non-Extensive Statistical Mechanics proposed by C. Tsallis in 1988. We present our results and discuss their physical implications with the help of theoretical model and observations. We obtained that there is a strong dependence between the nonextensive entropic parameter q and long-term solar variability presents in the sunspot area data. Among the most important results, we highlight that the asymmetry index q reveals the dominance of the North against the South. This behavior has been discussed and confirmed by several authors, but in no time they have given such behavior to a statistical model property. Thus, we conclude that this parameter can be considered as an effective measure for diagnosing long-term variations of solar dynamo. Finally, our dissertation opens a new approach for investigating time series in astrophysics from the perspective of non-extensivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the essential features of the dissipative parametric instability, in the universal complex Ginzburg- Landau equation. Dissipative parametric instability is excited through a parametric modulation of frequency dependent losses in a zig-zag fashion in the spectral domain. Such damping is introduced respectively for spectral components in the +ΔF and in the -ΔF region in alternating fashion, where F can represent wavenumber or temporal frequency depending on the applications. Such a spectral modulation can destabilize the homogeneous stationary solution of the system leading to growth of spectral sidebands and to the consequent pattern formation: both stable and unstable patterns in one- and in two-dimensional systems can be excited. The dissipative parametric instability provides an useful and interesting tool for the control of pattern formation in nonlinear optical systems with potentially interesting applications in technological applications, like the design of mode- locked lasers emitting pulse trains with tunable repetition rate; but it could also find realizations in nanophotonics circuits or in dissipative polaritonic Bose-Einstein condensates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An abstract of a thesis devoted to using helix-coil models to study unfolded states.\\

Research on polypeptide unfolded states has received much more attention in the last decade or so than it has in the past. Unfolded states are thought to be implicated in various

misfolding diseases and likely play crucial roles in protein folding equilibria and folding rates. Structural characterization of unfolded states has proven to be

much more difficult than the now well established practice of determining the structures of folded proteins. This is largely because many core assumptions underlying

folded structure determination methods are invalid for unfolded states. This has led to a dearth of knowledge concerning the nature of unfolded state conformational

distributions. While many aspects of unfolded state structure are not well known, there does exist a significant body of work stretching back half a century that

has been focused on structural characterization of marginally stable polypeptide systems. This body of work represents an extensive collection of experimental

data and biophysical models associated with describing helix-coil equilibria in polypeptide systems. Much of the work on unfolded states in the last decade has not been devoted

specifically to the improvement of our understanding of helix-coil equilibria, which arguably is the most well characterized of the various conformational equilibria

that likely contribute to unfolded state conformational distributions. This thesis seeks to provide a deeper investigation of helix-coil equilibria using modern

statistical data analysis and biophysical modeling techniques. The studies contained within seek to provide deeper insights and new perspectives on what we presumably

know very well about protein unfolded states. \\

Chapter 1 gives an overview of recent and historical work on studying protein unfolded states. The study of helix-coil equilibria is placed in the context

of the general field of unfolded state research and the basics of helix-coil models are introduced.\\

Chapter 2 introduces the newest incarnation of a sophisticated helix-coil model. State of the art modern statistical techniques are employed to estimate the energies

of various physical interactions that serve to influence helix-coil equilibria. A new Bayesian model selection approach is utilized to test many long-standing

hypotheses concerning the physical nature of the helix-coil transition. Some assumptions made in previous models are shown to be invalid and the new model

exhibits greatly improved predictive performance relative to its predecessor. \\

Chapter 3 introduces a new statistical model that can be used to interpret amide exchange measurements. As amide exchange can serve as a probe for residue-specific

properties of helix-coil ensembles, the new model provides a novel and robust method to use these types of measurements to characterize helix-coil ensembles experimentally

and test the position-specific predictions of helix-coil models. The statistical model is shown to perform exceedingly better than the most commonly used

method for interpreting amide exchange data. The estimates of the model obtained from amide exchange measurements on an example helical peptide

also show a remarkable consistency with the predictions of the helix-coil model. \\

Chapter 4 involves a study of helix-coil ensembles through the enumeration of helix-coil configurations. Aside from providing new insights into helix-coil ensembles,

this chapter also introduces a new method by which helix-coil models can be extended to calculate new types of observables. Future work on this approach could potentially

allow helix-coil models to move into use domains that were previously inaccessible and reserved for other types of unfolded state models that were introduced in chapter 1.