999 resultados para distributions of nucleon density


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a coarse-grained yet microscopic detailed model to study the statistical fluctuations of single-molecule protein conformational dynamics of adenylate kinase. We explored the underlying conformational energy landscape and found that the system has two basins of attractions, open and closed conformations connected by two separate pathways. The kinetics is found to be nonexponential, consistent with single-molecule conformational dynamics experiments. Furthermore, we found that the statistical distribution of the kinetic times for the conformational transition has a long power law tail, reflecting the exponential density of state of the underlying landscape. We also studied the joint distribution of the two pathways and found memory effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The eigenvalue densities of two random matrix ensembles, the Wigner Gaussian matrices and the Wishart covariant matrices, are decomposed in the contributions of each individual eigenvalue distribution. It is shown that the fluctuations of all eigenvalues, for medium matrix sizes, are described with a good precision by nearly normal distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reconstruction of Extensive Air Showers (EAS) observed by particle detectors at the ground is based on the characteristics of observables like the lateral particle density and the arrival times. The lateral densities, inferred for different EAS components from detector data, are usually parameterised by applying various lateral distribution functions (LDFs). The LDFs are used in turn for evaluating quantities like the total number of particles or the density at particular radial distances. Typical expressions for LDFs anticipate azimuthal symmetry of the density around the shower axis. The deviations of the lateral particle density from this assumption arising from various reasons are smoothed out in the case of compact arrays like KASCADE, but not in the case of arrays like Grande, which only sample a smaller part of the azimuthal variation. KASCADE-Grande, an extension of the former KASCADE experiment, is a multi-component Extensive Air Shower (EAS) experiment located at the Karlsruhe Institute of Technology (Campus North), Germany. The lateral distributions of charged particles are deduced from the basic information provided by the Grande scintillators - the energy deposits - first in the observation plane, then in the intrinsic shower plane. In all steps azimuthal dependences should be taken into account. As the energy deposit in the scintillators is dependent on the angles of incidence of the particles, azimuthal dependences are already involved in the first step: the conversion from the energy deposits to the charged particle density. This is done by using the Lateral Energy Correction Function (LECF) that evaluates the mean energy deposited by a charged particle taking into account the contribution of other particles (e.g. photons) to the energy deposit. By using a very fast procedure for the evaluation of the energy deposited by various particles we prepared realistic LECFs depending on the angle of incidence of the shower and on the radial and azimuthal coordinates of the location of the detector. Mapping the lateral density from the observation plane onto the intrinsic shower plane does not remove the azimuthal dependences arising from geometric and attenuation effects, in particular for inclined showers. Realistic procedures for applying correction factors are developed. Specific examples of the bias due to neglecting the azimuthal asymmetries in the conversion from the energy deposit in the Grande detectors to the lateral density of charged particles in the intrinsic shower plane are given. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Significantly more individuals and biomass of flying insects were present at the forest edge than in the understory throughout the year, as monitored by flight interception traps, in Central Amazonia. Numbers and biomass of flying insects increased at higher rates at the edge with rainfall, associated with termite swarming behavior and increased Homopteran density. The most abundant insects were Diptera, Coleoptera, Hymenoptera and Isoptera, whose ranked abundances varied with respect to forest edge and understory, as well as with season.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Bayesian Inference it is often desirable to have a posterior density reflecting mainly the information from sample data. To achieve this purpose it is important to employ prior densities which add little information to the sample. We have in the literature many such prior densities, for example, Jeffreys (1967), Lindley (1956); (1961), Hartigan (1964), Bernardo (1979), Zellner (1984), Tibshirani (1989), etc. In the present article, we compare the posterior densities of the reliability function by using Jeffreys, the maximal data information (Zellner, 1984), Tibshirani's, and reference priors for the reliability function R(t) in a Weibull distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a model of a spatial economy in which interregional trade patterns and the structure of the transport network are determined endogenously as a result of the interaction between industrial location behavior and increasing returns in transportation, in particular, economies of transport density. The traditional models assume either the structure of the transport network or industrial location patterns, and hence, they are unable to explain the interdependence of the two. It is shown that economies of transport density can be the primary source of industrial localization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, mixtures of vacuum gas oil and low density polyethylene, a major component of common industrial and consumer household plastics, were pyrolytically co-processed in a fluid catalytic cracking (FCC) riser reactor as a viable alternative for the energy and petrochemical revalorisation of plastic wastes into valuable petrochemical feedstocks and fuel within an existing industrial technology. Using equilibrium FCC catalyst, the oil–polymer blends were catalytically cracked at different processing conditions of temperatures between 773 K and 973 K and catalyst feed ratios of 5:1, 7:1 and 10:1. The influence of each of these processing parameters on the cracking gas and liquid yield patterns were studied and presented. Further analysed and presented are the different compositional distributions of the obtained liquids and gaseous products. The analysis of the results obtained revealed that with very little modifications to existing process superstructure, yields and compositional distributions of products from the fluid catalytic cracking of the oil–polymer blend in many cases were very similar to those of the processed oil feedstock, bringing to manifest the viability of the feedstock co-processing without significant detriments to FCC product yields and quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thermal degradation of high density polyethylene has been modelled by the random breakage of polymer bonds, using a set of population balance equations. A model was proposed in which the population balances were lumped into representative sizes so that the experimentally determined molecular weight distribution of the original polymer could be used as the initial condition. This model was then compared to two different cases of the unlumped population balance which assumed unimolecular initial distributions of 100 and 500 monomer units, respectively. The model that utilised the experimentally determined molecular weight distribution was found to best describe the experimental data. The model fits suggested a second mechanism in addition to random breakage at slow reaction rates. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a novel inversion-based neuro-controller for solving control problems involving uncertain nonlinear systems that could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. In this work a novel robust inverse control approach is obtained based on importance sampling from these distributions. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The performance of the new algorithm is illustrated through simulations with example systems.