956 resultados para Bivariate Gaussian distribution


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interpolation techniques for spatial data have been applied frequently in various fields of geosciences. Although most conventional interpolation methods assume that it is sufficient to use first- and second-order statistics to characterize random fields, researchers have now realized that these methods cannot always provide reliable interpolation results, since geological and environmental phenomena tend to be very complex, presenting non-Gaussian distribution and/or non-linear inter-variable relationship. This paper proposes a new approach to the interpolation of spatial data, which can be applied with great flexibility. Suitable cross-variable higher-order spatial statistics are developed to measure the spatial relationship between the random variable at an unsampled location and those in its neighbourhood. Given the computed cross-variable higher-order spatial statistics, the conditional probability density function (CPDF) is approximated via polynomial expansions, which is then utilized to determine the interpolated value at the unsampled location as an expectation. In addition, the uncertainty associated with the interpolation is quantified by constructing prediction intervals of interpolated values. The proposed method is applied to a mineral deposit dataset, and the results demonstrate that it outperforms kriging methods in uncertainty quantification. The introduction of the cross-variable higher-order spatial statistics noticeably improves the quality of the interpolation since it enriches the information that can be extracted from the observed data, and this benefit is substantial when working with data that are sparse or have non-trivial dependence structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The structure and dynamics of the two-dimensional linear shear flow of inelastic disks at high area fractions are analyzed. The event-driven simulation technique is used in the hard-particle limit, where the particles interact through instantaneous collisions. The structure (relative arrangement of particles) is analyzed using the bond-orientational order parameter. It is found that the shear flow reduces the order in the system, and the order parameter in a shear flow is lower than that in a collection of elastic hard disks at equilibrium. The distribution of relative velocities between colliding particles is analyzed. The relative velocity distribution undergoes a transition from a Gaussian distribution for nearly elastic particles, to an exponential distribution at low coefficients of restitution. However, the single-particle distribution function is close to a Gaussian in the dense limit, indicating that correlations between colliding particles have a strong influence on the relative velocity distribution. This results in a much lower dissipation rate than that predicted using the molecular chaos assumption, where the velocities of colliding particles are considered to be uncorrelated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The distribution of relative velocities between colliding particles in shear flows of inelastic spheres is analysed in the Volume fraction range 0.4-0.64. Particle interactions are considered to be due to instantaneous binary collisions, and the collision model has a normal coefficient of restitution e(n) (negative of the ratio of the post- and pre-collisional relative velocities of the particles along the line joining the centres) and a tangential coefficient of restitution e(t) (negative of the ratio of post- and pre-collisional velocities perpendicular to line joining the centres). The distribution or pre-collisional normal relative velocities (along the line Joining the centres of the particles) is Found to be an exponential distribution for particles with low normal coefficient of restitution in the range 0.6-0.7. This is in contrast to the Gaussian distribution for the normal relative velocity in all elastic fluid in the absence of shear. A composite distribution function, which consists of an exponential and a Gaussian component, is proposed to span the range of inelasticities considered here. In the case of roughd particles, the relative velocity tangential to the surfaces at contact is also evaluated, and it is found to be close to a Gaussian distribution even for highly inelastic particles.Empirical relations are formulated for the relative velocity distribution. These are used to calculate the collisional contributions to the pressure, shear stress and the energy dissipation rate in a shear flow. The results of the calculation were round to be in quantitative agreement with simulation results, even for low coefficients of restitution for which the predictions obtained using the Enskog approximation are in error by an order of magnitude. The results are also applied to the flow down an inclined plane, to predict the angle of repose and the variation of the volume fraction with angle of inclination. These results are also found to be in quantitative agreement with previous simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment. © 2012 American Society of Agricultural and Biological Engineers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bacteria play an important role in many ecological systems. The molecular characterization of bacteria using either cultivation-dependent or cultivation-independent methods reveals the large scale of bacterial diversity in natural communities, and the vastness of subpopulations within a species or genus. Understanding how bacterial diversity varies across different environments and also within populations should provide insights into many important questions of bacterial evolution and population dynamics. This thesis presents novel statistical methods for analyzing bacterial diversity using widely employed molecular fingerprinting techniques. The first objective of this thesis was to develop Bayesian clustering models to identify bacterial population structures. Bacterial isolates were identified using multilous sequence typing (MLST), and Bayesian clustering models were used to explore the evolutionary relationships among isolates. Our method involves the inference of genetic population structures via an unsupervised clustering framework where the dependence between loci is represented using graphical models. The population dynamics that generate such a population stratification were investigated using a stochastic model, in which homologous recombination between subpopulations can be quantified within a gene flow network. The second part of the thesis focuses on cluster analysis of community compositional data produced by two different cultivation-independent analyses: terminal restriction fragment length polymorphism (T-RFLP) analysis, and fatty acid methyl ester (FAME) analysis. The cluster analysis aims to group bacterial communities that are similar in composition, which is an important step for understanding the overall influences of environmental and ecological perturbations on bacterial diversity. A common feature of T-RFLP and FAME data is zero-inflation, which indicates that the observation of a zero value is much more frequent than would be expected, for example, from a Poisson distribution in the discrete case, or a Gaussian distribution in the continuous case. We provided two strategies for modeling zero-inflation in the clustering framework, which were validated by both synthetic and empirical complex data sets. We show in the thesis that our model that takes into account dependencies between loci in MLST data can produce better clustering results than those methods which assume independent loci. Furthermore, computer algorithms that are efficient in analyzing large scale data were adopted for meeting the increasing computational need. Our method that detects homologous recombination in subpopulations may provide a theoretical criterion for defining bacterial species. The clustering of bacterial community data include T-RFLP and FAME provides an initial effort for discovering the evolutionary dynamics that structure and maintain bacterial diversity in the natural environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The unconfined aquifer of the Continental Terminal in Niger was investigated by magnetic resonance sounding (MRS) and by 14 pumping tests in order to improve calibration of MRS outputs at field scale. The reliability of the standard relationship used for estimating aquifer transmissivity by MRS was checked; it was found that the parametric factor can be estimated with an uncertainty a parts per thousand currency sign150% by a single point of calibration. The MRS water content (theta (MRS)) was shown to be positively correlated with the specific yield (Sy), and theta (MRS) always displayed higher values than Sy. A conceptual model was subsequently developed, based on estimated changes of the total porosity, Sy, and the specific retention Sr as a function of the median grain size. The resulting relationship between theta (MRS) and Sy showed a reasonably good fit with the experimental dataset, considering the inherent heterogeneity of the aquifer matrix (residual error is similar to 60%). Interpreted in terms of aquifer parameters, MRS data suggest a log-normal distribution of the permeability and a one-sided Gaussian distribution of Sy. These results demonstrate the efficiency of the MRS method for fast and low-cost prospection of hydraulic parameters for large unconfined aquifers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

State-of-the-art image-set matching techniques typically implicitly model each image-set with a Gaussian distribution. Here, we propose to go beyond these representations and model image-sets as probability distribution functions (PDFs) using kernel density estimators. To compare and match image-sets, we exploit Csiszar´ f-divergences, which bear strong connections to the geodesic distance defined on the space of PDFs, i.e., the statistical manifold. Furthermore, we introduce valid positive definite kernels on the statistical manifold, which let us make use of more powerful classification schemes to match image-sets. Finally, we introduce a supervised dimensionality reduction technique that learns a latent space where f-divergences reflect the class labels of the data. Our experiments on diverse problems, such as video-based face recognition and dynamic texture classification, evidence the benefits of our approach over the state-of-the-art image-set matching methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, numerical modelling of fracture in concrete using two-dimensional lattice model is presented and also a few issues related to lattice modelling technique applicable to concrete fracture are reviewed. A comparison is made with acoustic emission (AE) events with the number of fractured elements. To implement the heterogeneity of the plain concrete, two methods namely, by generating grain structure of the concrete using Fuller's distribution and the concrete material properties are randomly distributed following Gaussian distribution are used. In the first method, the modelling of the concrete at meso level is carried out following the existing methods available in literature. The shape of the aggregates present in the concrete are assumed as perfect spheres and shape of the same in two-dimensional lattice network is circular. A three-point bend (TPB) specimen is tested in the experiment under crack mouth opening displacement (CMOD) control at a rate of 0.0004 mm/sec and the fracture process in the same TPB specimen is modelled using regular triangular 2D lattice network. Load versus crack mouth opening isplacement (CMOD) plots thus obtained by using both the methods are compared with experimental results. It was observed that the number of fractured elements increases near the peak load and beyond the peak load. That is once the crack starts to propagate. AE hits also increase rapidly beyond the peak load. It is compulsory here to mention that although the lattice modelling of concrete fracture used in this present study is very similar to those already available in literature, the present work brings out certain finer details which are not available explicitly in the earlier works.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Magnetron sputtering is a promising technique for the growth of oxide materials including ZnO, which allows deposition of films at low temperatures with good electrical properties. The current-voltage (I-P) characteristics of An Schottky contacts on magnetron sputtered ZnO, films have been measured over a temperature range of 278-358K. Both effective barrier height (phi(B,eff)) and ideality factor (n) are found to be a function of temperature, and this behavior has been interpreted on the basis of a Gaussian distribution of barrier heights due to barrier height inhomogeneities that prevail at the interface. Density of states (DOS) near the Fermi level is determined using a model based on the space charge limited current (SCLC). The dispersion in both real and imaginary parts of the dielectric constant at low frequencies, with increase in temperature is attributed to the space charge effect. Complex impedance plots exhibited two semicircles, which corresponds to bulk grains and the grain boundaries. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Inflation is a period of accelerated expansion in the very early universe, which has the appealing aspect that it can create primordial perturbations via quantum fluctuations. These primordial perturbations have been observed in the cosmic microwave background, and these perturbations also function as the seeds of all large-scale structure in the universe. Curvaton models are simple modifications of the standard inflationary paradigm, where inflation is driven by the energy density of the inflaton, but another field, the curvaton, is responsible for producing the primordial perturbations. The curvaton decays after inflation as ended, where the isocurvature perturbations of the curvaton are converted into adiabatic perturbations. Since the curvaton must decay, it must have some interactions. Additionally realistic curvaton models typically have some self-interactions. In this work we consider self-interacting curvaton models, where the self-interaction is a monomial in the potential, suppressed by the Planck scale, and thus the self-interaction is very weak. Nevertheless, since the self-interaction makes the equations of motion non-linear, it can modify the behaviour of the model very drastically. The most intriguing aspect of this behaviour is that the final properties of the perturbations become highly dependent on the initial values. Departures of Gaussian distribution are important observables of the primordial perturbations. Due to the non-linearity of the self-interacting curvaton model and its sensitivity to initial conditions, it can produce significant non-Gaussianity of the primordial perturbations. In this work we investigate the non-Gaussianity produced by the self-interacting curvaton, and demonstrate that the non-Gaussianity parameters do not obey the analytically derived approximate relations often cited in the literature. Furthermore we also consider a self-interacting curvaton with a mass in the TeV-scale. Motivated by realistic particle physics models such as the Minimally Supersymmetric Standard Model, we demonstrate that a curvaton model within the mass range can be responsible for the observed perturbations if it can decay late enough.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop an alternate characterization of the statistical distribution of the inter-cell interference power observed in the uplink of CDMA systems. We show that the lognormal distribution better matches the cumulative distribution and complementary cumulative distribution functions of the uplink interference than the conventionally assumed Gaussian distribution and variants based on it. This is in spite of the fact that many users together contribute to uplink interference, with the number of users and their locations both being random. Our observations hold even in the presence of power control and cell selection, which have hitherto been used to justify the Gaussian distribution approximation. The parameters of the lognormal are obtained by matching moments, for which detailed analytical expressions that incorporate wireless propagation, cellular layout, power control, and cell selection parameters are developed. The moment-matched lognormal model, while not perfect, is an order of magnitude better in modeling the interference power distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent single molecule experiments have suggested the existence of a photochemical funnel in the photophysics of conjugated polymers, like poly[2-methoxy-5-(2'-ethylhexyl)oxy-1,4-phenylenevinylene] (MEH-PPV). The funnel is believed to be a consequence of the presence of conformational or chemical defects along the polymer chain and efficient non-radiative energy transfer among different chromophore segments. Here we address the effect of the excitation energy dynamics on the photophysics of PPV. The PPV chain is modeled as a polymer with the length distribution of chromophores given either by a Gaussian or by a Poisson distribution. We observe that the Poisson distribution of the segment lengths explains the photophysics of PPV better than the Gaussian distribution. A recently proposed version of an extended particle-in-a-box' model is used to calculate the exciton energies and the transition dipole moments of the chromophores, and a master equation to describe the excitation energy transfer among different chromophores. The rate of energy transfer is assumed to be given here, as a first approximation, by the well-known Forster expression. The observed excitation population dynamics confirms the photochemical funneling of excitation energy from shorter to longer chromophores of the polymer chain. The time scale of spectral shift and energy transfer for our model polymer, with realistic values of optical parameters, is in the range of 200-300 ps. We find that the excitation energy may not always migrate towards the longest chromophore segments in the polymer chain as the efficiency of energy transfer between chromophores depends on the separation distance between the two and their relative orientation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anthesis was studied at the canopy level in 10 Norway spruce stands from 9 localities in Finland from 1963 to 1974. Distributions of pollen catches were compared to the normal Gaussian distribution. The basis for the timing studies was the 50 per cent point of the anthesis-fitted normal distribution. Development up to this point was given in calendar days, in degree days (>5 °C) and in period units. The count of each parameter began on March 19 (included). Male flowering in Norway spruce stands was found to have more annual variation in quantity than in Scots pine stands studied earlier. Anthesis in spruce in northern Finland occurred at a later date than in the south. The heat sums needed for anthesis varied latitudinally less in spruce than in pine. The variation of pollen catches in spruce increased towards north-west as in the case of Scots pine. In the unprocessed data, calendar days were found to be the most accurate forecast of anthesis in Norway spruce both for a single year and for the majority of cases of stand averages over several years. Locally, the period unit could be a more accurate parameter for the stand average. However, on a calendar day basis, when annual deviations between expected and measured heat sums were converted to days, period units were narrowly superior to days. The geographical correlations respect to timing of flowering, calculated against distances measured along simulated post-glacial migration routes, were stronger than purely latitudinal correlations. Effects of the reinvasion of Norway spruce into Finland are thus still visible in spruce populations just as they were in Scots pine populations. The proportion of the average annual heat sum needed for spruce anthesis grew rapidly north of a latitude of ca. 63° and the heat sum needed for anthesis decreased only slighty towards the timberline. In light of flowering phenology, it seems probable that the northwesterly third of Finnish Norway spruce populations are incompletely adapted to the prevailing cold climate. A moderate warming of the climate would therefore be beneficial for Norway spruce. This accords roughly with the adaptive situation in Scots pine.