979 resultados para Density-function


Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is known that the empirical orthogonal function method is unable to detect possible nonlinear structure in climate data. Here, isometric feature mapping (Isomap), as a tool for nonlinear dimensionality reduction, is applied to 1958–2001 ERA-40 sea-level pressure anomalies to study nonlinearity of the Asian summer monsoon intraseasonal variability. Using the leading two Isomap time series, the probability density function is shown to be bimodal. A two-dimensional bivariate Gaussian mixture model is then applied to identify the monsoon phases, the obtained regimes representing enhanced and suppressed phases, respectively. The relationship with the large-scale seasonal mean monsoon indicates that the frequency of monsoon regime occurrence is significantly perturbed in agreement with conceptual ideas, with preference for enhanced convection on intraseasonal time scales during large-scale strong monsoons. Trend analysis suggests a shift in concentration of monsoon convection, with less emphasis on South Asia and more on the East China Sea.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Asian summer monsoon is a high dimensional and highly nonlinear phenomenon involving considerable moisture transport towards land from the ocean, and is critical for the whole region. We have used daily ECMWF reanalysis (ERA-40) sea-level pressure (SLP) anomalies to the seasonal cycle, over the region 50-145°E, 20°S-35°N to study the nonlinearity of the Asian monsoon using Isomap. We have focused on the two-dimensional embedding of the SLP anomalies for ease of interpretation. Unlike the unimodality obtained from tests performed in empirical orthogonal function space, the probability density function, within the two-dimensional Isomap space, turns out to be bimodal. But a clustering procedure applied to the SLP data reveals support for three clusters, which are identified using a three-component bivariate Gaussian mixture model. The modes are found to appear similar to active and break phases of the monsoon over South Asia in addition to a third phase, which shows active conditions over the Western North Pacific. Using the low-level wind field anomalies the active phase over South Asia is found to be characterised by a strengthening and an eastward extension of the Somali jet whereas during the break phase the Somali jet is weakened near southern India, while the monsoon trough in northern India also weakens. Interpretation is aided using the APHRODITE gridded land precipitation product for monsoon Asia. The effect of large-scale seasonal mean monsoon and lower boundary forcing, in the form of ENSO, is also investigated and discussed. The outcome here is that ENSO is shown to perturb the intraseasonal regimes, in agreement with conceptual ideas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The activation of aerosols to form cloud droplets is dependent upon vertical velocities whose local variability is not typically resolved at the GCM grid scale. Consequently, it is necessary to represent the subgrid-scale variability of vertical velocity in the calculation of cloud droplet number concentration. This study uses the UK Chemistry and Aerosols community model (UKCA) within the Hadley Centre Global Environmental Model (HadGEM3), coupled for the first time to an explicit aerosol activation parameterisation, and hence known as UKCA-Activate. We explore the range of uncertainty in estimates of the indirect aerosol effects attributable to the choice of parameterisation of the subgrid-scale variability of vertical velocity in HadGEM-UKCA. Results of simulations demonstrate that the use of a characteristic vertical velocity cannot replicate results derived with a distribution of vertical velocities, and is to be discouraged in GCMs. This study focuses on the effect of the variance (σw2) of a Gaussian pdf (probability density function) of vertical velocity. Fixed values of σw (spanning the range measured in situ by nine flight campaigns found in the literature) and a configuration in which σw depends on turbulent kinetic energy are tested. Results from the mid-range fixed σw and TKE-based configurations both compare well with observed vertical velocity distributions and cloud droplet number concentrations. The radiative flux perturbation due to the total effects of anthropogenic aerosol is estimated at −1.9 W m−2 with σw = 0.1 m s−1, −2.1 W m−2 with σw derived from TKE, −2.25 W m−2 with σw = 0.4 m s−1, and −2.3 W m−2 with σw = 0.7 m s−1. The breadth of this range is 0.4 W m−2, which is comparable to a substantial fraction of the total diversity of current aerosol forcing estimates. Reducing the uncertainty in the parameterisation of σw would therefore be an important step towards reducing the uncertainty in estimates of the indirect aerosol effects. Detailed examination of regional radiative flux perturbations reveals that aerosol microphysics can be responsible for some climate-relevant radiative effects, highlighting the importance of including microphysical aerosol processes in GCMs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kumaraswamy [Generalized probability density-function for double-bounded random-processes, J. Hydrol. 462 (1980), pp. 79-88] introduced a distribution for double-bounded random processes with hydrological applications. For the first time, based on this distribution, we describe a new family of generalized distributions (denoted with the prefix `Kw`) to extend the normal, Weibull, gamma, Gumbel, inverse Gaussian distributions, among several well-known distributions. Some special distributions in the new family such as the Kw-normal, Kw-Weibull, Kw-gamma, Kw-Gumbel and Kw-inverse Gaussian distribution are discussed. We express the ordinary moments of any Kw generalized distribution as linear functions of probability weighted moments (PWMs) of the parent distribution. We also obtain the ordinary moments of order statistics as functions of PWMs of the baseline distribution. We use the method of maximum likelihood to fit the distributions in the new class and illustrate the potentiality of the new model with an application to real data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper a new parametric method to deal with discrepant experimental results is developed. The method is based on the fit of a probability density function to the data. This paper also compares the characteristics of different methods used to deduce recommended values and uncertainties from a discrepant set of experimental data. The methods are applied to the (137)Cs and (90)Sr published half-lives and special emphasis is given to the deduced confidence intervals. The obtained results are analyzed considering two fundamental properties expected from an experimental result: the probability content of confidence intervals and the statistical consistency between different recommended values. The recommended values and uncertainties for the (137)Cs and (90)Sr half-lives are 10,984 (24) days and 10,523 (70) days, respectively. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Impedance spectroscopy and nuclear magnetic resonance (NMR) were used to investigate the mobility of water molecules located in the interlayer space of H(+) - exchanged bentonite clay. The conductivity obtained by ac measurements was 1.25 x 10(-4) S/cm at 298 K. Proton ((1)H) lineshapes and spin-lattice relaxation times were measured as a function of temperature over the temperature range 130-320 K. The NMR experiments exhibit the qualitative features associated with the proton motion, namely the presence of a (1)H NMR line narrowing and a well-defined spin-lattice relaxation rate maximum. The temperature dependence of the proton spin-lattice relaxation rates was analyzed with the spectral density function appropriate for proton dynamics in a two-dimensional system. The self-diffusion coefficient estimated from our NMR data, D similar to 2 x 10(-7) cm(2)/s at 300 K, is consistent with those reported for exchanged montmorillonite clay hydrates studied by NMR and quasi-elastic neutron scattering (QNS).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Birnbaum and Saunders (1969a) introduced a probability distribution which is commonly used in reliability studies For the first time based on this distribution the so-called beta-Birnbaum-Saunders distribution is proposed for fatigue life modeling Various properties of the new model including expansions for the moments moment generating function mean deviations density function of the order statistics and their moments are derived We discuss maximum likelihood estimation of the model s parameters The superiority of the new model is illustrated by means of three failure real data sets (C) 2010 Elsevier B V All rights reserved

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the present work, a new approach for the determination of the partition coefficient in different interfaces based on the density function theory is proposed. Our results for log P(ow) considering a n-octanol/water interface for a large super cell for acetone -0.30 (-0.24) and methane 0.95 (0.78) are comparable with the experimental data given in parenthesis. We believe that these differences are mainly related to the absence of van der Walls interactions and the limited number of molecules considered in the super cell. The numerical deviations are smaller than that observed for interpolation based tools. As the proposed model is parameter free, it is not limited to the n-octanol/water interface.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.