950 resultados para Gaussian probability function


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Atmospheric effects can significantly degrade the reliability of free-space optical communications. One such effect is scintillation, caused by atmospheric turbulence, refers to random fluctuations in the irradiance and phase of the received laser beam. In this paper we inv stigate the use of multiple lasers and multiple apertures to mitigate scintillation. Since the scintillation process is slow, we adopt a block fading channel model and study the outage probability under the assumptions of orthogonal pulse-position modulation and non-ideal photodetection. Assuming perfect receiver channel state information (CSI), we derive the signal-to-noise ratio (SNR) exponents for the cases when the scintillation is lognormal, exponential and gammagamma distributed, which cover a wide range of atmospheric turbulence conditions. Furthermore, when CSI is also available at the transmitter, we illustrate very large gains in SNR are possible (in some cases larger than 15 dB) by adapting the transmitted power. Under a long-term power constraint, we outline fundamental design criteria via a simple expression that relates the required number of lasers and apertures for a given code rate and number of codeword blocks to completely remove system outages. Copyright © 2009 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a principled algorithm for robust Bayesian filtering and smoothing in nonlinear stochastic dynamic systems when both the transition function and the measurement function are described by non-parametric Gaussian process (GP) models. GPs are gaining increasing importance in signal processing, machine learning, robotics, and control for representing unknown system functions by posterior probability distributions. This modern way of system identification is more robust than finding point estimates of a parametric function representation. Our principled filtering/smoothing approach for GP dynamic systems is based on analytic moment matching in the context of the forward-backward algorithm. Our numerical evaluations demonstrate the robustness of the proposed approach in situations where other state-of-the-art Gaussian filters and smoothers can fail. © 2011 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A fundamental problem in the analysis of structured relational data like graphs, networks, databases, and matrices is to extract a summary of the common structure underlying relations between individual entities. Relational data are typically encoded in the form of arrays; invariance to the ordering of rows and columns corresponds to exchangeable arrays. Results in probability theory due to Aldous, Hoover and Kallenberg show that exchangeable arrays can be represented in terms of a random measurable function which constitutes the natural model parameter in a Bayesian model. We obtain a flexible yet simple Bayesian nonparametric model by placing a Gaussian process prior on the parameter function. Efficient inference utilises elliptical slice sampling combined with a random sparse approximation to the Gaussian process. We demonstrate applications of the model to network data and clarify its relation to models in the literature, several of which emerge as special cases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Based on the second-order solutions obtained for the three-dimensional weakly nonlinear random waves propagating over a steady uniform current in finite water depth, the joint statistical distribution of the velocity and acceleration of the fluid particle in the current direction is derived using the characteristic function expansion method. From the joint distribution and the Morison equation, the theoretical distributions of drag forces, inertia forces and total random forces caused by waves propagating over a steady uniform current are determined. The distribution of inertia forces is Gaussian as that derived using the linear wave model, whereas the distributions of drag forces and total random forces deviate slightly from those derived utilizing the linear wave model. The distributions presented can be determined by the wave number spectrum of ocean waves, current speed and the second order wave-wave and wave-current interactions. As an illustrative example, for fully developed deep ocean waves, the parameters appeared in the distributions near still water level are calculated for various wind speeds and current speeds by using Donelan-Pierson-Banner spectrum and the effects of the current and the nonlinearity of ocean waves on the distribution are studied. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The greatest relaxation time for an assembly of three- dimensional rigid rotators in an axially symmetric bistable potential is obtained exactly in terms of continued fractions as a sum of the zero frequency decay functions (averages of the Legendre polynomials) of the system. This is accomplished by studying the entire time evolution of the Green function (transition probability) by expanding the time dependent distribution as a Fourier series and proceeding to the zero frequency limit of the Laplace transform of that distribution. The procedure is entirely analogous to the calculation of the characteristic time of the probability evolution (the integral of the configuration space probability density function with respect to the position co-ordinate) for a particle undergoing translational diffusion in a potential; a concept originally used by Malakhov and Pankratov (Physica A 229 (1996) 109). This procedure allowed them to obtain exact solutions of the Kramers one-dimensional translational escape rate problem for piecewise parabolic potentials. The solution was accomplished by posing the problem in terms of the appropriate Sturm-Liouville equation which could be solved in terms of the parabolic cylinder functions. The method (as applied to rotational problems and posed in terms of recurrence relations for the decay functions, i.e., the Brinkman approach c.f. Blomberg, Physica A 86 (1977) 49, as opposed to the Sturm-Liouville one) demonstrates clearly that the greatest relaxation time unlike the integral relaxation time which is governed by a single decay function (albeit coupled to all the others in non-linear fashion via the underlying recurrence relation) is governed by a sum of decay functions. The method is easily generalized to multidimensional state spaces by matrix continued fraction methods allowing one to treat non-axially symmetric potentials, where the distribution function is governed by two state variables. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present study on the characterization of probability distributions using the residual entropy function. The concept of entropy is extensively used in literature as a quantitative measure of uncertainty associated with a random phenomenon. The commonly used life time models in reliability Theory are exponential distribution, Pareto distribution, Beta distribution, Weibull distribution and gamma distribution. Several characterization theorems are obtained for the above models using reliability concepts such as failure rate, mean residual life function, vitality function, variance residual life function etc. Most of the works on characterization of distributions in the reliability context centers around the failure rate or the residual life function. The important aspect of interest in the study of entropy is that of locating distributions for which the shannon’s entropy is maximum subject to certain restrictions on the underlying random variable. The geometric vitality function and examine its properties. It is established that the geometric vitality function determines the distribution uniquely. The problem of averaging the residual entropy function is examined, and also the truncated form version of entropies of higher order are defined. In this study it is established that the residual entropy function determines the distribution uniquely and that the constancy of the same is characteristics to the geometric distribution

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights and threshold such as to minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by $k$--means clustering and the weights are found using error backpropagation. We consider three machines, namely a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the US postal service database of handwritten digits, the SV machine achieves the highest test accuracy, followed by the hybrid approach. The SV approach is thus not only theoretically well--founded, but also superior in a practical application.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Radial basis function networks can be trained quickly using linear optimisation once centres and other associated parameters have been initialised. The authors propose a small adjustment to a well accepted initialisation algorithm which improves the network accuracy over a range of problems. The algorithm is described and results are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Consideration is given to a standard CDMA system and determination of the density function of the interference with and without Gaussian noise using sampling theory concepts. The formula derived provides fast and accurate results and is a simple, useful alternative to other methods