983 resultados para Inverse Gaussian Distribution


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within the classification of orbits in axisymmetric stellar systems, we present a new algorithm able to automatically classify the orbits according to their nature. The algorithm involves the application of the correlation integral method to the surface of section of the orbit; fitting the cumulative distribution function built with the consequents in the surface of section of the orbit, we can obtain the value of its logarithmic slope m which is directly related to the orbit’s nature: for slopes m ≈ 1 we expect the orbit to be regular, for slopes m ≈ 2 we expect it to be chaotic. With this method we have a fast and reliable way to classify orbits and, furthermore, we provide an analytical expression of the probability that an orbit is regular or chaotic given the logarithmic slope m of its correlation integral. Although this method works statistically well, the underlying algorithm can fail in some cases, misclassifying individual orbits under some peculiar circumstances. The performance of the algorithm benefits from a rich sampling of the traces of the SoS, which can be obtained with long numerical integration of orbits. Finally we note that the algorithm does not differentiate between the subtypes of regular orbits: resonantly trapped and untrapped orbits. Such distinction would be a useful feature, which we leave for future work. Since the result of the analysis is a probability linked to a Gaussian distribution, for the very definition of distribution, some orbits even if they have a certain nature are classified as belonging to the opposite class and create the probabilistic tails of the distribution. So while the method produces fair statistical results, it lacks in absolute classification precision.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The ability to control both the minimum size of holes and the minimum size of structural members are essential requirements in the topology optimization design process for manufacturing. This paper addresses both requirements by means of a unified approach involving mesh-independent projection techniques. An inverse projection is developed to control the minimum hole size while a standard direct projection scheme is used to control the minimum length of structural members. In addition, a heuristic scheme combining both contrasting requirements simultaneously is discussed. Two topology optimization implementations are contributed: one in which the projection (either inverse or direct) is used at each iteration; and the other in which a two-phase scheme is explored. In the first phase, the compliance minimization is carried out without any projection until convergence. In the second phase, the chosen projection scheme is applied iteratively until a solution is obtained while satisfying either the minimum member size or minimum hole size. Examples demonstrate the various features of the projection-based techniques presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Calculating the potentials on the heart’s epicardial surface from the body surface potentials constitutes one form of inverse problems in electrocardiography (ECG). Since these problems are ill-posed, one approach is to use zero-order Tikhonov regularization, where the squared norms of both the residual and the solution are minimized, with a relative weight determined by the regularization parameter. In this paper, we used three different methods to choose the regularization parameter in the inverse solutions of ECG. The three methods include the L-curve, the generalized cross validation (GCV) and the discrepancy principle (DP). Among them, the GCV method has received less attention in solutions to ECG inverse problems than the other methods. Since the DP approach needs knowledge of norm of noises, we used a model function to estimate the noise. The performance of various methods was compared using a concentric sphere model and a real geometry heart-torso model with a distribution of current dipoles placed inside the heart model as the source. Gaussian measurement noises were added to the body surface potentials. The results show that the three methods all produce good inverse solutions with little noise; but, as the noise increases, the DP approach produces better results than the L-curve and GCV methods, particularly in the real geometry model. Both the GCV and L-curve methods perform well in low to medium noise situations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer the posterior distribution of the parameters of interest given the observations by using a likelihood model relating the observations to the parameters, and a prior distribution over the parameters. We show how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer. We present an enhanced Markov chain Monte Carlo method to sample from the resulting multimodal posterior distribution. We go on to show how the computational complexity of the inference can be controlled by using a sparse, sequential Bayes algorithm for estimation with Gaussian processes. This helps to overcome the most serious barrier to the use of probabilistic, Gaussian process methods in remote sensing inverse problems, which is the prohibitively large size of the data sets. We contrast the sampling results with the approximations that are found by using the sparse, sequential Bayes algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The inverse controller is traditionally assumed to be a deterministic function. This paper presents a pedagogical methodology for estimating the stochastic model of the inverse controller. The proposed method is based on Bayes' theorem. Using Bayes' rule to obtain the stochastic model of the inverse controller allows the use of knowledge of uncertainty from both the inverse and the forward model in estimating the optimal control signal. The paper presents the methodology for general nonlinear systems and is demonstrated on nonlinear single-input-single-output (SISO) and multiple-input-multiple-output (MIMO) examples. © 2006 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report the suitability of an Einstein-Podolsky-Rosen entanglement source for Gaussian continuous-variable quantum key distribution at 1550 nm. Our source is based on a single continuous-wave squeezed vacuum mode combined with a vacuum mode at a balanced beam splitter. Extending a recent security proof, we characterize the source by quantifying the extractable length of a composable secure key from a finite number of samples under the assumption of collective attacks. We show that distances in the order of 10 km are achievable with this source for a reasonable sample size despite the fact that the entanglement was generated including a vacuum mode. Our security analysis applies to all states having an asymmetry in the field quadrature variances, including those generated by superposition of two squeezed modes with different squeezing strengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Niño South Oscillation (ENSO) is one climatic phenomenon related to the inter-annual variability of global meteorological patterns influencing sea surface temperature and rainfall variability. It influences human health indirectly through extreme temperature and moisture conditions that may accelerate the spread of some vector-borne viral diseases, like dengue fever (DF). This work examines the spatial distribution of association between ENSO and DF in the countries of the Americas during 1995-2004, which includes the 1997-1998 El Niño, one of the most important climatic events of 20(th) century. Data regarding the South Oscillation index (SOI), indicating El Niño-La Niña activity, were obtained from Australian Bureau of Meteorology. The annual DF incidence (AIy) by country was computed using Pan-American Health Association data. SOI and AIy values were standardised as deviations from the mean and plotted in bars-line graphics. The regression coefficient values between SOI and AIy (rSOI,AI) were calculated and spatially interpolated by an inverse distance weighted algorithm. The results indicate that among the five years registering high number of cases (1998, 2002, 2001, 2003 and 1997), four had El Niño activity. In the southern hemisphere, the annual spatial weighted mean centre of epidemics moved southward, from 6° 31' S in 1995 to 21° 12' S in 1999 and the rSOI,AI values were negative in Cuba, Belize, Guyana and Costa Rica, indicating a synchrony between higher DF incidence rates and a higher El Niño activity. The rSOI,AI map allows visualisation of a graded surface with higher values of ENSO-DF associations for Mexico, Central America, northern Caribbean islands and the extreme north-northwest of South America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The results of previous studies elsewhere have indicated that GB virus C (GBV-C) infection is frequent in patients infected with the human immunodeficiency virus type 1 (HIV-1) due to similar transmission routes of both viruses. The aim of this study was to determine the prevalence, incidence density and genotypic characteristics of GBV-C in this population. Methodology/Principal Findings: The study population included 233 patients from a cohort primarily comprised of homosexual men recently infected with HIV-1 in Sao Paulo, Brazil. The presence of GBV-C RNA was determined in plasma samples by reverse transcriptase-nested polymerase chain reaction and quantified by real-time PCR. GBV-C genotypes were determined by direct sequencing. HIV viral load, CD4+ T lymphocyte and CD8+ T lymphocyte count were also tested in all patients. The overall prevalence of GBV-C infection was 0.23 (95% CI: 0.18 to 0.29) in the study group. There was no significant difference between patients with and without GBV-C infection and Glycoprotein E2 antibody presence regarding age, sex, HIV-1 viral load, CD4+ and CD8+ T cell counts and treatment with antiretroviral drugs. An inverse correlation was observed between GBV-C and HIV-1 loads at enrollment and after one year. Also, a positive but not significant correlation was observed between GBV-C load and CD4+ T lymphocyte. Phylogenetic analysis of the GBV-C isolates revealed the presence of genotype 1 and genotype 2, these sub classified into subtype 2a and 2b. Conclusion/Significance: GBV-C infection is common in recently HIV -1 infected patients in Sao Paulo, Brazil and the predominant genotype is 2b. This study provides the first report of the GBV-C prevalence at the time of diagnosis of HIV-1 and the incidence density of GBV-C infection in one year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leakage reduction in water supply systems and distribution networks has been an increasingly important issue in the water industry since leaks and ruptures result in major physical and economic losses. Hydraulic transient solvers can be used in the system operational diagnosis, namely for leak detection purposes, due to their capability to describe the dynamic behaviour of the systems and to provide substantial amounts of data. In this research work, the association of hydraulic transient analysis with an optimisation model, through inverse transient analysis (ITA), has been used for leak detection and its location in an experimental facility containing PVC pipes. Observed transient pressure data have been used for testing ITA. A key factor for the success of the leak detection technique used is the accurate calibration of the transient solver, namely adequate boundary conditions and the description of energy dissipation effects since PVC pipes are characterised by a viscoelastic mechanical response. Results have shown that leaks were located with an accuracy between 4-15% of the total length of the pipeline, depending on the discretisation of the system model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The residence time distribution and mean residence time of a 10% sodium bicarbonate solution that is dried in a conventional spouted bed with inert bodies were measured with the stimulus-response method. Methylene blue was used as a chemical tracer, and the effects of the paste feed mode, size distribution of the inert bodies, and mean particle size on the residence times and dried powder properties were investigated. The results showed that the residence time distributions could be best reproduced with the perfect mixing cell model or N = 1 for the continuous stirred tank reactor in a series model. The mean residence times ranged from 6.04 to 12.90 min and were significantly affected by the factors studied. Analysis of variance on the experimental data showed that mean residence times were affected by the mean diameter of the inert bodies at a significance level of 1% and by the size distribution at a level of 5%. Moreover, altering the paste feed from dripping to pneumatic atomization affected mean residence time at a 5% significance level. The dried powder characteristics proved to be adequate for further industrial manipulation, as demonstrated by the low moisture content, narrow range of particle size, and good flow properties. The results of this research are significant in the study of the drying of heat-sensitive materials because it shows that by simultaneously changing the size distribution and average size of the inert bodies, the mean residence times of a paste can be reduced by half, thus decreasing losses due to degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a factorizing posterior approximation. For neural network models, we use a central limit theorem argument to make EP tractable when the number of parameters is large. For two types of models, we show that EP can achieve optimal generalization performance when data are drawn from a simple distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states.' Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled spins which are elements of u(1, 1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams. (C) 2001 Elsevier Science B.V. All rights reserved.