30 resultados para distribution (probability theory)

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A range of physical and engineering systems exhibit an irregular complex dynamics featuring alternation of quiet and burst time intervals called the intermittency. The intermittent dynamics most popular in laser science is the on-off intermittency [1]. The on-off intermittency can be understood as a conversion of the noise in a system close to an instability threshold into effective time-dependent fluctuations which result in the alternation of stable and unstable periods. The on-off intermittency has been recently demonstrated in semiconductor, Erbium doped and Raman lasers [2-5]. Recently demonstrated random distributed feedback (random DFB) fiber laser has an irregular dynamics near the generation threshold [6,7]. Here we show the intermittency in the cascaded random DFB fiber laser. We study intensity fluctuations in a random DFB fiber laser based on nitrogen doped fiber. The laser generates first and second Stokes components 1120 nm and 1180 nm respectively under an appropriate pumping. We study the intermittency in the radiation of the second Stokes wave. The typical time trace near the generation threshold of the second Stokes wave (Pth) is shown at Fig. 1a. From the number of long enough time-traces we calculate statistical distribution between major spikes in time dynamics, Fig. 1b. To eliminate contribution of high frequency components of spikes we use a low pass filter along with the reference value of the output power. Experimental data is fitted by power law, ~(P-Pth)y, where is a mean time between pikes. There are two different intermittency regimes. Just above Pth, the mean time is approximated by the -3/2 power law. The -3/2 power law is typical to the on-off intermittency with hopping between two states (first and second Stokes waves in our case) [7]. At higher power, the mean time is approximated by -4 power law, that indicates a change in intermittency type to multistate. Multistable dynamics is observed in erbium-doped fiber lasers [8]. The origin of multiples states in our system could be probably connected with polarization hopping or other reasons and should be further investigated. We have presented a first experimental statistical characterisation of the on-off and multistate intermittencies that occur in the generation of the second Stokes wave in nitrogen doped random DFB fiber laser. References [1] H. Fujisaka and T. Yamada, “A New Intermittency in Coupled Dynamical Systems,” Prog. Theor. Phys. 74, 918 (1985). [2] S. Osborne, A. Amann, D. Bitauld, and S. O’Brien, “On-off intermittency in an optically injected semiconductor laser,” Phys. Rev. E 85, 056204 (2012). [3] S. Sergeyev, K. O'Mahoney, S. Popov, and A. T. Friberg, “Coherence and anticoherence resonance in high-concentration erbium-doped fiber laser,” Opt. Lett. 35, 3736 (2010). [4] A.E. El-Taher, S.V. Sergeyev, E.G. Turitsyna, P. Harper, and S. K. Turitsyn, “Intermittent Self-Pulsing in a Fiber Raman Laser”, In proc. Conf. Nonlin. Photon., paper ID 1367139, Colorado Springs, USA, 2012 [5] S.K. Turitsyn, S.A. Babin, A.E. El-Taher, P. Harper, D.V. Churkin, S.I. Kablukov, J.D. Ania-Castañón, V. Karalekas, and E.V. Podivilov, “Random distributed feedback fibre laser”, Nat. Photon..4, 231 (2010). [6] I. D. Vatnik, D. V. Churkin, S. A. Babin, and S. K. Turitsyn, "Cascaded random distributed feedback Raman fiber laser operating at 1.2 μm," Opt. Express 19, 18486 (2011). [7] W. Feller, An introduction to probability theory and its applications, Vol. 1, 3rd ed. (Wiley, New-York, 1968). [8] G. Huerta-Cuellar, A.N. Pisarchik, and Y.O. Barmenkov, “Experimental characterization of hopping dynamics in a multistable fiber laser,” Phys. Rev. E 78, 035202(R) (2008).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis explores the process of developing a principled approach for translating a model of mental-health risk expertise into a probabilistic graphical structure. Probabilistic graphical structures can be a combination of graph and probability theory that provide numerous advantages when it comes to the representation of domains involving uncertainty, domains such as the mental health domain. In this thesis the advantages that probabilistic graphical structures offer in representing such domains is built on. The Galatean Risk Screening Tool (GRiST) is a psychological model for mental health risk assessment based on fuzzy sets. In this thesis the knowledge encapsulated in the psychological model was used to develop the structure of the probability graph by exploiting the semantics of the clinical expertise. This thesis describes how a chain graph can be developed from the psychological model to provide a probabilistic evaluation of risk that complements the one generated by GRiST’s clinical expertise by the decomposing of the GRiST knowledge structure in component parts, which were in turned mapped into equivalent probabilistic graphical structures such as Bayesian Belief Nets and Markov Random Fields to produce a composite chain graph that provides a probabilistic classification of risk expertise to complement the expert clinical judgements

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A cross-country pipeline construction project is exposed to an uncertain environment due to its enormous size (physical, manpower requirement and financial value), complexity in design technology and involvement of external factors. These uncertainties can lead to several changes in project scope during the process of project execution. Unless the changes are properly controlled, the time, cost and quality goals of the project may never be achieved. A methodology is proposed for project control through risk analysis, contingency allocation and hierarchical planning models. Risk analysis is carried out through the analytic hierarchy process (AHP) due to the subjective nature of risks in construction projects. The results of risk analysis are used to determine the logical contingency for project control with the application of probability theory. Ultimate project control is carried out by hierarchical planning model which enables decision makers to take vital decisions during the changing environment of the construction period. Goal programming (GP), a multiple criteria decision-making technique, is proposed for model formulation because of its flexibility and priority-base structure. The project is planned hierarchically in three levels—project, work package and activity. GP is applied separately at each level. Decision variables of each model are different planning parameters of the project. In this study, models are formulated from the owner's perspective and its effectiveness in project control is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projects that are exposed to uncertain environments can be effectively controlled with the application of risk analysis during the planning stage. The Analytic Hierarchy Process, a multiattribute decision-making technique, can be used to analyse and assess project risks which are objective or subjective in nature. Among other advantages, the process logically integrates the various elements in the planning process. The results from risk analysis and activity analysis are then used to develop a logical contingency allowance for the project through the application of probability theory. The contingency allowance is created in two parts: (a) a technical contingency, and (b) a management contingency. This provides a basis for decision making in a changing project environment. Effective control of the project is made possible by the limitation of the changes within the monetary contingency allowance for the work package concerned, and the utilization of the contingency through proper appropriation. The whole methodology is applied to a pipeline-laying project in India, and its effectiveness in project control is demonstrated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider the direct adaptive inverse control of nonlinear multivariable systems with different delays between every input-output pair. In direct adaptive inverse control, the inverse mapping is learned from examples of input-output pairs. This makes the obtained controller sub optimal, since the network may have to learn the response of the plant over a larger operational range than necessary. Moreover, in certain applications, the control problem can be redundant, implying that the inverse problem is ill posed. In this paper we propose a new algorithm which allows estimating and exploiting uncertainty in nonlinear multivariable control systems. This approach allows us to model strongly non-Gaussian distribution of control signals as well as processes with hysteresis. The proposed algorithm circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We obtain the exact asymptotic result for the disorder-averaged probability distribution function for a random walk in a biased Sinai model and show that it is characterized by a creeping behavior of the displacement moments with time, similar to v(mu n), where mu <1 is dimensionless mean drift. We employ a method originated in quantum diffusion which is based on the exact mapping of the problem to an imaginary-time Schrodinger equation. For nonzero drift such an equation has an isolated lowest eigenvalue separated by a gap from quasicontinuous excited states, and the eigenstate corresponding to the former governs the long-time asymptotic behavior.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We find the probability distribution of the fluctuating parameters of a soliton propagating through a medium with additive noise. Our method is a modification of the instanton formalism (method of optimal fluctuation) based on a saddle-point approximation in the path integral. We first solve consistently a fundamental problem of soliton propagation within the framework of noisy nonlinear Schrödinger equation. We then consider model modifications due to in-line (filtering, amplitude and phase modulation) control. It is examined how control elements change the error probability in optical soliton transmission. Even though a weak noise is considered, we are interested here in probabilities of error-causing large fluctuations which are beyond perturbation theory. We describe in detail a new phenomenon of soliton collapse that occurs under the combined action of noise, filtering and amplitude modulation. © 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce two novel techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we apply two novel techniques to the problem of extracting the distribution of wind vector directions from radar catterometer data gathered by a remote-sensing satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most conventional techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce three related techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce three novel techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic information tools have become increasingly popular with channel manufacturers in their efforts to manage resellers. Although these tools have been found to increase the efficiency of communications, researchers and practitioners alike have questioned their effectiveness. To investigate how top-down electronic information affects social channel relationships we consider the use of such tools in information technology distribution channels. Using electronic communications theory and channel governance theory we hypothesize that the usefulness of the tools is a function of the type of information inherent in each tool (demand creation information or supply fulfillment information) and the particular communications characteristics of this information.