964 resultados para Breakdown Probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cette présentation examinera le degré de certitude qui peut être atteint dans le domaine scientifique. Le paradigme scientifique est composé de deux extrêmes; causalité et déterminisme d'un côté et probabilité et indéterminisme de l'autre. En faisant appel aux notions de Hume de la ressemblance et la contiguïté, on peut rejeter la causalité ou le hasard objectif comme étant sans fondement et non empirique. Le problème de l'induction et le sophisme du parieur proviennent d’une même source cognitif / heuristique. Hume décrit ces tendances mentales dans ses essais « Of Probability » et « Of the Idea of Necessary Connexion ». Une discussion sur la conception de la probabilité de Hume ainsi que d'autres interprétations de probabilité sera nécessaire. Même si la science glorifie et idéalise la causalité, la probabilité peut être comprise comme étant tout aussi cohérente. Une attitude probabiliste, même si elle est également non empirique, pourrait être plus avantageuse que le vieux paradigme de la causalité.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dans cette thèse l’ancienne question philosophique “tout événement a-t-il une cause ?” sera examinée à la lumière de la mécanique quantique et de la théorie des probabilités. Aussi bien en physique qu’en philosophie des sciences la position orthodoxe maintient que le monde physique est indéterministe. Au niveau fondamental de la réalité physique – au niveau quantique – les événements se passeraient sans causes, mais par chance, par hasard ‘irréductible’. Le théorème physique le plus précis qui mène à cette conclusion est le théorème de Bell. Ici les prémisses de ce théorème seront réexaminées. Il sera rappelé que d’autres solutions au théorème que l’indéterminisme sont envisageables, dont certaines sont connues mais négligées, comme le ‘superdéterminisme’. Mais il sera argué que d’autres solutions compatibles avec le déterminisme existent, notamment en étudiant des systèmes physiques modèles. Une des conclusions générales de cette thèse est que l’interprétation du théorème de Bell et de la mécanique quantique dépend crucialement des prémisses philosophiques desquelles on part. Par exemple, au sein de la vision d’un Spinoza, le monde quantique peut bien être compris comme étant déterministe. Mais il est argué qu’aussi un déterminisme nettement moins radical que celui de Spinoza n’est pas éliminé par les expériences physiques. Si cela est vrai, le débat ‘déterminisme – indéterminisme’ n’est pas décidé au laboratoire : il reste philosophique et ouvert – contrairement à ce que l’on pense souvent. Dans la deuxième partie de cette thèse un modèle pour l’interprétation de la probabilité sera proposé. Une étude conceptuelle de la notion de probabilité indique que l’hypothèse du déterminisme aide à mieux comprendre ce que c’est qu’un ‘système probabiliste’. Il semble que le déterminisme peut répondre à certaines questions pour lesquelles l’indéterminisme n’a pas de réponses. Pour cette raison nous conclurons que la conjecture de Laplace – à savoir que la théorie des probabilités présuppose une réalité déterministe sous-jacente – garde toute sa légitimité. Dans cette thèse aussi bien les méthodes de la philosophie que de la physique seront utilisées. Il apparaît que les deux domaines sont ici solidement reliés, et qu’ils offrent un vaste potentiel de fertilisation croisée – donc bidirectionnelle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study on the characterization of probability distributions using the residual entropy function. The concept of entropy is extensively used in literature as a quantitative measure of uncertainty associated with a random phenomenon. The commonly used life time models in reliability Theory are exponential distribution, Pareto distribution, Beta distribution, Weibull distribution and gamma distribution. Several characterization theorems are obtained for the above models using reliability concepts such as failure rate, mean residual life function, vitality function, variance residual life function etc. Most of the works on characterization of distributions in the reliability context centers around the failure rate or the residual life function. The important aspect of interest in the study of entropy is that of locating distributions for which the shannon’s entropy is maximum subject to certain restrictions on the underlying random variable. The geometric vitality function and examine its properties. It is established that the geometric vitality function determines the distribution uniquely. The problem of averaging the residual entropy function is examined, and also the truncated form version of entropies of higher order are defined. In this study it is established that the residual entropy function determines the distribution uniquely and that the constancy of the same is characteristics to the geometric distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Statistics, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The forms of natural rubber studied were sheet [RSS 4 and RSS 5], ISNR 20 and EBC. In the case of the latter two forms samples from estate and nonestate sectors were included. The samples were collected from different locations at specified intervals, for a particular period. The effect of the extent of mastication on raw rubber properties as well as the properties of the compounds and vulcanizates also studied. The consistency in raw rubber properties and breakdown behavior of skim rubber were studied by collecting samples periodically from selected processing units. The effect of incorporation of skim with ISNR 20 has also been investigated

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we attempt to make a probabilistic analysis of some physically realizable, though complex, storage and queueing models. It is essentially a mathematical study of the stochastic processes underlying these models. Our aim is to have an improved understanding of the behaviour of such models, that may widen their applicability. Different inventory systems with randon1 lead times, vacation to the server, bulk demands, varying ordering levels, etc. are considered. Also we study some finite and infinite capacity queueing systems with bulk service and vacation to the server and obtain the transient solution in certain cases. Each chapter in the thesis is provided with self introduction and some important references

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many nonlinear optical microscopy techniques based on the high-intensity nonlinear phenomena were developed recent years. A new technique based on the minimal-invasive in-situ analysis of the specific bound elements in biological samples is described in the present work. The imaging-mode Laser-Induced Breakdown Spectroscopy (LIBS) is proposed as a combination of LIBS, femtosecond laser material processing and microscopy. The Calcium distribution in the peripheral cell wall of the sunflower seedling (Helianthus Annuus L.) stem is studied as a first application of the imaging-mode LIBS. At first, several nonlinear optical microscopy techniques are overviewed. The spatial resolution of the imaging-mode LIBS microscope is discussed basing on the Point-Spread Function (PSF) concept. The primary processes of the Laser-Induced Breakdown (LIB) are overviewed. We consider ionization, breakdown, plasma formation and ablation processes. Water with defined Calcium salt concentration is used as a model of the biological object in the preliminary experiments. The transient LIB spectra are measured and analysed for both nanosecond and femtosecond laser excitation. The experiment on the local Calcium concentration measurements in the peripheral cell wall of the sunflower seedling stem employing nanosecond LIBS shows, that nanosecond laser is not a suitable excitation source for the biological applications. In case of the nanosecond laser the ablation craters have random shape and depth over 20 µm. The analysis of the femtosecond laser ablation craters shows the reproducible circle form. At 3.5 µJ laser pulse energy the diameter of the crater is 4 µm and depth 140 nm for single laser pulse, which results in 1 femtoliter analytical volume. The experimental result of the 2 dimensional and surface sectioning of the bound Calcium concentrations is presented in the work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the independent particle model as our basis we present a scheme to reduce the complexity and computational effort to calculate inclusive probabilities in many-electron collision system. As an example we present an application to K - K charge transfer in collisions of 2.6 MeV Ne{^9+} on Ne. We are able to give impact parameter-dependent probabilities for many-particle states which could lead to KLL-Auger electrons after collision and we compare with experimental values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the single-particle amplitudes from a 20-level coupled-channel calculation with ab initio relativistic self consistent LCAO-MO Dirac-Fock-Slater energy eigenvalues and matrix elements we calculate within the frame of the inclusive probability formalism impact-parameter-dependent K-hole transfer probabilities. As an example we show results for the heavy asymmetric collision system S{^15+} on Ar for impact energies from 4.7 to 16 MeV. The inclusive probability formalism which reinstates the many-particle aspect of the collision system permits a qualitative and quantitative agreement with the experiment which is not achieved by the single-particle picture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we define a new scheme to develop and evaluate protection strategies for building reliable GMPLS networks. This is based on what we have called the network protection degree (NPD). The NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability, and an a posteriori evaluation, the failure impact degree (FID), which determines the impact on the network in case of failure, in terms of packet loss and recovery time. Having mathematical formulated these components, experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms in order to offer a certain degree of protection