931 resultados para Failure Probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the independent particle model as our basis we present a scheme to reduce the complexity and computational effort to calculate inclusive probabilities in many-electron collision system. As an example we present an application to K - K charge transfer in collisions of 2.6 MeV Ne{^9+} on Ne. We are able to give impact parameter-dependent probabilities for many-particle states which could lead to KLL-Auger electrons after collision and we compare with experimental values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the single-particle amplitudes from a 20-level coupled-channel calculation with ab initio relativistic self consistent LCAO-MO Dirac-Fock-Slater energy eigenvalues and matrix elements we calculate within the frame of the inclusive probability formalism impact-parameter-dependent K-hole transfer probabilities. As an example we show results for the heavy asymmetric collision system S{^15+} on Ar for impact energies from 4.7 to 16 MeV. The inclusive probability formalism which reinstates the many-particle aspect of the collision system permits a qualitative and quantitative agreement with the experiment which is not achieved by the single-particle picture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This bachelor’s thesis examines the crisis of hegemonic masculinities in David Lodge’s Campus Trilogy. In the course of the thesis, I demonstrate that the male characters in the novels aspire to hegemonic ideals of masculinity, but that ultimately most of them fail in their aspirations. However, I also show that this does not lead to the abandonment of this pursuit, but merely to its reformulation and a continued attempt of male characters to aspire to this reformulated ideal. In order to achieve this, I conduct a close reading of the novels and based on this, first determine the predominant types of hegemonic masculinities in each novel, and then whether certain characters aspire to these hegemonic ideals. Next I analyze whether or not they are successful. This analysis is chiefly based on the sociological concept of hegemonic masculinities developed by Connell. With the help of this concept, this thesis shows that several types of masculinities can be identified in the novels and that these exist in hierarchical relation to each other. Furthermore, it shows that these aspirations and the ideals themselves are always prone to crises that are brought on by societal changes in their environment. However, it is also demonstrated that in most cases these crises do not lead to the collapse of the ideal or the failure of its pursuit, but rather to the reformulation and continuation of both.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Memory errors are a common cause of incorrect software execution and security vulnerabilities. We have developed two new techniques that help software continue to execute successfully through memory errors: failure-oblivious computing and boundless memory blocks. The foundation of both techniques is a compiler that generates code that checks accesses via pointers to detect out of bounds accesses. Instead of terminating or throwing an exception, the generated code takes another action that keeps the program executing without memory corruption. Failure-oblivious code simply discards invalid writes and manufactures values to return for invalid reads, enabling the program to continue its normal execution path. Code that implements boundless memory blocks stores invalid writes away in a hash table to return as the values for corresponding out of bounds reads. he net effect is to (conceptually) give each allocated memory block unbounded size and to eliminate out of bounds accesses as a programming error. We have implemented both techniques and acquired several widely used open source servers (Apache, Sendmail, Pine, Mutt, and Midnight Commander).With standard compilers, all of these servers are vulnerable to buffer overflow attacks as documented at security tracking web sites. Both failure-oblivious computing and boundless memory blocks eliminate these security vulnerabilities (as well as other memory errors). Our results show that our compiler enables the servers to execute successfully through buffer overflow attacks to continue to correctly service user requests without security vulnerabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objetivo: Determinar si la presencia de hipertensión arterial severa, rangos de edad diferentes, sexo masculino, dislipidemia en tratamiento, obesidad o múltiples predictores clínicos de riesgo, se asocian a mayor probabilidad de desarrollar falla cardiaca aguda, infarto agudo de miocardio no fatal o muerte de origen cardiovascular, en pacientes mayores de 65 años llevados a un procedimiento quirúrgico no cardiaco. Materiales y Métodos. Se realizó un estudio de cohorte retrospectivo de los pacientes llevados a cirugías no cardiacas en la Fundación Cardioinfantil entre el 1ro de enero de 2003 y el 31 de diciembre de 2005. Por análisis bivariado, prueba T student y CHI2 se determino la significancia estadística, con posterior regresión logística. Resultados: De 1.600 eventos analizados, el 9,2% presentaron complicaciones cardiovasculares, de las cuales el 40,1% fueron mortales. El tabaquismo, hipertensión arterial severa y un mayor número de predictores clínicos de riesgo, se asocian con desenlaces mortales y no mortales, con un riesgo relativo de 5,3 (IC 95%, 3,7-7,69), 4,27 (IC 95%; 2,56-7,11) y hasta 18,86 (IC 95%; 9,59-37,1) respectivamente. La dislipidemia, sexo masculino y edad avanzada se asocian con eventos no mortales, con un RR de 3,1 (IC 95%; 1,97-4,87), 1,67 (95%; 1,06-2,62) y 2,49 (IC 95%; 1,03-6,05) respectivamente. Conclusiones: los factores de riesgo para enfermedad coronaria de la población general son también factores de riesgo para complicaciones cardiovasculares perioperatorias en cirugías no cardiacas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCCIÓN: Se define lesión renal aguda inducida por contraste (CIAKI), al deterioro de la función renal en las 48 horas posteriores a la administración de radiofármacos. Para prevenir este evento se han estudiado diversas intervenciones clínicas, como la administración de N-Acetilcisteina (NAC), previa al procedimiento diagnóstico. Meta-análisis anteriores comparan diferentes intervenciones clínicas para prevenir CIAKI sin resultados concluyentes. El presente meta-análisis analiza la evidencia de la eficacia la administración de NAC previa al procedimiento diagnóstico para prevenir CIAKI en pacientes con nefropatía previa. METODOLOGÍA: Se analizaron por 3 revisores independientes estudios ECCA, en población con nefropatía, en inglés, español, portugués, italiano, alemán y francés tanto publicados como de literatura gris, en donde se comparara NAC versus hidratación o placebo. RESULTADOS: Los estudios encontrados fueron publicados entre el año 2000 y 2010. Se analizaron 37 reportes con 6.022 pacientes. La correlación intraclase para tres observadores fue kappa r=0.97 [IC95% 0.93-0.99]). Los pacientes con nefropatía a quienes se les administró NAC previo al procedimiento tuvieron un 34% menos de probabilidad de hacer falla renal aguda, con OR ajustado por sesgo de publicación de 0.66 [IC95% 0.50-0.88] con un p de 0.002. Al realizar análisis de sensibilidad por la escala de Jadad, se encuentra que los resultados en los estudios con baja calidad no son significativos (p=0.202). CONCLUSION: Los resultados soportan la asociación entre el tratamiento preventivo con N-Acetilcisteina y una menor frecuencia de lesión renal aguda inducida por medio de contraste en pacientes nefrópatas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a presentation given to 3rd year Project students on our BSc degree programmes to help them project manage their 3rd year dissertations. It covers three practical methods. Fact: Skills Audits to help make projects realistic. Failure: Risk Assessment to help with contingency planning. Fiction: Gantt Charts to help with managing time and effort.