984 resultados para EXPONENTIATED WEIBULL DISTRIBUTION


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many applications of lifetime data analysis, it is important to perform inferences about the change-point of the hazard function. The change-point could be a maximum for unimodal hazard functions or a minimum for bathtub forms of hazard functions and is usually of great interest in medical or industrial applications. For lifetime distributions where this change-point of the hazard function can be analytically calculated, its maximum likelihood estimator is easily obtained from the invariance properties of the maximum likelihood estimators. From the asymptotical normality of the maximum likelihood estimators, confidence intervals can also be obtained. Considering the exponentiated Weibull distribution for the lifetime data, we have different forms for the hazard function: constant, increasing, unimodal, decreasing or bathtub forms. This model gives great flexibility of fit, but we do not have analytic expressions for the change-point of the hazard function. In this way, we consider the use of Markov Chain Monte Carlo methods to get posterior summaries for the change-point of the hazard function considering the exponentiated Weibull distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to present designs for an accelerated life test (ALT). Design/methodology/approach - Bayesian methods and simulation Monte Carlo Markov Chain (MCMC) methods were used. Findings - In the paper a Bayesian method based on MCMC for ALT under EW distribution (for life time) and Arrhenius models (relating the stress variable and parameters) was proposed. The paper can conclude that it is a reasonable alternative to the classical statistical methods since the implementation of the proposed method is simple, not requiring advanced computational understanding and inferences on the parameters can be made easily. By the predictive density of a future observation, a procedure was developed to plan ALT and also to verify if the conformance fraction of the manufactured process reaches some desired level of quality. This procedure is useful for statistical process control in many industrial applications. Research limitations/implications - The results may be applied in a semiconductor manufacturer. Originality/value - The Exponentiated-Weibull-Arrhenius model has never before been used to plan an ALT. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The geometrical factors defining an adhesive joint are of great importance as its design greatly conditions the performance of the bonding. One of the most relevant geometrical factors is the thickness of the adhesive as it decisively influences the mechanical properties of the bonding and has a clear economic impact on the manufacturing processes or long runs. The traditional mechanical joints (riveting, welding, etc.) are characterised by a predictable performance, and are very reliable in service conditions. Thus, structural adhesive joints will only be selected in industrial applications demanding mechanical requirements and adverse environmental conditions if the suitable reliability (the same or higher than the mechanical joints) is guaranteed. For this purpose, the objective of this paper is to analyse the influence of the adhesive thickness on the mechanical behaviour of the joint and, by means of a statistical analysis based on Weibull distribution, propose the optimum thickness for the adhesive combining the best mechanical performance and high reliability. This procedure, which is applicable without a great deal of difficulty to other joints and adhesives, provides a general use for a more reliable use of adhesive bondings and, therefore, for a better and wider use in the industrial manufacturing processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the photovoltaic field, the back contact solar cells technology has appeared as an alternative to the traditional silicon modules. This new type of cells places both positive and negative contacts on the back side of the cells maximizing the exposed surface to the light and making easier the interconnection of the cells in the module. The Emitter Wrap-Through solar cell structure presents thousands of tiny holes to wrap the emitter from the front surface to the rear surface. These holes are made in a first step over the silicon wafers by means of a laser drilling process. This step is quite harmful from a mechanical point of view since holes act as stress concentrators leading to a reduction in the strength of these wafers. This paper presents the results of the strength characterization of drilled wafers. The study is carried out testing the samples with the ring on ring device. Finite Element models are developed to simulate the tests. The stress concentration factor of the drilled wafers under this load conditions is determined from the FE analysis. Moreover, the material strength is characterized fitting the fracture stress of the samples to a three-parameter Weibull cumulative distribution function. The parameters obtained are compared with the ones obtained in the analysis of a set of samples without holes to validate the method employed for the study of the strength of silicon drilled wafers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62F25, 62F03.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62E16, 65C05, 65C20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets. (c) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fit of fracture strength data of brittle materials (Si3N4, SiC, and ZnO) to the Weibull and normal distributions is compared in terms of the Akaike information criterion. For Si3N4, the Weibull distribution fits the data better than the normal distribution, but for ZnO the result is just the opposite. In the case of SiC, the difference is not large enough to make a clear distinction between the two distributions. There is not sufficient evidence to show that the Weibull distribution is always preferred to other distributions, and the uncritical use of the Weibull distribution for strength data is questioned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modeling and analysis of lifetime data is an important aspect of statistical work in a wide variety of scientific and technological fields. Good (1953) introduced a probability distribution which is commonly used in the analysis of lifetime data. For the first time, based on this distribution, we propose the so-called exponentiated generalized inverse Gaussian distribution, which extends the exponentiated standard gamma distribution (Nadarajah and Kotz, 2006). Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters. The usefulness of the new model is illustrated by means of a real data set. (c) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Matematica Aplicada e Computacional - FCT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the first time, we introduce a generalized form of the exponentiated generalized gamma distribution [Cordeiro et al. The exponentiated generalized gamma distribution with application to lifetime data, J. Statist. Comput. Simul. 81 (2011), pp. 827-842.] that is the baseline for the log-exponentiated generalized gamma regression model. The new distribution can accommodate increasing, decreasing, bathtub- and unimodal-shaped hazard functions. A second advantage is that it includes classical distributions reported in the lifetime literature as special cases. We obtain explicit expressions for the moments of the baseline distribution of the new regression model. The proposed model can be applied to censored data since it includes as sub-models several widely known regression models. It therefore can be used more effectively in the analysis of survival data. We obtain maximum likelihood estimates for the model parameters by considering censored data. We show that our extended regression model is very useful by means of two applications to real data.