968 resultados para Weibull Probability Plot


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For any continuous baseline G distribution [G. M. Cordeiro and M. de Castro, A new family of generalized distributions, J. Statist. Comput. Simul. 81 (2011), pp. 883-898], proposed a new generalized distribution (denoted here with the prefix 'Kw-G'(Kumaraswamy-G)) with two extra positive parameters. They studied some of its mathematical properties and presented special sub-models. We derive a simple representation for the Kw-Gdensity function as a linear combination of exponentiated-G distributions. Some new distributions are proposed as sub-models of this family, for example, the Kw-Chen [Z.A. Chen, A new two-parameter lifetime distribution with bathtub shape or increasing failure rate function, Statist. Probab. Lett. 49 (2000), pp. 155-161], Kw-XTG [M. Xie, Y. Tang, and T.N. Goh, A modified Weibull extension with bathtub failure rate function, Reliab. Eng. System Safety 76 (2002), pp. 279-285] and Kw-Flexible Weibull [M. Bebbington, C. D. Lai, and R. Zitikis, A flexible Weibull extension, Reliab. Eng. System Safety 92 (2007), pp. 719-726]. New properties of the Kw-G distribution are derived which include asymptotes, shapes, moments, moment generating function, mean deviations, Bonferroni and Lorenz curves, reliability, Renyi entropy and Shannon entropy. New properties of the order statistics are investigated. We discuss the estimation of the parameters by maximum likelihood. We provide two applications to real data sets and discuss a bivariate extension of the Kw-G distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets. (c) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the first time, we introduce a generalized form of the exponentiated generalized gamma distribution [Cordeiro et al. The exponentiated generalized gamma distribution with application to lifetime data, J. Statist. Comput. Simul. 81 (2011), pp. 827-842.] that is the baseline for the log-exponentiated generalized gamma regression model. The new distribution can accommodate increasing, decreasing, bathtub- and unimodal-shaped hazard functions. A second advantage is that it includes classical distributions reported in the lifetime literature as special cases. We obtain explicit expressions for the moments of the baseline distribution of the new regression model. The proposed model can be applied to censored data since it includes as sub-models several widely known regression models. It therefore can be used more effectively in the analysis of survival data. We obtain maximum likelihood estimates for the model parameters by considering censored data. We show that our extended regression model is very useful by means of two applications to real data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study a five-parameter lifetime distribution called the McDonald extended exponential model to generalize the exponential, generalized exponential, Kumaraswamy exponential and beta exponential distributions, among others. We obtain explicit expressions for the moments and incomplete moments, quantile and generating functions, mean deviations, Bonferroni and Lorenz curves and Gini concentration index. The method of maximum likelihood and a Bayesian procedure are adopted for estimating the model parameters. The applicability of the new model is illustrated by means of a real data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli's discussion of "convex Bayesianism" (in particular their proposals concerning E-admissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of "strong independence" (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli's results and recent developments on the axiomatization of non-binary preferences, and its impact on "complete" independence, are described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives To evaluate the accuracy and probabilities of different fetal ultrasound parameters to predict neonatal outcome in isolated congenital diaphragmatic hernia (CDH). Methods Between January 2004 and December 2010, we evaluated prospectively 108 fetuses with isolated CDH (82 left-sided and 26 right-sided). The following parameters were evaluated: gestational age at diagnosis, side of the diaphragmatic defect, presence of polyhydramnios, presence of liver herniated into the fetal thorax (liver-up), lung-to-head ratio (LHR) and observed/expected LHR (o/e-LHR), observed/expected contralateral and total fetal lung volume (o/e-ContFLV and o/e-TotFLV) ratios, ultrasonographic fetal lung volume/fetal weight ratio (US-FLW), observed/expected contralateral and main pulmonary artery diameter (o/e-ContPA and o/eMPA) ratios and the contralateral vascularization index (Cont-VI). The outcomes were neonatal death and severe postnatal pulmonary arterial hypertension (PAH). Results Neonatal mortality was 64.8% (70/108). Severe PAH was diagnosed in 68 (63.0%) cases, of which 63 died neonatally (92.6%) (P < 0.001). Gestational age at diagnosis, side of the defect and polyhydramnios were not associated with poor outcome (P > 0.05). LHR, o/eLHR, liver-up, o/e-ContFLV, o/e-TotFLV, US-FLW, o/eContPA, o/e-MPA and Cont-VI were associated with both neonatal death and severe postnatal PAH (P < 0.001). Receiver-operating characteristics curves indicated that measuring total lung volumes (o/e-TotFLV and US-FLW) was more accurate than was considering only the contralateral lung sizes (LHR, o/e-LHR and o/e-ContFLV; P < 0.05), and Cont-VI was the most accurate ultrasound parameter to predict neonatal death and severe PAH (P < 0.001). Conclusions Evaluating total lung volumes is more accurate than is measuring only the contralateral lung size. Evaluating pulmonary vascularization (Cont-VI) is the most accurate predictor of neonatal outcome. Estimating the probability of survival and severe PAH allows classification of cases according to prognosis. Copyright (C) 2011 ISUOG. Published by John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background Smear-negative pulmonary tuberculosis (SNPTB) accounts for 30% of Pulmonary Tuberculosis (PTB) cases reported annually in developing nations. Polymerase chain reaction (PCR) may provide an alternative for the rapid detection of Mycobacterium tuberculosis (MTB); however little data are available regarding the clinical utility of PCR in SNPTB, in a setting with a high burden of TB/HIV co-infection. Methods To evaluate the performance of the PCR dot-blot in parallel with pretest probability (Clinical Suspicion) in patients suspected of having SNPTB, a prospective study of 213 individuals with clinical and radiological suspicion of SNPTB was carried out from May 2003 to May 2004, in a TB/HIV reference hospital. Respiratory specialists estimated the pretest probability of active disease into high, intermediate, low categories. Expectorated sputum was examined by direct microscopy (Ziehl-Neelsen staining), culture (Lowenstein Jensen) and PCR dot-blot. Gold standard was based on culture positivity combined with the clinical definition of PTB. Results In smear-negative and HIV subjects, active PTB was diagnosed in 28.4% (43/151) and 42.2% (19/45), respectively. In the high, intermediate and low pretest probability categories active PTB was diagnosed in 67.4% (31/46), 24% (6/25), 7.5% (6/80), respectively. PCR had sensitivity of 65% (CI 95%: 50%–78%) and specificity of 83% (CI 95%: 75%–89%). There was no difference in the sensitivity of PCR in relation to HIV status. PCR sensitivity and specificity among non-previously TB treated and those treated in the past were, respectively: 69%, 43%, 85% and 80%. The high pretest probability, when used as a diagnostic test, had sensitivity of 72% (CI 95%:57%–84%) and specificity of 86% (CI 95%:78%–92%). Using the PCR dot-blot in parallel with high pretest probability as a diagnostic test, sensitivity, specificity, positive and negative predictive values were: 90%, 71%, 75%, and 88%, respectively. Among non-previously TB treated and HIV subjects, this approach had sensitivity, specificity, positive and negative predictive values of 91%, 79%, 81%, 90%, and 90%, 65%, 72%, 88%, respectively. Conclusion PCR dot-blot associated with a high clinical suspicion may provide an important contribution to the diagnosis of SNPTB mainly in patients that have not been previously treated attended at a TB/HIV reference hospital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. MATERIALS AND METHODS: Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. RESULTS: We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. CONCLUSION: Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS. Arq Bras Endocrinol Metab. 2012;56(9):633-7

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to evaluate the frequencies of human platelet antigens in oncohematological patients with thrombocytopenia and to analyze the probability of their incompatibility with platelet transfusions. METHODS: Platelet antigen genotyping was performed by sequence-specific primer polymerase chain reaction (SSP-PCR) for the HPA-1a, HPA-1b, HPA-2a, HPA-2b, HPA-3a, HPA-3b, HPA-4a, HPA-4b, HPA-5a, HPA-5b; HPA-15a, HPA-15b alleles in 150 patients of the Hematology Service of the Hospital das Clínicas (FMUSP). RESULTS: The allele frequencies found were: HPA-1a: 0.837; HPA-1b: 0.163; HPA-2a: 0.830; HPA-2b: 0.170; HPA-3a: 0.700; HPA-3b: 0.300; HPA-4a: 1; HPA-4b: 0; HPA-5a: 0.887; HPA-5b: 0.113; HPA-15a: 0.457 and HPA-15b: 0.543. CONCLUSIONS: Data from the present study showed that the A allele is more common in the population than the B allele, except for HPA-15. This suggests that patients homozygous for the B allele are more predisposed to present alloimmunization and refractoriness to platelet transfusions by immune causes. Platelet genotyping could be of great value in the diagnosis of alloimmune thrombocytopenia and to provide compatible platelet concentrates for these patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of the 3D multipole probability tomography method (3D GPT) to image source poles, dipoles, quadrupoles and octopoles, of a geophysical vector or scalar field dataset is developed. A geophysical dataset is assumed to be the response of an aggregation of poles, dipoles, quadrupoles and octopoles. These physical sources are used to reconstruct without a priori assumptions the most probable position and shape of the true geophysical buried sources, by determining the location of their centres and critical points of their boundaries, as corners, wedges and vertices. This theory, then, is adapted to the geoelectrical, gravity and self potential methods. A few synthetic examples using simple geometries and three field examples are discussed in order to demonstrate the notably enhanced resolution power of the new approach. At first, the application to a field example related to a dipole–dipole geoelectrical survey carried out in the archaeological park of Pompei is presented. The survey was finalised to recognize remains of the ancient Roman urban network including roads, squares and buildings, which were buried under the thick pyroclastic cover fallen during the 79 AD Vesuvius eruption. The revealed anomaly structures are ascribed to wellpreserved remnants of some aligned walls of Roman edifices, buried and partially destroyed by the 79 AD Vesuvius pyroclastic fall. Then, a field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging as accurately as possible the differential mass density structure within the first few km of depth inside the volcanic apparatus. An assemblage of vertical prismatic blocks appears to be the most probable gravity model of the Etna apparatus within the first 5 km of depth below sea level. Finally, an experimental SP dataset collected in the Mt. Somma-Vesuvius volcanic district (Naples, Italy) is elaborated in order to define location and shape of the sources of two SP anomalies of opposite sign detected in the northwestern sector of the surveyed area. The modelled sources are interpreted as the polarization state induced by an intense hydrothermal convective flow mechanism within the volcanic apparatus, from the free surface down to about 3 km of depth b.s.l..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of the RAMS (Reliability, Availability, Maintainability and Safety) performances of system generally includes the evaluations of the “Importance” of its components and/or of the basic parameters of the model through the use of the Importance Measures. The analytical equations proposed in this study allow the estimation of the first order Differential Importance Measure on the basis of the Birnbaum measures of components, under the hypothesis of uniform percentage changes of parameters. The aging phenomena are introduced into the model by assuming exponential-linear or Weibull distributions for the failure probabilities. An algorithm based on a combination of MonteCarlo simulation and Cellular Automata is applied in order to evaluate the performance of a networked system, made up of source nodes, user nodes and directed edges subjected to failure and repair. Importance Sampling techniques are used for the estimation of the first and total order Differential Importance Measures through only one simulation of the system “operational life”. All the output variables are computed contemporaneously on the basis of the same sequence of the involved components, event types (failure or repair) and transition times. The failure/repair probabilities are forced to be the same for all components; the transition times are sampled from the unbiased probability distributions or it can be also forced, for instance, by assuring the occurrence of at least a failure within the system operational life. The algorithm allows considering different types of maintenance actions: corrective maintenance that can be performed either immediately upon the component failure or upon finding that the component has failed for hidden failures that are not detected until an inspection; and preventive maintenance, that can be performed upon a fixed interval. It is possible to use a restoration factor to determine the age of the component after a repair or any other maintenance action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.