10 resultados para Failure Rate

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Identification of all important community members as well as of the numerically dominant members of a community are key aspects of microbial community analysis of bioreactor samples. A systematic study was conducted with artificial consortia to test whether denaturing gradient gel electrophoresis (DGCE) is a reliable technique to obtain such community data under conditions where results would not be affected by differences in DNA extraction efficiency from cells. A total of 27 consortia were established by mixing DNA extracted from Escherichia coli K12, Burkholderia cepacia and Stenotrophomonas maltophilia in different proportions. Concentrations of DNA of single organisms in the consortia were either 0.04, 0.4 or 4 ng/mu l. DGGE-PCR of genomic DNA with primer sets targeted at the V3 and V6-V8 regions of the 16S rDNA failed to detect the three community members in only 7% of consortia, but provided incorrect information about dominance or co-dominance for 85% and 89% of consortia with the primer sets for the V6-V8 and V3 regions, respectively. The high failure rate in detection of dominant B. cepacia with the primers for the V6-V8 region was attributable to a single nucleoticle primer mismatch in the target sequences of both, the forward and reverse primer. Amplification bias in PCR of E. coli and S. maltophilia for the V6-V8 region and for all three organisms for the V3 region occurred due to interference of genomic DNA in PCR-DGGE, since a nested PCR approach, where PCR-DGGE was started from mixtures of 16S rRNA genes of the organisms, provided correct information about the relative abundance of original DNA in the sample. Multiple bands were not observed in pure culture amplicons produced with the V6-V8 primer pair, but pure culture V3 DGGE profiles of E. coli, S. maltophilia and B. cepacia contained 5, 3 and 3 bands, respectively. These results demonstrate DGGE was suitable for identification of all important community members in the three-membered artificial consortium, but not for identification of the dominant organisms in this small community. Multiple DGGE bands obtained for single organisms with the V3 primer pair could greatly confound interpretation of DGGE profiles. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In survival analysis applications, the failure rate function may frequently present a unimodal shape. In such case, the log-normal or log-logistic distributions are used. In this paper, we shall be concerned only with parametric forms, so a location-scale regression model based on the Burr XII distribution is proposed for modeling data with a unimodal failure rate function as an alternative to the log-logistic regression model. Assuming censored data, we consider a classic analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the log-logistic and log-Burr XII regression models. Besides, we use sensitivity analysis to detect influential or outlying observations, and residual analysis is used to check the assumptions in the model. Finally, we analyze a real data set under log-Buff XII regression models. (C) 2008 Published by Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, a simple relation between the Leimkuhler curve and the mean residual life is established. The result is illustrated with several models commonly used in informetrics, such as exponential, Pareto and lognormal. Finally, relationships with some other reliability concepts are also presented. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Considering a series representation of a coherent system using a shift transform of the components lifetime T-i, at its critical level Y-i, we study two problems. First, under such a shift transform, we analyse the preservation properties of the non-parametric distribution classes and secondly the association preserving property of the components lifetime under such transformations. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we introduce the Weibull power series (WPS) class of distributions which is obtained by compounding Weibull and power series distributions where the compounding procedure follows same way that was previously carried out by Adamidis and Loukas (1998) This new class of distributions has as a particular case the two-parameter exponential power series (EPS) class of distributions (Chahkandi and Gawk 2009) which contains several lifetime models such as exponential geometric (Adamidis and Loukas 1998) exponential Poisson (Kus 2007) and exponential logarithmic (Tahmasbi and Rezaei 2008) distributions The hazard function of our class can be increasing decreasing and upside down bathtub shaped among others while the hazard function of an EPS distribution is only decreasing We obtain several properties of the WPS distributions such as moments order statistics estimation by maximum likelihood and inference for a large sample Furthermore the EM algorithm is also used to determine the maximum likelihood estimates of the parameters and we discuss maximum entropy characterizations under suitable constraints Special distributions are studied in some detail Applications to two real data sets are given to show the flexibility and potentiality of the new class of distributions (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study evaluated the in vitro influence of pulse-repetition rate of Er:YAG laser and dentin depth on tensile bond strength of dentin-resin interface. Dentin surfaces of buccal or lingual surfaces from human third molars were submitted to tensile test in different depths (superficial, 1.0 and 1.5 mm) of the same dental area, using the same sample. Surface treatments were acid conditioning solely (control) and Er:YAG laser irradiation (80 mJ) followed by acid conditioning, with different pulse-repetition rates (1, 2, 3, or 4 Hz). Single bond/Z-250 system was used. The samples were stored in distilled water at 37 degrees C for 24 h, and then the first test (superficial dentine) was performed. The bond failures were analyzed. Following, the specimens were identified, grounded until 1.0- and 1.5-mm depths, submitted again to the treatments and to the second and, after that, to third-bond tests on a similar procedure and failure analysis. ANOVA and Tukey test demonstrated a significant difference (p < 0.001) for treatment and treatment X depth interaction (p < 0.05). The tested depths did not show influence (p > 0.05) on the bond strength of dentin-resin interface. It may be concluded that Er:YAG laser with 1, 2, 3, or 4 Hz combined with acid conditioning did not increase the resin tensile bond strength to dentin, regardless of dentin depth. (C) 2007 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Eplerenone Post-Acute Myocardial Infarction Heart Failure Efficacy and Survival Study ( n = 6632), eplerenone- associated reduction in all- cause mortality was significantly greater in those with a history of hypertension ( Hx- HTN). There were 4007 patients with Hx- HTN ( eplerenone: n = 1983) and 2625 patients without Hx- HTN ( eplerenone: n = 1336). Propensity scores for eplerenone use, separately calculated for patients with and without Hx- HTN, were used to assemble matched cohorts of 1838 and 1176 pairs of patients. In patients with Hx- HTN, all- cause mortality occurred in 18% of patients treated with placebo ( rate, 1430/ 10 000 person- years) and 14% of patients treated with eplerenone ( rate, 1058/ 10 000 person- years) during 2350 and 2457 years of follow- up, respectively ( hazard ratio [ HR]: 0.71; 95% CI: 0.59 to 0.85; P < 0.0001). Composite end point of cardiovascular hospitalization or cardiovascular mortality occurred in 33% of placebo-treated patients ( 3029/ 10 000 person- years) and 28% of eplerenone- treated patients (2438/10 000 person- years) with Hx- HTN ( HR: 0.82; 95% CI: 0.72 to 0.94; P = 0.003). In patients without Hx- HTN, eplerenone reduced heart failure hospitalization ( HR: 73; 95% CI: 0.55 to 0.97; P = 0.028) but had no effect on mortality ( HR: 0.91; 95% CI: 0.72 to 1.15; P = 0.435) or on the composite end point ( HR: 0.91; 95% CI: 0.76 to 1.10; P = 0.331). Eplerenone should, therefore, be prescribed to all of the post - acute myocardial infarction patients with reduced left ventricular ejection fraction and heart failure regardless of Hx- HTN.