935 resultados para Increasing failure rate


Relevância:

90.00% 90.00%

Publicador:

Resumo:

AIM To assess the long-term success of maxillary fixed retainers, investigate their effect on gingival health, and analyse the survival rate after a mean period of 7 years (minimum 5 years) in retention. SUBJECTS AND METHODS Forty one subjects were included in the study A clinical examination of the upper canine to canine region including gingival index (GI), plaque index, probing depth, and bleeding on probing (BOP) was performed. Intraoral photographs and dental impressions were taken and irregularity index was determined and compared to the values of the immediate post-therapeutic values; failures of retainers were also recorded and analysed. RESULTS The mean observed retention time was 7 years and 5 months. Irregularity index: Changes occurring during retention were statistically different between the lateral incisors bonded to retainers and the canines not bonded to retainers. Only six patients showed changes in irregularity index of the lateral incisors in spite of a retainer in place. Periodontal health: The median value of the GI for all teeth bonded to upper retainers was 1.10 and the median value of the plaque index (PI) was 1.14. PI was not a significant predictor of GI. The overall BOP of the bonded teeth to the retainer for each participant was 22.3 per cent. Failure rate: Twenty-eight out of 41 patients experienced no failure of the upper bonded retainer (68.3 per cent). Detachments were the most frequent incidents. CONCLUSION Although plaque accumulation might be increased in patients with already poor oral hygiene, maxillary bonded retainers caused no significant negative effects on the periodontal health.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fixation failure of glenoid components is the main cause of unsuccessful total shoulder arthroplasties. The characteristics of these failures are still not well understood, hence, attempts at improving the implant fixation are somewhat blind and the failure rate remains high. This lack of understanding is largely due to the fundamental problem that direct observations of failure are impossible as the fixation is inherently embedded within the bone. Twenty custom made implants, reflecting various common fixation designs, and a specimen set-up was prepared to enable direct observation of failure when the specimens were exposed to cyclic superior loads during laboratory experiments. Finite element analyses of the laboratory tests were also carried out to explain the observed failure scenarios. All implants, irrespective of the particular fixation design, failed at the implant-cement interface and failure initiated at the inferior part of the component fixation. Finite element analyses indicated that this failure scenario was caused by a weak and brittle implant-cement interface and tensile stresses in the inferior region possibly worsened by a stress raiser effect at the inferior rim. The results of this study indicate that glenoid failure can be delayed or prevented by improving the implant/cement interface strength. Also any design features that reduce the geometrical stress raiser and the inferior tensile stresses in general should delay implant loosening.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cardiovascular disease is one of the leading causes of death around the world. Resting heart rate has been shown to be a strong and independent risk marker for adverse cardiovascular events and mortality, and yet its role as a predictor of risk is somewhat overlooked in clinical practice. With the aim of highlighting its prognostic value, the role of resting heart rate as a risk marker for death and other adverse outcomes was further examined in a number of different patient populations. A systematic review of studies that previously assessed the prognostic value of resting heart rate for mortality and other adverse cardiovascular outcomes was presented. New analyses of nine clinical trials were carried out. Both the original and extended Cox model that allows for analysis of time-dependent covariates were used to evaluate and compare the predictive value of baseline and time-updated heart rate measurements for adverse outcomes in the CAPRICORN, EUROPA, PROSPER, PERFORM, BEAUTIFUL and SHIFT populations. Pooled individual patient meta-analyses of the CAPRICORN, EPHESUS, OPTIMAAL and VALIANT trials, and the BEAUTIFUL and SHIFT trials, were also performed. The discrimination and calibration of the models applied were evaluated using Harrell’s C-statistic and likelihood ratio tests, respectively. Finally, following on from the systematic review, meta-analyses of the relation between baseline and time-updated heart rate, and the risk of death from any cause and from cardiovascular causes, were conducted. Both elevated baseline and time-updated resting heart rates were found to be associated with an increase in the risk of mortality and other adverse cardiovascular events in all of the populations analysed. In some cases, elevated time-updated heart rate was associated with risk of events where baseline heart rate was not. Time-updated heart rate also contributed additional information about the risk of certain events despite knowledge of baseline heart rate or previous heart rate measurements. The addition of resting heart rate to the models where resting heart rate was found to be associated with risk of outcome improved both discrimination and calibration, and in general, the models including time-updated heart rate along with baseline or the previous heart rate measurement had the highest and similar C-statistics, and thus the greatest discriminative ability. The meta-analyses demonstrated that a 5bpm higher baseline heart rate was associated with a 7.9% and an 8.0% increase in the risk of all-cause and cardiovascular death, respectively (both p less than 0.001). Additionally, a 5bpm higher time-updated heart rate (adjusted for baseline heart rate in eight of the ten studies included in the analyses) was associated with a 12.8% (p less than 0.001) and a 10.9% (p less than 0.001) increase in the risk of all-cause and cardiovascular death, respectively. These findings may motivate health care professionals to routinely assess resting heart rate in order to identify individuals at a higher risk of adverse events. The fact that the addition of time-updated resting heart rate improved the discrimination and calibration of models for certain outcomes, even if only modestly, strengthens the case that it be added to traditional risk models. The findings, however, are of particular importance, and have greater implications for the clinical management of patients with pre-existing disease. An elevated, or increasing heart rate over time could be used as a tool, potentially alongside other established risk scores, to help doctors identify patient deterioration or those at higher risk, who might benefit from more intensive monitoring or treatment re-evaluation. Further exploration of the role of continuous recording of resting heart rate, say, when patients are at home, would be informative. In addition, investigation into the cost-effectiveness and optimal frequency of resting heart rate measurement is required. One of the most vital areas for future research is the definition of an objective cut-off value for the definition of a high resting heart rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates how to make improved action selection for online policy learning in robotic scenarios using reinforcement learning (RL) algorithms. Since finding control policies using any RL algorithm can be very time consuming, we propose to combine RL algorithms with heuristic functions for selecting promising actions during the learning process. With this aim, we investigate the use of heuristics for increasing the rate of convergence of RL algorithms and contribute with a new learning algorithm, Heuristically Accelerated Q-learning (HAQL), which incorporates heuristics for action selection to the Q-Learning algorithm. Experimental results on robot navigation show that the use of even very simple heuristic functions results in significant performance enhancement of the learning rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The technology of self-reducing pellets for ferro-alloys production is becoming an emerging process due to the lower electric energy consumption and the improvement of metal recovery in comparison with the traditional process. This paper presents the effects of reduction temperature, addition of ferro-silicon and addition of slag forming agents for the production of high carbon ferro-chromium by utilization of self-reducing pellets. These pellets were composed of Brazilian chromium ore (chromite) concentrate, petroleum coke, Portland cement, ferro-silicon and slag forming components (silica and hydrated lime). The pellets were processed at 1 773 K, 1 823 K and 1 873 K using an induction furnace. The products obtained, containing slag and metallic phases, were analyzed by scanning electron microscopy and chemical analyses (XEDS). A large effect on the reduction time was observed by increasing the temperature from 1 773 K to 1 823 K for pellets without Fe-Si addition: around 4 times faster at 1 823 K than at 1 773 K for reaction fraction close to one. However, when the temperature was further increased from 1 823 K to 1 873 K the kinetics improved by double. At 1 773 K, the addition of 2% of ferro-silicon in the pellet resulted in an increasing reaction rate of around 6 times, in comparison with agglomerate without it. The addition of fluxing agents (silica and lime), which form initial slag before the reduction is completed, impaired the full reduction. These pellets became less porous after the reduction process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses the effects of temperature, addition of ferro-silicon and fluxing agents for the production of high carbon ferro-chromium by self-reducing process. The use of self-reducing agglomerates for ferro-alloys production is becoming an emerging processing technology due to lowering the electric energy consumption and improving the metal recovery in comparison with traditional ones. The self-reducing pellets were composed by chromite, petroleum coke, cement and small (0.1% - 2%) addition of ferro-silicon. The slag composition was adjusted by addition of fluxing agents. The reduction of pellets was carried out at 1773K (1500 degrees C), 1823K (1550 degrees C) and 1873K (1600 degrees C) by using induction furnace. The products obtained, containing slag and metallic phases, were analyzed by scanning electron microscopy and chemical analyses (XEDS). By increasing temperature from 1773K to 1823K large effect on the reduction time was observed. It decreased from 30 minutes to 10 minutes, for reaching around 0.98 reduction fraction. No significant effect on reduction time was observed when the reduction temperature was increased from 1823K to 1873K. At 1773K, the addition of 2% of ferro-silicon in the pellet resulted in an increasing reaction rate of around 6 times, in comparison with agglomerate without this addition. The addition of fluxing agents (silica and hydrated lime) has effect on reduction time (inverse relationship) and the pellets become less porous after reduction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inverse Weibull distribution has the ability to model failure rates which are quite common in reliability and biological studies. A three-parameter generalized inverse Weibull distribution with decreasing and unimodal failure rate is introduced and studied. We provide a comprehensive treatment of the mathematical properties of the new distribution including expressions for the moment generating function and the rth generalized moment. The mixture model of two generalized inverse Weibull distributions is investigated. The identifiability property of the mixture model is demonstrated. For the first time, we propose a location-scale regression model based on the log-generalized inverse Weibull distribution for modeling lifetime data. In addition, we develop some diagnostic tools for sensitivity analysis. Two applications of real data are given to illustrate the potentiality of the proposed regression model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A five-parameter distribution so-called the beta modified Weibull distribution is defined and studied. The new distribution contains, as special submodels, several important distributions discussed in the literature, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among others. The new distribution can be used effectively in the analysis of survival data since it accommodates monotone, unimodal and bathtub-shaped hazard functions. We derive the moments and examine the order statistics and their moments. We propose the method of maximum likelihood for estimating the model parameters and obtain the observed information matrix. A real data set is used to illustrate the importance and flexibility of the new distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A bathtub-shaped failure rate function is very useful in survival analysis and reliability studies. The well-known lifetime distributions do not have this property. For the first time, we propose a location-scale regression model based on the logarithm of an extended Weibull distribution which has the ability to deal with bathtub-shaped failure rate functions. We use the method of maximum likelihood to estimate the model parameters and some inferential procedures are presented. We reanalyze a real data set under the new model and the log-modified Weibull regression model. We perform a model check based on martingale-type residuals and generated envelopes and the statistics AIC and BIC to select appropriate models. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A four parameter generalization of the Weibull distribution capable of modeling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone as well as non-monotone failure rates, which are quite common in lifetime problems and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the Weibull, extreme value, exponentiated Weibull, generalized Rayleigh and modified Weibull distributions, among others. We derive two infinite sum representations for its moments. The density of the order statistics is obtained. The method of maximum likelihood is used for estimating the model parameters. Also, the observed information matrix is obtained. Two applications are presented to illustrate the proposed distribution. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study in detail the so-called beta-modified Weibull distribution, motivated by the wide use of the Weibull distribution in practice, and also for the fact that the generalization provides a continuous crossover towards cases with different shapes. The new distribution is important since it contains as special sub-models some widely-known distributions, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among several others. It also provides more flexibility to analyse complex real data. Various mathematical properties of this distribution are derived, including its moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are also derived for the chf, mean deviations, Bonferroni and Lorenz curves, reliability and entropies. The estimation of parameters is approached by two methods: moments and maximum likelihood. We compare by simulation the performances of the estimates from these methods. We obtain the expected information matrix. Two applications are presented to illustrate the proposed distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A four-parameter extension of the generalized gamma distribution capable of modelling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone and non-monotone failure rate functions, which are quite common in lifetime data analysis and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the exponentiated Weibull, exponentiated generalized half-normal, exponentiated gamma and generalized Rayleigh, among others. We derive two infinite sum representations for its moments. We calculate the density of the order statistics and two expansions for their moments. The method of maximum likelihood is used for estimating the model parameters and the observed information matrix is obtained. Finally, a real data set from the medical area is analysed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mehlich-1, resin-HCO(3), and Pi tests were used to assess available P in an acid tropical Oxisol in Brazil treated with gypsum, which has been preferred over lime to ameliorate the Al toxicity in the subsoil. The soil was incubated in the laboratory at rates up to 75 g kg(-1) of phosphogypsum (PG) containing 0.3% total P, natural gypsum, or reagent-grade gypsum, and up to 100 mg P kg(-1) as triple superphosphate (TSP) or phosphate rock (PR). In the greenhouse, two consecutive maize crops were grown on the soil treated with 50 mg P kg(-1) of TSP and PG rates up to 75 g kg(-1). The results of the incubation study showed that Mehlich-P and Pi-P increased with increasing PG rate for the treatments of TSP, PR, and control. Resin-HCO(3) underestimated available P from TSP and PR because of the reaction between resin-HCO(3) and gypsum. Mehlich-1 overestimated available P from PR compared with TSP because of an excessive dissolution of PR by the strongly acidic Mehlich-1. Pi underestimated available P from PR in the treatments of natural and reagent-grade gypsum because of Ca common-ion effect from gypsum on depressing PR dissolution. The results in terms of the effect of PG on available P are similar in both incubation and greenhouse studies. Both Mehlich-P and Pi-P correlated well with P uptake by maize, whereas resin-P did not.