929 resultados para DECREASING FAILURE RATE


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study was to evaluate in vivo the failure rate of metallic brackets bonded with two orthodontic composites. Nineteen patients with ages ranging from 10.5 to 38.7 years needing corrective orthodontic treatment were selected for study. The enamel surfaces from second premolars to second premolars were treated with Transbond Plus-Self Etching Primer (3M Unitek). Next, 380 orthodontic brackets were bonded on maxillary and mandibular teeth, as follows: 190 with Transbond XT composite (3M Unitek) (control) and 190 with Transbond Plus Color Change (3M Unitek) (experimental) in contralateral quadrants. The bonded brackets were light cured for 40 s, and initial alignment archwires were inserted. Bond failure rates were recorded over a six-month period. At the end of the evaluation, six bond failures occurred, three for each composite. Kaplan-Meyer method and log-rank test (Mantel-Cox) was used for statistical analysis, and no statistically significant difference was found between the materials (p=0.999). Both Transbond XT and Transbond Plus Color Change composites had low debonding rates over the study period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study was to evaluate in vivo the failure rate of metallic brackets bonded with two orthodontic composites. Nineteen patients with ages ranging from 10.5 to 38.7 years needing corrective orthodontic treatment were selected for study. The enamel surfaces from second premolars to second premolars were treated with Transbond Plus-Self Etching Primer (3M Unitek). Next, 380 orthodontic brackets were bonded on maxillary and mandibular teeth, as follows: 190 with Transbond XT composite (3M Unitek) (control) and 190 with Transbond Plus Color Change (3M Unitek) (experimental) in contralateral quadrants. The bonded brackets were light cured for 40 s, and initial alignment archwires were inserted. Bond failure rates were recorded over a six-month period. At the end of the evaluation, six bond failures occurred, three for each composite. Kaplan-Meyer method and log-rank test (Mantel-Cox) was used for statistical analysis, and no statistically significant difference was found between the materials (p=0.999). Both Transbond XT and Transbond Plus Color Change composites had low debonding rates over the study period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resistance of trypanosomes to melarsoprol is ascribed to reduced uptake of the drug via the P2 nucleoside transporter. The aim of this study was to look for evidence of drug resistance in Trypanosoma brucei gambiense isolates from sleeping sickness patients in Ibba, South Sudan, an area of high melarsoprol failure rate. Eighteen T. b. gambiense stocks were phenotypically and only 10 strains genotypically characterized. In vitro, all isolates were sensitive to melarsoprol, melarsen oxide, and diminazene. Infected mice were cured with a 4 day treatment of 2.5mg/kg bwt melarsoprol, confirming that the isolates were sensitive. The gene that codes for the P2 transporter, TbATI, was amplified by PCR and sequenced. The sequences were almost identical to the TbAT1(sensitive) reference, except for one point mutation, C1384T resulting in the amino acid change proline-462 to serine. None of the described TbAT1(resistant)-type mutations were detected. In a T. b. gambiense sleeping sickness focus where melarsoprol had to be abandoned due to the high incidence of treatment failures, no evidence for drug resistant trypanosomes or for TbAT1(resistant)-type alleles of the P2 transporter could be found. These findings indicate that factors other than drug resistance contribute to melarsoprol treatment failures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Stable reconstruction of proximal femoral (PF) fractures is especially challenging due to the peculiarity of the injury patterns and the high load-bearing requirement. Since its introduction in 2007, the PF-locking compression plate (LCP) 4.5/5.0 has improved osteosynthesis for intertrochanteric and subtrochanteric fractures of the femur. This study reports our early results with this implant. METHODS Between January 2008 and June 2010, 19 of 52 patients (12 males, 7 females; mean age 59 years, range 19-96 years) presenting with fractures of the trochanteric region were treated at the authors' level 1 trauma centre with open reduction and internal fixation using PF-LCP. Postoperatively, partial weight bearing was allowed for all 19 patients. Follow-up included a thorough clinical and radiological evaluation at 1.5, 3, 6, 12, 24, 36 and 48 months. Failure analysis was based on conventional radiological and clinical assessment regarding the type of fracture, postoperative repositioning, secondary fracture dislocation in relation to the fracture constellation and postoperative clinical function (Merle d'Aubigné score). RESULTS In 18 patients surgery achieved adequate reduction and stable fixation without intra-operative complications. In one patient an ad latus displacement was observed on postoperative X-rays. At the third month follow-up four patients presented with secondary varus collapse and at the sixth month follow-up two patients had 'cut-outs' of the proximal fragment, with one patient having implant failure due to a broken proximal screw. Revision surgeries were performed in eight patients, one patient receiving a change of one screw, three patients undergoing reosteosynthesis with implantation of a condylar plate and one patient undergoing hardware removal with secondary implantation of a total hip prosthesis. Eight patients suffered from persistent trochanteric pain and three patients underwent hardware removal. CONCLUSIONS Early results for PF-LCP osteosynthesis show major complications in 7 of 19 patients requiring reosteosynthesis or prosthesis implantation due to secondary loss of reduction or hardware removal. Further studies are required to evaluate the limitations of this device.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND We observed a case of conductor externalization in a Biotronik Linox lead. OBJECTIVE To investigate lead performance of the Linox and identical Sorin Vigila lead and prevalence of conductor externalization. METHODS We compared lead performance of all Linox and Vigila leads implanted at our center (BL group; n=93) with all Boston Scientific Endotak Reliance leads (ER group; n=190) and Medtronic Sprint Quattro leads (SQ group; n=202) implanted during the same period. We screened all BL group patients for conductor externalization. RESULTS We identified 8 cases of lead failures in the BL group (index case of conductor externalization; 6 cases of non-physiological high rate sensing; one case of high voltage conductor fracture). Prospective, fluoroscopic screening of 98% of all active BL group cases revealed one additional case of conductor externalization. Median follow-up was 41, 27 and 29 months for the BL group, ER group and SQ group, respectively, lead survival 94.9%, 99.2% and 100% at 3 years, and 88%, 97.5% and 100% at 5 years (p=0.038 for BL group vs. ER group, and p=0.007 for BL group vs. SQ group by the log-rank test). Younger age at implant was an independent predictor for lead failure in the BL group (adjusted HR 0.85 [95% confidence interval 0.77-0.94]; p=0.001). CONCLUSION At our center, survival of the Linox lead is 88% at five years and significantly worse than its comparators. Conductor externalization is present in a minority of failed Linox leads. Younger age at implant is an independent predictor of Linox lead failure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cover title.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A common assumption in the restaurant industry is that restaurants fail at an exceedingly high rate. However, statistical research to support this assumption is limited. The authors present a study of 10 years in the life of three markets and offer new data for managers to consider.