415 resultados para Failure Rate


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This project was an observational study of outpatients following lower limb surgical procedures for removal of skin cancers. Findings highlight a previously unreported high surgical site failure rate. Results also identified four potential risk factors (increasing age, presence of leg pain, split skin graft and haematoma) which negatively impact on surgical site healing in this population.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents an event-based failure model to predict the number of failures that occur in water distribution assets. Often, such models have been based on analysis of historical failure data combined with pipe characteristics and environmental conditions. In this paper weather data have been added to the model to take into account the commonly observed seasonal variation of the failure rate. The theoretical basis of existing logistic regression models is briefly described in this paper, along with the refinements made to the model for inclusion of seasonal variation of weather. The performance of these refinements is tested using data from two Australian water authorities.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper details a statistical analysis of historical failure data, which focuses on determining the manner in which local climate affects pipe failure rates. It was found that seasonality exists in the data, indicating an affect of local climate on failure rate. Significant variation in failure rates was seen between the months of December and May, especially in February/March, whilst limited variations were seen in other months of the year. Further analysis found that failure rates were strongly correlated with minimum antecedent precipitation index and net evaporation and that climate affected failure rate by influencing soil moisture content. Interaction affects between static attributes of the pipe-environment system and local climate were also investigated.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND After general surgery, the lower limb experiences some of the highest complication rates. However, little is known about contributing factors to surgical site failure in the lower limb dermatological surgery population. OBJECTIVE To determine the incidence of lower limb surgical site failure and to explore the predictors that contribute to surgical site failure. METHODS A prospective observational study design was used to collect data from 73 participants, from July 2010, to March 2012. Incidence was determined as a percentage of surgical site failure from the total population. Predictors were determined by the use of a binary logistic regression model. RESULTS The surgical site failure rate was 53.4%. Split-skin grafting had a higher failure rate than primary closures, 66% versus 26.1%. Predictors of lower limb surgical site failure were identified as increasing age (p = .04) and the presence of postoperative hematoma (p = .01), with all patients who developed surgical site infection experiencing surgical site failure (p = .01). CONCLUSION Findings from this study confirmed that the lower limb is at high risk of surgical site failure. Two predictors of surgical site failure from this cohort were determined. However, to understand this phenomenon and make recommendations to assist and reduce surgical site complications, further research in this field is required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Poor student engagement and high failure rates in first year units were addressed at the Queensland University of Technology (QUT) with a course restructure involving a fresh approach to introducing programming. Students’ first taste of programming in the new course focused less on the language and syntax, and more on problem solving and design, and the role of programming in relation to other technologies they are likely to encounter in their studies. In effect, several technologies that have historically been compartmentalised and taught in isolation have been brought together as a breadth-first introduction to IT. Incorporating databases and Web development technologies into what used to be a purely programming unit gave students a very short introduction to each technology, with programming acting as the glue between each of them. As a result, students not only had a clearer understanding of the application of programming in the real world, but were able to determine their preference or otherwise for each of the technologies introduced, which will help them when the time comes for choosing a course major. Students engaged well in an intensely collaborative learning environment for this unit which was designed to both support the needs of students and meet industry expectations. Attrition from the unit was low, with computer laboratory practical attendance rates for the first time remaining high throughout semester, and the failure rate falling to a single figure percentage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In response to a range of contextual drivers, the worldwide adoption of ERP Systems in Higher Education Institutions (HEIs) has increased substantially over the past decade. Though the difficulties and high failure rate in implementing ERP systems at university environments have been cited in the literature, research on critical success factors (CSFs) for ERP implementations in this context is rare and fragmented. This paper is part of a larger research effort that aims to contribute to understanding the phenomenon of ERP implementations and evaluations in HEIs in the Australasian region; it identifies, previously reported, critical success factors (CSFs) in relation to ERP system implementations and discusses the importance of these factors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The consistently high failure rate in Queensland University of Technology’s introductory programming subject reflects a similar dilemma facing other universities worldwide. Experiments were conducted to quantify the effectiveness of collaborative learning on introductory level programming students over a number of semesters, replicating previous studies in this area. A selection of workshops in the introductory programming subject required students to problem-solve and program in pairs, mimicking the eXtreme Programming concept of pair programming. The failure rate for the subject fell from what had been an average of 30% since 2003 (with a high of 41% in 2006), to just 5% for those students who worked consistently in pairs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background The preservation of meniscal tissue is important to protect joint surfaces. Purpose We have an aggressive approach to meniscal repair, including repairing tears other than those classically suited to repair. Here we present the medium- to long-term outcome of meniscal repair (inside-out) in elite athletes. Study Design Case series; Level of evidence, 4. Methods Forty-two elite athletes underwent 45 meniscal repairs. All repairs were performed using an arthroscopically assisted inside-out technique. Eighty-three percent of these athletes had ACL reconstruction at the same time. Patients returned a completed questionnaire (including Lysholm and International Knee Documentation Committee [IKDC] scores). Mean follow-up was 8.5 years. Failure was defined by patients developing symptoms of joint line pain and/or locking or swelling requiring repeat arthroscopy and partial meniscectomy. Results The average Lysholm and subjective IKDC scores were 89.6 and 85.4, respectively. Eighty-one percent of patients returned to their main sport and most to a similar level at a mean time of 10.4 months after repair, reflecting the high level of ACL reconstruction in this group. We identified 11 definite failures, 10 medial and 1 lateral meniscus, that required excision; this represents a 24% failure rate. We identified 1 further patient who had possible failed repairs, giving a worst-case failure rate of 26.7% at a mean of 42 months after surgery. However, 7 of these failures were associated with a further injury. Therefore, the atraumatic failure rate was 11%. Age and size and location of the tears were not associated with a higher failure rate. Medial meniscal repairs were significantly more likely to fail than lateral meniscal repairs, with a failure rate of 36.4% and 5.6%, respectively (P < .05). Conclusion Meniscal repair and healing are possible, and most elite athletes can return to their preinjury level of activity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

My journey with Peer Assisted Study Sessions, or Supplemental Instruction (SI), began in 1993 when I took over a 1st year, 1st semester unit in QUT's Bachelor of Engineering program. The unit had 500 enrolments with students from all 10 engineering majors at QUT. The 500 students received a 2 hour lecture and a 1 hour tutorial per week, usually run by academic staff or postgraduate students. The unit covered basic mechanics, which comprises a challenging set of topics on how forces interact with various bodies. One normally expects 1st year students to find it difficult to come to grips with the material. However, when I ran that unit in 1993, the failure rate had been usually around 50%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The modern society has come to expect the electrical energy on demand, while many of the facilities in power systems are aging beyond repair and maintenance. The risk of failure is increasing with the aging equipments and can pose serious consequences for continuity of electricity supply. As the equipments used in high voltage power networks are very expensive, economically it may not be feasible to purchase and store spares in a warehouse for extended periods of time. On the other hand, there is normally a significant time before receiving equipment once it is ordered. This situation has created a considerable interest in the evaluation and application of probability methods for aging plant and provisions of spares in bulk supply networks, and can be of particular importance for substations. Quantitative adequacy assessment of substation and sub-transmission power systems is generally done using a contingency enumeration approach which includes the evaluation of contingencies, classification of the contingencies based on selected failure criteria. The problem is very complex because of the need to include detailed modelling and operation of substation and sub-transmission equipment using network flow evaluation and to consider multiple levels of component failures. In this thesis a new model associated with aging equipment is developed to combine the standard tools of random failures, as well as specific model for aging failures. This technique is applied in this thesis to include and examine the impact of aging equipments on system reliability of bulk supply loads and consumers in distribution network for defined range of planning years. The power system risk indices depend on many factors such as the actual physical network configuration and operation, aging conditions of the equipment, and the relevant constraints. The impact and importance of equipment reliability on power system risk indices in a network with aging facilities contains valuable information for utilities to better understand network performance and the weak links in the system. In this thesis, algorithms are developed to measure the contribution of individual equipment to the power system risk indices, as part of the novel risk analysis tool. A new cost worth approach was developed in this thesis that can make an early decision in planning for replacement activities concerning non-repairable aging components, in order to maintain a system reliability performance which economically is acceptable. The concepts, techniques and procedures developed in this thesis are illustrated numerically using published test systems. It is believed that the methods and approaches presented, substantially improve the accuracy of risk predictions by explicit consideration of the effect of equipment entering a period of increased risk of a non-repairable failure.