291 resultados para Failure Probability
Resumo:
This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.
Resumo:
Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.
Resumo:
This article presents new theoretical and empirical evidence on the forecasting ability of prediction markets. We develop a model that predicts that the time until expiration of a prediction market should negatively affect the accuracy of prices as a forecasting tool in the direction of a ‘favourite/longshot bias’. That is, high-likelihood events are underpriced, and low-likelihood events are over-priced. We confirm this result using a large data set of prediction market transaction prices. Prediction markets are reasonably well calibrated when time to expiration is relatively short, but prices are significantly biased for events farther in the future. When time value of money is considered, the miscalibration can be exploited to earn excess returns only when the trader has a relatively low discount rate.
Resumo:
A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements.
Resumo:
In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.
Resumo:
The operation of the law rests on the selection of an account of the facts. Whether this involves prediction or postdiction, it is not possible to achieve certainty. Any attempt to model the operation of the law completely will therefore raise questions of how to model the process of proof. In the selection of a model a crucial question will be whether the model is to be used normatively or descriptively. Focussing on postdiction, this paper presents and contrasts the mathematical model with the story model. The former carries the normative stamp of scientific approval, whereas the latter has been developed by experimental psychologists to describe how humans reason. Neil Cohen's attempt to use a mathematical model descriptively provides an illustration of the dangers in not clearly setting this parameter of the modelling process. It should be kept in mind that the labels 'normative' and 'descriptive' are not eternal. The mathematical model has its normative limits, beyond which we may need to critically assess models with descriptive origins.
Resumo:
It has been 21 years since the decision in Rogers v Whitaker and the legal principles concerning informed consent and liability for negligence are still strongly grounded in this landmark High Court decision. This paper considers more recent developments in the law concerning the failure to disclose inherent risks in medical procedures, focusing on the decision in Wallace v Kam [2013] HCA 19. In this case, the appellant underwent a surgical procedure that carried a number of risks. The surgery itself was not performed in a sub-standard way, but the surgeon failed to disclose two risks to the patient, a failure that constituted a breach of the surgeon’s duty of care in negligence. One of the undisclosed risks was considered to be less serious than the other, and this lesser risk eventuated causing injury to the appellant. The more serious risk did not eventuate, but the appellant argued that if the more serious risk had been disclosed, he would have avoided his injuries completely because he would have refused to undergo the procedure. Liability was disputed by the surgeon, with particular reference to causation principles. The High Court of Australia held that the appellant should not be compensated for harm that resulted from a risk he would have been willing to run. We examine the policy reasons underpinning the law of negligence in this specific context and consider some of the issues raised by this unusual case. We question whether some of the judicial reasoning adopted in this case, represents a significant shift in traditional causation principles.
Resumo:
While the Probability Ranking Principle for Information Retrieval provides the basis for formal models, it makes a very strong assumption regarding the dependence between documents. However, it has been observed that in real situations this assumption does not always hold. In this paper we propose a reformulation of the Probability Ranking Principle based on quantum theory. Quantum probability theory naturally includes interference effects between events. We posit that this interference captures the dependency between the judgement of document relevance. The outcome is a more sophisticated principle, the Quantum Probability Ranking Principle, that provides a more sensitive ranking which caters for interference/dependence between documents’ relevance.
Resumo:
Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.
Resumo:
In Kumar v Suncorp Metway Insurance Limited [2004] QSC 381 Douglas J examined s37 of the Motor Accident Insurance Act 1994 (Qld) in the context of an accident involving multiple insurers when a notice of accident had not been given to the Nominal Defendant
Resumo:
Railhead is perhaps the highest stressed civil infrastructure due to the passage of heavily loaded wheels through a very small contact patch. The stresses at the contact patch cause yielding of the railhead material and wear. Many theories exist for the prediction of these mechanisms of continuous rails; this process in the discontinuous rails is relatively sparingly researched. Discontinuous railhead edges fail due to accumulating excessive plastic strains. Significant safety concern is widely reported as these edges form part of Insulated Rail Joints (IRJs) in the signalling track circuitry. Since Hertzian contact is not valid at a discontinuous edge, 3D finite element (3DFE) models of wheel contact at a railhead edge have been used in this research. Elastic–plastic material properties of the head hardened rail steel have been experimentally determined through uniaxial monotonic tension tests and incorporated into a FE model of a cylindrical specimen subject to cyclic tension load- ing. The parameters required for the Chaboche kinematic hardening model have been determined from the stabilised hysteresis loops of the cyclic load simulation and imple- mented into the 3DFE model. The 3DFE predictions of the plastic strain accumulation in the vicinity of the wheel contact at discontinuous railhead edges are shown to be affected by the contact due to passage of wheels rather than the magnitude of the loads the wheels carry. Therefore to eliminate this failure mechanism, modification to the contact patch is essential; reduction in wheel load cannot solve this problem.
Resumo:
This paper evaluates and proposes various compensation methods for three-level Z-source inverters under semiconductor-failure conditions. Unlike the fault-tolerant techniques used in traditional three-level inverters, where either an extra phase-leg or collective switching states are used, the proposed methods for three-level Z-source inverters simply reconfigure their relevant gating signals so as to ride-through the failed semiconductor conditions smoothly without any significant decrease in their ac-output quality and amplitude. These features are partly attributed to the inherent boost characteristics of a Z-source inverter, in addition to its usual voltage-buck operation. By focusing on specific types of three-level Z-source inverters, it can also be shown that, for the dual Z-source inverters, a unique feature accompanying it is its extra ability to force common-mode voltage to zero even under semiconductor-failure conditions. For verifying these described performance features, PLECS simulation and experimental testing were performed with some results captured and shown in a later section for visual confirmation.
Resumo:
An increasing range of services are now offered via online applications and e-commerce websites. However, problems with online services still occur at times, even for the best service providers due to the technical failures, informational failures, or lack of required website functionalities. Also, the widespread and increasing implementation of web services means that service failures are both more likely to occur, and more likely to have serious consequences. In this paper we first develop a digital service value chain framework based on existing service delivery models adapted for digital services. We then review current literature on service failure prevention, and provide a typology of technolo- gies and approaches that can be used to prevent failures of different types (functional, informational, system), that can occur at different stages in the web service delivery. This makes a contribution to theory by relating specific technologies and technological approaches to the point in the value chain framework where they will have the maximum impact. Our typology can also be used to guide the planning, justification and design of robust, reliable web services.
Resumo:
The ubiquitin-proteasome system targets many cellular proteins for degradation and thereby controls most cellular processes. Although it is well established that proteasome inhibition is lethal, the underlying mechanism is unknown. Here, we show that proteasome inhibition results in a lethal amino acid shortage. In yeast, mammalian cells, and flies, the deleterious consequences of proteasome inhibition are rescued by amino acid supplementation. In all three systems, this rescuing effect occurs without noticeable changes in the levels of proteasome substrates. In mammalian cells, the amino acid scarcity resulting from proteasome inhibition is the signal that causes induction of both the integrated stress response and autophagy, in an unsuccessful attempt to replenish the pool of intracellular amino acids. These results reveal that cells can tolerate protein waste, but not the amino acid scarcity resulting from proteasome inhibition.
Resumo:
A better understanding of the behaviour of prepared cane and bagasse, and the ability to model the mechanical behaviour of bagasse as it is squeezed in a milling unit to extract juice, would help identify how to improve the current process. There are opportunities to decrease bagasse moisture from a milling unit. The behaviour of bagasse in chutes is poorly understood. Previous investigations have shown that juice flow through bagasse obeys Darcy’s permeability law, that the grip of the rough surface of the grooves on the bagasse can be represented by the Mohr-Coulomb failure criterion for soils, and that the internal mechanical behaviour of the bagasse is critical state behaviour similar to that for sand and clay. Progress has been made in the last 11 years towards implementing a mechanical model for bagasse in finite element software. The objective is to be able to correctly simulate various simple mechanical loading conditions measured in the laboratory. Combining these behaviours together is thought to have a high probability of reproducing the complicated stress conditions in a milling unit. This paper reports on progress made towards modelling the fifth and final (and most challenging) of the simple loading conditions: the shearing of heavily over-consolidated bagasse, using a specific model for bagasse in a multi-element simulation.