996 resultados para Expectation hypothesis failure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The author, Dean Shepherd, is of entrepreneurship—how entrepreneurs think, decide to act, and feel. He recently realized that while his publications in academic journals have implications for entrepreneurs, those implications have remained relatively hidden in the text of the articles and hidden in articles published in journals largely inaccessible to those involved in the entrepreneurial process. This series is designed to bring the practical implications of his research to the forefront.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background There are few data regarding the effectiveness of remote monitoring for older people with heart failure. We conducted a post-hoc sub-analysis of a previously published large Cochrane systematic review and meta-analysis of relevant randomized controlled trials to determine whether structured telephone support and telemonitoring were effective in this population. Methods A post hoc sub-analysis of a systematic review and meta-analysis that applied the Cochrane methodology was conducted. Meta-analyses of all-cause mortality, all-cause hospitalizations and heart failure-related hospitalizations were performed for studies where the mean or median age of participants was 70 or more years. Results The mean or median age of participants was 70 or more years in eight of the 16 (n=2,659/5,613; 47%) structured telephone support studies and four of the 11 (n=894/2,710; 33%) telemonitoring studies. Structured telephone support (RR 0.80; 95% CI=0.63-1.00) and telemonitoring (RR 0.56; 95% CI=0.41-0.76) interventions reduced mortality. Structured telephone support interventions reduced heart failure-related hospitalizations (RR 0.81; 95% CI=0.67-0.99). Conclusion Despite a systematic bias towards recruitment of individuals younger than the epidemiological average into the randomized controlled trials, older people with heart failure did benefit from structured telephone support and telemonitoring. These post-hoc sub-analysis results were similar to overall effects observed in the main meta-analysis. While further research is required to confirm these observational findings, the evidence at hand indicates that discrimination by age alone may be not be appropriate when inviting participation in a remote monitoring service for heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been 21 years since the decision in Rogers v Whitaker and the legal principles concerning informed consent and liability for negligence are still strongly grounded in this landmark High Court decision. This paper considers more recent developments in the law concerning the failure to disclose inherent risks in medical procedures, focusing on the decision in Wallace v Kam [2013] HCA 19. In this case, the appellant underwent a surgical procedure that carried a number of risks. The surgery itself was not performed in a sub-standard way, but the surgeon failed to disclose two risks to the patient, a failure that constituted a breach of the surgeon’s duty of care in negligence. One of the undisclosed risks was considered to be less serious than the other, and this lesser risk eventuated causing injury to the appellant. The more serious risk did not eventuate, but the appellant argued that if the more serious risk had been disclosed, he would have avoided his injuries completely because he would have refused to undergo the procedure. Liability was disputed by the surgeon, with particular reference to causation principles. The High Court of Australia held that the appellant should not be compensated for harm that resulted from a risk he would have been willing to run. We examine the policy reasons underpinning the law of negligence in this specific context and consider some of the issues raised by this unusual case. We question whether some of the judicial reasoning adopted in this case, represents a significant shift in traditional causation principles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Kumar v Suncorp Metway Insurance Limited [2004] QSC 381 Douglas J examined s37 of the Motor Accident Insurance Act 1994 (Qld) in the context of an accident involving multiple insurers when a notice of accident had not been given to the Nominal Defendant

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Railhead is perhaps the highest stressed civil infrastructure due to the passage of heavily loaded wheels through a very small contact patch. The stresses at the contact patch cause yielding of the railhead material and wear. Many theories exist for the prediction of these mechanisms of continuous rails; this process in the discontinuous rails is relatively sparingly researched. Discontinuous railhead edges fail due to accumulating excessive plastic strains. Significant safety concern is widely reported as these edges form part of Insulated Rail Joints (IRJs) in the signalling track circuitry. Since Hertzian contact is not valid at a discontinuous edge, 3D finite element (3DFE) models of wheel contact at a railhead edge have been used in this research. Elastic–plastic material properties of the head hardened rail steel have been experimentally determined through uniaxial monotonic tension tests and incorporated into a FE model of a cylindrical specimen subject to cyclic tension load- ing. The parameters required for the Chaboche kinematic hardening model have been determined from the stabilised hysteresis loops of the cyclic load simulation and imple- mented into the 3DFE model. The 3DFE predictions of the plastic strain accumulation in the vicinity of the wheel contact at discontinuous railhead edges are shown to be affected by the contact due to passage of wheels rather than the magnitude of the loads the wheels carry. Therefore to eliminate this failure mechanism, modification to the contact patch is essential; reduction in wheel load cannot solve this problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a standard overlapping generations growth model, with a fixed amount of land and endogenous fertility, the competitive economy converges to a steady state with a zero population growth rate and positive consumption per capita. The Malthusian hypothesis is interpreted as a positive statement about the relationship between population growth and consumption per-capita, when production exhibits diminishing returns to labor and there is a fixed amount of land essential for production. Even when individuals care only about the number of their children and not about their children's welfare, the equilibrium is such that they eventually would choose to have only one child for each adult. Hence, if Malthus's "positive check' on population is the result of the response of optimizing agents to competitively determined prices, Malthus's pessimistic conjecture is not necessarily true, even though his other assumptions hold. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new test of hypothesis for classifying stationary time series based on the bias-adjusted estimators of the fitted autoregressive model is proposed. It is shown theoretically that the proposed test has desirable properties. Simulation results show that when time series are short, the size and power estimates of the proposed test are reasonably good, and thus this test is reliable in discriminating between short-length time series. As the length of the time series increases, the performance of the proposed test improves, but the benefit of bias-adjustment reduces. The proposed hypothesis test is applied to two real data sets: the annual real GDP per capita of six European countries, and quarterly real GDP per capita of five European countries. The application results demonstrate that the proposed test displays reasonably good performance in classifying relatively short time series.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Engaging clients from the outset of psychotherapy is important for therapeutic success. However, there is little research evaluating therapists’ initial attempts to engage clients. This article reports retrospective analysis of data from a trial of online Cognitive Behavioural Therapy (CBT) for depression. Qualitative and quantitative methods were used to evaluate how therapists manage clients’ expectations at the outset of therapy and its relationship with client retention in the therapeutic intervention. Aims To develop a system to codify expectation management in initial sessions of online CBT and evaluate its relationship with retention. Method Initial qualitative research using conversation analysis identified three different communication practices used by therapists at the start of first sessions: no expectation management, some expectation management, and comprehensive expectation management. These findings were developed into a coding scheme that enabled substantial inter-rater agreement (weighted Kappa = 0.78; 95% CI: 0.52 to 0.94) and was applied to all trial data. Results Adjusting for a range of client variables, primary analysis of data from 147 clients found comprehensive expectation management was associated with clients remaining in therapy for 1.4 sessions longer than those who received no expectation management (95% CI: -0.2 to 3.0). This finding was supported by a sensitivity analysis including an additional 21 clients (1.6 sessions, 95% CI: 0.2 to 3.1). Conclusions Using a combination of qualitative and quantitative methods, this study suggests a relationship between expectation management and client retention in online CBT for depression, which has implications for professional practice. A larger prospective study would enable a more precise estimate of retention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper evaluates and proposes various compensation methods for three-level Z-source inverters under semiconductor-failure conditions. Unlike the fault-tolerant techniques used in traditional three-level inverters, where either an extra phase-leg or collective switching states are used, the proposed methods for three-level Z-source inverters simply reconfigure their relevant gating signals so as to ride-through the failed semiconductor conditions smoothly without any significant decrease in their ac-output quality and amplitude. These features are partly attributed to the inherent boost characteristics of a Z-source inverter, in addition to its usual voltage-buck operation. By focusing on specific types of three-level Z-source inverters, it can also be shown that, for the dual Z-source inverters, a unique feature accompanying it is its extra ability to force common-mode voltage to zero even under semiconductor-failure conditions. For verifying these described performance features, PLECS simulation and experimental testing were performed with some results captured and shown in a later section for visual confirmation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An increasing range of services are now offered via online applications and e-commerce websites. However, problems with online services still occur at times, even for the best service providers due to the technical failures, informational failures, or lack of required website functionalities. Also, the widespread and increasing implementation of web services means that service failures are both more likely to occur, and more likely to have serious consequences. In this paper we first develop a digital service value chain framework based on existing service delivery models adapted for digital services. We then review current literature on service failure prevention, and provide a typology of technolo- gies and approaches that can be used to prevent failures of different types (functional, informational, system), that can occur at different stages in the web service delivery. This makes a contribution to theory by relating specific technologies and technological approaches to the point in the value chain framework where they will have the maximum impact. Our typology can also be used to guide the planning, justification and design of robust, reliable web services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ubiquitin-proteasome system targets many cellular proteins for degradation and thereby controls most cellular processes. Although it is well established that proteasome inhibition is lethal, the underlying mechanism is unknown. Here, we show that proteasome inhibition results in a lethal amino acid shortage. In yeast, mammalian cells, and flies, the deleterious consequences of proteasome inhibition are rescued by amino acid supplementation. In all three systems, this rescuing effect occurs without noticeable changes in the levels of proteasome substrates. In mammalian cells, the amino acid scarcity resulting from proteasome inhibition is the signal that causes induction of both the integrated stress response and autophagy, in an unsuccessful attempt to replenish the pool of intracellular amino acids. These results reveal that cells can tolerate protein waste, but not the amino acid scarcity resulting from proteasome inhibition.