719 resultados para Empirical Comparison


Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth (AeH) systems–a new genre of eHealth systems designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypotheses relating to 9 constructs were tested using a structural equation modelling technique. The moderation effects on the hypotheses were also tested based on six moderation factors to understand their role on the designed research model. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. Hypothesis testing provided sufficient data to accept 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. All six moderation factors showed significant influence on the research model. A validation of this model with a wider survey cohort is recommended as a future study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conventional wisdom is that offenders have very high discount rates not only with respect to income and fines but also with respect to time incarcerated. These rates are difficult to measure objectively and the usual approach is to ask subjects hypothetical questions and infer time preference from their answers. In this article, we propose estimating rates at which offenders discount time incarcerated by specifying their equilibrium plea, defined as the discount rate, which equates the time and expected time spent in jail following a guilty plea and a trial. Offenders are assumed to exhibit positive time preference and discount time spent in jail at a constant rate. Our choice of sample is interesting because the offenders are not on bail, punishment is not delayed and the offences are planned therefore conforming to Becker’s model of the decision to commit a crime. Contrary to the discussion in the literature, we do not find evidence of consistently high time discount rates, and therefore cannot unequivocally infer that the prison experience always results in low levels of specific deterrence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use Bayesian model selection techniques to test extensions of the standard flat LambdaCDM paradigm. Dark-energy and curvature scenarios, and primordial perturbation models are considered. To that end, we calculate the Bayesian evidence in favour of each model using Population Monte Carlo (PMC), a new adaptive sampling technique which was recently applied in a cosmological context. The Bayesian evidence is immediately available from the PMC sample used for parameter estimation without further computational effort, and it comes with an associated error evaluation. Besides, it provides an unbiased estimator of the evidence after any fixed number of iterations and it is naturally parallelizable, in contrast with MCMC and nested sampling methods. By comparison with analytical predictions for simulated data, we show that our results obtained with PMC are reliable and robust. The variability in the evidence evaluation and the stability for various cases are estimated both from simulations and from data. For the cases we consider, the log-evidence is calculated with a precision of better than 0.08. Using a combined set of recent CMB, SNIa and BAO data, we find inconclusive evidence between flat LambdaCDM and simple dark-energy models. A curved Universe is moderately to strongly disfavoured with respect to a flat cosmology. Using physically well-motivated priors within the slow-roll approximation of inflation, we find a weak preference for a running spectral index. A Harrison-Zel'dovich spectrum is weakly disfavoured. With the current data, tensor modes are not detected; the large prior volume on the tensor-to-scalar ratio r results in moderate evidence in favour of r=0.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research studied distributed computing of all-to-all comparison problems with big data sets. The thesis formalised the problem, and developed a high-performance and scalable computing framework with a programming model, data distribution strategies and task scheduling policies to solve the problem. The study considered storage usage, data locality and load balancing for performance improvement in solving the problem. The research outcomes can be applied in bioinformatics, biometrics and data mining and other domains in which all-to-all comparisons are a typical computing pattern.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current state of the practice in Blackspot Identification (BSI) utilizes safety performance functions based on total crash counts to identify transport system sites with potentially high crash risk. This paper postulates that total crash count variation over a transport network is a result of multiple distinct crash generating processes including geometric characteristics of the road, spatial features of the surrounding environment, and driver behaviour factors. However, these multiple sources are ignored in current modelling methodologies in both trying to explain or predict crash frequencies across sites. Instead, current practice employs models that imply that a single underlying crash generating process exists. The model mis-specification may lead to correlating crashes with the incorrect sources of contributing factors (e.g. concluding a crash is predominately caused by a geometric feature when it is a behavioural issue), which may ultimately lead to inefficient use of public funds and misidentification of true blackspots. This study aims to propose a latent class model consistent with a multiple crash process theory, and to investigate the influence this model has on correctly identifying crash blackspots. We first present the theoretical and corresponding methodological approach in which a Bayesian Latent Class (BLC) model is estimated assuming that crashes arise from two distinct risk generating processes including engineering and unobserved spatial factors. The Bayesian model is used to incorporate prior information about the contribution of each underlying process to the total crash count. The methodology is applied to the state-controlled roads in Queensland, Australia and the results are compared to an Empirical Bayesian Negative Binomial (EB-NB) model. A comparison of goodness of fit measures illustrates significantly improved performance of the proposed model compared to the NB model. The detection of blackspots was also improved when compared to the EB-NB model. In addition, modelling crashes as the result of two fundamentally separate underlying processes reveals more detailed information about unobserved crash causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research identifies various reasons companies invest in information technology (IT), often as a means to generate value. To add to the discussion of IT value generation, this study investigates investments in enterprise software systems that support business processes. Managers of more than 500 Swiss small and medium-sized enterprises (SMEs) responded to a survey regarding the levels of their IT investment in enterprise software systems and the perceived utility of those investments. The authors use logistic and ordinary least squares regression to examine whether IT investments in two business processes affect SMEs' performance and competitive advantage. Using cluster analysis, they also develop a firm typology with four distinct groups that differ in their investments in enterprise software systems. These findings offer key implications for both research and managerial practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process models have become an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach to process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions similarly to how they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. An empirical investigation comparing both the modelling outputs and participant behaviour of this virtual world role-play elicitor with an S-BPM process modelling tool found that while the modelling approaches of the two groups varied greatly, the virtual world elicitor may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interdependence of Greece and other European stock markets and the subsequent portfolio implications are examined in wavelet and variational mode decomposition domain. In applying the decomposition techniques, we analyze the structural properties of data and distinguish between short and long term dynamics of stock market returns. First, the GARCH-type models are fitted to obtain the standardized residuals. Next, different copula functions are evaluated, and based on the conventional information criteria and time varying parameter, Joe-Clayton copula is chosen to model the tail dependence between the stock markets. The short-run lower tail dependence time paths show a sudden increase in comovement during the global financial crises. The results of the long-run dependence suggest that European stock markets have higher interdependence with Greece stock market. Individual country’s Value at Risk (VaR) separates the countries into two distinct groups. Finally, the two-asset portfolio VaR measures provide potential markets for Greece stock market investment diversification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Imbalance is not only a direct major cause of downtime in wind turbines, but also accelerates the degradation of neighbouring and downstream components (e.g. main bearing, generator). Along with detection, the imbalance quantification is also essential as some residual imbalance always exist even in a healthy turbine. Three different commonly used sensor technologies (vibration, acoustic emission and electrical measurements) are investigated in this work to verify their sensitivity to different imbalance grades. This study is based on data obtained by experimental tests performed on a small scale wind turbine drive train test-rig for different shaft speeds and imbalance levels. According to the analysis results, electrical measurements seem to be the most suitable for tracking the development of imbalance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although hundreds of thousands of organic products are traded on a daily basis, it is less known how imported organic products are evaluated by consumers in an importing country. The paper analyzes Japanese wine point of sale (POS) data to examine whether consumers differentiate between local and imported organic products. The results of our hedonic analyses show that the premium for imported organic red (white) wines is about 42.996 % (8.872 %) while that for domestic red (white) organic wines is about 6.440 % (1.214 %), implying that Japanese consumers pay higher premiums for imported organic agricultural products than for those produced in Japan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-resident workforces experience high labour turnover, which has an impact on organisational operations and affects worker satisfaction and, in turn, partners’ ability to cope with work-related absences. Research suggests that partner satisfaction may be increased by providing a range of support services, which include professional, practical, and social support. A search was conducted to identify support available for resources and health-industry non-resident workers. These were compared to the supports available to families of deployed defence personnel. They were used to compare and contrast the spread available for each industry. The resources industry primarily provided social support, and lacked an inclusion of professional and practical supports. Health-professional support services were largely directed towards extended locum support, rather than to Fly-In Fly-Out workers. Improving sources of support which parallel support provided to the Australian Defence Force is suggested as a way to increase partner satisfaction. The implications are to understand the level of uptake, perceived importance, and utilisation of such support services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective(s) To describe how doctors define and use the terms “futility” and “futile treatment” in end-of-life care. Design, Setting, Participants A qualitative study using semi-structured interviews with 96 doctors across a range of specialties who treat adults at the end of life. Doctors were recruited from three large Australian teaching hospitals and were interviewed from May to July 2013. Results Doctors’ conceptions of futility focused on the quality and chance of patient benefit. Aspects of benefit included physiological effect, weighing benefits and burdens, and quantity and quality of life. Quality and length of life were linked, but many doctors discussed instances when benefit was determined by quality of life alone. Most doctors described the assessment of chance of success in achieving patient benefit as a subjective exercise. Despite a broad conceptual consensus about what futility means, doctors noted variability in how the concept was applied in clinical decision-making. Over half the doctors also identified treatment that is futile but nevertheless justified, such as short-term treatment as part of supporting the family of a dying person. Conclusions There is an overwhelming preference for a qualitative approach to assessing futility, which brings with it variation in clinical decision-making. “Patient benefit” is at the heart of doctors’ definitions of futility. Determining patient benefit requires discussions with patients and families about their values and goals as well as the burdens and benefits of further treatment.