14 resultados para utility analysis

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simulation model adopting a health system perspective showed population-based screening with DXA, followed by alendronate treatment of persons with osteoporosis, or with anamnestic fracture and osteopenia, to be cost-effective in Swiss postmenopausal women from age 70, but not in men. INTRODUCTION: We assessed the cost-effectiveness of a population-based screen-and-treat strategy for osteoporosis (DXA followed by alendronate treatment if osteoporotic, or osteopenic in the presence of fracture), compared to no intervention, from the perspective of the Swiss health care system. METHODS: A published Markov model assessed by first-order Monte Carlo simulation was refined to reflect the diagnostic process and treatment effects. Women and men entered the model at age 50. Main screening ages were 65, 75, and 85 years. Age at bone densitometry was flexible for persons fracturing before the main screening age. Realistic assumptions were made with respect to persistence with intended 5 years of alendronate treatment. The main outcome was cost per quality-adjusted life year (QALY) gained. RESULTS: In women, costs per QALY were Swiss francs (CHF) 71,000, CHF 35,000, and CHF 28,000 for the main screening ages of 65, 75, and 85 years. The threshold of CHF 50,000 per QALY was reached between main screening ages 65 and 75 years. Population-based screening was not cost-effective in men. CONCLUSION: Population-based DXA screening, followed by alendronate treatment in the presence of osteoporosis, or of fracture and osteopenia, is a cost-effective option in Swiss postmenopausal women after age 70.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Economic theory distinguishes two concepts of utility: decision utility, objectively quantifiable by choices, and experienced utility, referring to the satisfaction by an obtainment. To date, experienced utility is typically measured with subjective ratings. This study intended to quantify experienced utility by global levels of neuronal activity. Neuronal activity was measured by means of electroencephalographic (EEG) responses to gain and omission of graded monetary rewards at the level of the EEG topography in human subjects. A novel analysis approach allowed approximating psychophysiological value functions for the experienced utility of monetary rewards. In addition, we identified the time windows of the event-related potentials (ERP) and the respective intracortical sources, in which variations in neuronal activity were significantly related to the value or valence of outcomes. Results indicate that value functions of experienced utility and regret disproportionally increase with monetary value, and thus contradict the compressing value functions of decision utility. The temporal pattern of outcome evaluation suggests an initial (∼250 ms) coarse evaluation regarding the valence, concurrent with a finer-grained evaluation of the value of gained rewards, whereas the evaluation of the value of omitted rewards emerges later. We hypothesize that this temporal double dissociation is explained by reward prediction errors. Finally, a late, yet unreported, reward-sensitive ERP topography (∼500 ms) was identified. The sources of these topographical covariations are estimated in the ventromedial prefrontal cortex, the medial frontal gyrus, the anterior and posterior cingulate cortex and the hippocampus/amygdala. The results provide important new evidence regarding “how,” “when,” and “where” the brain evaluates outcomes with different hedonic impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate the potential improvement of antimicrobial treatment by utilizing a new multiplex polymerase chain reaction (PCR) assay that identifies sepsis-relevant microorganisms in blood. DESIGN: Prospective, observational international multicentered trial. SETTING: University hospitals in Germany (n = 2), Spain (n = 1), and the United States (n = 1), and one Italian tertiary general hospital. PATIENTS: 436 sepsis patients with 467 episodes of antimicrobial treatment. METHODS: Whole blood for PCR and blood culture (BC) analysis was sampled independently for each episode. The potential impact of reporting microorganisms by PCR on adequacy and timeliness of antimicrobial therapy was analyzed. The number of gainable days on early adequate antimicrobial treatment attributable to PCR findings was assessed. MEASUREMENTS AND MAIN RESULTS: Sepsis criteria, days on antimicrobial therapy, antimicrobial substances administered, and microorganisms identified by PCR and BC susceptibility tests. RESULTS: BC diagnosed 117 clinically relevant microorganisms; PCR identified 154. Ninety-nine episodes were BC positive (BC+); 131 episodes were PCR positive (PCR+). Overall, 127.8 days of clinically inadequate empirical antibiotic treatment in the 99 BC+ episodes were observed. Utilization of PCR-aided diagnostics calculates to a potential reduction of 106.5 clinically inadequate treatment days. The ratio of gainable early adequate treatment days to number of PCR tests done is 22.8 days/100 tests overall (confidence interval 15-31) and 36.4 days/100 tests in the intensive care and surgical ward populations (confidence interval 22-51). CONCLUSIONS: Rapid PCR identification of microorganisms may contribute to a reduction of early inadequate antibiotic treatment in sepsis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator) in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s), there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid. BETASCAN is publicly accessible on the Web at http://betascan.csail.mit.edu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Popularity of Online Social Networks has been recently overshadowed by the privacy problems they pose. Users are getting increasingly vigilant concerning information they disclose and are strongly opposing the use of their information for commercial purposes. Nevertheless, as long as the network is offered to users for free, providers have little choice but to generate revenue through personalized advertising to remain financially viable. Our study empirically investigates the ways out of this deadlock. Using conjoint analysis we find that privacy is indeed important for users. We identify three groups of users with different utility patterns: Unconcerned Socializers, Control-conscious Socializers and Privacy-concerned. Our results provide relevant insights into how network providers can capitalize on different user preferences by specifically addressing the needs of distinct groups in the form of various premium accounts. Overall, our study is the first attempt to assess the value of privacy in monetary terms in this context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A clinically relevant bleeding diathesis is a frequent diagnostic challenge, which sometimes remains unexplained despite extensive investigations. The aim of our work was to evaluate the diagnostic utility of functional platelet testing by flow cytometry in this context. Methods: In case of negative results after standard laboratory work-up, flow cytometric analysis (FCA) of platelet function was done. We performed analysis of surface glycoproteins (GP) Ibα, IIb, IIIa; P-selectin expression and PAC-1 binding after graded doses of ADP, collagen and thrombin; content/secretion of dense granules; ability to generate procoagulant platelets. Results: Out of 437 patients investigated with standard tests between January 2007 and December 2011, we identified 67 (15.3%) with high bleeding scores and non-diagnostic standard laboratory work-up including platelet aggregation studies. Among these patients FCA revealed some potentially causative platelet defects: decreased dense-granule content/secretion (n=13); decreased alpha-granule secretion induced by ADP (n=10), convulxin (n=4) or thrombin (n=3); decreased fibrinogen-receptor activation induced by ADP (n=11), convulxin (n=11) or thrombin (n=8); decreased generation of COAT-platelets, i.e. highly procoagulant platelets induced by simultaneous activation with collagen and thrombin (n=16). Conclusion: Our work confirms that storage pool defects are frequent in patients with a bleeding diathesis and normal coagulation and platelet aggregations studies. Additionally, flow cytometric analysis is able to identify discrete platelet activation defects. In particular, we show for the first time that a relevant proportion of these patients has an isolated impaired ability to generate COAT-platelets - a conceptually new defect in platelet procoagulant activity, that is missed by conventional laboratory work-up. © 2014 Clinical Cytometry Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Growth codes are a subclass of Rateless codes that have found interesting applications in data dissemination problems. Compared to other Rateless and conventional channel codes, Growth codes show improved intermediate performance which is particularly useful in applications where partial data presents some utility. In this paper, we investigate the asymptotic performance of Growth codes using the Wormald method, which was proposed for studying the Peeling Decoder of LDPC and LDGM codes. Compared to previous works, the Wormald differential equations are set on nodes' perspective which enables a numerical solution to the computation of the expected asymptotic decoding performance of Growth codes. Our framework is appropriate for any class of Rateless codes that does not include a precoding step. We further study the performance of Growth codes with moderate and large size codeblocks through simulations and we use the generalized logistic function to model the decoding probability. We then exploit the decoding probability model in an illustrative application of Growth codes to error resilient video transmission. The video transmission problem is cast as a joint source and channel rate allocation problem that is shown to be convex with respect to the channel rate. This illustrative application permits to highlight the main advantage of Growth codes, namely improved performance in the intermediate loss region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In clinical practise the high dose ACTH stimulation test (HDT) is frequently used in the assessment of adrenal insufficiency (AI). However, there is uncertainty regarding optimal time-points and number of blood samplings. The present study compared the utility of a single cortisol value taken either 30 or 60 minutes after ACTH stimulation with the traditional interpretation of the HDT. METHODS: Retrospective analysis of 73 HDT performed at a single tertiary endocrine centre. Serum cortisol was measured at baseline, 30 and 60 minutes after intravenous administration of 250 µg synthetic ACTH1-24. Adrenal insufficiency (AI) was defined as a stimulated cortisol level <550 nmol/l. RESULTS: There were twenty patients (27.4%) who showed an insufficient rise in serum cortisol using traditional HDT criteria and were diagnosed to suffer from AI. There were ten individuals who showed insufficient cortisol values after 30 minutes, rising to sufficient levels at 60 minutes. All patients revealing an insufficient cortisol response result after 60 minutes also had an insufficient result after 30 minutes. The cortisol value taken after 30 minutes did not add incremental diagnostic value in any of the cases under investigation compared with the 60 minutes' sample. CONCLUSIONS: Based on the findings of the present analysis the utility of a cortisol measurement 30 minutes after high dose ACTH injection was low and did not add incremental diagnostic value to a single measurement after 60 minutes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An accurate detection of individuals at clinical high risk (CHR) for psychosis is a prerequisite for effective preventive interventions. Several psychometric interviews are available, but their prognostic accuracy is unknown. We conducted a prognostic accuracy meta-analysis of psychometric interviews used to examine referrals to high risk services. The index test was an established CHR psychometric instrument used to identify subjects with and without CHR (CHR+ and CHR-). The reference index was psychosis onset over time in both CHR+ and CHR- subjects. Data were analyzed with MIDAS (STATA13). Area under the curve (AUC), summary receiver operating characteristic curves, quality assessment, likelihood ratios, Fagan's nomogram and probability modified plots were computed. Eleven independent studies were included, with a total of 2,519 help-seeking, predominately adult subjects (CHR+: N=1,359; CHR-: N=1,160) referred to high risk services. The mean follow-up duration was 38 months. The AUC was excellent (0.90; 95% CI: 0.87-0.93), and comparable to other tests in preventive medicine, suggesting clinical utility in subjects referred to high risk services. Meta-regression analyses revealed an effect for exposure to antipsychotics and no effects for type of instrument, age, gender, follow-up time, sample size, quality assessment, proportion of CHR+ subjects in the total sample. Fagan's nomogram indicated a low positive predictive value (5.74%) in the general non-help-seeking population. Albeit the clear need to further improve prediction of psychosis, these findings support the use of psychometric prognostic interviews for CHR as clinical tools for an indicated prevention in subjects seeking help at high risk services worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To evaluate the prognostic value of electrophysiological stimulation (EPS) in the risk stratification for tachyarrhythmic events and sudden cardiac death (SCD). METHODS We conducted a prospective cohort study and analyzed the long-term follow-up of 265 consecutive patients who underwent programmed ventricular stimulation at the Luzerner Kantonsspital (Lucerne, Switzerland) between October 2003 and April 2012. Patients underwent EPS for SCD risk evaluation because of structural or functional heart disease and/or electrical conduction abnormality and/or after syncope/cardiac arrest. EPS was considered abnormal, if a sustained ventricular tachycardia (VT) was inducible. The primary endpoint of the study was SCD or, in implanted patients, adequate ICD-activation. RESULTS During EPS, sustained VT was induced in 125 patients (47.2%) and non-sustained VT in 60 patients (22.6%); in 80 patients (30.2%) no arrhythmia could be induced. In our cohort, 153 patients (57.7%) underwent ICD implantation after the EPS. During follow-up (mean duration 4.8 ± 2.3 years), a primary endpoint event occurred in 49 patients (18.5%). The area under the receiver operating characteristic curve (AUROC) was 0.593 (95%CI: 0.515-0.670) for a left ventricular ejection fraction (LVEF) < 35% and 0.636 (95%CI: 0.563-0.709) for inducible sustained VT during EPS. The AUROC of EPS was higher in the subgroup of patients with LVEF ≥ 35% (0.681, 95%CI: 0.578-0.785). Cox regression analysis showed that both, sustained VT during EPS (HR: 2.26, 95%CI: 1.22-4.19, P = 0.009) and LVEF < 35% (HR: 2.00, 95%CI: 1.13-3.54, P = 0.018) were independent predictors of primary endpoint events. CONCLUSION EPS provides a benefit in risk stratification for future tachyarrhythmic events and SCD and should especially be considered in patients with LVEF ≥ 35%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE The use of 6-minute-walk distance (6MWD) as an indicator of exercise capacity to predict postoperative survival in lung transplantation has not previously been well studied. OBJECTIVES To evaluate the association between 6MWD and postoperative survival following lung transplantation. METHODS Adult, first time, lung-only transplantations per the United Network for Organ Sharing database from May 2005 to December 2011 were analyzed. Kaplan-Meier methods and Cox proportional hazards modeling were used to determine the association between preoperative 6MWD and post-transplant survival after adjusting for potential confounders. A receiver operating characteristic curve was used to determine the 6MWD value that provided maximal separation in 1-year mortality. A subanalysis was performed to assess the association between 6MWD and post-transplant survival by disease category. MEASUREMENTS AND MAIN RESULTS A total of 9,526 patients were included for analysis. The median 6MWD was 787 ft (25th-75th percentiles = 450-1,082 ft). Increasing 6MWD was associated with significantly lower overall hazard of death (P < 0.001). Continuous increase in walk distance through 1,200-1,400 ft conferred an incremental survival advantage. Although 6MWD strongly correlated with survival, the impact of a single dichotomous value to predict outcomes was limited. All disease categories demonstrated significantly longer survival with increasing 6MWD (P ≤ 0.009) except pulmonary vascular disease (P = 0.74); however, the low volume in this category (n = 312; 3.3%) may limit the ability to detect an association. CONCLUSIONS 6MWD is significantly associated with post-transplant survival and is best incorporated into transplant evaluations on a continuous basis given limited ability of a single, dichotomous value to predict outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the recycling of municipal wastewater can play an important role in water supply security and ecosystem protection, the percentage of wastewater recycled is generally low and strikingly variable. Previous research has employed detailed case studies to examine the factors that contribute to recycling success but usually lacks a comparative perspective across cases. In this study, 25 water utilities in New South Wales, Australia, were compared using fuzzy-set Qualitative Comparative Analysis (fsQCA). This research method applies binary logic and set theory to identify the minimal combinations of conditions that are necessary and/or sufficient for an outcome to occur within the set of cases analyzed. The influence of six factors (rainfall, population density, coastal or inland location, proximity to users; cost recovery and revenue for water supply services) was examined for two outcomes, agricultural use and "heavy" (i.e., commercial/municipal/industrial) use. Each outcome was explained by two different pathways, illustrating that different combinations of conditions are associated with the same outcome. Generally, while economic factors are crucial for heavy use, factors relating to water stress and geographical proximity matter most for agricultural reuse. These results suggest that policies to promote wastewater reuse may be most effective if they target uses that are most feasible for utilities and correspond to the local context. This work also makes a methodological contribution through illustrating the potential utility of fsQCA for understanding the complex drivers of performance in water recycling.