868 resultados para Cost Over run


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bora wind is a mesoscale phenomenon which typically affects the Adriatic Sea basin for several days each year, especially during winter. The Bora wind has been studied for its intense outbreak across the Dinaric Alps. The properties of the Bora wind are widely discussed in the literature and scientific papers usually focus on the eastern Adriatic coast where strong turbulence and severe gust intensity are more pronounced. However, the impact of the Bora wind can be significant also over Italy, not only in terms of wind speed instensity. Depending on the synoptic pressure pattern (cyclonic or anticyclonic Bora) and on the season, heavy snowfall, severe storms, storm surges and floods can occur along the Adriatic coast and on the windward flanks of the Apennines. In the present work five Bora cases that occurred in recent years have been selected and their evolution has been simulated with the BOLAM-MOLOCH model set, developed at ISAC-CNR in Bologna. Each case study has been addressed by a control run and by several sensitivity tests, performed with the purpose of better understanding the role played by air-sea latent and sensible heat fluxes. The tests show that the removal of the fluxes induces modifications in the wind approching the coast and a decrease of the total precipitation amount predicted over Italy. In order to assess the role of heat fluxes, further analysis has been carried out: column integrated water vapour fluxes have been computed along the Italian coastline and an atmospheric water balance has been evaluated inside a box volume over the Adriatic Sea. The balance computation shows that, although latent heat flux produces a significant impact on the precipitation field, its contribution to the balance is relatively minor. The most significant and lasting case study, that of February 2012, has been studied in more detail in order to explain the impressive drop in the total precipitation amount simulated in the sensitivity tests with removed heat fluxes with respect to the CNTRL run. In these experiments relative humidity and potential temperature distribution over different cross-sections have been examined. With respect to the CNTRL run a drier and more stable boundary layer, characterised by a more pronounced wind shear at the lower levels, has been observed to establish above the Adriatic Sea. Finally, in order to demonstrate that also the interaction of the Bora flow with the Apennines plays a crucial role, sensitivity tests varying the orography height have been considered. The results of such sensitivity tests indicate that the propagation of the Bora wind over the Adriatic Sea, and in turn its meteorological impact over Italy, is influenced by both the large air-sea heat fluxes and the interaction with the Apennines that decelerate the upstream flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local and regional procurement (LRP) of food aid is often claimed to lead to quicker and more cost-effective response. We generate timeliness and cost-effectiveness estimates by comparing US-funded LRP activities in nine countries against in-kind, transoceanic food aid shipments from the US to the same countries during the same timeframe. Procuring food locally or distributing cash or vouchers results in a time savings of nearly 14 weeks, a 62 percent gain. Cost-effectiveness varies significantly by commodity type. Procuring grains locally saved over 50 percent, on average, while local procurement of processed commodities was not always cost-effective. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human development causes degradation of stream ecosystems due to impacts on channel morphology, hydrology, and water quality. Urbanization, the second leading cause of stream impairment, increases the amount of impervious surface cover, thus reducing infiltration and increasing surface runoff of precipitation, which ultimately affects stream hydrologic process and aquatic biodiversity. The main objective of this study was to assess the overall health of Miller Run, a small tributary of the Bull Run and Susquehanna River watersheds, through an integrative hydrologic and water quality approach in order to determine the degree of Bucknell University’s impact on the stream. Hydrologic conditions, including stage and discharge, and water quality conditions, including total suspended solids, ion, nutrient, and dissolved metal concentrations, specific conductivity, pH, and temperature, were measured and evaluated at two sampling sites (upstream and downstream of Bucknell’s main campus) during various rain events from September 2007 to March 2008. The primary focus of the stream analysis was based on one main rain event on 26 February 2008. The results provided evidence that Miller Run is impacted by Bucknell’s campus. From a hydrologic perspective, the stream’s hydrograph showed the exact opposite pattern of what would be expected from a ‘normal’ stream. Miller run had a flashier downstream hydrograph and a broader upstream hydrograph, which was more than likely due to the increased amount of impervious surface cover throughout the downstream half of the watershed. From a water quality perspective, sediment loads increased at a faster rate and were significantly higher downstream compared to upstream. These elevated sediment concentrations were probably the combined result of sediment runoff from upstream and downstream construction sites that were being developed over the course of the study. Sodium, chloride, and potassium concentrations, in addition to specific conductivity, also significantly increased downstream of Bucknell’s campus due to the runoff of road salts. Calcium and magnesium concentrations did not appear to be impacted by urbanization, although they did demonstrate a significant dilution effect downstream. The downstream site was not directly affected by elevated nitrate concentrations; however, soluble reactive phosphorus concentrations tended to increase downstream and ammonium concentrations significantly peaked partway through the rain event downstream. These patterns suggest that Miller Run may be impacted by nutrient runoff from the golf course, athletic fields, and/or fertilizers applications on the main campus. Dissolved manganese and iron concentrations also appeared to slightly increase downstream, demonstrating the affect of urban runoff from roads and parking lots. pH and temperature both decreased farther downstream, but neither showed a significant impact of urbanization. More studies are necessary to determine how Miller Run responds to changes in season, climate, precipitation intensity, and land-use. This study represents the base-line analysis of Miller Run’s current hydrologic and water quality conditions; based on these initial findings, Bucknell should strongly consider modifications to improve storm water management practices and to reduce the campus’s overall impact on the stream in order to enhance and preserve the integrity of its natural water resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study was conducted to estimate the direct losses due to Neospora caninum in Swiss dairy cattle and to assess the costs and benefits of different potential control strategies. A Monte Carlo simulation spreadsheet module was developed to estimate the direct costs caused by N. caninum, with and without control strategies, and to estimate the costs of these control strategies in a financial analysis. The control strategies considered were "testing and culling of seropositive female cattle", "discontinued breeding with offspring from seropositive cows", "chemotherapeutical treatment of female offspring" and "vaccination of all female cattle". Each parameter in the module that was considered to be uncertain, was described using probability distributions. The simulations were run with 20,000 iterations over a time period of 25 years. The median annual losses due to N. caninum in the Swiss dairy cow population were estimated to be euro 9.7 million euros. All control strategies that required yearly serological testing of all cattle in the population produced high costs and thus were not financially profitable. Among the other control strategies, two showed benefit-cost ratios (BCR) >1 and positive net present values (NPV): "Discontinued breeding with offspring from seropositive cows" (BCR=1.29, NPV=25 million euros ) and "chemotherapeutical treatment of all female offspring" (BCR=2.95, NPV=59 million euros). In economic terms, the best control strategy currently available would therefore be "discontinued breeding with offspring from seropositive cows".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtualization has become a common abstraction layer in modern data centers. By multiplexing hardware resources into multiple virtual machines (VMs) and thus enabling several operating systems to run on the same physical platform simultaneously, it can effectively reduce power consumption and building size or improve security by isolating VMs. In a virtualized system, memory resource management plays a critical role in achieving high resource utilization and performance. Insufficient memory allocation to a VM will degrade its performance dramatically. On the contrary, over-allocation causes waste of memory resources. Meanwhile, a VM’s memory demand may vary significantly. As a result, effective memory resource management calls for a dynamic memory balancer, which, ideally, can adjust memory allocation in a timely manner for each VM based on their current memory demand and thus achieve the best memory utilization and the optimal overall performance. In order to estimate the memory demand of each VM and to arbitrate possible memory resource contention, a widely proposed approach is to construct an LRU-based miss ratio curve (MRC), which provides not only the current working set size (WSS) but also the correlation between performance and the target memory allocation size. Unfortunately, the cost of constructing an MRC is nontrivial. In this dissertation, we first present a low overhead LRU-based memory demand tracking scheme, which includes three orthogonal optimizations: AVL-based LRU organization, dynamic hot set sizing and intermittent memory tracking. Our evaluation results show that, for the whole SPEC CPU 2006 benchmark suite, after applying the three optimizing techniques, the mean overhead of MRC construction is lowered from 173% to only 2%. Based on current WSS, we then predict its trend in the near future and take different strategies for different prediction results. When there is a sufficient amount of physical memory on the host, it locally balances its memory resource for the VMs. Once the local memory resource is insufficient and the memory pressure is predicted to sustain for a sufficiently long time, a relatively expensive solution, VM live migration, is used to move one or more VMs from the hot host to other host(s). Finally, for transient memory pressure, a remote cache is used to alleviate the temporary performance penalty. Our experimental results show that this design achieves 49% center-wide speedup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moisture induced distresses have been the prevalent distress type affecting the deterioration of both asphalt and concrete pavement sections. While various surface techniques have been employed over the years to minimize the ingress of moisture into the pavement structural sections, subsurface drainage components like open-graded base courses remain the best alternative in minimizing the time the pavement structural sections are exposed to saturated conditions. This research therefore focuses on assessing the performance and cost-effectiveness of pavement sections containing both treated and untreated open-graded aggregate base materials. Three common roadway aggregates comprising of two virgin aggregates and one recycled aggregate were investigated using four open-ended gradations and two binder types. Laboratory tests were conducted to determine the hydraulic, mechanical and durability characteristics of treated and untreated open-graded mixes made from these three aggregate types. Results of the experimental program show that for the same gradation and mix design types, limestone samples have the greatest drainage capacity, stability to traffic loads and resistance to degradation from environmental conditions like freeze-thaw. However, depending on the gradation and mix design used, all three aggregate types namely limestone, natural gravel and recycled concrete can meet the minimum coefficient of hydraulic conductivity required for good drainage in most pavements. Tests results for both asphalt and cement treated open-graded samples indicate that a percent air void content within the range of 15-25 will produce a treated open-graded base course with sufficient drainage capacity and also long term stability under both traffic and environmental loads. Using the new Mechanistic and Empirical Design Guide software, computer simulations of pavement performance were conducted on pavement sections containing these open-graded base aggregate base materials to determine how the MEPDG predicted pavement performance is sensitive to drainage. Using three truck traffic levels and four climatic regions, results of the computer simulations indicate that the predicted performance was not sensitive to the drainage characteristics of the open-graded base course. Based on the result of the MEPDG predicted pavement performance, the cost-effectiveness of the pavement sections with open-graded base was computed on the assumption that the increase service life experienced by these sections was attributed to the positive effects of subsurface drainage. The two cost analyses used gave two contrasting results with the one indicating that the inclusion of open-graded base courses can lead to substantial savings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Michigan Department of Transportation is evaluating upgrading their portion of the Wolverine Line between Chicago and Detroit to accommodate high speed rail. This will entail upgrading the track to allow trains to run at speeds in excess of 110 miles per hour (mph). An important component of this upgrade will be to assess the requirement for ballast material for high speed rail. In the event that the existing ballast materials do not meet specifications for higher speed train, additional ballast will be required. The purpose of this study, therefore, is to investigate the current MDOT railroad ballast quality specifications and compare them to both the national and international specifications for use on high speed rail lines. The study found that while MDOT has quality specifications for railroad ballast it does not have any for high speed rail. In addition, the American Railway Engineering and Maintenance-of-Way Association (AREMA), while also having specifications for railroad ballast, does not have specific specifications for high speed rail lines. The AREMA aggregate specifications for ballast include the following tests: (1) LA Abrasion, (2) Percent Moisture Absorption, (3) Flat and Elongated Particles, (4) Sulfate Soundness test. Internationally, some countries do require a highly standard for high speed rail such as the Los Angeles (LA) Abrasion test, which is uses a higher standard performance and the Micro Duval test, which is used to determine the maximum speed that a high speed can operate at. Since there are no existing MDOT ballast specification for high speed rail, it is assumed that aggregate ballast specifications for the Wolverine Line will use the higher international specifications. The Wolverine line, however, is located in southern Michigan is a region of sedimentary rocks which generally do not meet the existing MDOT ballast specifications. The investigation found that there were only 12 quarries in the Michigan that meet the MDOT specification. Of these 12 quarries, six were igneous or metamorphic rock quarries, while six were carbonate quarries. Of the six carbonate quarries four were locate in the Lower Peninsula and two in the Upper Peninsula. Two of the carbonate quarries were located in near proximity to the Wolverine Line, while the remaining quarries were at a significant haulage distance. In either case, the cost of haulage becomes an important consideration. In this regard, four of the quarries were located with lake terminals allowing water transportation to down state ports. The Upper Peninsula also has a significant amount of metal based mining in both igneous and metamorphic rock that generate significant amount of waste rock that could be used as a ballast material. The main drawback, however, is the distance to the Wolverine rail line. One potential source is the Cliffs Natural Resources that operates two large surface mines in the Marquette area with rail and water transportation to both Lake Superior and Lake Michigan. Both mines mine rock with a very high compressive strength far in excess of most ballast materials used in the United States and would make an excellent ballast materials. Discussions with Cliffs, however, indicated that due to environmental concerns that they would most likely not be interested in producing a ballast material. In the United States carbonate aggregates, while used for ballast, many times don't meet the ballast specifications in addition to the problem of particle degradation that can lead to fouling and cementation issues. Thus, many carbonate aggregate quarries in close proximity to railroads are not used. Since Michigan has a significant amount of carbonate quarries, the research also investigated using the dynamic properties of aggregate as a possible additional test for aggregate ballast quality. The dynamic strength of a material can be assessed using a split Hopkinson Pressure Bar (SHPB). The SHPB has been traditionally used to assess the dynamic properties of metal but over the past 20 years it is now being used to assess the dynamic properties of brittle materials such as ceramics and rock. In addition, the wear properties of metals have been related to their dynamic properties. Wear or breakdown of railroad ballast materials is one of the main problems with ballast material due to the dynamic loading generated by trains and which will be significantly higher for high speed rails. Previous research has indicated that the Port Inland quarry along Lake Michigan in the Southern Upper Peninsula has significant dynamic properties that might make it potentially useable as an aggregate for high speed rail. The dynamic strength testing conducted in this research indicate that the Port Inland limestone in fact has a dynamic strength close to igneous rocks and much higher than other carbonate rocks in the Great Lakes region. It is recommended that further research be conducted to investigate the Port Inland limestone as a high speed ballast material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/AIMS: Alveolar echinococcosis (AE) is a serious liver disease. The aim of this study was to explore the long-term prognosis of AE patients, the burden of this disease in Switzerland and the cost-effectiveness of treatment. METHODS: Relative survival analysis was undertaken using a national database with 329 patient records. 155 representative cases had sufficient details regarding treatment costs and patient outcome to estimate the financial implications and treatment costs of AE. RESULTS: For an average 54-year-old patient diagnosed with AE in 1970 the life expectancy was estimated to be reduced by 18.2 and 21.3 years for men and women, respectively. By 2005 this was reduced to approximately 3.5 and 2.6 years, respectively. Patients undergoing radical surgery had a better outcome, whereas the older patients had a poorer prognosis than the younger patients. Costs amount to approximately Euro108,762 per patient. Assuming the improved life expectancy of AE patients is due to modern treatment the cost per disability-adjusted life years (DALY) saved is approximately Euro6,032. CONCLUSIONS: Current treatments have substantially improved the prognosis of AE patients compared to the 1970s. The cost per DALY saved is low compared to the average national annual income. Hence, AE treatment is highly cost-effective in Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To compare costs of function- and pain-centred inpatient treatment in patients with chronic low back pain over 3 years of follow-up. DESIGN: Cost analysis of a randomized controlled trial. PATIENTS: A total of 174 patients with chronic low back pain were randomized to function- or pain-centred inpatient treatment. METHODS: Data on direct and indirect costs were gathered by questionnaires sent to patients, health insurance providers, employers, and the Swiss Disability Insurance Company. RESULTS: There was a non-significant difference in total medical costs after 3 years' follow-up. Total costs were 77,305 Euros in the function-centred inpatient treatment group and 83,085 Euros in the pain-centred inpatient treatment group. Likewise, indirect costs after 3 years from lost work days were non-significantly lower in the function-centred in-patient treatment group (6354 Euros; 95% confidence interval -20,892, 8392) and direct medical costs were non-significantly higher in the function-centred inpatient treatment group (574 Euros; 95% confidence interval -862, 2011). CONCLUSION: The total costs of function-centred and pain-centred inpatient treatment were similar over the whole 3-year follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTION UNDER STUDY The aim of this study was to evaluate the cost-effectiveness of ticagrelor and generic clopidogrel as add-on therapy to acetylsalicylic acid (ASA) in patients with acute coronary syndrome (ACS), from a Swiss perspective. METHODS Based on the PLATelet inhibition and patient Outcomes (PLATO) trial, one-year mean healthcare costs per patient treated with ticagrelor or generic clopidogrel were analysed from a payer perspective in 2011. A two-part decision-analytic model estimated treatment costs, quality-adjusted life years (QALYs), life years and the cost-effectiveness of ticagrelor and generic clopidogrel in patients with ACS up to a lifetime at a discount of 2.5% per annum. Sensitivity analyses were performed. RESULTS Over a patient's lifetime, treatment with ticagrelor generates an additional 0.1694 QALYs and 0.1999 life years at a cost of CHF 260 compared with generic clopidogrel. This results in an Incremental Cost Effectiveness Ratio (ICER) of CHF 1,536 per QALY and CHF 1,301 per life year gained. Ticagrelor dominated generic clopidogrel over the five-year and one-year periods with treatment generating cost savings of CHF 224 and 372 while gaining 0.0461 and 0.0051 QALYs and moreover 0.0517 and 0.0062 life years, respectively. Univariate sensitivity analyses confirmed the dominant position of ticagrelor in the first five years and probabilistic sensitivity analyses showed a high probability of cost-effectiveness over a lifetime. CONCLUSION During the first five years after ACS, treatment with ticagrelor dominates generic clopidogrel in Switzerland. Over a patient's lifetime, ticagrelor is highly cost-effective compared with generic clopidogrel, proven by ICERs significantly below commonly accepted willingness-to-pay thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable detection of JAK2-V617F is critical for accurate diagnosis of myeloproliferative neoplasms (MPNs); in addition, sensitive mutation-specific assays can be applied to monitor disease response. However, there has been no consistent approach to JAK2-V617F detection, with assays varying markedly in performance, affecting clinical utility. Therefore, we established a network of 12 laboratories from seven countries to systematically evaluate nine different DNA-based quantitative PCR (qPCR) assays, including those in widespread clinical use. Seven quality control rounds involving over 21,500 qPCR reactions were undertaken using centrally distributed cell line dilutions and plasmid controls. The two best-performing assays were tested on normal blood samples (n=100) to evaluate assay specificity, followed by analysis of serial samples from 28 patients transplanted for JAK2-V617F-positive disease. The most sensitive assay, which performed consistently across a range of qPCR platforms, predicted outcome following transplant, with the mutant allele detected a median of 22 weeks (range 6-85 weeks) before relapse. Four of seven patients achieved molecular remission following donor lymphocyte infusion, indicative of a graft vs MPN effect. This study has established a robust, reliable assay for sensitive JAK2-V617F detection, suitable for assessing response in clinical trials, predicting outcome and guiding management of patients undergoing allogeneic transplant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND  Whole genome sequencing (WGS) is increasingly used in molecular-epidemiological investigations of bacterial pathogens, despite cost- and time-intensive analyses. We combined strain-specific single nucleotide polymorphism (SNP)-typing and targeted WGS to investigate a tuberculosis cluster spanning 21 years in Bern, Switzerland. METHODS  Based on genome sequences of three historical outbreak Mycobacterium tuberculosis isolates, we developed a strain-specific SNP-typing assay to identify further cases. We screened 1,642 patient isolates, and performed WGS on all identified cluster isolates. We extracted SNPs to construct genomic networks. Clinical and social data were retrospectively collected. RESULTS  We identified 68 patients associated with the outbreak strain. Most were diagnosed in 1991-1995, but cases were observed until 2011. Two thirds belonged to the homeless and substance abuser milieu. Targeted WGS revealed 133 variable SNP positions among outbreak isolates. Genomic network analyses suggested a single origin of the outbreak, with subsequent division into three sub-clusters. Isolates from patients with confirmed epidemiological links differed by 0-11 SNPs. CONCLUSIONS  Strain-specific SNP-genotyping allowed rapid and inexpensive identification of M. tuberculosis outbreak isolates in a population-based strain collection. Subsequent targeted WGS provided detailed insights into transmission dynamics. This combined approach could be applied to track bacterial pathogens in real-time and at high resolution.