795 resultados para Slot-based task-splitting algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MOTIVATION: Analysis of millions of pyro-sequences is currently playing a crucial role in the advance of environmental microbiology. Taxonomy-independent, i.e. unsupervised, clustering of these sequences is essential for the definition of Operational Taxonomic Units. For this application, reproducibility and robustness should be the most sought after qualities, but have thus far largely been overlooked. RESULTS: More than 1 million hyper-variable internal transcribed spacer 1 (ITS1) sequences of fungal origin have been analyzed. The ITS1 sequences were first properly extracted from 454 reads using generalized profiles. Then, otupipe, cd-hit-454, ESPRIT-Tree and DBC454, a new algorithm presented here, were used to analyze the sequences. A numerical assay was developed to measure the reproducibility and robustness of these algorithms. DBC454 was the most robust, closely followed by ESPRIT-Tree. DBC454 features density-based hierarchical clustering, which complements the other methods by providing insights into the structure of the data. AVAILABILITY: An executable is freely available for non-commercial users at ftp://ftp.vital-it.ch/tools/dbc454. It is designed to run under MPI on a cluster of 64-bit Linux machines running Red Hat 4.x, or on a multi-core OSX system. CONTACT: dbc454@vital-it.ch or nicolas.guex@isb-sib.ch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence-based (EBP) aims for a new distribution of power centered on scientific evidence rather than clinical expertise. The present article describes the operational process of EBP by describing the implementation stages of this type of practise. This stage presentation is essential given that there are many conceptions end models of EBP and that some nurses have a limited knowledge of its rules ans implications. Given that number and formulation of the stages varies by author, the process presented here attempts to integrate the different stages reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Missouri River floods of 2011 will go down in history as the longest duration flooding event this state has seen to date. The combination of above normal snowfall in the upper Missouri River basin followed by the equivalent of nearly one year’s worth of rainfall in May created an above normal runoff situation which filled the Missouri River and the six main reservoirs within the basin. Compounding this problem was colder than normal temperatures which kept much of the snowpack in the upper basin on the ground longer into the spring, setting the stage for this historic event. The U.S. Army Corps of Engineers (USACE) began increasing the outflow at Gavin’s Point, near Yankton, South Dakota in May. On June 14, 2011, the outflow reached a record rate of over 160,000 cubic feet per second (cfs), over twice the previous record outflow set in 1997. This increased output from Gavin’s Point caused the Missouri River to flow out of its banks covering over 283,000 acres of land in Iowa, forcing hundreds of evacuations, damaging 255,000 acres of cropland and significantly impacting the levee system on the Missouri River basin. Over the course of the summer, approximately 64 miles of primary roads closed due to Missouri River flooding, including 54 miles of Interstate Highway. Many county secondary roads were closed by high water or overburdened due to the numerous detours and road closures in this area. As the Missouri River levels began to increase, municipalities and counties aided by State and Federal agencies began preparing for a sustained flood event. Citizens, businesses, state agencies, local governments and non‐profits made substantial preparations, in some cases expending millions of dollars on emergency protective measures to protect their facilities from the impending flood. Levee monitors detected weak spots in the levee system in all affected counties, with several levees being identified as at risk levees that could potentially fail. Of particular concern was the 28 miles of levees protecting Council Bluffs. Based on this concern, Council Bluffs prepared an evacuation plan for the approximately 30,000 residents that resided in the protected area. On May 25, 2011, Governor Branstad directed the execution of the Iowa Emergency Response Plan in accordance with Section 401 of the Stafford Act. On May 31, 2011, HSEMD Administrator, Brigadier General J. Derek Hill, formally requested the USACE to provide technical assistance and advanced measures for the communities along the Missouri River basin. On June 2, 2011 Governor Branstad issued a State of Iowa Proclamation of Disaster Emergency for Fremont, Harrison, Mills, Monona, Pottawattamie, and Woodbury counties. The length of this flood event created a unique set of challenges for Federal, State and local entities. In many cases, these organizations were conducting response and recovery operations simultaneously. Due to the length of this entire event, the State Emergency Operations Center and the local Emergency Operations Centers remained open for an extended period of time, putting additional strain on many organizations and resources. In response to this disaster, Governor Branstad created the Missouri River Recovery Coordination Task Force to oversee the State’s recovery efforts. The Governor announced the creation of this Task Force on October 17, 2011 and appointed Brigadier General J. Derek Hill, HSEMD Administrator as the chairman. This Task Force would be a temporary group of State agency representatives and interested stakeholders brought together to support the recovery efforts of the Iowa communities impacted by the Missouri River Flood. Collectively, this group would analyze and share damage assessment data, coordinate assistance across various stakeholders, monitor progress, capture best practices and identify lessons learned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is on state-of-the-art research efforts specific to infrastructure inventory/data collection with sign inventory as a case study. The development of an agency-wide sign inventory is based on feature inventory and location information. Specific to location, a quick and simple location acquisition tool is critical to tying assets to an accurate location-referencing system. This research effort provides a contrast between legacy referencing systems (route and milepost) and global positioning system- (GPS-) based techniques (latitude and longitude) integrated into a geographic information system (GIS) database. A summary comparison of field accuracies using a variety of consumer grade devices is also provided. This research, and the data collection tools developed, are critical in supporting the Iowa Department of Transportation (DOT) Statewide Sign Management System development effort. For the last two years, a Task Force has embarked on a comprehensive effort to develop a sign management system to improve sign quality, as well as to manage all aspects of signage, from request, ordering, fabricating, installing, maintaining, and ultimately removing, and to provide the ability to budget for these key assets on a statewide basis. This effort supported the development of a sign inventory tool and is the beginning of the development of a sign management system to support the Iowa DOT efforts in the consistent, cost effective, and objective decision making process when it comes to signs and their maintenance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report, the Full Report, is the culmination of the Task Force’s responsibilities as set out in Executive Order 5, dated October 30, 2007. The Executive Order specifies a number of goals and report requirements.There is a commonly held perception that the use of detention may serve as a deterrent to future delinquency. Data in this report reflect that approximately 40% of youth detained in 2006 were re-detained in 2006. Research conducted by national experts indicates that, particularly for low risk/low level offenders, that the use of detention is not neutral, and may increase the likelihood of recidivism. Comparable data for Iowa are not available (national data studied for this report provide level of risk, but risk level related to detention is not presently available for Iowa). The Task Force finds no evidence suggesting that recidivism levels (as related to detention risk) in Iowa should be different than found in other states. Data in this report also suggest that detention is one of the juvenile justice system’s more costly sanctions ($257 - $340 per day). Other sites and local jurisdictions have been able to redirect savings from the reduced use of juvenile detention to support less costly, community-based detention alternatives without compromising public safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resilient modulus (MR) input parameters in the Mechanistic-Empirical Pavement Design Guide (MEPDG) program have a significant effect on the projected pavement performance. The MEPDG program uses three different levels of inputs depending on the desired level of accuracy. The primary objective of this research was to develop a laboratory testing program utilizing the Iowa DOT servo-hydraulic machine system for evaluating typical Iowa unbound materials and to establish a database of input values for MEPDG analysis. This was achieved by carrying out a detailed laboratory testing program designed in accordance with the AASHTO T307 resilient modulus test protocol using common Iowa unbound materials. The program included laboratory tests to characterize basic physical properties of the unbound materials, specimen preparation and repeated load triaxial tests to determine the resilient modulus. The MEPDG resilient modulus input parameter library for Iowa typical unbound pavement materials was established from the repeated load triaxial MR test results. This library includes the non-linear, stress-dependent resilient modulus model coefficients values for level 1 analysis, the unbound material properties values correlated to resilient modulus for level 2 analysis, and the typical resilient modulus values for level 3 analysis. The resilient modulus input parameters library can be utilized when designing low volume roads in the absence of any basic soil testing. Based on the results of this study, the use of level 2 analysis for MEPDG resilient modulus input is recommended since the repeated load triaxial test for level 1 analysis is complicated, time consuming, expensive, and requires sophisticated equipment and skilled operators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study is to systematically evaluate the Iowa Department of Transportation’s (DOT’s) existing Pavement Management Information System (PMIS) with respect to the input information required for Mechanistic-Empirical Pavement Design Guide (MEPDG) rehabilitation analysis and design. To accomplish this objective, all of available PMIS data for interstate and primary roads in Iowa were retrieved from the Iowa DOT PMIS. The retrieved data were evaluated with respect to the input requirements and outputs for the latest version of the MEPDG software (version 1.0). The input parameters that are required for MEPDG HMA rehabilitation design, but currently unavailable in the Iowa DOT PMIS were identified. The differences in the specific measurement metrics used and their units for some of the pavement performance measures between the Iowa DOT PMIS and MEPDG were identified and discussed. Based on the results of this study, it is recommended that the Iowa DOT PMIS should be updated, if possible, to include the identified parameters that are currently unavailable, but are required for MEPDG rehabilitation design. Similarly, the measurement units of distress survey results in the Iowa DOT PMIS should be revised to correspond to those of MEPDG performance predictions. *******************Large File**************************

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accuracy or correspondence exists between predicted and monitored performance for Iowa conditions. A comprehensive literature review was conducted to identify the MEPDG input parameters and the MEPDG verification/calibration process. Sensitivities of MEPDG input parameters to predictions were studied using different versions of the MEPDG software. Based on literature review and sensitivity analysis, a detailed verification procedure was developed. A total of sixteen different types of pavement sections across Iowa, not used for national calibration in NCHRP 1-47A, were selected. A database of MEPDG inputs and the actual pavement performance measures for the selected pavement sites were prepared for verification. The accuracy of the MEPDG performance models for Iowa conditions was statistically evaluated. The verification testing showed promising results in terms of MEPDG’s performance prediction accuracy for Iowa conditions. Recalibrating the MEPDG performance models for Iowa conditions is recommended to improve the accuracy of predictions. ****************** Large File**************************

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensity-modulated radiotherapy (IMRT) treatment plan verification by comparison with measured data requires having access to the linear accelerator and is time consuming. In this paper, we propose a method for monitor unit (MU) calculation and plan comparison for step and shoot IMRT based on the Monte Carlo code EGSnrc/BEAMnrc. The beamlets of an IMRT treatment plan are individually simulated using Monte Carlo and converted into absorbed dose to water per MU. The dose of the whole treatment can be expressed through a linear matrix equation of the MU and dose per MU of every beamlet. Due to the positivity of the absorbed dose and MU values, this equation is solved for the MU values using a non-negative least-squares fit optimization algorithm (NNLS). The Monte Carlo plan is formed by multiplying the Monte Carlo absorbed dose to water per MU with the Monte Carlo/NNLS MU. Several treatment plan localizations calculated with a commercial treatment planning system (TPS) are compared with the proposed method for validation. The Monte Carlo/NNLS MUs are close to the ones calculated by the TPS and lead to a treatment dose distribution which is clinically equivalent to the one calculated by the TPS. This procedure can be used as an IMRT QA and further development could allow this technique to be used for other radiotherapy techniques like tomotherapy or volumetric modulated arc therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD) related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI) and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles) as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox) for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice. METHODS: We included all patients in the Swiss HIV Cohort Study starting darunavir after recording a viral load above 1000 HIV-1 RNA copies/mL given prior exposure to both PIs and nonnucleoside reverse transcriptase inhibitors. We followed these patients for up to 72 weeks, assessed virological failure using different loss of virological response algorithms and evaluated risk factors for virological failure using a Bayesian method to fit discrete Cox proportional hazard models. RESULTS: Among 130 treatment-experienced patients starting darunavir, the median age was 47 years, the median duration of HIV infection was 16 years, and 82% received mono or dual antiretroviral therapy before starting highly active antiretroviral therapy. During a median patient follow-up period of 45 weeks, 17% of patients stopped taking darunavir after a median exposure of 20 weeks. In patients followed beyond 48 weeks, the rate of virological failure at 48 weeks was at most 20%. Virological failure was more likely where patients had previously failed on both amprenavir and saquinavir and as the number of previously failed PI regimens increased. CONCLUSIONS: As a component of therapy for treatment-experienced patients, darunavir can achieve a similar efficacy and tolerability in clinical practice to that seen in clinical trials. Clinicians should consider whether a patient has failed on both amprenavir and saquinavir and the number of failed PI regimens before prescribing darunavir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.