95 resultados para Slot-based task-splitting algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Prospective data describing the appropriateness of use of colonoscopy based on detailed panel-based clinical criteria are not available. METHODS: In a cohort of 553 consecutive patients referred for colonoscopy to two university-based Swiss outpatient clinics, the percentage of patients who underwent colonoscopy for appropriate, equivocal, and inappropriate indications and the relationship between appropriateness of use and the presence of relevant endoscopic lesions was prospectively assessed. This assessment was based on criteria of the American Society for Gastrointestinal Endoscopy and explicit American and Swiss criteria developed in 1994 by a formal panel process using the RAND/UCLA appropriateness method. RESULTS: The procedures were rated appropriate or equivocal in 72.2% by criteria of the American Society for Gastrointestinal Endoscopy, in 68.5% by explicit American criteria, and in 74.4% by explicit Swiss criteria (not statistically significant, NS). Inappropriate use (overuse) of colonoscopy was found in 27.8%, 31.5%, and 25.6%, respectively (NS). The proportion of appropriate procedures was higher with increasing age. Almost all reasons for using colonoscopy could be assessed by the two explicit criteria sets, whereas 28.4% of reasons for using colonoscopy could not be evaluated by the criteria of the American Society for Gastrointestinal Endoscopy (p < 0.0001). The probability of finding a relevant endoscopic lesion was distinctly higher in the procedures rated appropriate or equivocal than in procedures judged inappropriate. CONCLUSIONS: The rate of inappropriate use of colonoscopy is substantial in Switzerland. Explicit criteria allow assessment of almost all indications encountered in clinical practice. In this study, all sets of appropriateness criteria significantly enhanced the probability of finding a relevant endoscopic lesion during colonoscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT AND OBJECTIVES: A multicentric study was set up to assess the feasibility for Swiss cancer registries of actively retrieving 3 additional variables of epidemiological and a etiological relevance for melanoma, and of potential use for the evaluation of prevention campaigns. MATERIAL AND METHODS: The skin type, family history of melanoma and precise anatomical site were retrieved for melanoma cases registered in 5 Swiss cantons (Neuchâtel, St-Gall and Appenzell, Vaud and Wallis) over 3 to 6 consecutive years (1995-2002). Data were obtained via a short questionnaire administered by the physicians - mostly dermatologists - who originally excised the lesions. As the detailed body site was routinely collected in Ticino, data from this Cancer Registry were included in the body site analysis. Relative melanoma density (RMD) was computed by the ratio of observed to expected numbers of melanomas allowing for body site surface areas, and further adjusted for site-specific melanocyte density. RESULTS: Of the 1,645 questionnaires sent, 1,420 (86.3%) were returned. The detailed cutaneous site and skin type were reliably obtained for 84.7% and 78.7% of questionnaires, and family history was known in 76% of instances. Prevalence of sun-sensitive subjects and patients with melanoma affected first-degree relatives, two target groups for early detection and surveillance campaigns were 54.1% and 3.4%, respectively. After translation into the 4th digit of the International Classification of Diseases for Oncology, the anatomical site codes from printed (original information) and pictorial support (body chart from the questionnaire) concurred for 94.6% of lesions. Discrepancies occurred mostly for lesions on the upper, outer part of the shoulder for which the clinician's textual description was "shoulder blade". This differential misclassification suggests under-estimation by about 10% of melanomas of the upper limbs and an over-estimation of 5% for truncal melanomas. Sites of highest melanoma risk were the face, the shoulder and the upper arm for sexes, the back for men and the leg for women. Three major features of this series were: (1) an unexpectedly high RMD for the face in women (6.2 vs 4.2 in men), (2) the absence of a male predominance for melanomas on the ears, and (3) for the upper limbs, a steady gradient of increasing melanoma density with increasing proximity to the trunk, regardless of sex. DISCUSSION AND CONCLUSION: The feasibility of retrieving the skin type, the precise anatomical location and family history of melanoma in a reliable manner was demonstrated thanks to the collaboration of Swiss dermatologists. Use of a schematic body drawing improves the quality of the anatomical site data and facilitate the reporting task of doctors. Age and sex patterns of RMD paralleled general indicators of sun exposure and behaviour, except for the hand (RMD=0.2). These Swiss results support some site or sun exposure specificity in the aetiology of melanoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently graph theory and complex networks have been widely used as a mean to model functionality of the brain. Among different neuroimaging techniques available for constructing the brain functional networks, electroencephalography (EEG) with its high temporal resolution is a useful instrument of the analysis of functional interdependencies between different brain regions. Alzheimer's disease (AD) is a neurodegenerative disease, which leads to substantial cognitive decline, and eventually, dementia in aged people. To achieve a deeper insight into the behavior of functional cerebral networks in AD, here we study their synchronizability in 17 newly diagnosed AD patients compared to 17 healthy control subjects at no-task, eyes-closed condition. The cross-correlation of artifact-free EEGs was used to construct brain functional networks. The extracted networks were then tested for their synchronization properties by calculating the eigenratio of the Laplacian matrix of the connection graph, i.e., the largest eigenvalue divided by the second smallest one. In AD patients, we found an increase in the eigenratio, i.e., a decrease in the synchronizability of brain networks across delta, alpha, beta, and gamma EEG frequencies within the wide range of network costs. The finding indicates the destruction of functional brain networks in early AD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guilbert ER, Morin D, Guilbert AC, Gagnon H, Robitaille J, Richardson M. International Journal of Nursing Practice 2011; 17: 315-321 Task-shifting in the delivery of hormonal contraceptive methods: Validation of a questionnaire and preliminary results In order to palliate the access problem to effective contraceptive methods in Quebec, Canada, as well as to legitimate nurses' practices in family planning, a collaborative agreement was developed that allow nurses, in conjunction with pharmacists, to give hormonal contraceptives to healthy women of reproductive age for a 6 month period. Training in hormonal contraception was offered to targeted nurses before they could begin this practice. A questionnaire, based on Rogers's theory of diffusion of innovations, was elaborated and validated to specifically evaluate this phenomenon. Preliminary results show that the translation of training into practice might be suboptimal. The validated questionnaire can now be used to fully understand the set of factors influencing this new practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MOTIVATION: Analysis of millions of pyro-sequences is currently playing a crucial role in the advance of environmental microbiology. Taxonomy-independent, i.e. unsupervised, clustering of these sequences is essential for the definition of Operational Taxonomic Units. For this application, reproducibility and robustness should be the most sought after qualities, but have thus far largely been overlooked. RESULTS: More than 1 million hyper-variable internal transcribed spacer 1 (ITS1) sequences of fungal origin have been analyzed. The ITS1 sequences were first properly extracted from 454 reads using generalized profiles. Then, otupipe, cd-hit-454, ESPRIT-Tree and DBC454, a new algorithm presented here, were used to analyze the sequences. A numerical assay was developed to measure the reproducibility and robustness of these algorithms. DBC454 was the most robust, closely followed by ESPRIT-Tree. DBC454 features density-based hierarchical clustering, which complements the other methods by providing insights into the structure of the data. AVAILABILITY: An executable is freely available for non-commercial users at ftp://ftp.vital-it.ch/tools/dbc454. It is designed to run under MPI on a cluster of 64-bit Linux machines running Red Hat 4.x, or on a multi-core OSX system. CONTACT: dbc454@vital-it.ch or nicolas.guex@isb-sib.ch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence-based (EBP) aims for a new distribution of power centered on scientific evidence rather than clinical expertise. The present article describes the operational process of EBP by describing the implementation stages of this type of practise. This stage presentation is essential given that there are many conceptions end models of EBP and that some nurses have a limited knowledge of its rules ans implications. Given that number and formulation of the stages varies by author, the process presented here attempts to integrate the different stages reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensity-modulated radiotherapy (IMRT) treatment plan verification by comparison with measured data requires having access to the linear accelerator and is time consuming. In this paper, we propose a method for monitor unit (MU) calculation and plan comparison for step and shoot IMRT based on the Monte Carlo code EGSnrc/BEAMnrc. The beamlets of an IMRT treatment plan are individually simulated using Monte Carlo and converted into absorbed dose to water per MU. The dose of the whole treatment can be expressed through a linear matrix equation of the MU and dose per MU of every beamlet. Due to the positivity of the absorbed dose and MU values, this equation is solved for the MU values using a non-negative least-squares fit optimization algorithm (NNLS). The Monte Carlo plan is formed by multiplying the Monte Carlo absorbed dose to water per MU with the Monte Carlo/NNLS MU. Several treatment plan localizations calculated with a commercial treatment planning system (TPS) are compared with the proposed method for validation. The Monte Carlo/NNLS MUs are close to the ones calculated by the TPS and lead to a treatment dose distribution which is clinically equivalent to the one calculated by the TPS. This procedure can be used as an IMRT QA and further development could allow this technique to be used for other radiotherapy techniques like tomotherapy or volumetric modulated arc therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD) related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI) and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles) as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox) for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice. METHODS: We included all patients in the Swiss HIV Cohort Study starting darunavir after recording a viral load above 1000 HIV-1 RNA copies/mL given prior exposure to both PIs and nonnucleoside reverse transcriptase inhibitors. We followed these patients for up to 72 weeks, assessed virological failure using different loss of virological response algorithms and evaluated risk factors for virological failure using a Bayesian method to fit discrete Cox proportional hazard models. RESULTS: Among 130 treatment-experienced patients starting darunavir, the median age was 47 years, the median duration of HIV infection was 16 years, and 82% received mono or dual antiretroviral therapy before starting highly active antiretroviral therapy. During a median patient follow-up period of 45 weeks, 17% of patients stopped taking darunavir after a median exposure of 20 weeks. In patients followed beyond 48 weeks, the rate of virological failure at 48 weeks was at most 20%. Virological failure was more likely where patients had previously failed on both amprenavir and saquinavir and as the number of previously failed PI regimens increased. CONCLUSIONS: As a component of therapy for treatment-experienced patients, darunavir can achieve a similar efficacy and tolerability in clinical practice to that seen in clinical trials. Clinicians should consider whether a patient has failed on both amprenavir and saquinavir and the number of failed PI regimens before prescribing darunavir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.