811 resultados para Algorithm Calibration
Resumo:
Background: Attention to patients with acute minor-illnesses requesting same-day consultation represents a major burden in primary care. The workload is assumed by general practitioners in many countries. A number of reports suggest that care to these patients may be provided, at in least in part, by nurses. However, there is scarce information with respect to the applicability of a program of nurse management for adult patients with acute minor-illnesses in large areas. The aim of this study is to assess the effectiveness of a program of nurse algorithm-guided care for adult patients with acute minor illnesses requesting same-day consultation in primary care in a largely populated area. Methods: A cross-sectional study of all adult patients seeking same day consultation for 16 common acute minor illnesses in a large geographical area with 284 primary care practices. Patients were included in a program of nurse case management using management algorithms. The main outcome measure was case resolution, defined as completion of the algorithm by the nurse without need of referral of the patient to the general practitioner. The secondary outcome measure was return to consultation, defined as requirement of new consultation for the same reason as the first one, in primary care within a 7-day period. Results: During a two year period (April 2009-April 2011), a total of 1,209,669 consultations were performed in the program. Case resolution was achieved by nurses in 62.5% of consultations. The remaining cases were referred to a general practitioner. Resolution rates ranged from 94.2% in patients with burns to 42% in patients with upper respiratory symptoms. None of the 16 minor illnesses had a resolution rate below 40%. Return to consultation during a 7-day period was low, only 4.6%. Conclusions: A program of algorithms-guided care is effective for nurse case management of patients requesting same day consultation for minor illnesses in primary care.
Resumo:
Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy, Total Variation (TV)- based energies and more recently non-local means. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm or fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n2) and O(1/√ε), while existing techniques are in O(1/n2) and O(1/√ε). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.
Resumo:
Pyogenic liver abscess is a severe condition and a therapeutic challenge. Treatment failure may be due to an unrecognized ingested foreign body that migrated from the gastrointestinal tract. There has recently been a marked increase in the number of reported cases of this condition, but initial misdiagnosis as cryptogenic liver abscess still occurs in the majority of cases. We conducted the current study to characterize this entity and provide a diagnostic strategy applicable worldwide. To this end, data were collected from our case and from a systematic review that identified 59 well-described cases. Another systematic review identified series of cryptogenic-and Asian Klebsiella-liver abscess; these data were pooled and compared with the data from the cases of migrated foreign body liver abscess. The review points out the low diagnostic accuracy of history taking, modern imaging, and even surgical exploration. A fistula found through imaging procedures or endoscopy warrants surgical exploration. Findings suggestive of foreign body migration are symptoms of gastrointestinal perforation, computed tomography demonstration of a thickened gastrointestinal wall in continuity with the abscess, and adhesions seen during surgery. Treatment failure, left lobe location, unique location (that is, only 1 abscess location within the liver), and absence of underlying conditions also point to the diagnosis, as shown by comparison with the cryptogenic liver abscess series. This study demonstrates that migrated foreign body liver abscess is a specific entity, increasingly reported. It usually is not cured when unrecognized, and diagnosis is mainly delayed. This study provides what we consider the best available evidence for timely diagnosis with worldwide applicability. Increased awareness is required to treat this underestimated condition effectively, and further studies are needed.
Resumo:
Bioassays with bioreporter bacteria are usually calibrated with analyte solutions of known concentrations that are analysed along with the samples of interest. This is done as bioreporter output (the intensity of light, fluorescence or colour) does not only depend on the target concentration, but also on the incubation time and physiological activity of the cells in the assay. Comparing the bioreporter output with standardized colour tables in the field seems rather difficult and error-prone. A new approach to control assay variations and improve application ease could be an internal calibration based on the use of multiple bioreporter cell lines with drastically different reporter protein outputs at a given analyte concentration. To test this concept, different Escherichia coli-based bioreporter strains expressing either cytochrome c peroxidase (CCP, or CCP mutants) or β-galactosidase upon induction with arsenite were constructed. The reporter strains differed either in the catalytic activity of the reporter protein (for CCP) or in the rates of reporter protein synthesis (for β-galactosidase), which, indeed, resulted in output signals with different intensities at the same arsenite concentration. Hence, it was possible to use combinations of these cell lines to define arsenite concentration ranges at which none, one or more cell lines gave qualitative (yes/no) visible signals that were relatively independent of incubation time or bioreporter activity. The discriminated concentration ranges would fit very well with the current permissive (e.g. World Health Organization) levels of arsenite in drinking water (10 µg l−1).
Resumo:
BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.
Resumo:
The Highway Safety Manual is the national safety manual that provides quantitative methods for analyzing highway safety. The HSM presents crash modification factors related to work zone characteristics such as work zone duration and length. These crash modification factors were based on high-impact work zones in California. Therefore there was a need to use work zone and safety data from the Midwest to calibrate these crash modification factors for use in the Midwest. Almost 11,000 Missouri freeway work zones were analyzed to derive a representative and stratified sample of 162 work zones. The 162 work zones was more than four times the number of work zones used in the HSM. This dataset was used for modeling and testing crash modification factors applicable to the Midwest. The dataset contained work zones ranging from 0.76 mile to 9.24 miles and with durations from 16 days to 590 days. A combined fatal/injury/non-injury model produced a R2 fit of 0.9079 and a prediction slope of 0.963. The resulting crash modification factors of 1.01 for duration and 0.58 for length were smaller than the values in the HSM. Two practical application examples illustrate the use of the crash modification factors for comparing alternate work zone setups.
Resumo:
The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.
Resumo:
Background: The first AO comprehensive pediatric long bone fracture classification system has been established following a structured path of development and validation with experienced pediatric surgeons. Methods: A follow-up series of agreement studies was applied to specify and evaluate a grading system for displacement of pediatric supracondylar fractures. An iterative process comprising an international group of 5 experienced pediatric surgeons (Phase 1) followed by a pragmatic multicenter agreement study involving 26 raters (Phase 2) was used. The last evaluations were conducted on a consecutive collection of 154 supracondylar fractures documented by standard anteroposterior and lateral radiographs. Results: Fractures were classified according to 1 of 4 grades: I = incomplete fracture with no or minimal displacement; II = Incomplete fracture with continuity of the posterior (extension fracture) or anterior cortex (flexion fracture); III = lack of bone continuity (broken cortex), but still some contact between the fracture planes; IV = complete fracture with no bone continuity (broken cortex), and no contact between the fracture planes. A diagnostic algorithm to support the practical application of the grading system in a clinical setting, as well as an aid using a circle placed over the capitellum was proposed. The overall kappa coefficients were 0.68 and 0.61 in the Phase 1 and Phase 2 studies, respectively. In the Phase 1 study, fracture grades I, II, III, and IV were classified with median accuracies of 91%, 82%, 83%, and 99.5%, respectively. Similar median accuracies of 86% (Grade I), 73% (Grade II), 83%(Grade III), and 92% were reported for the Phase 2 study. Reliability was high in distinguishing complete, unstable fractures from stable injuries [ie, kappa coefficients of 0.84 (Phase 1) and 0.83 (Phase 2) were calculated]; in Phase 2, surgeons' accuracies in classifying complete fractures were all above 85%. Conclusions: With clear and unambiguous definition, this new grading system for supracondylar fracture displacement has proved to be sufficiently reliable and accurate when applied by pediatric surgeons in the framework of clinical routine as well as research.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.
Resumo:
Based on results of an evaluation performed during the winter of 1985-86, six Troxler 3241-B Asphalt Content Gauges were purchased for District use in monitoring project asphalt contents. Use of these gauges will help reduce the need for chemical based extractions. Effective use of the gauges depends on the accurate preparation and transfer of project mix calibrations from the Central Lab to the Districts. The objective of this project was to evaluate the precision and accuracy of a gauge in determining asphalt contents and to develop a mix calibration transfer procedure for implementation during the 1987 construction. The first part of the study was accomplished by preparing mix calibrations in the Central Lab gauge and taking multiple measurements of a sample with known asphalt content. The second part was accomplished by preparing transfer pans, obtaining count data on the pans using each gauge, and transferring calibrations from one gauge to another through the use of calibration transfer equations. The transferred calibrations were tested by measuring samples with a known asphalt content. The study established that the Troxler 3241-B Asphalt Content Gauge yields results of acceptable accuracy and precision as evidenced by a standard deviation of 0.04% asphalt content on multiple measurements of the same sample. The calibration transfer procedure proved feasible and resulted in the calibration transfer portion of Materials I.M. 335 - Method of Test For Determining the Asphalt Content of Bituminous Mixtures by the Nuclear Method.
Resumo:
In the previous study, moisture loss indices were developed based on the field measurements from one CIR-foam and one CIR-emulsion construction sites. To calibrate these moisture loss indices, additional CIR construction sites were monitored using embedded moisture and temperature sensors. In addition, to determine the optimum timing of an HMA overlay on the CIR layer, the potential of using the stiffness of CIR layer measured by geo-gauge instead of the moisture measurement by a nuclear gauge was explored. Based on the monitoring the moisture and stiffness from seven CIR project sites, the following conclusions are derived: 1. In some cases, the in-situ stiffness remained constant and, in other cases, despite some rainfalls, stiffness of the CIR layers steadily increased during the curing time. 2. The stiffness measured by geo-gauge was affected by a significant amount of rainfall. 3. The moisture indices developed for CIR sites can be used for predicting moisture level in a typical CIR project. The initial moisture content and temperature were the most significant factors in predicting the future moisture content in the CIR layer. 4. The stiffness of a CIR layer is an extremely useful tool for contractors to use for timing their HMA overlay. To determine the optimal timing of an HMA overlay, it is recommended that the moisture loss index should be used in conjunction with the stiffness of the CIR layer.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.