838 resultados para Automated algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To improve the tag persistence throughout the whole cardiac cycle by providing a constant tag-contrast throughout all the cardiac phases when using balanced steady-state free precession (bSSFP) imaging. MATERIALS AND METHODS: The flip angles of the imaging radiofrequency pulses were optimized to compensate for the tagging contrast-to-noise ratio (Tag-CNR) fading at later cardiac phases in bSSFP imaging. Complementary spatial modulation of magnetization (CSPAMM) tagging was implemented to improve the Tag-CNR. Numerical simulations were performed to examine the behavior of the Tag-CNR with the proposed method, and to compare the resulting Tag-CNR with that obtained from the more commonly used spoiled gradient echo (SPGR) imaging. A gel phantom, as well as five healthy human volunteers, were scanned on a 1.5T scanner using bSSFP imaging with and without the proposed technique. The phantom was also scanned with SPGR imaging. RESULTS: With the proposed technique, the Tag-CNR remained almost constant during the whole cardiac cycle. Using bSSFP imaging, the Tag-CNR was about double that of SPGR. CONCLUSION: The tag persistence was significantly improved when the proposed method was applied, with better Tag-CNR during the diastolic cardiac phase. The improved Tag-CNR will support automated tagging analysis and quantification methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD) related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI) and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles) as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox) for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this project was to determine the feasibility of using pavement condition data collected for the Iowa Pavement Management Program (IPMP) as input to the Iowa Quadrennial Need Study. The need study, conducted by the Iowa Department of Transportation (Iowa DOT) every four years, currently uses manually collected highway infrastructure condition data (roughness, rutting, cracking, etc.). Because of the Iowa DOT's 10-year data collection cycles, condition data for a given highway segment may be up to 10 years old. In some cases, the need study process has resulted in wide fluctuations in funding allocated to individual Iowa counties from one study to the next. This volatility in funding levels makes it difficult for county engineers to plan and program road maintenance and improvements. One possible remedy is to input more current and less subjective infrastructure condition data. The IPMP was initially developed to satisfy the Intermodal Surface Transportation Efficiency Act (ISTEA) requirement that federal-aid-eligible highways be managed through a pavement management system. Currently all metropolitan planning organizations (MPOs) in Iowa and 15 of Iowa's 18 RPAs participate in the IPMP. The core of this program is a statewide data base of pavement condition and construction history information. The pavement data are collected by machine in two-year cycles. Using pilot areas, researchers examined the implications of using the automated data collected for the IPMP as input to the need study computer program, HWYNEEDS. The results show that using the IPMP automated data in HWYNEEDS is feasible and beneficial, resulting in less volatility in the level of total need between successive quadrennial need studies. In other words, the more current the data, the smaller the shift in total need.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy, Total Variation (TV)- based energies and more recently non-local means. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm or fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n2) and O(1/√ε), while existing techniques are in O(1/n2) and O(1/√ε). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kansas State University, with funding from the Kansas Department of Transportation (KDOT), has developed a computerized reduction system for profilograms produced by mechanical profilographs. The commercial version of the system (ProScan (trademark)) is marketed by Devore Systems, Inc. The system consists of an IBM Compatible PC 486SX33 computer or better, Epson LQ-570 printer, a Logitech Scanman 32 hand scanner system, a paper transport unit, and the ProScan software. The Scanner is not adaptable to IBM computers with the micro channel architecture. The Iowa DOT Transportation Centers could realize the following advantages by using ProScan: (1) Save about 5 to 8 staff hours of reduction and reporting time per Transportation Center per week for a Materials Technician 3 or 4 (the time savings would come during the busiest part of the season); (2) Reduce errors in reduction, transfer, and typing of profile values; (3) Increase the accuracy of the monitor results; and (4) Allow rapid evaluation of contractor traces when tolerance limits between monitor and certified results are exceeded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project examines similarities and differences between the automated condition data collected on and off county paved roads and the manual condition data collected by Iowa Department of Transportation (DOT) staff in 2000 and 2001. Also, the researchers will provide staff support to the advisory committee in exploring other options to the highway need process. The results show that the automated condition data can be used in a converted highway needs process with no major differences between the two methods. Even though the foundation rating difference was significant, the foundation rating weighting factor in HWYNEEDS is minimal and should not have a major impact. In terms of RUTF formula based distribution, the results clearly show the superiority of the condition-based analysis compared to the non-condition based. That correlation can be further enhanced by adding more distress variables to the analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a multicenter study a new, fully automated Roche Diagnostics Elecsys HBsAg II screening assay with improved sensitivity to HBsAg mutant detection was compared to well-established HBsAg tests: AxSYM HBsAg V2 (Abbott), Architect HBsAg (Abbott), Advia Centaur HBsAg (Bayer) Enzygnost HBsAg 5.0 (Dade-Behring), and Vitros Eci HBsAg (Ortho). A total of 16 seroconversion panels, samples of 60 HBsAg native mutants, and 31 HBsAg recombinant mutants, dilution series of NIBSC and PEI standards, 156 HBV positive samples comprising genotypes A to G, 686 preselected HBsAg positive samples from different stages of infection, 3,593 samples from daily routine, and 6,360 unselected blood donations were tested to evaluate the analytical and clinical sensitivity, the detection of mutants, and the specificity of the new assay. Elecsys HBsAg II showed a statistically significant better sensitivity in seroconversion panels to the compared tests. Fifty-seven out of 60 native mutants and all recombinant mutants were found positive. Among 156 HBV samples with different genotypes and 696 preselected HBsAg positive samples Elecsys HBsAg II achieved a sensitivity of 100%. The lower detection limit for NIBSC standard was calculated to be 0.025 IU/ml and for the PEI standards ad and ay it was <0.001 and <0.005 U/ml, respectively. Within 2,724 daily routine specimens and 6.360 unselected blood donations Elecsys HBsAg II showed a specificity of 99.97 and 99.88%, respectively. In conclusion the new Elecsys HBsAg II shows a high sensitivity for the detection of all stages of HBV infection and HBsAg mutants paired together with a high specificity in blood donors, daily routine samples, and potentially interfering sera.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a very fast method for blindly approximating a nonlinear mapping which transforms a sum of random variables. The estimation is surprisingly good even when the basic assumption is not satisfied.We use the method for providing a good initialization for inverting post-nonlinear mixtures and Wiener systems. Experiments show that the algorithm speed is strongly improved and the asymptotic performance is preserved with a very low extra computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a comprehensive study of different Independent Component Analysis (ICA) algorithms for the calculation of coherency and sharpness of electroencephalogram (EEG) signals, in order to investigate the possibility of early detection of Alzheimer’s disease (AD). We found that ICA algorithms can help in the artifact rejection and noise reduction, improving the discriminative property of features in high frequency bands (specially in high alpha and beta ranges). In addition to different ICA algorithms, the optimum number of selected components is investigated, in order to help decision processes for future works.