998 resultados para Diagnostic Algorithms
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Habil.-Schr., 2006
Resumo:
Gastrointestinal cancers, HCC, ectopeptidases, differential display, gasdermin-like
Resumo:
AbstractBackground:Myocardial perfusion scintigraphy (MPS) in patients not reaching 85% of the maximum predicted heart rate (MPHR) has reduced sensitivity.Objectives:In an attempt to maintain diagnostic sensitivity without losing functional exercise data, a new exercise and dipyridamole combined protocol (EDCP) was developed. Our aim was to evaluate the feasibility and safety of this protocol and to compare its diagnostic sensitivity against standard exercise and dipyridamole protocols.Methods:In patients not reaching a sufficient exercise (SE) test and with no contraindications, 0.56 mg/kg of dipyridamole were IV administered over 1 minute simultaneously with exercise, followed by 99mTc-MIBI injection.Results:Of 155 patients, 41 had MPS with EDCP, 47 had a SE test (≥ 85% MPHR) and 67 underwent the dipyridamole alone test (DIP). They all underwent coronary angiography within 3 months. The three stress methods for diagnosis of coronary lesions had their sensitivity compared. For stenosis ≥ 70%, EDCP yielded 97% sensitivity, SE 90% and DIP 95% (p = 0.43). For lesions ≥ 50%, the sensitivities were 94%, 88% and 95%, respectively (p = 0.35). Side effects of EDCP were present in only 12% of the patients, significantly less than with DIP (p < 0.001).Conclusions:The proposed combined protocol is a valid and safe method that yields adequate diagnostic sensitivity, keeping exercise prognostic information in patients unable to reach target heart rate, with fewer side effects than the DIP.
Resumo:
Data Mining, Vision Restoration, Treatment outcome prediction, Self-Organising-Map
Resumo:
AbstractBackground:Guidelines recommend that in suspected stable coronary artery disease (CAD), a clinical (non-invasive) evaluation should be performed before coronary angiography.Objective:We assessed the efficacy of patient selection for coronary angiography in suspected stable CAD.Methods:We prospectively selected consecutive patients without known CAD, referred to a high-volume tertiary center. Demographic characteristics, risk factors, symptoms and non-invasive test results were correlated to the presence of obstructive CAD. We estimated the CAD probability based on available clinical data and the incremental diagnostic value of previous non-invasive tests.Results:A total of 830 patients were included; median age was 61 years, 49.3% were males, 81% had hypertension and 35.5% were diabetics. Non-invasive tests were performed in 64.8% of the patients. At coronary angiography, 23.8% of the patients had obstructive CAD. The independent predictors for obstructive CAD were: male gender (odds ratio [OR], 3.95; confidence interval [CI] 95%, 2.70 - 5.77), age (OR for 5 years increment, 1.15; CI 95%, 1.06 - 1.26), diabetes (OR, 2.01; CI 95%, 1.40 - 2.90), dyslipidemia (OR, 2.02; CI 95%, 1.32 - 3.07), typical angina (OR, 2.92; CI 95%, 1.77 - 4.83) and previous non-invasive test (OR 1.54; CI 95% 1.05 - 2.27).Conclusions:In this study, less than a quarter of the patients referred for coronary angiography with suspected CAD had the diagnosis confirmed. A better clinical and non-invasive assessment is necessary, to improve the efficacy of patient selection for coronary angiography.
Resumo:
This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniques for maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables, and an approach for performing parallel addition of N input symbols.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.
Resumo:
Some practical aspects of Genetic algorithms’ implementation regarding to life cycle management of electrotechnical equipment are considered.
Resumo:
It is common to find in experimental data persistent oscillations in the aggregate outcomes and high levels of heterogeneity in individual behavior. Furthermore, it is not unusual to find significant deviations from aggregate Nash equilibrium predictions. In this paper, we employ an evolutionary model with boundedly rational agents to explain these findings. We use data from common property resource experiments (Casari and Plott, 2003). Instead of positing individual-specific utility functions, we model decision makers as selfish and identical. Agent interaction is simulated using an individual learning genetic algorithm, where agents have constraints in their working memory, a limited ability to maximize, and experiment with new strategies. We show that the model replicates most of the patterns that can be found in common property resource experiments.
Resumo:
"Vegeu el resum a l'inici del fitxer adjunt."
Resumo:
Purpose: To investigate the accuracy of 4 clinical instruments in the detection of glaucomatous damage. Methods: 102 eyes of 55 test subjects (Age mean = 66.5yrs, range = [39; 89]) underwent Heidelberg Retinal Tomography (HRTIII), (disc area<2.43); and standard automated perimetry (SAP) using Octopus (Dynamic); Pulsar (TOP); and Moorfields Motion Displacement Test (MDT) (ESTA strategy). Eyes were separated into three groups 1) Healthy (H): IOP<21mmHg and healthy discs (clinical examination), 39 subjects, 78 eyes; 2) Glaucoma suspect (GS): Suspicious discs (clinical examination), 12 subjects, 15 eyes; 3) Glaucoma (G): progressive structural or functional loss, 14 subjects, 20 eyes. Clinical diagnostic precision was examined using the cut-off associated with the p<5% normative limit of MD (Octopus/Pulsar), PTD (MDT) and MRA (HRT) analysis. The sensitivity, specificity and accuracy were calculated for each instrument. Results: See table Conclusions: Despite the advantage of defining glaucoma suspects using clinical optic disc examination, the HRT did not yield significantly higher accuracy than functional measures. HRT, MDT and Octopus SAP yielded higher accuracy than Pulsar perimetry, although results did not reach statistical significance. Further studies are required to investigate the structure-function correlations between these instruments.