952 resultados para evaluation algorithm
Resumo:
Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.
Resumo:
The scale down of transistor technology allows microelectronics manufacturers such as Intel and IBM to build always more sophisticated systems on a single microchip. The classical interconnection solutions based on shared buses or direct connections between the modules of the chip are becoming obsolete as they struggle to sustain the increasing tight bandwidth and latency constraints that these systems demand. The most promising solution for the future chip interconnects are the Networks on Chip (NoC). NoCs are network composed by routers and channels used to inter- connect the different components installed on the single microchip. Examples of advanced processors based on NoC interconnects are the IBM Cell processor, composed by eight CPUs that is installed on the Sony Playstation III and the Intel Teraflops pro ject composed by 80 independent (simple) microprocessors. On chip integration is becoming popular not only in the Chip Multi Processor (CMP) research area but also in the wider and more heterogeneous world of Systems on Chip (SoC). SoC comprehend all the electronic devices that surround us such as cell-phones, smart-phones, house embedded systems, automotive systems, set-top boxes etc... SoC manufacturers such as ST Microelectronics , Samsung, Philips and also Universities such as Bologna University, M.I.T., Berkeley and more are all proposing proprietary frameworks based on NoC interconnects. These frameworks help engineers in the switch of design methodology and speed up the development of new NoC-based systems on chip. In this Thesis we propose an introduction of CMP and SoC interconnection networks. Then focusing on SoC systems we propose: • a detailed analysis based on simulation of the Spidergon NoC, a ST Microelectronics solution for SoC interconnects. The Spidergon NoC differs from many classical solutions inherited from the parallel computing world. Here we propose a detailed analysis of this NoC topology and routing algorithms. Furthermore we propose aEqualized a new routing algorithm designed to optimize the use of the resources of the network while also increasing its performance; • a methodology flow based on modified publicly available tools that combined can be used to design, model and analyze any kind of System on Chip; • a detailed analysis of a ST Microelectronics-proprietary transport-level protocol that the author of this Thesis helped developing; • a simulation-based comprehensive comparison of different network interface designs proposed by the author and the researchers at AST lab, in order to integrate shared-memory and message-passing based components on a single System on Chip; • a powerful and flexible solution to address the time closure exception issue in the design of synchronous Networks on Chip. Our solution is based on relay stations repeaters and allows to reduce the power and area demands of NoC interconnects while also reducing its buffer needs; • a solution to simplify the design of the NoC by also increasing their performance and reducing their power and area consumption. We propose to replace complex and slow virtual channel-based routers with multiple and flexible small Multi Plane ones. This solution allows us to reduce the area and power dissipation of any NoC while also increasing its performance especially when the resources are reduced. This Thesis has been written in collaboration with the Advanced System Technology laboratory in Grenoble France, and the Computer Science Department at Columbia University in the city of New York.
Resumo:
Myocardial perfusion quantification by means of Contrast-Enhanced Cardiac Magnetic Resonance images relies on time consuming frame-by-frame manual tracing of regions of interest. In this Thesis, a novel automated technique for myocardial segmentation and non-rigid registration as a basis for perfusion quantification is presented. The proposed technique is based on three steps: reference frame selection, myocardial segmentation and non-rigid registration. In the first step, the reference frame in which both endo- and epicardial segmentation will be performed is chosen. Endocardial segmentation is achieved by means of a statistical region-based level-set technique followed by a curvature-based regularization motion. Epicardial segmentation is achieved by means of an edge-based level-set technique followed again by a regularization motion. To take into account the changes in position, size and shape of myocardium throughout the sequence due to out of plane respiratory motion, a non-rigid registration algorithm is required. The proposed non-rigid registration scheme consists in a novel multiscale extension of the normalized cross-correlation algorithm in combination with level-set methods. The myocardium is then divided into standard segments. Contrast enhancement curves are computed measuring the mean pixel intensity of each segment over time, and perfusion indices are extracted from each curve. The overall approach has been tested on synthetic and real datasets. For validation purposes, the sequences have been manually traced by an experienced interpreter, and contrast enhancement curves as well as perfusion indices have been computed. Comparisons between automatically extracted and manually obtained contours and enhancement curves showed high inter-technique agreement. Comparisons of perfusion indices computed using both approaches against quantitative coronary angiography and visual interpretation demonstrated that the two technique have similar diagnostic accuracy. In conclusion, the proposed technique allows fast, automated and accurate measurement of intra-myocardial contrast dynamics, and may thus address the strong clinical need for quantitative evaluation of myocardial perfusion.
Resumo:
Photovoltaic (PV) solar panels generally produce electricity in the 6% to 16% efficiency range, the rest being dissipated in thermal losses. To recover this amount, hybrid photovoltaic thermal systems (PVT) have been devised. These are devices that simultaneously convert solar energy into electricity and heat. It is thus interesting to study the PVT system globally from different point of views in order to evaluate advantages and disadvantages of this technology and its possible uses. In particular in Chapter II, the development of the PVT absorber numerical optimization by a genetic algorithm has been carried out analyzing different internal channel profiles in order to find a right compromise between performance and technical and economical feasibility. Therefore in Chapter III ,thanks to a mobile structure built into the university lab, it has been compared experimentally electrical and thermal output power from PVT panels with separated photovoltaic and solar thermal productions. Collecting a lot of experimental data based on different seasonal conditions (ambient temperature,irradiation, wind...),the aim of this mobile structure has been to evaluate average both thermal and electrical increasing and decreasing efficiency values obtained respect to separate productions through the year. In Chapter IV , new PVT and solar thermal equation based models in steady state conditions have been developed by software Dymola that uses Modelica language. This permits ,in a simplified way respect to previous system modelling softwares, to model and evaluate different concepts about PVT panel regarding its structure before prototyping and measuring it. Chapter V concerns instead the definition of PVT boundary conditions into a HVAC system . This was made trough year simulations by software Polysun in order to finally assess the best solar assisted integrated structure thanks to F_save(solar saving energy)factor. Finally, Chapter VI presents the conclusion and the perspectives of this PhD work.
Resumo:
Complex networks analysis is a very popular topic in computer science. Unfortunately this networks, extracted from different contexts, are usually very large and the analysis may be very complicated: computation of metrics on these structures could be very complex. Among all metrics we analyse the extraction of subnetworks called communities: they are groups of nodes that probably play the same role within the whole structure. Communities extraction is an interesting operation in many different fields (biology, economics,...). In this work we present a parallel community detection algorithm that can operate on networks with huge number of nodes and edges. After an introduction to graph theory and high performance computing, we will explain our design strategies and our implementation. Then, we will show some performance evaluation made on a distributed memory architectures i.e. the supercomputer IBM-BlueGene/Q "Fermi" at the CINECA supercomputing center, Italy, and we will comment our results.
Resumo:
This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.
Resumo:
Tandem mass spectral libraries are gaining more and more importance for the identification of unknowns in different fields of research, including metabolomics, forensics, toxicology, and environmental analysis. Particularly, the recent invention of reliable, robust, and transferable libraries has increased the general acceptance of these tools. Herein, we report on results obtained from thorough evaluation of the match reliabilities of two tandem mass spectral libraries: the MSforID library established by the Oberacher group in Innsbruck and the Weinmann library established by the Weinmann group in Freiburg. Three different experiments were performed: (1) Spectra of the libraries were searched against their corresponding library after excluding either this single compound-specific spectrum or all compound-specific spectra prior to searching; (2) the libraries were searched against each other using either library as reference set or sample set; (3) spectra acquired on different mass spectrometric instruments were matched to both libraries. Almost 13,000 tandem mass spectra were included in this study. The MSforID search algorithm was used for spectral matching. Statistical evaluation of the library search results revealed that principally both libraries enable the sensitive and specific identification of compounds. Due to higher mass accuracy of the QqTOF compared with the QTrap instrument, matches to the MSforID library were more reliable when comparing spectra with both libraries. Furthermore, only the MSforID library was shown to be efficiently transferable to different kinds of tandem mass spectrometers, including "tandem-in-time" instruments; this is due to the coverage of a large range of different collision energy settings-including the very low range-which is an outstanding characteristics of the MSforID library.
Resumo:
Ninety strains of a collection of well-identified clinical isolates of gram-negative nonfermentative rods collected over a period of 5 years were evaluated using the new colorimetric VITEK 2 card. The VITEK 2 colorimetric system identified 53 (59%) of the isolates to the species level and 9 (10%) to the genus level; 28 (31%) isolates were misidentified. An algorithm combining the colorimetric VITEK 2 card and 16S rRNA gene sequencing for adequate identification of gram-negative nonfermentative rods was developed. According to this algorithm, any identification by the colorimetric VITEK 2 card other than Achromobacter xylosoxidans, Acinetobacter sp., Burkholderia cepacia complex, Pseudomonas aeruginosa, and Stenotrophomonas maltophilia should be subjected to 16S rRNA gene sequencing when accurate identification of nonfermentative rods is of concern.
Resumo:
Routine bridge inspections require labor intensive and highly subjective visual interpretation to determine bridge deck surface condition. Light Detection and Ranging (LiDAR) a relatively new class of survey instrument has become a popular and increasingly used technology for providing as-built and inventory data in civil applications. While an increasing number of private and governmental agencies possess terrestrial and mobile LiDAR systems, an understanding of the technology’s capabilities and potential applications continues to evolve. LiDAR is a line-of-sight instrument and as such, care must be taken when establishing scan locations and resolution to allow the capture of data at an adequate resolution for defining features that contribute to the analysis of bridge deck surface condition. Information such as the location, area, and volume of spalling on deck surfaces, undersides, and support columns can be derived from properly collected LiDAR point clouds. The LiDAR point clouds contain information that can provide quantitative surface condition information, resulting in more accurate structural health monitoring. LiDAR scans were collected at three study bridges, each of which displayed a varying degree of degradation. A variety of commercially available analysis tools and an independently developed algorithm written in ArcGIS Python (ArcPy) were used to locate and quantify surface defects such as location, volume, and area of spalls. The results were visual and numerically displayed in a user-friendly web-based decision support tool integrating prior bridge condition metrics for comparison. LiDAR data processing procedures along with strengths and limitations of point clouds for defining features useful for assessing bridge deck condition are discussed. Point cloud density and incidence angle are two attributes that must be managed carefully to ensure data collected are of high quality and useful for bridge condition evaluation. When collected properly to ensure effective evaluation of bridge surface condition, LiDAR data can be analyzed to provide a useful data set from which to derive bridge deck condition information.
Resumo:
In this study, the use of magnesium as a Hall thruster propellant was evaluated. A xenon Hall thruster was modified such that magnesium propellant could be loaded into the anode and use waste heat from the thruster discharge to drive the propellant vaporization. A control scheme was developed, which allowed for precise control of the mass flow rate while still using plasma heating as the main mechanism for evaporation. The thruster anode, which also served as the propellant reservoir, was designed such that the open area was too low for sufficient vapor flow at normal operating temperatures (i.e. plasma heating alone). The remaining heat needed to achieve enough vapor flow to sustain thruster discharge came from a counter-wound resistive heater located behind the anode. The control system has the ability to arrest thermal runaway in a direct evaporation feed system and stabilize the discharge current during voltage-limited operation. A proportional-integral-derivative control algorithm was implemented to enable automated operation of the mass flow control system using the discharge current as the measured variable and the anode heater current as the controlled parameter. Steady-state operation at constant voltage with discharge current excursions less than 0.35 A was demonstrated for 70 min. Using this long-duration method, stable operation was achieved with heater powers as low as 6% of the total discharge power. Using the thermal mass flow control system the thruster operated stably enough and long enough that performance measurements could be obtained and compared to the performance of the thruster using xenon propellant. It was found that when operated with magnesium, the thruster has thrust ranging from 34 mN at 200 V to 39 mN at 300 V with 1.7 mg/s of propellant. It was found to have 27 mN of thrust at 300 V using 1.0 mg/s of propellant. The thrust-to-power ratio ranged from 24 mN/kW at 200 V to 18 mN/kW at 300 volts. The specific impulse was 2000 s at 200 V and upwards of 2700 s at 300 V. The anode efficiency was found to be ~23% using magnesium, which is substantially lower than the 40% anode efficiency of xenon at approximately equivalent molar flow rates. Measurements in the plasma plume of the thruster—operated using magnesium and xenon propellants—were obtained using a Faraday probe to measure off-axis current distribution, a retarding potential analyzer to measure ion energy, and a double Langmuir probe to measure plasma density, electron temperature, and plasma potential. Additionally, the off axis current distributions and ion energy distributions were compared to measurements made in krypton and bismuth plasmas obtained in previous studies of the same thruster. Comparisons showed that magnesium had the largest beam divergence of the four propellants while the others had similar divergence. The comparisons also showed that magnesium and krypton both had very low voltage utilization compared to xenon and bismuth. It is likely that the differences in plume structure are due to the atomic differences between the propellants; the ionization mean free path goes down with increasing atomic mass. Magnesium and krypton have long ionization mean free paths and therefore require physically larger thruster dimensions for efficient thruster operation and would benefit from magnetic shielding.
Resumo:
Chronic diarrhea is defined as a decrease in fecal consistency lasting for four or more weeks. A myriad of disorders are associated with chronic diarrhea. In developed countries, chronic diarrhea is mostly caused by non-infectious diseases. There are four pathogenic mechanisms leading to chronic diarrhea: osmotic diarrhea, secretory diarrhea, inflammatory diarrhea, and dysmotility. Overlaps between these mechanisms are possible. A 72-hour fecal collection as well as the fasting test are important diagnostic tools to identify the underlying pathomechanism. The identification of the pathomechanism narrows down the possible etiologies of chronic diarrhea and allows therefore a cost-saving diagnostic workup. The endoscopy is well established in the workup of chronic diarrhea. This article gives an overview about the main causes and mechanisms leading to chronic diarrhea and proposes an algorithm for the diagnostic evalution.
Resumo:
Aim of this paper is to evaluate the diagnostic contribution of various types of texture features in discrimination of hepatic tissue in abdominal non-enhanced Computed Tomography (CT) images. Regions of Interest (ROIs) corresponding to the classes: normal liver, cyst, hemangioma, and hepatocellular carcinoma were drawn by an experienced radiologist. For each ROI, five distinct sets of texture features are extracted using First Order Statistics (FOS), Spatial Gray Level Dependence Matrix (SGLDM), Gray Level Difference Method (GLDM), Laws' Texture Energy Measures (TEM), and Fractal Dimension Measurements (FDM). In order to evaluate the ability of the texture features to discriminate the various types of hepatic tissue, each set of texture features, or its reduced version after genetic algorithm based feature selection, was fed to a feed-forward Neural Network (NN) classifier. For each NN, the area under Receiver Operating Characteristic (ROC) curves (Az) was calculated for all one-vs-all discriminations of hepatic tissue. Additionally, the total Az for the multi-class discrimination task was estimated. The results show that features derived from FOS perform better than other texture features (total Az: 0.802+/-0.083) in the discrimination of hepatic tissue.
Resumo:
Tracking user’s visual attention is a fundamental aspect in novel human-computer interaction paradigms found in Virtual Reality. For example, multimodal interfaces or dialogue-based communications with virtual and real agents greatly benefit from the analysis of the user’s visual attention as a vital source for deictic references or turn-taking signals. Current approaches to determine visual attention rely primarily on monocular eye trackers. Hence they are restricted to the interpretation of two-dimensional fixations relative to a defined area of projection. The study presented in this article compares precision, accuracy and application performance of two binocular eye tracking devices. Two algorithms are compared which derive depth information as required for visual attention-based 3D interfaces. This information is further applied to an improved VR selection task in which a binocular eye tracker and an adaptive neural network algorithm is used during the disambiguation of partly occluded objects.
Resumo:
A tandem mass spectral database system consists of a library of reference spectra and a search program. State-of-the-art search programs show a high tolerance for variability in compound-specific fragmentation patterns produced by collision-induced decomposition and enable sensitive and specific 'identity search'. In this communication, performance characteristics of two search algorithms combined with the 'Wiley Registry of Tandem Mass Spectral Data, MSforID' (Wiley Registry MSMS, John Wiley and Sons, Hoboken, NJ, USA) were evaluated. The search algorithms tested were the MSMS search algorithm implemented in the NIST MS Search program 2.0g (NIST, Gaithersburg, MD, USA) and the MSforID algorithm (John Wiley and Sons, Hoboken, NJ, USA). Sample spectra were acquired on different instruments and, thus, covered a broad range of possible experimental conditions or were generated in silico. For each algorithm, more than 30,000 matches were performed. Statistical evaluation of the library search results revealed that principally both search algorithms can be combined with the Wiley Registry MSMS to create a reliable identification tool. It appears, however, that a higher degree of spectral similarity is necessary to obtain a correct match with the NIST MS Search program. This characteristic of the NIST MS Search program has a positive effect on specificity as it helps to avoid false positive matches (type I errors), but reduces sensitivity. Thus, particularly with sample spectra acquired on instruments differing in their Setup from tandem-in-space type fragmentation, a comparably higher number of false negative matches (type II errors) were observed by searching the Wiley Registry MSMS.
Resumo:
Recent treatment planning studies have demonstrated the use of physiologic images in radiation therapy treatment planning to identify regions for functional avoidance. This image-guided radiotherapy (IGRT) strategy may reduce the injury and/or functional loss following thoracic radiotherapy. 4D computed tomography (CT), developed for radiotherapy treatment planning, is a relatively new imaging technique that allows the acquisition of a time-varying sequence of 3D CT images of the patient's lungs through the respiratory cycle. Guerrero et al. developed a method to calculate ventilation imaging from 4D CT, which is potentially better suited and more broadly available for IGRT than the current standard imaging methods. The key to extracting function information from 4D CT is the construction of a volumetric deformation field that accurately tracks the motion of the patient's lungs during the respiratory cycle. The spatial accuracy of the displacement field directly impacts the ventilation images; higher spatial registration accuracy will result in less ventilation image artifacts and physiologic inaccuracies. Presently, a consistent methodology for spatial accuracy evaluation of the DIR transformation is lacking. Evaluation of the 4D CT-derived ventilation images will be performed to assess correlation with global measurements of lung ventilation, as well as regional correlation of the distribution of ventilation with the current clinical standard SPECT. This requires a novel framework for both the detailed assessment of an image registration algorithm's performance characteristics as well as quality assurance for spatial accuracy assessment in routine application. Finally, we hypothesize that hypo-ventilated regions, identified on 4D CT ventilation images, will correlate with hypo-perfused regions in lung cancer patients who have obstructive lesions. A prospective imaging trial of patients with locally advanced non-small-cell lung cancer will allow this hypothesis to be tested. These advances are intended to contribute to the validation and clinical implementation of CT-based ventilation imaging in prospective clinical trials, in which the impact of this imaging method on patient outcomes may be tested.