903 resultados para Artificial intelligence -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study is to provide an alternative model approach, i.e., artificial neural network (ANN) model, to predict the compositional viscosity of binary mixtures of room temperature ionic liquids (in short as ILs) [C n-mim] [NTf 2] with n=4, 6, 8, 10 in methanol and ethanol over the entire range of molar fraction at a broad range of temperatures from T=293.0328.0K. The results show that the proposed ANN model provides alternative way to predict compositional viscosity successfully with highly improved accuracy and also show its potential to be extensively utilized to predict compositional viscosity over a wide range of temperatures and more complex viscosity compositions, i.e., more complex intermolecular interactions between components in which it would be hard or impossible to establish the analytical model. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantile normalization (QN) is a technique for microarray data processing and is the default normalization method in the Robust Multi-array Average (RMA) procedure, which was primarily designed for analysing gene expression data from Affymetrix arrays. Given the abundance of Affymetrix microarrays and the popularity of the RMA method, it is crucially important that the normalization procedure is applied appropriately. In this study we carried out simulation experiments and also analysed real microarray data to investigate the suitability of RMA when it is applied to dataset with different groups of biological samples. From our experiments, we showed that RMA with QN does not preserve the biological signal included in each group, but rather it would mix the signals between the groups. We also showed that the Median Polish method in the summarization step of RMA has similar mixing effect. RMA is one of the most widely used methods in microarray data processing and has been applied to a vast volume of data in biomedical research. The problematic behaviour of this method suggests that previous studies employing RMA could have been misadvised or adversely affected. Therefore we think it is crucially important that the research community recognizes the issue and starts to address it. The two core elements of the RMA method, quantile normalization and Median Polish, both have the undesirable effects of mixing biological signals between different sample groups, which can be detrimental to drawing valid biological conclusions and to any subsequent analyses. Based on the evidence presented here and that in the literature, we recommend exercising caution when using RMA as a method of processing microarray gene expression data, particularly in situations where there are likely to be unknown subgroups of samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field programmable gate array devices boast abundant resources with which custom accelerator components for signal, image and data processing may be realised; however, realising high performance, low cost accelerators currently demands manual register transfer level design. Software-programmable ’soft’ processors have been proposed as a way to reduce this design burden but they are unable to support performance and cost comparable to custom circuits. This paper proposes a new soft processing approach for FPGA which promises to overcome this barrier. A high performance, fine-grained streaming processor, known as a Streaming Accelerator Element, is proposed which realises accelerators as large scale custom multicore networks. By adopting a streaming execution approach with advanced program control and memory addressing capabilities, typical program inefficiencies can be almost completely eliminated to enable performance and cost which are unprecedented amongst software-programmable solutions. When used to realise accelerators for fast fourier transform, motion estimation, matrix multiplication and sobel edge detection it is shown how the proposed architecture enables real-time performance and with performance and cost comparable with hand-crafted custom circuit accelerators and up to two orders of magnitude beyond existing soft processors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book provides a comprehensive tutorial on similarity operators. The authors systematically survey the set of similarity operators, primarily focusing on their semantics, while also touching upon mechanisms for processing them effectively.

The book starts off by providing introductory material on similarity search systems, highlighting the central role of similarity operators in such systems. This is followed by a systematic categorized overview of the variety of similarity operators that have been proposed in literature over the last two decades, including advanced operators such as RkNN, Reverse k-Ranks, Skyline k-Groups and K-N-Match. Since indexing is a core technology in the practical implementation of similarity operators, various indexing mechanisms are summarized. Finally, current research challenges are outlined, so as to enable interested readers to identify potential directions for future investigations.

In summary, this book offers a comprehensive overview of the field of similarity search operators, allowing readers to understand the area of similarity operators as it stands today, and in addition providing them with the background needed to understand recent novel approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urothelial cancer (UC) is highly recurrent and can progress from non-invasive (NMIUC) to a more aggressive muscle-invasive (MIUC) subtype that invades the muscle tissue layer of the bladder. We present a proof of principle study that network-based features of gene pairs can be used to improve classifier performance and the functional analysis of urothelial cancer gene expression data. In the first step of our procedure each individual sample of a UC gene expression dataset is inflated by gene pair expression ratios that are defined based on a given network structure. In the second step an elastic net feature selection procedure for network-based signatures is applied to discriminate between NMIUC and MIUC samples. We performed a repeated random subsampling cross validation in three independent datasets. The network signatures were characterized by a functional enrichment analysis and studied for the enrichment of known cancer genes. We observed that the network-based gene signatures from meta collections of proteinprotein interaction (PPI) databases such as CPDB and the PPI databases HPRD and BioGrid improved the classification performance compared to single gene based signatures. The network based signatures that were derived from PPI databases showed a prominent enrichment of cancer genes (e.g., TP53, TRIM27 and HNRNPA2Bl). We provide a novel integrative approach for large-scale gene expression analysis for the identification and development of novel diagnostical targets in bladder cancer. Further, our method allowed to link cancer gene associations to network-based expression signatures that are not observed in gene-based expression signatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work was to monitor a set of physical-chemical properties of heavy oil procedural streams through nuclear magnetic resonance spectroscopy, in order to propose an analysis procedure and online data processing for process control. Different statistical methods which allow to relate the results obtained by nuclear magnetic resonance spectroscopy with the results obtained by the conventional standard methods during the characterization of the different streams, have been implemented in order to develop models for predicting these same properties. The real-time knowledge of these physical-chemical properties of petroleum fractions is very important for enhancing refinery operations, ensuring technically, economically and environmentally proper refinery operations. The first part of this work involved the determination of many physical-chemical properties, at Matosinhos refinery, by following some standard methods important to evaluate and characterize light vacuum gas oil, heavy vacuum gas oil and fuel oil fractions. Kinematic viscosity, density, sulfur content, flash point, carbon residue, P-value and atmospheric and vacuum distillations were the properties analysed. Besides the analysis by using the standard methods, the same samples were analysed by nuclear magnetic resonance spectroscopy. The second part of this work was related to the application of multivariate statistical methods, which correlate the physical-chemical properties with the quantitative information acquired by nuclear magnetic resonance spectroscopy. Several methods were applied, including principal component analysis, principal component regression, partial least squares and artificial neural networks. Principal component analysis was used to reduce the number of predictive variables and to transform them into new variables, the principal components. These principal components were used as inputs of the principal component regression and artificial neural networks models. For the partial least squares model, the original data was used as input. Taking into account the performance of the develop models, by analysing selected statistical performance indexes, it was possible to conclude that principal component regression lead to worse performances. When applying the partial least squares and artificial neural networks models better results were achieved. However, it was with the artificial neural networks model that better predictions were obtained for almost of the properties analysed. With reference to the results obtained, it was possible to conclude that nuclear magnetic resonance spectroscopy combined with multivariate statistical methods can be used to predict physical-chemical properties of petroleum fractions. It has been shown that this technique can be considered a potential alternative to the conventional standard methods having obtained very promising results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is on the implementation of a dual axis positioning system controller. The system was designed to be used for space-dependent ultrasound signal acquisition problems, such as pressure field mapping. The work developed can be grouped in two main subjects: hardware and software. Each axis includes one stepper motor connected to a driver circuit, which is then connected to a processing unit. The graphical user interface is simple and clear for the user. The system resolution was computed as 127 mu m with an accuracy of 2.44 mu m. Although the target application is ultrasound signal acquisition, the controller can be applied to other devices that has up to four stepper motors. The application was developed as an open source software, thus it can be used or changed to fit different purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório da Prática de Ensino Supervisionada, Mestrado em Ensino da Matemática, Universidade de Lisboa, Instituto de Educação, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Super-resolution refers to the process of obtaining a high resolution image from one or more low resolution images. In this work, we present a novel method for the super-resolution problem for the limited case, where only one image of low resolution is given as an input. The proposed method is based on statistical learning for inferring the high frequencies regions which helps to distinguish a high resolution image from a low resolution one. These inferences are obtained from the correlation between regions of low and high resolution that come exclusively from the image to be super-resolved, in term of small neighborhoods. The Markov random fields are used as a model to capture the local statistics of high and low resolution data when they are analyzed at different scales and resolutions. Experimental results show the viability of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The smart grid concept is rapidly evolving in the direction of practical implementations able to bring smart grid advantages into practice. Evolution in legacy equipment and infrastructures is not sufficient to accomplish the smart grid goals as it does not consider the needs of the players operating in a complex environment which is dynamic and competitive in nature. Artificial intelligence based applications can provide solutions to these problems, supporting decentralized intelligence and decision-making. A case study illustrates the importance of Virtual Power Players (VPP) and multi-player negotiation in the context of smart grids. This case study is based on real data and aims at optimizing energy resource management, considering generation, storage and demand response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Swarm Intelligence (SI) is a growing research field of Artificial Intelligence (AI). SI is the general term for several computational techniques which use ideas and get inspiration from the social behaviours of insects and of other animals. This paper presents hybridization and combination of different AI approaches, like Bio-Inspired Techniques (BIT), Multi-Agent systems (MAS) and Machine Learning Techniques (ML T). The resulting system is applied to the problem of jobs scheduling to machines on dynamic manufacturing environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity markets are complex environments with very particular characteristics. A critical issue regarding these specific characteristics concerns the constant changes they are subject to. This is a result of the electricity markets’ restructuring, which was performed so that the competitiveness could be increased, but it also had exponential implications in the increase of the complexity and unpredictability in those markets scope. The constant growth in markets unpredictability resulted in an amplified need for market intervenient entities in foreseeing market behaviour. The need for understanding the market mechanisms and how the involved players’ interaction affects the outcomes of the markets, contributed to the growth of usage of simulation tools. Multi-agent based software is particularly well fitted to analyze dynamic and adaptive systems with complex interactions among its constituents, such as electricity markets. This dissertation presents ALBidS – Adaptive Learning strategic Bidding System, a multiagent system created to provide decision support to market negotiating players. This system is integrated with the MASCEM electricity market simulator, so that its advantage in supporting a market player can be tested using cases based on real markets’ data. ALBidS considers several different methodologies based on very distinct approaches, to provide alternative suggestions of which are the best actions for the supported player to perform. The approach chosen as the players’ actual action is selected by the employment of reinforcement learning algorithms, which for each different situation, simulation circumstances and context, decides which proposed action is the one with higher possibility of achieving the most success. Some of the considered approaches are supported by a mechanism that creates profiles of competitor players. These profiles are built accordingly to their observed past actions and reactions when faced with specific situations, such as success and failure. The system’s context awareness and simulation circumstances analysis, both in terms of results performance and execution time adaptation, are complementary mechanisms, which endow ALBidS with further adaptation and learning capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational Intelligence (CI) includes four main areas: Evolutionary Computation (genetic algorithms and genetic programming), Swarm Intelligence, Fuzzy Systems and Neural Networks. This article shows how CI techniques overpass the strict limits of Artificial Intelligence field and can help solving real problems from distinct engineering areas: Mechanical, Computer Science and Electrical Engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. This paper presents a methodology to provide decision support to electricity market negotiating players. This model allows integrating different strategic approaches for electricity market negotiations, and choosing the most appropriate one at each time, for each different negotiation context. This methodology is integrated in ALBidS (Adaptive Learning strategic Bidding System) – a multiagent system that provides decision support to MASCEM's negotiating agents so that they can properly achieve their goals. ALBidS uses artificial intelligence methodologies and data analysis algorithms to provide effective adaptive learning capabilities to such negotiating entities. The main contribution is provided by a methodology that combines several distinct strategies to build actions proposals, so that the best can be chosen at each time, depending on the context and simulation circumstances. The choosing process includes reinforcement learning algorithms, a mechanism for negotiating contexts analysis, a mechanism for the management of the efficiency/effectiveness balance of the system, and a mechanism for competitor players' profiles definition.