867 resultados para Artificial neural net
Resumo:
This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.
Resumo:
Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.
Resumo:
Esta tese incide sobre o desenvolvimento de modelos computacionais e de aplicações para a gestão do lado da procura, no âmbito das redes elétricas inteligentes. É estudado o desempenho dos intervenientes da rede elétrica inteligente, sendo apresentado um modelo do produtor-consumidor doméstico. O problema de despacho económico considerando previsão de produção e consumo de energia obtidos a partir de redes neuronais artificiais é apresentado. São estudados os modelos existentes no âmbito dos programas de resposta à procura e é desenvolvida uma ferramenta computacional baseada no algoritmo de fuzzy-clustering subtrativo. São analisados perfis de consumo e modos de operação, incluindo uma breve análise da introdução do veículo elétrico e de contingências na rede de energia elétrica. São apresentadas aplicações para a gestão de energia dos consumidores no âmbito do projeto piloto InovGrid. São desenvolvidos sistemas de automação para, aquisição monitorização, controlo e supervisão do consumo a partir de dados fornecidos pelos contadores inteligente que permitem a incorporação das ações dos consumidores na gestão do consumo de energia elétrica; SMART GRIDS - COMPUTATIONAL MODELS DEVELOPMENT AND DEMAND SIDE MANAGMENT APPLICATIONS Abstract: This thesis focuses on the development of computational models and its applications on the demand side management within the smart grid scope. The performance of the electrical network players is studied and a domestic prosumer model is presented. The economic dispatch problem considering the production forecast and the energy consumption obtained from artificial neural networks is also presented. The existing demand response models are studied and a computational tool based on the fuzzy subtractive clustering algorithm is developed. Energy consumption profiles and operational modes are analyzed, including a brief analysis of the electrical vehicle and contingencies on the electrical network. Consumer energy management applications within the scope of InovGrid pilot project are presented. Computational systems are developed for the acquisition, monitoring, control and supervision of consumption data provided by smart meters allowing to incorporate consumer actions on their electrical energy management.
Resumo:
This paper presents a semi-parametric Algorithm for parsing football video structures. The approach works on a two interleaved based process that closely collaborate towards a common goal. The core part of the proposed method focus perform a fast automatic football video annotation by looking at the enhance entropy variance within a series of shot frames. The entropy is extracted on the Hue parameter from the HSV color system, not as a global feature but in spatial domain to identify regions within a shot that will characterize a certain activity within the shot period. The second part of the algorithm works towards the identification of dominant color regions that could represent players and playfield for further activity recognition. Experimental Results shows that the proposed football video segmentation algorithm performs with high accuracy.
Resumo:
Growing models have been widely used for clustering or topology learning. Traditionally these models work on stationary environments, grow incrementally and adapt their nodes to a given distribution based on global parameters. In this paper, we present an enhanced unsupervised self-organising network for the modelling of visual objects. We first develop a framework for building non-rigid shapes using the growth mechanism of the self-organising maps, and then we define an optimal number of nodes without overfitting or underfitting the network based on the knowledge obtained from information-theoretic considerations. We present experimental results for hands and we quantitatively evaluate the matching capabilities of the proposed method with the topographic product.
Resumo:
In contemporary societies higher education must shape individuals able to solve problems in a workable and simpler manner and, therefore, a multidisciplinary view of the problems, with insights in disciplines like psychology, mathematics or computer science becomes mandatory. Undeniably, the great challenge for teachers is to provide a comprehensive training in General Chemistry with high standards of quality, and aiming not only at the promotion of the student’s academic success, but also at the understanding of the competences/skills required to their future doings. Thus, this work will be focused on the development of an intelligent system to assess the Quality-of-General-Chemistry-Learning, based on factors related with subject, teachers and students.
Resumo:
The inclusion of General Chemistry (GC) in the curricula of higher education courses in science and technology aims, on the one hand, to develop students' skills necessary for further studies and, on the other hand, to respond to the need of endowing future professionals of knowledge to analyze and solve multidisciplinary problems in a sustainable way. The participation of students in the evaluation of the role played by the GC in their training is crucial, and the analysis of the results can be an essential tool to increase success in the education of students and improving practices in various professions. Undeniably, this work will be focused on the development of an intelligent system to assess the role of GC. The computational framework is built on top of a Logic Programming approach to Knowledge Representation and Reasoning, complemented with a problem solving methodology moored on Artificial Neural Networks. The results so far obtained show that the proposed model stands for a good start, being its overall accuracy higher than 95%.
Resumo:
An integrated analysis of naproxen adsorption on bone char in batch and packed-bed column conditions has been performed. Kinetic, thermodynamic and breakthrough parameters have been calculated using adsorption models and artificial neural networks. Results show that naproxen removal using bone char in batch conditions is a feasible and effective process, which could involve electrostatic and non-electrostatic interactions depending mainly on pH conditions. However, the application of packed-bed column for naproxen adsorption on bone char is not effective for the treatment of diluted solutions due to the low degree of adsorbent utilization (below 4%) at tested operating conditions. The proposed mechanism for naproxen removal using bone char could include a complexation process via phosphate and naproxen, hydrogen bonding and the possibility of hydrophobic interactions via π–π electron. This study highlights the relevance of performing an integrated analysis of adsorbent effectiveness in batch and dynamic conditions to establish the best process configuration for the removal of emerging water pollutants such as pharmaceuticals.
Resumo:
A pesquisa tem como objetivo desenvolver uma estrutura de controle preditivo neural, com o intuito de controlar um processo de pH, caracterizado por ser um sistema SISO (Single Input - Single Output). O controle de pH é um processo de grande importância na indústria petroquímica, onde se deseja manter constante o nível de acidez de um produto ou neutralizar o afluente de uma planta de tratamento de fluidos. O processo de controle de pH exige robustez do sistema de controle, pois este processo pode ter ganho estático e dinâmica nãolineares. O controlador preditivo neural envolve duas outras teorias para o seu desenvolvimento, a primeira referente ao controle preditivo e a outra a redes neurais artificiais (RNA s). Este controlador pode ser dividido em dois blocos, um responsável pela identificação e outro pelo o cálculo do sinal de controle. Para realizar a identificação neural é utilizada uma RNA com arquitetura feedforward multicamadas com aprendizagem baseada na metodologia da Propagação Retroativa do Erro (Error Back Propagation). A partir de dados de entrada e saída da planta é iniciado o treinamento offline da rede. Dessa forma, os pesos sinápticos são ajustados e a rede está apta para representar o sistema com a máxima precisão possível. O modelo neural gerado é usado para predizer as saídas futuras do sistema, com isso o otimizador calcula uma série de ações de controle, através da minimização de uma função objetivo quadrática, fazendo com que a saída do processo siga um sinal de referência desejado. Foram desenvolvidos dois aplicativos, ambos na plataforma Builder C++, o primeiro realiza a identificação, via redes neurais e o segundo é responsável pelo controle do processo. As ferramentas aqui implementadas e aplicadas são genéricas, ambas permitem a aplicação da estrutura de controle a qualquer novo processo
Resumo:
The intersection of Artificial Intelligence and The Law stands for a multifaceted matter, and its effects set the advances on culture, organization, as well as the social matters, when the emergent information technologies are taken into consideration. From this point of view, the weight of formal and informal Conflict Resolution settings should be highlighted, and the use of defective data, information or knowledge must be emphasized. Indeed, it is hard to do it with traditional problem solving methodologies. Therefore, in this work the focus is on the development of decision support systems, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks. It is intended to evaluate the Quality-of-Judgments and the respective Degree-of-Confidence that one has on such happenings.
Resumo:
Acute Coronary Syndrome (ACS) is transversal to a broad and heterogeneous set of human beings, and assumed as a serious diagnosis and risk stratification problem. Although one may be faced with or had at his disposition different tools as biomarkers for the diagnosis and prognosis of ACS, they have to be previously evaluated and validated in different scenarios and patient cohorts. Besides ensuring that a diagnosis is correct, attention should also be directed to ensure that therapies are either correctly or safely applied. Indeed, this work will focus on the development of a diagnosis decision support system in terms of its knowledge representation and reasoning mechanisms, given here in terms of a formal framework based on Logic Programming, complemented with a problem solving methodology to computing anchored on Artificial Neural Networks. On the one hand it caters for the evaluation of ACS predisposing risk and the respective Degree-of-Confidence that one has on such a happening. On the other hand it may be seen as a major development on the Multi-Value Logics to understand things and ones behavior. Undeniably, the proposed model allows for an improvement of the diagnosis process, classifying properly the patients that presented the pathology (sensitivity ranging from 89.7% to 90.9%) as well as classifying the absence of ACS (specificity ranging from 88.4% to 90.2%).
Resumo:
On the one hand, pesticides may be absorbed into the body orally, dermally, ocularly and by inhalation and the human exposure may be dietary, recreational and/or occupational where toxicity could be acute or chronic. On the other hand, the environmental fate and toxicity of the pesticide is contingent on the physico-chemical characteristics of pesticide, the soil composition and adsorption. Human toxicity is also dependent on the exposure time and individual’s susceptibility. Therefore, this work will focus on the development of an Artificial Intelligence based diagnosis support system to assess the pesticide toxicological risk to humanoid, built under a formal framework based on Logic Programming to knowledge representation and reasoning, complemented with an approach to computing grounded on Artificial Neural Networks. The proposed solution is unique in itself, once it caters for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in terms of a qualitative or quantitative setting.
Resumo:
In an organisation any optimization process of its issues faces increasing challenges and requires new approaches to the organizational phenomenon. Indeed, in this work it is addressed the problematic of efficiency dynamics through intangible variables that may support a different view of the corporations. It focuses on the challenges that information management and the incorporation of context brings to competitiveness. Thus, in this work it is presented the analysis and development of an intelligent decision support system in terms of a formal agenda built on a Logic Programming based methodology to problem solving, complemented with an attitude to computing grounded on Artificial Neural Networks. The proposed model is in itself fairly precise, with an overall accuracy, sensitivity and specificity with values higher than 90 %. The proposed solution is indeed unique, catering for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in a quantitative or qualitative arrangement.
Resumo:
A problemática relacionada com a modelação da qualidade da água de albufeiras pode ser abordada de diversos pontos de vista. Neste trabalho recorre-se a metodologias de resolução de problemas que emanam da Área Cientifica da Inteligência Artificial, assim como a ferramentas utilizadas na procura de soluções como as Árvores de Decisão, as Redes Neuronais Artificiais e a Aproximação de Vizinhanças. Actualmente os métodos de avaliação da qualidade da água são muito restritivos já que não permitem aferir a qualidade da água em tempo real. O desenvolvimento de modelos de previsão baseados em técnicas de Descoberta de Conhecimento em Bases de Dados, mostrou ser uma alternativa tendo em vista um comportamento pró-activo que pode contribuir decisivamente para diagnosticar, preservar e requalificar as albufeiras. No decurso do trabalho, foi utilizada a aprendizagem não-supervisionada tendo em vista estudar a dinâmica das albufeiras sendo descritos dois comportamentos distintos, relacionados com a época do ano. ABSTRACT: The problems related to the modelling of water quality in reservoirs can be approached from different viewpoints. This work resorts to methods of resolving problems emanating from the Scientific Area of Artificial lntelligence as well as to tools used in the search for solutions such as Decision Trees, Artificial Neural Networks and Nearest-Neighbour Method. Currently, the methods for assessing water quality are very restrictive because they do not indicate the water quality in real time. The development of forecasting models, based on techniques of Knowledge Discovery in Databases, shows to be an alternative in view of a pro-active behavior that may contribute to diagnose, maintain and requalify the water bodies. ln this work. unsupervised learning was used to study the dynamics of reservoirs, being described two distinct behaviors, related to the time of year.
Resumo:
This paper presents a methodology for short-term load forecasting based on genetic algorithm feature selection and artificial neural network modeling. A feed forward artificial neural network is used to model the 24-h ahead load based on past consumption, weather and stock index data. A genetic algorithm is used in order to find the best subset of variables for modeling. Three data sets of different geographical locations, encompassing areas of different dimensions with distinct load profiles are used in order to evaluate the methodology. The developed approach was found to generate models achieving a minimum mean average percentage error under 2 %. The feature selection algorithm was able to significantly reduce the number of used features and increase the accuracy of the models.