993 resultados para Engenharia de software experimental
Resumo:
Para a realização deste trabalho, foi utilizada a técnica da eletrocoagulação (EC) para o tratamento de efluente de piscicultura. Um reator de EC em escala de laboratório, com capacidade de 1,5 L foi montado, utilizando um conjunto de quatro placas de eletrodos de alumínio, um agitador mecânico de alto torque microprocessado, fios condutores com garras de jacaré e uma fonte de tensão com potência regulável. Os eletrodos foram arranjados dentro da célula eletrolítica de forma monopolar, em paralelo e a uma distância de 11 mm. O efluente utilizado neste estudo foi coletado em tanques de piscicultura do centro de criação de peixes do Departamento de Engenharia de Pesca da Universidade Federal do Ceará. Para a determinação da melhor condição de operação do reator, foi feito um planejamento experimental por intermédio do Software “Statgrafics”, definindo, as variáveis operacionais e os seus respetivos intervalos de variação (pH inicial de 4 a 8, condutividade de 1000 a 4000 μS cm-1, tempo de eletrolise de 15 a 35 min., agitação de 200 a 600 rpm e corrente de 1 a 2,5 A), que combinadas entre si totalizaram um total de 35 ensaios experimentais. Com base nos resultados obtidos por meio das análises físico-químicas em laboratório, pode-se afirmar que o pH inicial=8, condutividade=1000 μS cm-1, tempo=35 min., agitação=200 rpm e corrente=2,5 A, são as condições ótimas de operação do reator. Nestas condições, alcançaram remoção de 84,95% para DQO, 98,06% para nitrito, 82,43% para nitrato, 98,05% para fósforo total e 95,32% para a turbidez, sendo o custo operacional de 4,59 R$/m3 de efluente tratado. Com base nos resultados obtidos, pode-se concluir que alguns dos parâmetros analisados (pH, turbidez, temperatura, STD, nitrito, nitrato e fósforo total) estão de acordo com os padrões estabelecidos para água doce, classe 2, pela Resolução CONAMA nº 357/05, e de acordo com a Resolução CONAMA nº 430/2011 e a Portaria nº 154/2002 da SEMACE (CE), para lançamento do efluente final nos corpos receptores. A técnica de eletrocoagulação além de ser um método alternativo, eficiente e promissor para tratamento de efluentes de piscicultura, também mostrou ser ecologicamente correto por dispensar o consumo elevado de reagentes, ao contrário do que acontece no tratamento convencional.
Resumo:
We have devised a program that allows computation of the power of F-test, and hence determination of appropriate sample and subsample sizes, in the context of the one-way hierarchical analysis of variance with fixed effects. The power at a fixed alternative is an increasing function of the sample size and of the subsample size. The program makes it easy to obtain the power of F-test for a range of values of sample and subsample sizes, and therefore the appropriate sizes based on a desired power. The program can be used for the 'ordinary' case of the one-way analysis of variance, as well as for hierarchical analysis of variance with two stages of sampling. Examples are given of the practical use of the program.
Resumo:
This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational, and research tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system. In this context the research developed includes the visual information as a meaningful source that allows detecting the obstacle position coordinates as well as planning the free obstacle trajectory that should be reached by the robot
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
The question of Pilot Project creation, due to support pre-development stage of software product elaboration, nowadays might be used as an approach, which allows improving the whole scheme of information technology project running. This subject is not new, but till now no model has been presented, which gives deep description of this important stage on the early phase of project. This Master's Thesis represents the research's results and findings concerning the pre-development study from the Software Engineering point of view. The aspects of feasibility study, pilot prototype developments are analyzed in this paper. As the result, the technique of Pilot Project is formulated and scheme has been presented. The experimental part is focused on particular area Pilot Project scheme's implementation- Internationally Distributed Software projects. The specific characteristic, aspects, obstacles, advantages and disadvantages are considered on the example of cross border region of Russia and Finland. The real case of Pilot Project technique implementation is given.
Resumo:
This master’s thesis aims to study and represent from literature how evolutionary algorithms are used to solve different search and optimisation problems in the area of software engineering. Evolutionary algorithms are methods, which imitate the natural evolution process. An artificial evolution process evaluates fitness of each individual, which are solution candidates. The next population of candidate solutions is formed by using the good properties of the current population by applying different mutation and crossover operations. Different kinds of evolutionary algorithm applications related to software engineering were searched in the literature. Applications were classified and represented. Also the necessary basics about evolutionary algorithms were presented. It was concluded, that majority of evolutionary algorithm applications related to software engineering were about software design or testing. For example, there were applications about classifying software production data, project scheduling, static task scheduling related to parallel computing, allocating modules to subsystems, N-version programming, test data generation and generating an integration test order. Many applications were experimental testing rather than ready for real production use. There were also some Computer Aided Software Engineering tools based on evolutionary algorithms.
Resumo:
The present thesis in focused on the minimization of experimental efforts for the prediction of pollutant propagation in rivers by mathematical modelling and knowledge re-use. Mathematical modelling is based on the well known advection-dispersion equation, while the knowledge re-use approach employs the methods of case based reasoning, graphical analysis and text mining. The thesis contribution to the pollutant transport research field consists of: (1) analytical and numerical models for pollutant transport prediction; (2) two novel techniques which enable the use of variable parameters along rivers in analytical models; (3) models for the estimation of pollutant transport characteristic parameters (velocity, dispersion coefficient and nutrient transformation rates) as functions of water flow, channel characteristics and/or seasonality; (4) the graphical analysis method to be used for the identification of pollution sources along rivers; (5) a case based reasoning tool for the identification of crucial information related to the pollutant transport modelling; (6) and the application of a software tool for the reuse of information during pollutants transport modelling research. These support tools are applicable in the water quality research field and in practice as well, as they can be involved in multiple activities. The models are capable of predicting pollutant propagation along rivers in case of both ordinary pollution and accidents. They can also be applied for other similar rivers in modelling of pollutant transport in rivers with low availability of experimental data concerning concentration. This is because models for parameter estimation developed in the present thesis enable the calculation of transport characteristic parameters as functions of river hydraulic parameters and/or seasonality. The similarity between rivers is assessed using case based reasoning tools, and additional necessary information can be identified by using the software for the information reuse. Such systems represent support for users and open up possibilities for new modelling methods, monitoring facilities and for better river water quality management tools. They are useful also for the estimation of environmental impact of possible technological changes and can be applied in the pre-design stage or/and in the practical use of processes as well.
Resumo:
In the paper machine, it is not a desired feature for the boundary layer flows in the fabric and the roll surfaces to travel into the closing nips, creating overpressure. In this thesis, the aerodynamic behavior of the grooved roll and smooth rolls is compared in order to understand the nip flow phenomena, which is the main reason why vacuum and grooved roll constructions are designed. A common method to remove the boundary layer flow from the closing nip is to use the vacuum roll construction. The downside of the use of vacuum rolls is high operational costs due to pressure losses in the vacuum roll shell. The deep grooved roll has the same goal, to create a pressure difference over the paper web and keep the paper attached to the roll or fabric surface in the drying pocket of the paper machine. A literature review revealed that the aerodynamic functionality of the grooved roll is not very well known. In this thesis, the aerodynamic functionality of the grooved roll in interaction with a permeable or impermeable wall is studied by varying the groove properties. Computational fluid dynamics simulations are utilized as the research tool. The simulations have been performed with commercial fluid dynamics software, ANSYS Fluent. Simulation results made with 3- and 2-dimensional fluid dynamics models are compared to laboratory scale measurements. The measurements have been made with a grooved roll simulator designed for the research. The variables in the comparison are the paper or fabric wrap angle, surface velocities, groove geometry and wall permeability. Present-day computational and modeling resources limit grooved roll fluid dynamics simulations in the paper machine scale. Based on the analysis of the aerodynamic functionality of the grooved roll, a grooved roll simulation tool is proposed. The smooth roll simulations show that the closing nip pressure does not depend on the length of boundary layer development. The surface velocity increase affects the pressure distribution in the closing and opening nips. The 3D grooved roll model reveals the aerodynamic functionality of the grooved roll. With the optimal groove size it is possible to avoid closing nip overpressure and keep the web attached to the fabric surface in the area of the wrap angle. The groove flow friction and minor losses play a different role when the wrap angle is changed. The proposed 2D grooved roll simulation tool is able to replicate the grooved aerodynamic behavior with reasonable accuracy. A small wrap angle predicts the pressure distribution correctly with the chosen approach for calculating the groove friction losses. With a large wrap angle, the groove friction loss shows too large pressure gradients, and the way of calculating the air flow friction losses in the groove has to be reconsidered. The aerodynamic functionality of the grooved roll is based on minor and viscous losses in the closing and opening nips as well as in the grooves. The proposed 2D grooved roll model is a simplification in order to reduce computational and modeling efforts. The simulation tool makes it possible to simulate complex paper machine constructions in the paper machine scale. In order to use the grooved roll as a replacement for the vacuum roll, the grooved roll properties have to be considered on the basis of the web handling application.
Resumo:
This work deals with an evaluation of an experimental application about polarimetry for pharmacy and food engineering courses. Foods obtained from the undergraduate students were used for demonstrating multidisciplinary concepts and these concepts were associated to the teaching of polarimetry. According to the results, the benefits of the contextualization are beyond the class and the undergraduating students became interested in control of quality of foods. From these results, it can be concluded that the experimental emphasis given is valid and creates motivation and interest for learning physico-chemistry, in comparison with the traditional methodology applied to teach polarimetry.
Resumo:
Conduziu-se este trabalho com o objetivo de avaliar o desempenho de um sistema de irrigação a baixa pressão, bubbler, em condições de campo. A avaliação consistiu de um estudo dividido em duas fases, em que, na primeira, foi elaborado o dimensionamento hidráulico do sistema de irrigação, com uso do programa computacional Bubbler versão 1.1, enquanto, na segunda fase, ocorreram a instalação e os testes de campo. Estabeleceram-se as alturas de 0,77; 0,71; 0,68 e 0,67 m na saída das mangueiras emissoras no campo, conforme recomendação do programa. Foram feitas as avaliações de vazão em cada mangueira emissora, para determinar o Coeficiente de Uniformidade de Christiansen (CUC), a Uniformidade de Distribuição (UD) e a Eficiência de Aplicação (EA). Os testes mostraram CUC igual a 96,64%, UD igual a 95,85% e EA igual a 86,98%. O sistema no campo proporcionou vazão média de 64,8 L h-1 contra os 79,2 L h-1 estabelecidos pelo programa. Os valores encontrados de vazão diferiram dos valores projetados pelo aplicativo, em conseqüência da variação dos diâmetros e das perdas de carga (linear e localizada) que apresentaram desvio-padrão de 0,23 m.
Resumo:
O desenvolvimento de técnicas que permitam o aumento da eficiência de práticas de conservação do solo é necessário frente aos grandes prejuízos causados pela erosão. Nesse sentido, elaborou-se um software que utiliza bases de dados geradas em um Sistema de Informações Geográficas e que permite o dimensionamento de sistemas de terraceamento em nível de maneira mais racional, considerando as variações espaciais existentes no terreno. Como dados de entrada ao software, devem ser fornecidas imagens de elevação e declividade e, ainda, características de tipo, uso e manejo do solo, a metodologia para o cálculo do espaçamento entre terraços e a recomendação para a escolha do tipo de terraço mais aconselhado. Como resultados, o software fornece uma imagem com o sistema de terraceamento locado, que pode ser salva em diferentes formatos, bem como um relatório, que poderá ser impresso e usado juntamente com a imagem para a implantação do sistema.
Resumo:
O resfriamento de grãos por ventilação de ar ambiente aplica-se amplamente nas etapas finais de secagem e para controle posterior da temperatura de grãos armazenados. O objetivo deste trabalho é realizar estudo teórico-experimental sobre o estado térmico de massa de grãos de soja sujeita à aeração. Foram obtidos dados experimentais sobre a dinâmica de resfriamento de massa pré-aquecida de grãos de soja para diferentes alturas da coluna de grãos e velocidades do ar. A análise dos resultados mostrou que a taxa de resfriamento varia significativamente durante todo o processo e em todo o domínio e a difusividade térmica das camadas não poderia ser considerada constante. Para simular a dinâmica de resfriamento, foram apresentados dois modelos matemáticos. No primeiro modelo, o domínio de resfriamento foi dividido pela fronteira móvel em duas zonas, representadas por diferentes difusividades térmicas (análogo de problema de Stefan). No outro modelo, todo o domínio foi dividido hipoteticamente em pequenas camadas e foi considerado que o processo de equilíbrio térmico entre o ar e a massa de grãos para essas camadas é atingido instantaneamente ("reatores homogêneos"). Os resultados de simulações mostraram concordância satisfatória com os dados experimentais.
Resumo:
O presente trabalho teve por objetivo desenvolver softwares e hardwares para aplicação ao monitoramento e controle automático para a irrigação de precisão usando sistemas do tipo pivô central. O trabalho foi desenvolvido no Departamento de Engenharia Rural - LER, da Escola Superior de Agricultura "Luiz de Queiroz" - ESALQ, da Universidade de São Paulo - USP, em Piracicaba - SP. Foram utilizados componentes eletrônicos discretos, circuitos integrados diversos, módulos de radiofreqüência, microcontroladores da família Basic Step e um microcomputador. Foram utilizadas as linguagens Delphi e TBasic. O hardware é constituído de dois circuitos eletrônicos, sendo um deles para "interface" com o computador e o outro para monitoramento e transmissão da leitura de tensiômetros para o computador via radiofreqüência. Foram feitas avaliações do alcance e da eficiência na transmissão de dados dos módulos de radiofreqüência e do desempenho do software e do hardware. Os resultados mostraram que tanto os circuitos quanto os aplicativos desenvolvidos apresentaram funcionamento satisfatório. Os testes de comunicação dos rádios indicaram que esses possuem alcance máximo de 50 m. Concluiu-se que o sistema desenvolvido tem grande potencial para utilização em sistemas de irrigação de precisão usando pivô central, bastando para isso que o alcance dos rádios seja aumentado.
Resumo:
O software PRAPRAG é uma ferramenta de escolha de máquinas e implementos agrícolas que apresentam o menor custo por área ou por quantidade produzida, bem como, faz o planejamento de aquisição das máquinas para a propriedade agrícola, do ponto de vista técnico e econômico. Foi utilizada a linguagem de programação Borland Delphi 3.0 e, a partir de prospectos das máquinas e implementos, criou-se um banco de dados onde o usuário pode cadastrar e modificar suas características de uso. O software mostrou-se uma ferramenta útil e uso amigável. O software proporciona maior rapidez, segurança e confiabilidade ao processo produtivo e econômico das propriedades, na seleção e aquisição de conjuntos mecanizados agrícolas, e na determinação de custos com a mão de obra utilizada.
Resumo:
Os avanços da tecnologia de aplicação aérea de agroquímicos têm-se dado na direção de redução do volume de calda, o que pode ocasionar má distribuição e consequente deposição irregular. O presente trabalho teve como objetivo avaliar a qualidade da aplicação de calda de pulverização em aplicação aérea, na cultura da soja (Glycine Max L.). Para a aplicação, foi utilizada uma aeronave agrícola experimental, aplicando um volume de calda de 20 L ha-1 . Para a determinação dos volumes depositados nas folhas do terço superior, médio e inferior das plantas de soja, foi utilizado corante alimentício azul brilhante adicionado à calda de pulverização. Estas folhas foram lavadas, e o volume determinado por espectrofotometria. Para a obtenção do espectro de gotas, foram utilizados alvos artificiais constituídos por papel hidrossensível, distribuídos no terço superior e médio das plantas. Os dados foram submetidos à análise de variância de fator único, considerando as diferentes posições na planta, e cartas de controle foram feitas a partir dos limites inferior e superior de controle. A aplicação aérea de calda de pulverização na cultura da soja apresentou menores valores de diâmetro da mediana volumétrica, amplitude relativa e cobertura no terço médio em relação ao terço superior da cultura da soja. Houve menor deposição da calda de pulverização no terço inferior. Os indicadores de cobertura da calda de pulverização demonstraram que a aplicação aérea com a aeronave agrícola experimental avaliada não se encontra sob controle estatístico de processo, ou seja, fora do padrão de qualidade.