944 resultados para Correlazione digitale immagini freeware MATLAB
Resumo:
Una de las actuaciones posibles para la gestión de los residuos sólidos urbanos es la valorización energética, es decir la incineración con recuperación de energía. Sin embargo es muy importante controlar adecuadamente el proceso de incineración para evitar en lo posible la liberación de sustancias contaminantes a la atmósfera que puedan ocasionar problemas de contaminación industrial.Conseguir que tanto el proceso de incineración como el tratamiento de los gases se realice en condiciones óptimas presupone tener un buen conocimiento de las dependencias entre las variables de proceso. Se precisan métodos adecuados de medida de las variables más importantes y tratar los valores medidos con modelos adecuados para transformarlos en magnitudes de mando. Un modelo clásico para el control parece poco prometedor en este caso debido a la complejidad de los procesos, la falta de descripción cuantitativa y la necesidad de hacer los cálculos en tiempo real. Esto sólo se puede conseguir con la ayuda de las modernas técnicas de proceso de datos y métodos informáticos, tales como el empleo de técnicas de simulación, modelos matemáticos, sistemas basados en el conocimiento e interfases inteligentes. En [Ono, 1989] se describe un sistema de control basado en la lógica difusa aplicado al campo de la incineración de residuos urbanos. En el centro de investigación FZK de Karslruhe se están desarrollando aplicaciones que combinan la lógica difusa con las redes neuronales [Jaeschke, Keller, 1994] para el control de la planta piloto de incineración de residuos TAMARA. En esta tesis se plantea la aplicación de un método de adquisición de conocimiento para el control de sistemas complejos inspirado en el comportamiento humano. Cuando nos encontramos ante una situación desconocida al principio no sabemos como actuar, salvo por la extrapolación de experiencias anteriores que puedan ser útiles. Aplicando procedimientos de prueba y error, refuerzo de hipótesis, etc., vamos adquiriendo y refinando el conocimiento, y elaborando un modelo mental. Podemos diseñar un método análogo, que pueda ser implementado en un sistema informático, mediante el empleo de técnicas de Inteligencia Artificial.Así, en un proceso complejo muchas veces disponemos de un conjunto de datos del proceso que a priori no nos dan información suficientemente estructurada para que nos sea útil. Para la adquisición de conocimiento pasamos por una serie de etapas: - Hacemos una primera selección de cuales son las variables que nos interesa conocer. - Estado del sistema. En primer lugar podemos empezar por aplicar técnicas de clasificación (aprendizaje no supervisado) para agrupar los datos y obtener una representación del estado de la planta. Es posible establecer una clasificación, pero normalmente casi todos los datos están en una sola clase, que corresponde a la operación normal. Hecho esto y para refinar el conocimiento utilizamos métodos estadísticos clásicos para buscar correlaciones entre variables (análisis de componentes principales) y así poder simplificar y reducir la lista de variables. - Análisis de las señales. Para analizar y clasificar las señales (por ejemplo la temperatura del horno) es posible utilizar métodos capaces de describir mejor el comportamiento no lineal del sistema, como las redes neuronales. Otro paso más consiste en establecer relaciones causales entre las variables. Para ello nos sirven de ayuda los modelos analíticos - Como resultado final del proceso se pasa al diseño del sistema basado en el conocimiento. El objetivo principal es aplicar el método al caso concreto del control de una planta de tratamiento de residuos sólidos urbanos por valorización energética. En primer lugar, en el capítulo 2 Los residuos sólidos urbanos, se trata el problema global de la gestión de los residuos, dando una visión general de las diferentes alternativas existentes, y de la situación nacional e internacional en la actualidad. Se analiza con mayor detalle la problemática de la incineración de los residuos, poniendo especial interés en aquellas características de los residuos que tienen mayor importancia de cara al proceso de combustión.En el capítulo 3, Descripción del proceso, se hace una descripción general del proceso de incineración y de los distintos elementos de una planta incineradora: desde la recepción y almacenamiento de los residuos, pasando por los distintos tipos de hornos y las exigencias de los códigos de buena práctica de combustión, el sistema de aire de combustión y el sistema de humos. Se presentan también los distintos sistemas de depuración de los gases de combustión, y finalmente el sistema de evacuación de cenizas y escorias.El capítulo 4, La planta de tratamiento de residuos sólidos urbanos de Girona, describe los principales sistemas de la planta incineradora de Girona: la alimentación de residuos, el tipo de horno, el sistema de recuperación de energía, y el sistema de depuración de los gases de combustión Se describe también el sistema de control, la operación, los datos de funcionamiento de la planta, la instrumentación y las variables que son de interés para el control del proceso de combustión.En el capítulo 5, Técnicas utilizadas, se proporciona una visión global de los sistemas basados en el conocimiento y de los sistemas expertos. Se explican las diferentes técnicas utilizadas: redes neuronales, sistemas de clasificación, modelos cualitativos, y sistemas expertos, ilustradas con algunos ejemplos de aplicación.Con respecto a los sistemas basados en el conocimiento se analizan en primer lugar las condiciones para su aplicabilidad, y las formas de representación del conocimiento. A continuación se describen las distintas formas de razonamiento: redes neuronales, sistemas expertos y lógica difusa, y se realiza una comparación entre ellas. Se presenta una aplicación de las redes neuronales al análisis de series temporales de temperatura.Se trata también la problemática del análisis de los datos de operación mediante técnicas estadísticas y el empleo de técnicas de clasificación. Otro apartado está dedicado a los distintos tipos de modelos, incluyendo una discusión de los modelos cualitativos.Se describe el sistema de diseño asistido por ordenador para el diseño de sistemas de supervisión CASSD que se utiliza en esta tesis, y las herramientas de análisis para obtener información cualitativa del comportamiento del proceso: Abstractores y ALCMEN. Se incluye un ejemplo de aplicación de estas técnicas para hallar las relaciones entre la temperatura y las acciones del operador. Finalmente se analizan las principales características de los sistemas expertos en general, y del sistema experto CEES 2.0 que también forma parte del sistema CASSD que se ha utilizado.El capítulo 6, Resultados, muestra los resultados obtenidos mediante la aplicación de las diferentes técnicas, redes neuronales, clasificación, el desarrollo de la modelización del proceso de combustión, y la generación de reglas. Dentro del apartado de análisis de datos se emplea una red neuronal para la clasificación de una señal de temperatura. También se describe la utilización del método LINNEO+ para la clasificación de los estados de operación de la planta.En el apartado dedicado a la modelización se desarrolla un modelo de combustión que sirve de base para analizar el comportamiento del horno en régimen estacionario y dinámico. Se define un parámetro, la superficie de llama, relacionado con la extensión del fuego en la parrilla. Mediante un modelo linealizado se analiza la respuesta dinámica del proceso de incineración. Luego se pasa a la definición de relaciones cualitativas entre las variables que se utilizan en la elaboración de un modelo cualitativo. A continuación se desarrolla un nuevo modelo cualitativo, tomando como base el modelo dinámico analítico.Finalmente se aborda el desarrollo de la base de conocimiento del sistema experto, mediante la generación de reglas En el capítulo 7, Sistema de control de una planta incineradora, se analizan los objetivos de un sistema de control de una planta incineradora, su diseño e implementación. Se describen los objetivos básicos del sistema de control de la combustión, su configuración y la implementación en Matlab/Simulink utilizando las distintas herramientas que se han desarrollado en el capítulo anterior.Por último para mostrar como pueden aplicarse los distintos métodos desarrollados en esta tesis se construye un sistema experto para mantener constante la temperatura del horno actuando sobre la alimentación de residuos.Finalmente en el capítulo Conclusiones, se presentan las conclusiones y resultados de esta tesis.
Resumo:
The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·
Resumo:
This paper presents the model SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes), which is a vertical (1-D) integrated radiative transfer and energy balance model. The model links visible to thermal infrared radiance spectra (0.4 to 50 μm) as observed above the canopy to the fluxes of water, heat and carbon dioxide, as a function of vegetation structure, and the vertical profiles of temperature. Output of the model is the spectrum of outgoing radiation in the viewing direction and the turbulent heat fluxes, photosynthesis and chlorophyll fluorescence. A special routine is dedicated to the calculation of photosynthesis rate and chlorophyll fluorescence at the leaf level as a function of net radiation and leaf temperature. The fluorescence contributions from individual leaves are integrated over the canopy layer to calculate top-of-canopy fluorescence. The calculation of radiative transfer and the energy balance is fully integrated, allowing for feedback between leaf temperatures, leaf chlorophyll fluorescence and radiative fluxes. Leaf temperatures are calculated on the basis of energy balance closure. Model simulations were evaluated against observations reported in the literature and against data collected during field campaigns. These evaluations showed that SCOPE is able to reproduce realistic radiance spectra, directional radiance and energy balance fluxes. The model may be applied for the design of algorithms for the retrieval of evapotranspiration from optical and thermal earth observation data, for validation of existing methods to monitor vegetation functioning, to help interpret canopy fluorescence measurements, and to study the relationships between synoptic observations with diurnally integrated quantities. The model has been implemented in Matlab and has a modular design, thus allowing for great flexibility and scalability.
Resumo:
The presented study examined the opinion of in-service and prospective chemistry teachers about the importance of usage of molecular and crystal models in secondary-level school practice, and investigated some of the reasons for their (non-) usage. The majority of participants stated that the use of models plays an important role in chemistry education and that they would use them more often if the circumstances were more favourable. Many teachers claimed that three-dimensional (3d) models are still not available in sufficient number at their schools; they also pointed to the lack of available computer facilities during chemistry lessons. The research revealed that, besides the inadequate material circumstances, less than one third of participants are able to use simple (freeware) computer programs for drawing molecular structures and their presentation in virtual space; however both groups of teachers expressed the willingness to improve their knowledge in the subject area. The investigation points to several actions which could be undertaken to improve the current situation.
Resumo:
Space applications are challenged by the reliability of parallel computing systems (FPGAs) employed in space crafts due to Single-Event Upsets. The work reported in this paper aims to achieve self-managing systems which are reliable for space applications by applying autonomic computing constructs to parallel computing systems. A novel technique, 'Swarm-Array Computing' inspired by swarm robotics, and built on the foundations of autonomic and parallel computing is proposed as a path to achieve autonomy. The constitution of swarm-array computing comprising for constituents, namely the computing system, the problem / task, the swarm and the landscape is considered. Three approaches that bind these constituents together are proposed. The feasibility of one among the three proposed approaches is validated on the SeSAm multi-agent simulator and landscapes representing the computing space and problem are generated using the MATLAB.
Resumo:
This paper illustrates how nonlinear programming and simulation tools, which are available in packages such as MATLAB and SIMULINK, can easily be used to solve optimal control problems with state- and/or input-dependent inequality constraints. The method presented is illustrated with a model of a single-link manipulator. The method is suitable to be taught to advanced undergraduate and Master's level students in control engineering.
Resumo:
The work reported in this paper proposes Swarm-Array computing, a novel technique inspired by swarm robotics, and built on the foundations of autonomic and parallel computing. The approach aims to apply autonomic computing constructs to parallel computing systems and in effect achieve the self-ware objectives that describe self-managing systems. The constitution of swarm-array computing comprising four constituents, namely the computing system, the problem/task, the swarm and the landscape is considered. Approaches that bind these constituents together are proposed. Space applications employing FPGAs are identified as a potential area for applying swarm-array computing for building reliable systems. The feasibility of a proposed approach is validated on the SeSAm multi-agent simulator and landscapes are generated using the MATLAB toolkit.
Resumo:
In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.
Resumo:
Techniques for modelling urban microclimates and urban block surfaces temperatures are desired by urban planners and architects for strategic urban designs at the early design stages. This paper introduces a simplified mathematical model for urban simulations (UMsim) including urban surfaces temperatures and microclimates. The nodal network model has been developed by integrating coupled thermal and airflow model. Direct solar radiation, diffuse radiation, reflected radiation, long-wave radiation, heat convection in air and heat transfer in the exterior walls and ground within the complex have been taken into account. The relevant equations have been solved using the finite difference method under the Matlab platform. Comparisons have been conducted between the data produced from the simulation and that from an urban experimental study carried out in a real architectural complex on the campus of Chongqing University, China in July 2005 and January 2006. The results show a satisfactory agreement between the two sets of data. The UMsim can be used to simulate the microclimates, in particular the surface temperatures of urban blocks, therefore it can be used to assess the impact of urban surfaces properties on urban microclimates. The UMsim will be able to produce robust data and images of urban environments for sustainable urban design.
Resumo:
A Bond Graph is a graphical modelling technique that allows the representation of energy flow between the components of a system. When used to model power electronic systems, it is necessary to incorporate bond graph elements to represent a switch. In this paper, three different methods of modelling switching devices are compared and contrasted: the Modulated Transformer with a binary modulation ratio (MTF), the ideal switch element, and the Switched Power Junction (SPJ) method. These three methods are used to model a dc-dc Boost converter and then run simulations in MATLAB/SIMULINK. To provide a reference to compare results, the converter is also simulated using PSPICE. Both quantitative and qualitative comparisons are made to determine the suitability of each of the three Bond Graph switch models in specific power electronics applications
Resumo:
The development of an Artificial Neural Network model of UK domestic appliance energy consumption is presented. The model uses diary-style appliance use data and a survey questionnaire collected from 51 households during the summer of 2010. It also incorporates measured energy data and is sensitive to socioeconomic, physical dwelling and temperature variables. A prototype model is constructed in MATLAB using a two layer feed forward network with backpropagation training and has a12:10:24architecture.Model outputs include appliance load profiles which can be applied to the fields of energy planning (micro renewables and smart grids), building simulation tools and energy policy.
Resumo:
The development of a combined engineering and statistical Artificial Neural Network model of UK domestic appliance load profiles is presented. The model uses diary-style appliance use data and a survey questionnaire collected from 51 suburban households and 46 rural households during the summer of 2010 and2011 respectively. It also incorporates measured energy data and is sensitive to socioeconomic, physical dwelling and temperature variables. A prototype model is constructed in MATLAB using a two layer feed forward network with back propagation training which has a 12:10:24 architecture. Model outputs include appliance load profiles which can be applied to the fields of energy planning (microrenewables and smart grids), building simulation tools and energy policy.
Resumo:
This paper describes an experimental application of constrained predictive control and feedback linearisation based on dynamic neural networks. It also verifies experimentally a method for handling input constraints, which are transformed by the feedback linearisation mappings. A performance comparison with a PID controller is also provided. The experimental system consists of a laboratory based single link manipulator arm, which is controlled in real time using MATLAB/SIMULINK together with data acquisition equipment.
Resumo:
The type and thickness of insulation on the topside horizontal of cold pitched roofs has a significant role in controlling air movement, energy conservation and moisture transfer reduction through the ceiling to the loft (roof void) space. To investigate its importance, a numerical model using a HAM software package on a Matlab platform with a Simulink simulation tool has been developed using insitu measurements of airflows from the dwelling space through the ceiling to the loft of three houses of different configurations and loft space. Considering typical UK roof underlay (i.e. bituminous felt and a vapour permeable underlay), insitu measurements of the 3 houses were tested using a calibrated passive sampling technique. Using the measured airflows, the effect of air movement on three types of roof insulation (i.e. fibreglass, cellulose and foam) was modelled to investigate associated energy losses and moisture transport. The thickness of the insulation materials were varied but the ceiling airtightness and eaves gap size were kept constant. These instances were considered in order to visualize the effects of the changing parameters. In addition, two different roof underlays of varying resistances were considered and compared to access the influence of the underlay, if any, on energy conservation. The comparison of these insulation materials in relation to the other parameters showed that the type of insulation material and thickness, contributes significantly to energy conservation and moisture transfer reduction through the roof and hence of the building as a whole.
Resumo:
The ability to create accurate geometric models of neuronal morphology is important for understanding the role of shape in information processing. Despite a significant amount of research on automating neuron reconstructions from image stacks obtained via microscopy, in practice most data are still collected manually. This paper describes Neuromantic, an open source system for three dimensional digital tracing of neurites. Neuromantic reconstructions are comparable in quality to those of existing commercial and freeware systems while balancing speed and accuracy of manual reconstruction. The combination of semi-automatic tracing, intuitive editing, and ability of visualizing large image stacks on standard computing platforms provides a versatile tool that can help address the reconstructions availability bottleneck. Practical considerations for reducing the computational time and space requirements of the extended algorithm are also discussed.