37 resultados para Hydrologic Modeling Catchment and Runoff Computations
em Universidad Politécnica de Madrid
Application of the Extended Kalman filter to fuzzy modeling: Algorithms and practical implementation
Resumo:
Modeling phase is fundamental both in the analysis process of a dynamic system and the design of a control system. If this phase is in-line is even more critical and the only information of the system comes from input/output data. Some adaptation algorithms for fuzzy system based on extended Kalman filter are presented in this paper, which allows obtaining accurate models without renounce the computational efficiency that characterizes the Kalman filter, and allows its implementation in-line with the process
Advances in the modeling, characterization and reliability of concentrator multijunction solar cells
Resumo:
Los sistemas de concentración fotovoltaica (CPV) parecen ser una de las vías más prometedoras para generar electricidad a gran escala a precios competitivos. La investigación actual se centra en aumentar la eficiencia y la concentración de los sistemas para abaratar costes. Al mismo tiempo se investiga sobre la fiabilidad de los diferentes componentes que integran un sistema de concentración, ya que para que los sistemas de concentración sean competitivos es necesario que tengan una fiabilidad al menos similar a los sistemas basados en células de silicio. En la presente tesis doctoral se ha llevado a cabo el estudio de aspectos avanzados de células solares multi-unión diseñadas para trabajar a concentraciones ultra-altas. Para ello, se ha desarrollado un modelo circuital tridimensional distribuido con el que simular el comportamiento de las células solares triple-unión bajo distintas condiciones de funcionamiento, así mismo se ha realizado una caracterización avanzada de este tipo de células para comprender mejor su modo de operación y así poder contribuir a mejorar su eficiencia. Finalmente, se han llevado a cabo ensayos de vida acelerados en células multiunión comerciales para conocer la fiabilidad de este tipo de células solares. Para la simulación de células solares triple-unión se ha desarrollado en la presente tesis doctoral un modelo circuital tridimensinal distribuido el cuál integra una descripción completa de la unión túnel. De este modo, con el modelo desarrollado, hemos podido simular perfiles de luz sobre la célula solar que hacen que la densidad de corriente fotogenerada sea mayor a la densidad de corriente pico de la unión túnel. El modelo desarrollado también contempla la distribución lateral de corriente en las capas semiconductoras que componen y rodean la unión túnel. Por tanto, se ha podido simular y analizar el efecto que tiene sobre el funcionamiento de la célula solar que los concentradores ópticos produzcan perfiles de luz desuniformes, tanto en nivel de irradiancia como en el contenido espectral de la luz (aberración cromática). Con el objetivo de determinar cuáles son los mecanismos de recombinación que están limitando el funcionamiento de cada subcélula que integra una triple-unión, y así intentar reducirlos, se ha llevado a cabo la caracterización eléctrica de células solares monouni ón idénticas a las subcelulas de una triple-unión. También se ha determinado la curva corriente-tensión en oscuridad de las subcélulas de GaInP y GaAs de una célula dobleunión mediante la utilización de un teorema de reciprocidad electro-óptico. Finalmente, se ha analizado el impacto de los diferentes mecanismos de recombinación en el funcionamiento de la célula solar triple-unión en concentración. Por último, para determinar la fiabilidad de este tipo de células, se ha llevado a cabo un ensayo de vida acelerada en temperatura en células solares triple-unión comerciales. En la presente tesis doctoral se describe el diseño del ensayo, el progreso del mismo y los datos obtenidos tras el análisis de los resultados preliminares. Abstract Concentrator photovoltaic systems (CPV) seem to be one of the most promising ways to generate electricity at competitive prices. Nowadays, the research is focused on increasing the efficiency and the concentration of the systems in order to reduce costs. At the same time, another important area of research is the study of the reliability of the different components which make up a CPV system. In fact, in order for a CPV to be cost-effective, it should have a warranty at least similar to that of the systems based on Si solar cells. In the present thesis, we will study in depth the behavior of multijunction solar cells under ultra-high concentration. With this purpose in mind, a three-dimensional circuital distributed model which is able to simulate the behavior of triple-junction solar cells under different working conditions has been developed. Also, an advanced characterization of these solar cells has been carried out in order to better understand their behavior and thus contribute to improving efficiency. Finally, accelerated life tests have been carried out on commercial lattice-matched triple-junction solar cells in order to determine their reliability. In order to simulate triple-junction solar cells, a 3D circuital distributed model which integrates a full description of the tunnel junction has been developed. We have analyzed the behavior of the multijunction solar cell under light profiles which cause the current density photo-generated in the solar cell to be higher than the tunnel junction’s peak current density. The advanced model developed also takes into account the lateral current spreading through the semiconductor layers which constitute and surround the tunnel junction. Therefore, the effects of non-uniform light profiles, in both irradiance and the spectral content produced by the concentrators on the solar cell, have been simulated and analyzed. In order to determine which recombination mechanisms are limiting the behavior of each subcell in a triple-junction stack, and to try to reduce them when possible, an electrical characterization of single-junction solar cells that resemble the subcells in a triplejunction stack has been carried out. Also, the dark I-V curves of the GaInP and GaAs subcells in a dual-junction solar cell have been determined by using an electro-optical reciprocity theorem. Finally, the impact of the different recombination mechanisms on the behavior of the triple-junction solar cell under concentration has been analyzed. In order to determine the reliability of these solar cells, a temperature accelerated life test has been carried out on commercial triple-junction solar cells. In the present thesis, the design and the evolution of the test, as well as the data obtained from the analysis of the preliminary results, are presented.
Resumo:
At present, all methods in Evolutionary Computation are bioinspired by the fundamental principles of neo-Darwinism, as well as by a vertical gene transfer. Virus transduction is one of the key mechanisms of horizontal gene propagation in microorganisms (e.g. bacteria). In the present paper, we model and simulate a transduction operator, exploring the possible role and usefulness of transduction in a genetic algorithm. The genetic algorithm including transduction has been named PETRI (abbreviation of Promoting Evolution Through Reiterated Infection). Our results showed how PETRI approaches higher fitness values as transduction probability comes close to 100%. The conclusion is that transduction improves the performance of a genetic algorithm, assuming a population divided among several sub-populations or ?bacterial colonies?.
Resumo:
The opening of new windows on the façade is proposed as a refurbishment strategy in an existing building in Málaga to facilitate cross ventilation of dwellings. The building is a residential block of 140 public housing units for rent for people with low income in Málaga (Spain), property of the City Council. By modeling with Computational Fluid Dynamics (CFD), eleven configurations of openings are studied in two different areas of the main housing type of the building. The quantity of introduced/extracted air into/from the room and the generated airflow patterns are obtained. The modeling allows comparing the different openings configurations to determine the most appropriate ventilation option for every room.
Resumo:
A clear statement in these lines textually cited (Byers et al., 1938) defines the framework of this special issue: “True soil is the product of the action of climate and living organism upon the parent material, as conditioned by the local relief. The length of time during which these forces are operative is of great importance in determining the character of the ultimate product. Drainage conditions are also important and are controlled by local relief, by the nature of the parent material or underlying rock strata, or by the amount of precipitation in relation to rate of percolation and runoff water. There are, therefore, five principal factors of soil formation: Parent material, climate, biological activity, relief and time. These soil forming factors are interdependent, each modifying the effectiveness of the others.” Due to these various processes associated to its formation and genesis soil dynamics reveals high complexity that creates several levels of structure using this term in a broad sense
Resumo:
Hydrology is the study of the properties, distribution and effects of water on the Earth?s soil, rocks and atmosphere. It also encompasses the study of the hydrologic cycle of precipitation, runoff, infiltration, storage, and evaporation, including the physical, biological and chemical reaction of water with the earth and its relation to life?.
Resumo:
Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic programming (and more recently, constraint programming) resulting in quite capable parallelizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.
Resumo:
Irregular computations pose some of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. In the past decade there has been significant progress in the development of parallelizing compilers for logic programming and, more recently, constraint programming. The typical applications of these paradigms frequently involve irregular computations, which arguably makes the techniques used in these compilers potentially interesting. In this paper we introduce in a tutorial way some of the problems faced by parallelizing compilers for logic and constraint programs. These include the need for inter-procedural pointer aliasing analysis for independence detection and having to manage speculative and irregular computations through task granularity control and dynamic task allocation. We also provide pointers to some of the progress made in these áreas. In the associated talk we demónstrate representatives of several generations of these parallelizing compilers.
Resumo:
Sustaining irrigated agriculture to meet food production needs while maintaining aquatic ecosystems is at the heart of many policy debates in various parts of the world, especially in arid and semi-arid areas. Researchers and practitioners are increasingly calling for integrated approaches, and policy-makers are progressively supporting the inclusion of ecological and social aspects in water management programs. This paper contributes to this policy debate by providing an integrated economic-hydrologic modeling framework that captures the socio-economic and environmental effects of various policy initiatives and climate variability. This modeling integration includes a risk-based economic optimization model and a hydrologic water management simulation model that have been specified for the Middle Guadiana basin, a vulnerable drought-prone agro-ecological area with highly regulated river systems in southwest Spain. Namely, two key water policy interventions were investigated: the implementation of minimum environmental flows (supported by the European Water Framework Directive, EU WFD), and a reduction in the legal amount of water delivered for irrigation (planned measure included in the new Guadiana River Basin Management Plan, GRBMP, still under discussion). Results indicate that current patterns of excessive water use for irrigation in the basin may put environmental flow demands at risk, jeopardizing the WFD s goal of restoring the ?good ecological status? of water bodies by 2015. Conflicts between environmental and agricultural water uses will be stressed during prolonged dry episodes, and particularly in summer low-flow periods, when there is an important increase of crop irrigation water requirements. Securing minimum stream flows would entail a substantial reduction in irrigation water use for rice cultivation, which might affect the profitability and economic viability of small rice-growing farms located upstream in the river. The new GRBMP could contribute to balance competing water demands in the basin and to increase economic water productivity, but might not be sufficient to ensure the provision of environmental flows as required by the WFD. A thoroughly revision of the basin s water use concession system for irrigation seems to be needed in order to bring the GRBMP in line with the WFD objectives. Furthermore, the study illustrates that social, economic, institutional, and technological factors, in addition to bio-physical conditions, are important issues to be considered for designing and developing water management strategies. The research initiative presented in this paper demonstrates that hydro-economic models can explicitly integrate all these issues, constituting a valuable tool that could assist policy makers for implementing sustainable irrigation policies.
Resumo:
In the recent years the missing fourth component, the memristor, was successfully synthesized. However, the mathematical complexity and variety of the models behind this component, in addition to the existence of convergence problems in the simulations, make the design of memristor-based applications long and difficult. In this work we present a memristor model characterization framework which supports the automated generation of subcircuit files. The proposed environment allows the designer to choose and parameterize the memristor model that best suits for a given application. The framework carries out characterizing simulations in order to study the possible non-convergence problems, solving the dependence on the simulation conditions and guaranteeing the functionality and performance of the design. Additionally, the occurrence of undesirable effects related to PVT variations is also taken into account. By performing a Monte Carlo or a corner analysis, the designer is aware of the safety margins which assure the correct device operation.
Resumo:
Dynamic thermal management techniques require a collection of on-chip thermal sensors that imply a significant area and power overhead. Finding the optimum number of temperature monitors and their location on the chip surface to optimize accuracy is an NP-hard problem. In this work we improve the modeling of the problem by including area, power and networking constraints along with the consideration of three inaccuracy terms: spatial errors, sampling rate errors and monitor-inherent errors. The problem is solved by the simulated annealing algorithm. We apply the algorithm to a test case employing three different types of monitors to highlight the importance of the different metrics. Finally we present a case study of the Alpha 21364 processor under two different constraint scenarios.
Resumo:
En la interacción con el entorno que nos rodea durante nuestra vida diaria (utilizar un cepillo de dientes, abrir puertas, utilizar el teléfono móvil, etc.) y en situaciones profesionales (intervenciones médicas, procesos de producción, etc.), típicamente realizamos manipulaciones avanzadas que incluyen la utilización de los dedos de ambas manos. De esta forma el desarrollo de métodos de interacción háptica multi-dedo dan lugar a interfaces hombre-máquina más naturales y realistas. No obstante, la mayoría de interfaces hápticas disponibles en el mercado están basadas en interacciones con un solo punto de contacto; esto puede ser suficiente para la exploración o palpación del entorno pero no permite la realización de tareas más avanzadas como agarres. En esta tesis, se investiga el diseño mecánico, control y aplicaciones de dispositivos hápticos modulares con capacidad de reflexión de fuerzas en los dedos índice, corazón y pulgar del usuario. El diseño mecánico de la interfaz diseñada, ha sido optimizado con funciones multi-objetivo para conseguir una baja inercia, un amplio espacio de trabajo, alta manipulabilidad y reflexión de fuerzas superiores a 3 N en el espacio de trabajo. El ancho de banda y la rigidez del dispositivo se han evaluado mediante simulación y experimentación real. Una de las áreas más importantes en el diseño de estos dispositivos es el efector final, ya que es la parte que está en contacto con el usuario. Durante este trabajo se ha diseñado un dedal de bajo peso, adaptable a diferentes usuarios que, mediante la incorporación de sensores de contacto, permite estimar fuerzas normales y tangenciales durante la interacción con entornos reales y virtuales. Para el diseño de la arquitectura de control, se estudiaron los principales requisitos para estos dispositivos. Entre estos, cabe destacar la adquisición, procesado e intercambio a través de internet de numerosas señales de control e instrumentación; la computación de equaciones matemáticas incluyendo la cinemática directa e inversa, jacobiana, algoritmos de detección de agarres, etc. Todos estos componentes deben calcularse en tiempo real garantizando una frecuencia mínima de 1 KHz. Además, se describen sistemas para manipulación de precisión virtual y remota; así como el diseño de un método denominado "desacoplo cinemático iterativo" para computar la cinemática inversa de robots y la comparación con otros métodos actuales. Para entender la importancia de la interacción multimodal, se ha llevado a cabo un estudio para comprobar qué estímulos sensoriales se correlacionan con tiempos de respuesta más rápidos y de mayor precisión. Estos experimentos se desarrollaron en colaboración con neurocientíficos del instituto Technion Israel Institute of Technology. Comparando los tiempos de respuesta en la interacción unimodal (auditiva, visual y háptica) con combinaciones bimodales y trimodales de los mismos, se demuestra que el movimiento sincronizado de los dedos para generar respuestas de agarre se basa principalmente en la percepción háptica. La ventaja en el tiempo de procesamiento de los estímulos hápticos, sugiere que los entornos virtuales que incluyen esta componente sensorial generan mejores contingencias motoras y mejoran la credibilidad de los eventos. Se concluye que, los sistemas que incluyen percepción háptica dotan a los usuarios de más tiempo en las etapas cognitivas para rellenar información de forma creativa y formar una experiencia más rica. Una aplicación interesante de los dispositivos hápticos es el diseño de nuevos simuladores que permitan entrenar habilidades manuales en el sector médico. En colaboración con fisioterapeutas de Griffith University en Australia, se desarrolló un simulador que permite realizar ejercicios de rehabilitación de la mano. Las propiedades de rigidez no lineales de la articulación metacarpofalange del dedo índice se estimaron mediante la utilización del efector final diseñado. Estos parámetros, se han implementado en un escenario que simula el comportamiento de la mano humana y que permite la interacción háptica a través de esta interfaz. Las aplicaciones potenciales de este simulador están relacionadas con entrenamiento y educación de estudiantes de fisioterapia. En esta tesis, se han desarrollado nuevos métodos que permiten el control simultáneo de robots y manos robóticas en la interacción con entornos reales. El espacio de trabajo alcanzable por el dispositivo háptico, se extiende mediante el cambio de modo de control automático entre posición y velocidad. Además, estos métodos permiten reconocer el gesto del usuario durante las primeras etapas de aproximación al objeto para su agarre. Mediante experimentos de manipulación avanzada de objetos con un manipulador y diferentes manos robóticas, se muestra que el tiempo en realizar una tarea se reduce y que el sistema permite la realización de la tarea con precisión. Este trabajo, es el resultado de una colaboración con investigadores de Harvard BioRobotics Laboratory. ABSTRACT When we interact with the environment in our daily life (using a toothbrush, opening doors, using cell-phones, etc.), or in professional situations (medical interventions, manufacturing processes, etc.) we typically perform dexterous manipulations that involve multiple fingers and palm for both hands. Therefore, multi-Finger haptic methods can provide a realistic and natural human-machine interface to enhance immersion when interacting with simulated or remote environments. Most commercial devices allow haptic interaction with only one contact point, which may be sufficient for some exploration or palpation tasks but are not enough to perform advanced object manipulations such as grasping. In this thesis, I investigate the mechanical design, control and applications of a modular haptic device that can provide force feedback to the index, thumb and middle fingers of the user. The designed mechanical device is optimized with a multi-objective design function to achieve a low inertia, a large workspace, manipulability, and force-feedback of up to 3 N within the workspace; the bandwidth and rigidity for the device is assessed through simulation and real experimentation. One of the most important areas when designing haptic devices is the end-effector, since it is in contact with the user. In this thesis the design and evaluation of a thimble-like, lightweight, user-adaptable, and cost-effective device that incorporates four contact force sensors is described. This design allows estimation of the forces applied by a user during manipulation of virtual and real objects. The design of a real-time, modular control architecture for multi-finger haptic interaction is described. Requirements for control of multi-finger haptic devices are explored. Moreover, a large number of signals have to be acquired, processed, sent over the network and mathematical computations such as device direct and inverse kinematics, jacobian, grasp detection algorithms, etc. have to be calculated in Real Time to assure the required high fidelity for the haptic interaction. The Hardware control architecture has different modules and consists of an FPGA for the low-level controller and a RT controller for managing all the complex calculations (jacobian, kinematics, etc.); this provides a compact and scalable solution for the required high computation capabilities assuring a correct frequency rate for the control loop of 1 kHz. A set-up for dexterous virtual and real manipulation is described. Moreover, a new algorithm named the iterative kinematic decoupling method was implemented to solve the inverse kinematics of a robotic manipulator. In order to understand the importance of multi-modal interaction including haptics, a subject study was carried out to look for sensory stimuli that correlate with fast response time and enhanced accuracy. This experiment was carried out in collaboration with neuro-scientists from Technion Israel Institute of Technology. By comparing the grasping response times in unimodal (auditory, visual, and haptic) events with the response times in events with bimodal and trimodal combinations. It is concluded that in grasping tasks the synchronized motion of the fingers to generate the grasping response relies on haptic cues. This processing-speed advantage of haptic cues suggests that multimodalhaptic virtual environments are superior in generating motor contingencies, enhancing the plausibility of events. Applications that include haptics provide users with more time at the cognitive stages to fill in missing information creatively and form a richer experience. A major application of haptic devices is the design of new simulators to train manual skills for the medical sector. In collaboration with physical therapists from Griffith University in Australia, we developed a simulator to allow hand rehabilitation manipulations. First, the non-linear stiffness properties of the metacarpophalangeal joint of the index finger were estimated by using the designed end-effector; these parameters are implemented in a scenario that simulates the behavior of the human hand and that allows haptic interaction through the designed haptic device. The potential application of this work is related to educational and medical training purposes. In this thesis, new methods to simultaneously control the position and orientation of a robotic manipulator and the grasp of a robotic hand when interacting with large real environments are studied. The reachable workspace is extended by automatically switching between rate and position control modes. Moreover, the human hand gesture is recognized by reading the relative movements of the index, thumb and middle fingers of the user during the early stages of the approximation-to-the-object phase and then mapped to the robotic hand actuators. These methods are validated to perform dexterous manipulation of objects with a robotic manipulator, and different robotic hands. This work is the result of a research collaboration with researchers from the Harvard BioRobotics Laboratory. The developed experiments show that the overall task time is reduced and that the developed methods allow for full dexterity and correct completion of dexterous manipulations.
Resumo:
Most CPV systems are based on Fresnel lenses. Among these, LPI-patented Fresnel-Köhler (FK) concentrator outstands owing to performance and practical reasons. The VentanaTM power train is the first off-the-shelf commercial product based on the FK and comprises both the primary (POE) lenses (a 36-units 1×1 m2 acrylic panel manufactured by EVONIK and 10×) and glass (or Savosil) secondary optics (SOE). This high concentration optical train (Cg=1,024×, ~250mm optical depth) fits with 5×5 mm2 (at least) solar cells. The optical train is the fruit of a 1-year development that has included design, modeling, prototyping and characterization, and through the process LPI had the opportunity to find out how well the actual performance correlates with models, but also learned practical aspects of a CPV system of this kind, some of which have very positive impact on system performance and reliability.
Resumo:
Acquired brain injury (ABI) 1-2 refers to any brain damage occurring after birth. It usually causes certain damage to portions of the brain. ABI may result in a significant impairment of an individuals physical, cognitive and/or psychosocial functioning. The main causes are traumatic brain injury (TBI), cerebrovascular accident (CVA) and brain tumors. The main consequence of ABI is a dramatic change in the individuals daily life. This change involves a disruption of the family, a loss of future income capacity and an increase of lifetime cost. One of the main challenges in neurorehabilitation is to obtain a dysfunctional profile of each patient in order to personalize the treatment. This paper proposes a system to generate a patient s dysfunctional profile by integrating theoretical, structural and neuropsychological information on a 3D brain imaging-based model. The main goal of this dysfunctional profile is to help therapists design the most suitable treatment for each patient. At the same time, the results obtained are a source of clinical evidence to improve the accuracy and quality of our rehabilitation system. Figure 1 shows the diagram of the system. This system is composed of four main modules: image-based extraction of parameters, theoretical modeling, classification and co-registration and visualization module.
Resumo:
The aim of the paper is to discuss the use of knowledge models to formulate general applications. First, the paper presents the recent evolution of the software field where increasing attention is paid to conceptual modeling. Then, the current state of knowledge modeling techniques is described where increased reliability is available through the modern knowledge acquisition techniques and supporting tools. The KSM (Knowledge Structure Manager) tool is described next. First, the concept of knowledge area is introduced as a building block where methods to perform a collection of tasks are included together with the bodies of knowledge providing the basic methods to perform the basic tasks. Then, the CONCEL language to define vocabularies of domains and the LINK language for methods formulation are introduced. Finally, the object oriented implementation of a knowledge area is described and a general methodology for application design and maintenance supported by KSM is proposed. To illustrate the concepts and methods, an example of system for intelligent traffic management in a road network is described. This example is followed by a proposal of generalization for reuse of the resulting architecture. Finally, some concluding comments are proposed about the feasibility of using the knowledge modeling tools and methods for general application design.