981 resultados para plug in


Relevância:

60.00% 60.00%

Publicador:

Resumo:

El presente proyecto tiene como objetivo la creación de un controlador MIDI económico que haga uso de la tecnología actual, y partiendo de la idea del instrumento clásico, el Theremin, desarrollado por Lev Serguéievich Termen. Para ello se ha dividido el proyecto en dos principales bloques, el primero, hardware y el segundo, software. En la parte del hardware, se explica cual ha sido la razón de la utilización del microprocesador Arduino Uno, sus características técnicas y el uso de sensores de ultrasonido, ya que proporcionan la característica de poder interactuar con el controlador a través de gestos con las manos, al igual que un Theremin clásico. Se explica el montaje de los dispositivos que conforman el controlador, así como la mejora realizada, con la utilización de 4 de estos sensores, para dar más capacidades de interactuación con el controlador MIDI. También se ve en ese apartado, como se programa la tarjeta de Arduino, para que se encargue de realizar medidas con los sensores y enviarlas por el puerto serial USB. En el apartado del software se da una introducción al entorno de programación Max/MSP. Se ve el plug in desarrollado con este lenguaje, para poder comunicar el controlador MIDI con un software de audio profesional (Ableton Live) y se explica con detalle los bloques que conforman el plug in de control de sensores y como es transformada la información que entrega el microprocesador Arduino por el puerto USB, en datos MIDI. También, se da una explicación sobre el manejo correcto del controlador a la hora de mover las manos sobre los sensores y de donde situar el instrumento para que no se produzcan problemas de interferencias con las señales que envían los ultrasonidos. Además, se proporciona un presupuesto del coste de los materiales, y otro del coste del desarrollo realizado por el ingeniero. ABSTRACT The aim of this Project is the creation of an economical MIDI controller that uses nowadays technology and that is based on the idea of the Theremin, a classical instrument conceived by Lev Serguéievich Termen. In order to accomplish this, the project has been divided into two sections: hardware and software. The hardware section explains why the microprocessor Arduino Uno has been chosen, sets out its technical specifications and the use of ultrasonic sensors. These sensors enable the user to interact with the controller through hand gestures like the Theremin. The assembly of the devices is exposed as well as the improvements made with the use of four of these sensors to offer more interactive capabilities with the MIDI controller. The Arduino singleboard programming that performs the measurements with the sensors and sends these measurements through the USB serial port is also explained here. The software section introduces Max/MSP programming environment as well as the plug in developed with this language that connects the MIDI controller with professional audio software (Ableton Live). The blocks that build the sensor controller plug in are explained in detail along with the way the Arduino delivers the information through the USB port into MIDI data. In addition, an explanation of the correct handling of the MIDI controller is given focusing on how the user should move his hands above the sensors and where to place the instrument to avoid interference problems with the signals sent. Also, a cost estimation of both materials and engineering is provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Context. This thesis is framed in experimental software engineering. More concretely, it addresses the problems arisen when assessing process conformance in test-driven development experiments conducted by UPM's Experimental Software Engineering group. Process conformance was studied using the Eclipse's plug-in tool Besouro. It has been observed that Besouro does not work correctly in some circumstances. It creates doubts about the correction of the existing experimental data which render it useless. Aim. The main objective of this work is the identification and correction of Besouro's faults. A secondary goal is fixing the datasets already obtained in past experiments to the maximum possible extent. This way, existing experimental results could be used with confidence. Method. (1) Testing Besouro using different sequences of events (creation methods, assertions etc..) to identify the underlying faults. (2) Fix the code and (3) fix the datasets using code specially created for this purpose. Results. (1) We confirmed the existence of several fault in Besouro's code that affected to Test-First and Test-Last episode identification. These faults caused the incorrect identification of 20% of episodes. (2) We were able to fix Besouro's code. (3) The correction of existing datasets was possible, subjected to some restrictions (such us the impossibility of tracing code size increase to programming time. Conclusion. The results of past experiments dependent upon Besouro's data could no be trustable. We have the suspicion that more faults remain in Besouro's code, whose identification requires further analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the main objectives of European Commission related to climate and energy is the well-known 20-20-20 targets to be achieved in 2020: Europe has to reduce greenhouse gas emissions of at least 20% below 1990 levels, 20% of EU energy consumption has to come from renewable resources and, finally, a 20% reduction in primary energy use compared with projected levels, has to be achieved by improving energy efficiency. In order to reach these objectives, it is necessary to reduce the overall emissions, mainly in transport (reducing CO2, NOx and other pollutants), and to increase the penetration of the intermittent renewable energy. A high deployment of battery electric (BEVs) and plug-in hybrid electric vehicles (PHEVs), with a low-cost source of energy storage, could help to achieve both targets. Hybrid electric vehicles (HEVs) use a combination of a conventional internal combustion engine (ICE) with one (or more) electric motor. There are different grades of hybridation from micro-hybrids with start-stop capability, mild hybrids (with kinetic energy recovery), medium hybrids (mild hybrids plus energy assist) and full hybrids (medium hybrids plus electric launch capability). These last types of vehicles use a typical battery capacity around 1-2 kWh. Plug in hybrid electric vehicles (PHEVs) use larger battery capacities to achieve limited electric-only driving range. These vehicles are charged by on-board electricity generation or either plugging into electric outlets. Typical battery capacity is around 10 kWh. Battery Electric Vehicles (BEVs) are only driven by electric power and their typical battery capacity is around 15-20 kWh. One type of PHEV, the Extended Range Electric Vehicle (EREV), operates as a BEV until its plug-in battery capacity is depleted; at which point its gasoline engine powers an electric generator to extend the vehicle's range. The charging of PHEVs (including EREVs) and BEVs will have different impacts to the electric grid, depending on the number of vehicles and the start time for charging. Initially, the lecture will start analyzing the electrical power requirements for charging PHEVs-BEVs in Flanders region (Belgium) under different charging scenarios. Secondly and based on an activity-based microsimulation mobility model, an efficient method to reduce this impact will be presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El concepto de algoritmo es básico en informática, por lo que es crucial que los alumnos profundicen en él desde el inicio de su formación. Por tanto, contar con una herramienta que guíe a los estudiantes en su aprendizaje puede suponer una gran ayuda en su formación. La mayoría de los autores coinciden en que, para determinar la eficacia de una herramienta de visualización de algoritmos, es esencial cómo se utiliza. Así, los estudiantes que participan activamente en la visualización superan claramente a los que la contemplan de forma pasiva. Por ello, pensamos que uno de los mejores ejercicios para un alumno consiste en simular la ejecución del algoritmo que desea aprender mediante el uso de una herramienta de visualización, i. e. consiste en realizar una simulación visual de dicho algoritmo. La primera parte de esta tesis presenta los resultados de una profunda investigación sobre las características que debe reunir una herramienta de ayuda al aprendizaje de algoritmos y conceptos matemáticos para optimizar su efectividad: el conjunto de especificaciones eMathTeacher, además de un entorno de aprendizaje que integra herramientas que las cumplen: GRAPHs. Hemos estudiado cuáles son las cualidades esenciales para potenciar la eficacia de un sistema e-learning de este tipo. Esto nos ha llevado a la definición del concepto eMathTeacher, que se ha materializado en el conjunto de especificaciones eMathTeacher. Una herramienta e-learning cumple las especificaciones eMathTeacher si actúa como un profesor virtual de matemáticas, i. e. si es una herramienta de autoevaluación que ayuda a los alumnos a aprender de forma activa y autónoma conceptos o algoritmos matemáticos, corrigiendo sus errores y proporcionando pistas para encontrar la respuesta correcta, pero sin dársela explícitamente. En estas herramientas, la simulación del algoritmo no continúa hasta que el usuario introduce la respuesta correcta. Para poder reunir en un único entorno una colección de herramientas que cumplan las especificaciones eMathTeacher hemos creado GRAPHs, un entorno ampliable, basado en simulación visual, diseñado para el aprendizaje activo e independiente de los algoritmos de grafos y creado para que en él se integren simuladores de diferentes algoritmos. Además de las opciones de creación y edición del grafo y la visualización de los cambios producidos en él durante la simulación, el entorno incluye corrección paso a paso, animación del pseudocódigo del algoritmo, preguntas emergentes, manejo de las estructuras de datos del algoritmo y creación de un log de interacción en XML. Otro problema que nos planteamos en este trabajo, por su importancia en el proceso de aprendizaje, es el de la evaluación formativa. El uso de ciertos entornos e-learning genera gran cantidad de datos que deben ser interpretados para llegar a una evaluación que no se limite a un recuento de errores. Esto incluye el establecimiento de relaciones entre los datos disponibles y la generación de descripciones lingüísticas que informen al alumno sobre la evolución de su aprendizaje. Hasta ahora sólo un experto humano era capaz de hacer este tipo de evaluación. Nuestro objetivo ha sido crear un modelo computacional que simule el razonamiento del profesor y genere un informe sobre la evolución del aprendizaje que especifique el nivel de logro de cada uno de los objetivos definidos por el profesor. Como resultado del trabajo realizado, la segunda parte de esta tesis presenta el modelo granular lingüístico de la evaluación del aprendizaje, capaz de modelizar la evaluación y generar automáticamente informes de evaluación formativa. Este modelo es una particularización del modelo granular lingüístico de un fenómeno (GLMP), en cuyo desarrollo y formalización colaboramos, basado en la lógica borrosa y en la teoría computacional de las percepciones. Esta técnica, que utiliza sistemas de inferencia basados en reglas lingüísticas y es capaz de implementar criterios de evaluación complejos, se ha aplicado a dos casos: la evaluación, basada en criterios, de logs de interacción generados por GRAPHs y de cuestionarios de Moodle. Como consecuencia, se han implementado, probado y utilizado en el aula sistemas expertos que evalúan ambos tipos de ejercicios. Además de la calificación numérica, los sistemas generan informes de evaluación, en lenguaje natural, sobre los niveles de competencia alcanzados, usando sólo datos objetivos de respuestas correctas e incorrectas. Además, se han desarrollado dos aplicaciones capaces de ser configuradas para implementar los sistemas expertos mencionados. Una procesa los archivos producidos por GRAPHs y la otra, integrable en Moodle, evalúa basándose en los resultados de los cuestionarios. ABSTRACT The concept of algorithm is one of the core subjects in computer science. It is extremely important, then, for students to get a good grasp of this concept from the very start of their training. In this respect, having a tool that helps and shepherds students through the process of learning this concept can make a huge difference to their instruction. Much has been written about how helpful algorithm visualization tools can be. Most authors agree that the most important part of the learning process is how students use the visualization tool. Learners who are actively involved in visualization consistently outperform other learners who view the algorithms passively. Therefore we think that one of the best exercises to learn an algorithm is for the user to simulate the algorithm execution while using a visualization tool, thus performing a visual algorithm simulation. The first part of this thesis presents the eMathTeacher set of requirements together with an eMathTeacher-compliant tool called GRAPHs. For some years, we have been developing a theory about what the key features of an effective e-learning system for teaching mathematical concepts and algorithms are. This led to the definition of eMathTeacher concept, which has materialized in the eMathTeacher set of requirements. An e-learning tool is eMathTeacher compliant if it works as a virtual math trainer. In other words, it has to be an on-line self-assessment tool that helps students to actively and autonomously learn math concepts or algorithms, correcting their mistakes and providing them with clues to find the right answer. In an eMathTeacher-compliant tool, algorithm simulation does not continue until the user enters the correct answer. GRAPHs is an extendible environment designed for active and independent visual simulation-based learning of graph algorithms, set up to integrate tools to help the user simulate the execution of different algorithms. Apart from the options of creating and editing the graph, and visualizing the changes made to the graph during simulation, the environment also includes step-by-step correction, algorithm pseudo-code animation, pop-up questions, data structure handling and XML-based interaction log creation features. On the other hand, assessment is a key part of any learning process. Through the use of e-learning environments huge amounts of data can be output about this process. Nevertheless, this information has to be interpreted and represented in a practical way to arrive at a sound assessment that is not confined to merely counting mistakes. This includes establishing relationships between the available data and also providing instructive linguistic descriptions about learning evolution. Additionally, formative assessment should specify the level of attainment of the learning goals defined by the instructor. Till now, only human experts were capable of making such assessments. While facing this problem, our goal has been to create a computational model that simulates the instructor’s reasoning and generates an enlightening learning evolution report in natural language. The second part of this thesis presents the granular linguistic model of learning assessment to model the assessment of the learning process and implement the automated generation of a formative assessment report. The model is a particularization of the granular linguistic model of a phenomenon (GLMP) paradigm, based on fuzzy logic and the computational theory of perceptions, to the assessment phenomenon. This technique, useful for implementing complex assessment criteria using inference systems based on linguistic rules, has been applied to two particular cases: the assessment of the interaction logs generated by GRAPHs and the criterion-based assessment of Moodle quizzes. As a consequence, several expert systems to assess different algorithm simulations and Moodle quizzes have been implemented, tested and used in the classroom. Apart from the grade, the designed expert systems also generate natural language progress reports on the achieved proficiency level, based exclusively on the objective data gathered from correct and incorrect responses. In addition, two applications, capable of being configured to implement the expert systems, have been developed. One is geared up to process the files output by GRAPHs and the other one is a Moodle plug-in set up to perform the assessment based on the quizzes results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En 1938 surge en la escena académica un nuevo hombre –el homo ludens–, que llega de la mano del historiador Johan Huizinga. Ante el inminente aumento del tiempo libre, a partir de la disminución de la jornada de trabajo por una creciente automatización de la industria, el homo ludens viene a complementar el tiempo de trabajo ocupado desde la Revolución Industrial por el homo faber. La validez y fuerza de la propuesta de Huizinga está basada en el Juego como generador de cultura y desde allí el estudio del comportamiento de la sociedad desde su aspecto lúdico. Aquí radica la fortaleza representativa del HOMO LUDENS para las vanguardias de posguerra, que se proponen incluirlo en sus debates culturales analíticos y propositivos. En la posguerra Europa vivía el triunfalismo de la guerra en medio de sociedades y ciudades ruinosas que necesitaban una urgente reconstrucción. Se oían otras realidades, tanto el ‘American way of life’ como “The Americans” (Robert Frank, 1958): la sociedad de la abundancia, el gusto popular, la inclusión de lo cotidiano y lo banal, la publicidad, los medios de comunicación de masas, el consumo, la reconversión de la industria bélica en doméstica, las telecomunicaciones, los robots, los nuevos materiales, la carrera espacial y la ciencia ficción. Inglaterra surge como natural vínculo social y cultural con Estados Unidos, convirtiéndose en catalizador de las ideas vanguardistas. Ese panorama efervescente, ‘los años pop’, surge como una realidad cultural que representaba la complejidad, el caos, y la fascinación de la imaginería de un futuro presente, ilustrado en las propuestas artísticas de la época. Surgían, tanto desde Estados Unidos como desde Europa, ideas novedosas sobre la utilización lúdica del tiempo libre, como forma de potenciar la naturaleza creativa del ser humano: el Independent Group, Charles y Ray Eames, el Black Mountain Collage, el Theatre of Action, The Factory, la Internacional Situacionista y la Generación Beat. Estos grupos de gran impacto cultural basaron sus acciones en la utilización del Juego en sus propuestas, esencialmente bajo la in¬ fluencia dadá y singularmente de Marcel Duchamp cuyas propuestas revolucionaron la historia del arte del siglo XX. Todos ellos exploraron permanentemente la unión del arte a la vida a través de experiencias lúdicas, sirviendo como motivadores de las propuestas arquitectónico-urbanísticas en estudio en esta investigación. Estas principales son: de Alison y Peter Smithson “Berlín Hauptstadt” (1957); de ARCHIGRAM “Plug-in city” (1963-1964) e Instant City (1968); de Yona Friedman “Arquitectura Móvil” (1957); de Cederic Price “Fun Palace” (1960-1961); y de Constant Nieuwenhuys “New Babylon” (1959-64). La investigación fue conducida por una búsqueda ‘flexible’ de conceptos, hechos, personajes y proyectos. Desde el análisis de los escritos, gráficos, estudios y trabajos realizados por los protagonistas de la vanguardia, así como de una lectura contextual de la época enriquecida e ilustrada con hechos significativos y anécdotas divertidas, se opera en simultaneidad con una multiplicidad de fuentes. Se maneja un gran volumen de información proveniente de áreas de conocimiento muy diferentes (filosofía, arte, arquitectura, antropología, sociología, sicología, etc.), trabajándose con un objeto de estudio ‘vivo’, abierto y en constante reorganización. Pretende además de comunicar sus dérives, como construcción de discursos históricos, estimular el planteamiento de nuevas derivas. A través de un proceso minucioso se buscaron las relaciones entre los conceptos de Juego teóricos elaborados por distintos pensadores a partir de Kant y Schiller, que tuvieran una clara relación con los procesos proyectuales de los Arquitectos de la vanguardia en estudio. Dos factores son claves: el carácter de seriedad del Juego y la decisión de situar la investigación en el lugar del juego-play (juego libre y sin reglas). Se concluye sobre la estrecha relación Arquitectura y Juego en la vanguardia estudiada, profundizando en: el acto creativo del proyectar como proceso lúdico –por parte del Arquitecto–; el desarrollo de un homo ludens destinatario/usuario; las nuevas herramientas disciplinares desarrolladas: soporte topológico, plug-in, fragmentación, flexibilidad, on-off, transportabilidad. Una nueva arquitectura lúdica desde la que se profundiza en los aportes disciplinares teóricos y prácticos bajo este enfoque: la Arquitectura desde el automovimiento, lo efímero, el fragmento, el azar e indeterminación, lo ficticio, el vacío.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

kdens produces univariate kernel density estimates and graphs the result. kdens supplements official Stata's kdensity. Important additions are: adaptive (i.e. variable bandwidth) kernel density estimation, several automatic bandwidth selectors including the Sheather-Jones plug-in estimator, pointwise variability bands and confidence intervals, boundary correction for variables with bounded domain, fast binned approximation estimation. Note that the moremata package, also available from SSC, is required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The reliability of the printed circuit board assembly under dynamic environments, such as those found onboard airplanes, ships and land vehicles is receiving more attention. This research analyses the dynamic characteristics of the printed circuit board (PCB) supported by edge retainers and plug-in connectors. By modelling the wedge retainer and connector as providing simply supported boundary condition with appropriate rotational spring stiffnesses along their respective edges with the aid of finite element codes, accurate natural frequencies for the board against experimental natural frequencies are obtained. For a PCB supported by two opposite wedge retainers and a plug-in connector and with its remaining edge free of any restraint, it is found that these real supports behave somewhere between the simply supported and clamped boundary conditions and provide a percentage fixity of 39.5% more than the classical simply supported case. By using an eigensensitivity method, the rotational stiffnesses representing the boundary supports of the PCB can be updated effectively and is capable of representing the dynamics of the PCB accurately. The result shows that the percentage error in the fundamental frequency of the PCB finite element model is substantially reduced from 22.3% to 1.3%. The procedure demonstrated the effectiveness of using only the vibration test frequencies as reference data when the mode shapes of the original untuned model are almost identical to the referenced modes/experimental data. When using only modal frequencies in model improvement, the analysis is very much simplified. Furthermore, the time taken to obtain the experimental data will be substantially reduced as the experimental mode shapes are not required.In addition, this thesis advocates a relatively simple method in determining the support locations for maximising the fundamental frequency of vibrating structures. The technique is simple and does not require any optimisation or sequential search algorithm in the analysis. The key to the procedure is to position the necessary supports at positions so as to eliminate the lower modes from the original configuration. This is accomplished by introducing point supports along the nodal lines of the highest possible mode from the original configuration, so that all the other lower modes are eliminated by the introduction of the new or extra supports to the structure. It also proposes inspecting the average driving point residues along the nodal lines of vibrating plates to find the optimal locations of the supports. Numerical examples are provided to demonstrate its validity. By applying to the PCB supported on its three sides by two wedge retainers and a connector, it is found that a single point constraint that would yield maximum fundamental frequency is located at the mid-point of the nodal line, namely, node 39. This point support has the effect of increasing the structure's fundamental frequency from 68.4 Hz to 146.9 Hz, or 115% higher.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Browsing constitutes an important part of the user information searching process on the Web. In this paper, we present a browser plug-in called ESpotter, which recognizes entities of various types on Web pages and highlights them according to their types to assist user browsing. ESpotter uses a range of standard named entity recognition techniques. In addition, a key new feature of ESpotter is that it addresses the problem of multiple domains on the Web by adapting lexicon and patterns to these domains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The modern grid system or the smart grid is likely to be populated with multiple distributed energy sources, e.g. wind power, PV power, Plug-in Electric Vehicle (PEV). It will also include a variety of linear and nonlinear loads. The intermittent nature of renewable energies like PV, wind turbine and increased penetration of Electric Vehicle (EV) makes the stable operation of utility grid system challenging. In order to ensure a stable operation of the utility grid system and to support smart grid functionalities such as, fault ride-through, frequency response, reactive power support, and mitigation of power quality issues, an energy storage system (ESS) could play an important role. A fast acting bidirectional energy storage system which can rapidly provide and absorb power and/or VARs for a sufficient time is a potentially valuable tool to support this functionality. Battery energy storage systems (BESS) are one of a range suitable energy storage system because it can provide and absorb power for sufficient time as well as able to respond reasonably fast. Conventional BESS already exist on the grid system are made up primarily of new batteries. The cost of these batteries can be high which makes most BESS an expensive solution. In order to assist moving towards a low carbon economy and to reduce battery cost this work aims to research the opportunities for the re-use of batteries after their primary use in low and ultra-low carbon vehicles (EV/HEV) on the electricity grid system. This research aims to develop a new generation of second life battery energy storage systems (SLBESS) which could interface to the low/medium voltage network to provide necessary grid support in a reliable and in cost-effective manner. The reliability/performance of these batteries is not clear, but is almost certainly worse than a new battery. Manufacturers indicate that a mixture of gradual degradation and sudden failure are both possible and failure mechanisms are likely to be related to how hard the batteries were driven inside the vehicle. There are several figures from a number of sources including the DECC (Department of Energy and Climate Control) and Arup and Cenex reports indicate anything from 70,000 to 2.6 million electric and hybrid vehicles on the road by 2020. Once the vehicle battery has degraded to around 70-80% of its capacity it is considered to be at the end of its first life application. This leaves capacity available for a second life at a much cheaper cost than a new BESS Assuming a battery capability of around 5-18kWhr (MHEV 5kWh - BEV 18kWh battery) and approximate 10 year life span, this equates to a projection of battery storage capability available for second life of >1GWhrs by 2025. Moreover, each vehicle manufacturer has different specifications for battery chemistry, number and arrangement of battery cells, capacity, voltage, size etc. To enable research and investment in this area and to maximize the remaining life of these batteries, one of the design challenges is to combine these hybrid batteries into a grid-tie converter where their different performance characteristics, and parameter variation can be catered for and a hot swapping mechanism is available so that as a battery ends it second life, it can be replaced without affecting the overall system operation. This integration of either single types of batteries with vastly different performance capability or a hybrid battery system to a grid-tie 3 energy storage system is different to currently existing work on battery energy storage systems (BESS) which deals with a single type of battery with common characteristics. This thesis addresses and solves the power electronic design challenges in integrating second life hybrid batteries into a grid-tie energy storage unit for the first time. This study details a suitable multi-modular power electronic converter and its various switching strategies which can integrate widely different batteries to a grid-tie inverter irrespective of their characteristics, voltage levels and reliability. The proposed converter provides a high efficiency, enhanced control flexibility and has the capability to operate in different operational modes from the input to output. Designing an appropriate control system for this kind of hybrid battery storage system is also important because of the variation of battery types, differences in characteristics and different levels of degradations. This thesis proposes a generalised distributed power sharing strategy based on weighting function aims to optimally use a set of hybrid batteries according to their relative characteristics while providing the necessary grid support by distributing the power between the batteries. The strategy is adaptive in nature and varies as the individual battery characteristics change in real time as a result of degradation for example. A suitable bidirectional distributed control strategy or a module independent control technique has been developed corresponding to each mode of operation of the proposed modular converter. Stability is an important consideration in control of all power converters and as such this thesis investigates the control stability of the multi-modular converter in detailed. Many controllers use PI/PID based techniques with fixed control parameters. However, this is not found to be suitable from a stability point-of-view. Issues of control stability using this controller type under one of the operating modes has led to the development of an alternative adaptive and nonlinear Lyapunov based control for the modular power converter. Finally, a detailed simulation and experimental validation of the proposed power converter operation, power sharing strategy, proposed control structures and control stability issue have been undertaken using a grid connected laboratory based multi-modular hybrid battery energy storage system prototype. The experimental validation has demonstrated the feasibility of this new energy storage system operation for use in future grid applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62E16,62F15, 62H12, 62M20.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Efficient and reliable techniques for power delivery and utilization are needed to account for the increased penetration of renewable energy sources in electric power systems. Such methods are also required for current and future demands of plug-in electric vehicles and high-power electronic loads. Distributed control and optimal power network architectures will lead to viable solutions to the energy management issue with high level of reliability and security. This dissertation is aimed at developing and verifying new techniques for distributed control by deploying DC microgrids, involving distributed renewable generation and energy storage, through the operating AC power system. To achieve the findings of this dissertation, an energy system architecture was developed involving AC and DC networks, both with distributed generations and demands. The various components of the DC microgrid were designed and built including DC-DC converters, voltage source inverters (VSI) and AC-DC rectifiers featuring novel designs developed by the candidate. New control techniques were developed and implemented to maximize the operating range of the power conditioning units used for integrating renewable energy into the DC bus. The control and operation of the DC microgrids in the hybrid AC/DC system involve intelligent energy management. Real-time energy management algorithms were developed and experimentally verified. These algorithms are based on intelligent decision-making elements along with an optimization process. This was aimed at enhancing the overall performance of the power system and mitigating the effect of heavy non-linear loads with variable intensity and duration. The developed algorithms were also used for managing the charging/discharging process of plug-in electric vehicle emulators. The protection of the proposed hybrid AC/DC power system was studied. Fault analysis and protection scheme and coordination, in addition to ideas on how to retrofit currently available protection concepts and devices for AC systems in a DC network, were presented. A study was also conducted on the effect of changing the distribution architecture and distributing the storage assets on the various zones of the network on the system's dynamic security and stability. A practical shipboard power system was studied as an example of a hybrid AC/DC power system involving pulsed loads. Generally, the proposed hybrid AC/DC power system, besides most of the ideas, controls and algorithms presented in this dissertation, were experimentally verified at the Smart Grid Testbed, Energy Systems Research Laboratory. All the developments in this dissertation were experimentally verified at the Smart Grid Testbed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Event-B is a formal method for modeling and verification of discrete transition systems. Event-B development yields proof obligations that must be verified (i.e. proved valid) in order to keep the produced models consistent. Satisfiability Modulo Theory solvers are automated theorem provers used to verify the satisfiability of logic formulas considering a background theory (or combination of theories). SMT solvers not only handle large firstorder formulas, but can also generate models and proofs, as well as identify unsatisfiable subsets of hypotheses (unsat-cores). Tool support for Event-B is provided by the Rodin platform: an extensible Eclipse based IDE that combines modeling and proving features. A SMT plug-in for Rodin has been developed intending to integrate alternative, efficient verification techniques to the platform. We implemented a series of complements to the SMT solver plug-in for Rodin, namely improvements to the user interface for when proof obligations are reported as invalid by the plug-in. Additionally, we modified some of the plug-in features, such as support for proof generation and unsat-core extraction, to comply with the SMT-LIB standard for SMT solvers. We undertook tests using applicable proof obligations to demonstrate the new features. The contributions described can potentially affect productivity in a positive manner.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

My thesis examines fine-scale habitat use and movement patterns of age 1 Greenland cod (Gadus macrocephalus ogac) tracked using acoustic telemetry. Recent advances in tracking technologies such as GPS and acoustic telemetry have led to increasingly large and detailed datasets that present new opportunities for researchers to address fine-scale ecological questions regarding animal movement and spatial distribution. There is a growing demand for home range models that will not only work with massive quantities of autocorrelated data, but that can also exploit the added detail inherent in these high-resolution datasets. Most published home range studies use radio-telemetry or satellite data from terrestrial mammals or avian species, and most studies that evaluate the relative performance of home range models use simulated data. In Chapter 2, I used actual field-collected data from age-1 Greenland cod tracked with acoustic telemetry to evaluate the accuracy and precision of six home range models: minimum convex polygons, kernel densities with plug-in bandwidth selection and the reference bandwidth, adaptive local convex hulls, Brownian bridges, and dynamic Brownian bridges. I then applied the most appropriate model to two years (2010-2012) of tracking data collected from 82 tagged Greenland cod tracked in Newman Sound, Newfoundland, Canada, to determine diel and seasonal differences in habitat use and movement patterns (Chapter 3). Little is known of juvenile cod ecology, so resolving these relationships will provide valuable insight into activity patterns, habitat use, and predator-prey dynamics, while filling a knowledge gap regarding the use of space by age 1 Greenland cod in a coastal nursery habitat. By doing so, my thesis demonstrates an appropriate technique for modelling the spatial use of fish from acoustic telemetry data that can be applied to high-resolution, high-frequency tracking datasets collected from mobile organisms in any environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Open Access funded by Medical Research Council Acknowledgment The work reported here was funded by a grant from the Medical Research Council, UK, grant number: MR/J013838/1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2014 over 70% of people in Great Britain accessed the Internet every day. This resource is an optimal vector for malicious attackers to penetrate home computers and as such compromised pages have been increasing in both number and complexity. This paper presents X-Secure, a novel browser plug-in designed to present and raise the awareness of inexperienced users by analysing web-pages before malicious scripts are executed by the host computer. X-Secure was able to detect over 90% of the tested attacks and provides a danger level based on cumulative analysis of the source code, the URL, and the remote server, by using a set of heuristics, hence increasing the situational awareness of users browsing the internet.