992 resultados para system configuration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a knowledge model for a configuration problem in the do-main of traffic control. The goal of this model is to help traffic engineers in the dynamic selection of a set of messages to be presented to drivers on variable message signals. This selection is done in a real-time context using data recorded by traffic detectors on motorways. The system follows an advanced knowledge-based solution that implements two abstract problem solving methods according to a model-based approach recently proposed in the knowledge engineering field. Finally, the paper presents a discussion about the advantages and drawbacks found for this problem as a consequence of the applied knowledge modeling ap-proach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to describe an intelligent system for the problem of real time road traffic control. The purpose of the system is to help traffic engineers in the selection of the state of traffic control devices on real time, using data recorded by traffic detectors on motorways. The system follows an advanced knowledge-based approach that implements an abstract generic problem solving method, called propose-and-revise, which was proposed in Artificial Intelligence, within the knowledge engineering field, as a standard cognitive structure oriented to solve configuration design problems. The paper presents the knowledge model of such a system together with the strategy of inference and describes how it was applied for the case of the M-40 urban ring for the city of Madrid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document is the result of a process of web development to create a tool that will allow to Cracow University of Technology consult, create and manage timetables. The technologies chosen for this purpose are Apache Tomcat Server, My SQL Community Server, JDBC driver, Java Servlets and JSPs for the server side. The client part counts on Javascript, jQuery, AJAX and CSS technologies to perform the dynamism. The document will justify the choice of these technologies and will explain some development tools that help in the integration and development of all this elements: specifically, NetBeans IDE and MySQL workbench have been used as helpful tools. After explaining all the elements involved in the development of the web application, the architecture and the code developed are explained through UML diagrams. Some implementation details related to security are also deeper explained through sequence diagrams. As the source code of the application is provided, an installation manual has been developed to run the project. In addition, as the platform is intended to be a beta that will be grown, some unimplemented ideas for future development are also exposed. Finally, some annexes with important files and scripts related to the initiation of the platform are attached. This project started through an existing tool that needed to be expanded. The main purpose of the project along its development has focused on setting the roots for a whole new platform that will replace the existing one. For this goal, it has been needed to make a deep inspection on the existing web technologies: a web server and a SQL database had to be chosen. Although the alternatives were a lot, Java technology for the server was finally selected because of the big community backwards, the easiness of modelling the language through UML diagrams and the fact of being free license software. Apache Tomcat is the open source server that can use Java Servlet and JSP technology. Related to the SQL database, MySQL Community Server is the most popular open-source SQL Server, with a big community after and quite a lot of tools to manage the server. JDBC is the driver needed to put in contact Java and MySQL. Once we chose the technologies that would be part of the platform, the development process started. After a detailed explanation of the development environment installation, we used UML use case diagrams to set the main tasks of the platform; UML class diagrams served to establish the existing relations between the classes generated; the architecture of the platform was represented through UML deployment diagrams; and Enhanced entity–relationship (EER) model were used to define the tables of the database and their relationships. Apart from the previous diagrams, some implementation issues were explained to make a better understanding of the developed code - UML sequence diagrams helped to explain this. Once the whole platform was properly defined and developed, the performance of the application has been shown: it has been proved that with the current state of the code, the platform covers the use cases that were set as the main target. Nevertheless, some requisites needed for the proper working of the platform have been specified. As the project is aimed to be grown, some ideas that could not be added to this beta have been explained in order not to be missed for future development. Finally, some annexes containing important configuration issues for the platform have been added after proper explanation, as well as an installation guide that will let a new developer get the project ready. In addition to this document some other files related to the project are provided: - Javadoc. The Javadoc containing the information of every Java class created is necessary for a better understanding of the source code. - database_model.mwb. This file contains the model of the database for MySQL Workbench. This model allows, among other things, generate the MySQL script for the creation of the tables. - ScheduleManager.war. The WAR file that will allow loading the developed application into Tomcat Server without using NetBeans. - ScheduleManager.zip. The source code exported from NetBeans project containing all Java packages, JSPs, Javascript files and CSS files that are part of the platform. - config.properties. The configuration file to properly get the names and credentials to use the database, also explained in Annex II. Example of config.properties file. - db_init_script.sql. The SQL query to initiate the database explained in Annex III. SQL statements for MySQL initialization. RESUMEN. Este proyecto tiene como punto de partida la necesidad de evolución de una herramienta web existente. El propósito principal del proyecto durante su desarrollo se ha centrado en establecer las bases de una completamente nueva plataforma que reemplazará a la existente. Para lograr esto, ha sido necesario realizar una profunda inspección en las tecnologías web existentes: un servidor web y una base de datos SQL debían ser elegidos. Aunque existen muchas alternativas, la tecnología Java ha resultado ser elegida debido a la gran comunidad de desarrolladores que tiene detrás, además de la facilidad que proporciona este lenguaje a la hora de modelarlo usando diagramas UML. Tampoco hay que olvidar que es una tecnología de uso libre de licencia. Apache Tomcat es el servidor de código libre que permite emplear Java Servlets y JSPs para hacer uso de la tecnología de Java. Respecto a la base de datos SQL, el servidor más popular de código libre es MySQL, y cuenta también con una gran comunidad detrás y buenas herramientas de modelado, creación y gestión de la bases de datos. JDBC es el driver que va a permitir comunicar las aplicaciones Java con MySQL. Tras elegir las tecnologías que formarían parte de esta nueva plataforma, el proceso de desarrollo tiene comienzo. Tras una extensa explicación de la instalación del entorno de desarrollo, se han usado diagramas de caso de UML para establecer cuáles son los objetivos principales de la plataforma; los diagramas de clases nos permiten realizar una organización del código java desarrollado de modo que sean fácilmente entendibles las relaciones entre las diferentes clases. La arquitectura de la plataforma queda definida a través de diagramas de despliegue. Por último, diagramas EER van a definir las relaciones entre las tablas creadas en la base de datos. Aparte de estos diagramas, algunos detalles de implementación se van a justificar para tener una mejor comprensión del código desarrollado. Diagramas de secuencia ayudarán en estas explicaciones. Una vez que toda la plataforma haya quedad debidamente definida y desarrollada, se va a realizar una demostración de la misma: se demostrará cómo los objetivos generales han sido alcanzados con el desarrollo actual del proyecto. No obstante, algunos requisitos han sido aclarados para que la plataforma trabaje adecuadamente. Como la intención del proyecto es crecer (no es una versión final), algunas ideas que se han podido llevar acabo han quedado descritas de manera que no se pierdan. Por último, algunos anexos que contienen información importante acerca de la plataforma se han añadido tras la correspondiente explicación de su utilidad, así como una guía de instalación que va a permitir a un nuevo desarrollador tener el proyecto preparado. Junto a este documento, ficheros conteniendo el proyecto desarrollado quedan adjuntos. Estos ficheros son: - Documentación Javadoc. Contiene la información de las clases Java que han sido creadas. - database_model.mwb. Este fichero contiene el modelo de la base de datos para MySQL Workbench. Esto permite, entre otras cosas, generar el script de iniciación de la base de datos para la creación de las tablas. - ScheduleManager.war. El fichero WAR que permite desplegar la plataforma en un servidor Apache Tomcat. - ScheduleManager.zip. El código fuente exportado directamente del proyecto de Netbeans. Contiene todos los paquetes de Java generados, ficheros JSPs, Javascript y CSS que forman parte de la plataforma. - config.properties. Ejemplo del fichero de configuración que permite obtener los nombres de la base de datos - db_init_script.sql. Las consultas SQL necesarias para la creación de la base de datos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic and Partial Reconfiguration (DPR) allows a system to be able to modify certain parts of itself during run-time. This feature gives rise to the capability of evolution: changing parts of the configuration according to the online evaluation of performance or other parameters. The evolution is achieved through a bio-inspired model in which the features of the system are identified as genes. The objective of the evolution may not be a single one; in this work, power consumption is taken into consideration, together with the quality of filtering, as the measure of performance, of a noisy image. Pareto optimality is applied to the evolutionary process, in order to find a representative set of optimal solutions as for performance and power consumption. The main contributions of this paper are: implementing an evolvable system on a low-power Spartan-6 FPGA included in a Wireless Sensor Network node and, by enabling the availability of a real measure of power consumption at run-time, achieving the capability of multi-objective evolution, that yields different optimal configurations, among which the selected one will depend on the relative “weights” of performance and power consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the GTH-UPM system for the Albayzin 2014 Search on Speech Evaluation. Teh evaluation task consists of searching a list of terms/queries in audio files. The GTH-UPM system we are presenting is based on a LVCSR (Large Vocabulary Continuous Speech Recognition) system. We have used MAVIR corpus and the Spanish partition of the EPPS (European Parliament Plenary Sessions) database for training both acoustic and language models. The main effort has been focused on lexicon preparation and text selection for the language model construction. The system makes use of different lexicon and language models depending on the task that is performed. For the best configuration of the system on the development set, we have obtained a FOM of 75.27 for the deyword spotting task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las herramientas de configuración basadas en lenguajes de alto nivel como LabVIEW permiten el desarrollo de sistemas de adquisición de datos basados en hardware reconfigurable FPGA muy complejos en un breve periodo de tiempo. La estandarización del ciclo de diseño hardware/software y la utilización de herramientas como EPICS facilita su integración con la plataforma de adquisición y control ITER CODAC CORE SYSTEM (CCS) basada en Linux. En este proyecto se propondrá una metodología que simplificará el ciclo completo de integración de plataformas novedosas, como cRIO, en las que el funcionamiento del hardware de adquisición puede ser modificado por el usuario para que éste se amolde a sus requisitos específicos. El objetivo principal de este proyecto fin de master es realizar la integración de un sistema cRIO NI9159 y diferentes módulos de E/S analógica y digital en EPICS y en CODAC CORE SYSTEM (CCS). Este último consiste en un conjunto de herramientas software que simplifican la integración de los sistemas de instrumentación y control del experimento ITER. Para cumplir el objetivo se realizarán las siguientes tareas: • Desarrollo de un sistema de adquisición de datos basado en FPGA con la plataforma hardware CompactRIO. En esta tarea se realizará la configuración del sistema y la implementación en LabVIEW para FPGA del hardware necesario para comunicarse con los módulos: NI9205, NI9264, NI9401.NI9477, NI9426, NI9425 y NI9476 • Implementación de un driver software utilizando la metodología de AsynDriver para integración del cRIO con EPICS. Esta tarea requiere definir todos los records necesarios que exige EPICS y crear las interfaces adecuadas que permitirán comunicarse con el hardware. • Implementar la descripción del sistema cRIO y del driver EPICS en el sistema de descripción de plantas de ITER llamado SDD. Esto automatiza la creación de las aplicaciones de EPICS que se denominan IOCs. SUMMARY The configuration tools based in high-level programing languages like LabVIEW allows the development of high complex data acquisition systems based on reconfigurable hardware FPGA in a short time period. The standardization of the hardware/software design cycle and the use of tools like EPICS ease the integration with the data acquisition and control platform of ITER, the CODAC Core System based on Linux. In this project a methodology is proposed in order to simplify the full integration cycle of new platforms like CompactRIO (cRIO), in which the data acquisition functionality can be reconfigured by the user to fits its concrete requirements. The main objective of this MSc final project is to develop the integration of a cRIO NI-9159 and its different analog and digital Input/Output modules with EPICS in a CCS. The CCS consists of a set of software tools that simplifies the integration of instrumentation and control systems in the International Thermonuclear Reactor (ITER) experiment. To achieve such goal the following tasks are carried out: • Development of a DAQ system based on FPGA using the cRIO hardware platform. This task comprehends the configuration of the system and the implementation of the mandatory hardware to communicate to the I/O adapter modules NI9205, NI9264, NI9401, NI9477, NI9426, NI9425 y NI9476 using LabVIEW for FPGA. • Implementation of a software driver using the asynDriver methodology to integrate such cRIO system with EPICS. This task requires the definition of the necessary EPICS records and the creation of the appropriate interfaces that allow the communication with the hardware. • Develop the cRIO system’s description and the EPICS driver in the ITER plant description tool named SDD. This development will automate the creation of EPICS applications, called IOCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the prediction of velocity fields on the 2415-3S airfoil which will be used for an unmanned aerial vehicle with internal propulsion system and in this way analyze the air flow through an internal duct of the airfoil using computational fluid dynamics. The main objective is to evaluate the effect of the internal air flow past the airfoil and how this affects the aerodynamic performance by means of lift and drag forces. For this purpose, three different designs of the internal duct were studied; starting from the base 2415-3S airfoil developed in previous investigation, basing on the hypothesis of decreasing the flow separation produced when the propulsive airflow merges the external flow, and in this way obtaining the best configuration. For that purpose, an exhaustive study of the mesh sensitivity was performed. It was used a non-structured mesh since the computational domain is three-dimensional and complex. The selected mesh contains approximately 12.5 million elements. Both the computational domain and the numerical solution were made with commercial CAD and CFD software, respectively. Air, incompressible and steady was analyzed. The boundary conditions are in concordance with experimental setup in the AF 6109 wind tunnel. The k-e model is utilized to describe the turbulent flow process as followed in references. Results allowed obtaining velocity contours as well as lift and drag coefficients and also the location of separation and reattachment regions in some cases for zero degrees of angle of attack on the internal and external surfaces of the airfoil. Finally, the selection of the configuration with the best aerodynamic performance was made, selecting the option without curved baffles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the prediction of pressure and velocity fields on the 2415-3S airfoil which will be used for and unmanned aerial vehicle with internal propulsion system and in this way analyze the air flow through an internal duct of the airfoil using computational fluid dynamics. The main objective is to evaluate the effect of the internal air flow past the airfoil and how this affects the aerodynamic performance by means of lift and drag forces. For this purpose, three different designs of the internal duct were studied; starting from the base 2415-3S airfoil developed in previous investigation, basing on the hypothesis of decreasing the flow separation produced when the propulsive airflow merges the external flow, and in this way obtaining the best configuration. For that purpose, an exhaustive study of the mesh sensitivity was performed. It was used a non-structured mesh since the computational domain is tridimensional and complex. The selected mesh contains approximately 12.5 million elements. Both the computational domain and the numerical solution were made with commercial CAD and CFD software respectively. Air, incompressible and steady was analyzed. The boundary conditions are in concordance with experimental setup in the AF 6109 wind tunnel. The k-ε model is utilized to describe the turbulent flow process as followed in references. Results allowed obtaining pressure and velocity contours as well as lift and drag coefficients and also the location of separation and reattachment regions in some cases for zero degrees of angle of attack on the internal and external surfaces of the airfoil. Finally, the selection of the configuration with the best aerodynamic performance was made, selecting the option without curved baffles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pumped storage hydro plants (PSHP) can provide adequate energy storage and frequency regulation capacities in isolated power systems having significant renewable energy resources. Due to its high wind and solar potential, several plans have been developed for La Palma Island in the Canary archipelago, aimed at increasing the penetration of these energy sources. In this paper, the performance of the frequency control of La Palma power system is assessed, when the demand is supplied by the available wind and solar generation with the support of a PSHP which has been predesigned for this purpose. The frequency regulation is provided exclusively by the PSHP. Due to topographic and environmental constraints, this plant has a long tail-race tunnel without a surge tank. In this configuration, the effects of pressure waves cannot be neglected and, therefore, usual recommendations for PID governor tuning provide poor performance. A PI governor tuning criterion is proposed for the hydro plant and compared with other criteria according to several performance indices. Several scenarios considering solar and wind energy penetration have been simulated to check the plant response using the proposed criterion. This tuning of the PI governor maintains La Palma system frequency within grid code requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermodynamic stability and oligomerization status of the tumor suppressor p53 tetramerization domain have been studied experimentally and theoretically. A series of hydrophilic mutations at Met-340 and Leu-344 of human p53 were designed to disrupt the hydrophobic dimer–dimer interface of the tetrameric oligomerization domain of p53 (residues 325–355). Meanfield calculations of the free energy of the solvated mutants as a function of interdimer distance were compared with experimental data on the thermal stability and oligomeric state (tetramer, dimer, or equilibrium mixture of both) of each mutant. The calculations predicted a decreasing stability and oligomeric state for the following amino acids at residue 340: Met (tetramer) > Ser Asp, His, Gln, > Glu, Lys (dimer), whereas the experimental results showed the following order: Met (tetramer) > Ser > Gln > His, Lys > Asp, Glu (dimers). For residue 344, the calculated trend was Leu (tetramer) > Ala > Arg, Gln, Lys (dimer), and the experimental trend was Leu (tetramer) > Ala, Arg, Gln, Lys (dimer). The discrepancy for the lysine side chain at residue 340 is attributed to the dual nature of lysine, both hydrophobic and charged. The incorrect prediction of stability of the mutant with Asp at residue 340 is attributed to the fact that within the meanfield approach, we use the wild-type backbone configuration for all mutants, but low melting temperatures suggest a softening of the α-helices at the dimer–dimer interface. Overall, this initial application of meanfield theory toward a protein-solvent system is encouraging for the application of the theoretical model to more complex systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interdependence between geometry of a fault system, its kinematics, and seismicity is investigated. Quantitative measure is introduced for inconsistency between a fixed configuration of faults and the slip rates on each fault. This measure, named geometric incompatibility (G), depicts summarily the instability near the fault junctions: their divergence or convergence ("unlocking" or "locking up") and accumulation of stress and deformations. Accordingly, the changes in G are connected with dynamics of seismicity. Apart from geometric incompatibility, we consider deviation K from well-known Saint Venant condition of kinematic compatibility. This deviation depicts summarily unaccounted stress and strain accumulation in the region and/or internal inconsistencies in a reconstruction of block- and fault system (its geometry and movements). The estimates of G and K provide a useful tool for bringing together the data on different types of movement in a fault system. An analog of Stokes formula is found that allows determination of the total values of G and K in a region from the data on its boundary. The phenomenon of geometric incompatibility implies that nucleation of strong earthquakes is to large extent controlled by processes near fault junctions. The junctions that have been locked up may act as transient asperities, and unlocked junctions may act as transient weakest links. Tentative estimates of K and G are made for each end of the Big Bend of the San Andreas fault system in Southern California. Recent strong earthquakes Landers (1992, M = 7.3) and Northridge (1994, M = 6.7) both reduced K but had opposite impact on G: Landers unlocked the area, whereas Northridge locked it up again.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study analyzes the repeatability, reproducibility and accuracy of a new hyperspectral system based on a pushbroom sensor as a means of measuring spectral features and color of materials and objects. The hyperspectral system consisted of a CCD camera, a spectrograph and an objective lens. An additional linear moving system allowed the mechanical scanning of the complete scene. A uniform overhead luminaire with daylight configuration was used to irradiate the scene using d:45 geometry. We followed the guidelines of the ASTM E2214-08 Standard Practice for Specifying and Verifying the Performance of Color-Measuring Instruments that define the standards and latest multidimensional procedures. The results obtained are analyzed in-depth and compared to those recently reported by other authors for spectrophotometers and multispectral systems. It can be concluded that hyperspectral systems are reliable and can be used in the industry to perform spectral and color readings with a high spatial resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study Forward Osmosis (FO) as an emerging desalination technology, and its capability to replace totally or partially Reverse Osmosis (RO) in order to reduce the great amount of energy required in the current desalination plants. For this purpose, we propose a superstructure that includes both membrane based desalination technologies, allowing the selection of only one of the technologies or a combination of both of them seeking for the optimal configuration of the network. The optimization problem is solved for a seawater desalination plant with a given fresh water production. The results obtained show that the optimal solution combines both desalination technologies to reduce not only the energy consumption but also the total cost of the desalination process in comparison with the same plant but operating only with RO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the effect of quantum interference on population distribution and photon statistics of a cavity field interacting with dressed states of a strongly driven three-level atom. We analyse three coupling configurations of the cavity field to the driven atom, with the cavity frequency tuned to the outer Rabi sideband, the inner Rabi sideband and the central frequency of the 'singly dressed' three-level atom. The quantum doubly dressed states for each configuration are identified and the population distribution and photon statistics are interpreted in terms of transitions among these dressed states and their populations. We find that the population distribution depends strongly on quantum interference and the cavity damping. For the cavity field tuned to the outer or inner Rabi sidebands the cavity damping induces transitions between the dressed states which are forbidden for the ordinary spontaneous emission. Moreover, we find that in the case of the cavity field coupled to the inner Rabi sideband the population distribution is almost Poissonian with a large average number of photons that can be controlled by quantum interference. This system can be considered as a one-atom dressed-state laser with controlled intensity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of a new aeration system on the biopile performance was investigated. The purpose was to increase biodegradation efficiency by optimising airflow through the pile. During a 1-month field trial, the performance of a new system using two perforated vertical pipes with wind-driven turbines was compared with that of a standard pile configuration with two horizontal perforated pipes. Both piles were composed of a similar mix of diesel-contaminated soils, woodchips, compost and NPK fertiliser. Hydrocarbons were recovered using solvent extraction, and determined both gravimetrically and by gas chromatography. Total heterotrophs, pH and moisture content were also assessed. Air pressure measurements were made to compare the efficiency of suction in the pipes. Results at the end of the experiment showed that there was no significant difference between the two piles in the total amount of hydrocarbon biodegradation. The normalised degradation rate was, however, considerably higher in the new system than in the standard one, suggesting that the vertical venting method may have improved the efficiency of the biological reactions in the pile. The pressure measurements showed a significant improvement in the suction produced by the new aeration system. However, many factors other than the airflow (oxygen supply) may influence and limit the biodegradation rates, including moisture content, age of contaminants and the climatic conditions. Additional experiments and modelling need to be carried out to explore further the new aeration method and to develop criteria and guidelines for engineering design of optimal aeration schemes in order to achieve maximum biodegradation in biopiles. (C) 2003 Elsevier Ltd. All rights reserved.