902 resultados para Supervisory Control and Data Acquisition (SCADA)
Resumo:
The properties of data and activities in business processes can be used to greatly facilítate several relevant tasks performed at design- and run-time, such as fragmentation, compliance checking, or top-down design. Business processes are often described using workflows. We present an approach for mechanically inferring business domain-specific attributes of workflow components (including data Ítems, activities, and elements of sub-workflows), taking as starting point known attributes of workflow inputs and the structure of the workflow. We achieve this by modeling these components as concepts and applying sharing analysis to a Horn clause-based representation of the workflow. The analysis is applicable to workflows featuring complex control and data dependencies, embedded control constructs, such as loops and branches, and embedded component services.
Resumo:
La caracterización de módulos fotovoltaicos proporciona las especificaciones eléctricas que se necesitan para conocer los niveles de eficiencia energética que posee un módulo fotovoltaico de concentración. Esta caracterización se consigue a través de medidas de curvas IV, de igual manera que se obtienen para caracterizar los módulos convencionales. Este proyecto se ha realizado para la optimización y ampliación de un programa de medida y caracterización de hasta cuatro módulos fotovoltaicos que se encuentran en el exterior, sobre un seguidor. El programa, desarrollado en LabVIEW, opera sobre el sistema de medida, obteniendo los datos de caracterización del módulo que se está midiendo. Para ello en primer lugar se ha tomado como base una aplicación ya implementada y se ha analizado su funcionamiento para poder optimizarla y ampliarla para introducir nuevas prestaciones. La nueva prestación más relevante para la medida de los módulos, busca evitar que el módulo entre medida y medida, se encuentre disipando toda la energía que absorbe y se esté calentando. Esto se ha conseguido introduciendo una carga electrónica dentro del sistema de medida, que mantenga polarizado el módulo siempre y cuando, no se esté produciendo una medida sobre él. En este documento se describen los dispositivos que forman todo el sistema de medida, así como también se describe el software del programa. Además, se incluye un manual de usuario para un fácil manejo del programa. ABSTRACT. The aim of the characterization of concentrator photovoltaic modules (CPV) is to provide the electrical specifications to know the energy efficiency at operating conditions. This characterization is achieved through IV curves measures, the same way that they are obtained to characterize conventional silicon modules. The objective of this project is the optimization and improvement of a measurement and characterization system for CPV modules. A software has been developed in LabVIEW for the operation of the measurement system and data acquisition of the IV curves of the modules. At first, an already deployed application was taken as the basis and its operation was analyzed in order to optimize and extend to introduce new features. The more relevant update seeks to prevent the situation in which the module is dissipating all the energy between measurements. This has been achieved by introducing an electronic load into the measuring system. This load maintains the module biased at its maximum power point between measurement periods. This work describes the devices that take part in the measurement system, as well as the software program developed. In addition, a user manual is included for an easy handling of the program.
Resumo:
The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.
Resumo:
The Galway Bay wave energy test site promises to be a vital resource for wave energy researchers and developers. As part of the development of this site, a floating power system is being developed to provide power and data acquisition capabilities, including its function as a local grid connection, allowing for the connection of up to three wave energy converter devices. This work shows results from scaled physical model testing and numerical modelling of the floating power system and an oscillating water column connected with an umbilical. Results from this study will be used to influence further scaled testing as well as the full scale design and build of the floating power system in Galway Bay.
Resumo:
Light is the main information about the interstellar medium accessible on Earth. Based on this information one can conclude on the composition of the region where the light originates from, as well as on its history. The requirement for this is that it is possible to identify the different absorption and emission features in the spectrum and assign them to certain molecules, atoms or ions. To enable the identification of the different species, precise spectroscopic investigations of the species in the laboratory are necessary. In this work a new spectroscopic method is presented, which can be used to record pure rotational spectra of mass selected, cold, stored molecular ions. It is based on the idea of state specific attachment of helium atoms to the stored molecular ions. The new technique has been made possible through the development and recent completion of two new 22-pole ion trap instruments in the work group of Laboratory Astrophysics at the University of Cologne. These new instruments have the advantage to reach temperatures as low as 4K compared to the 10K of the predecessor instrument. These low temperatures enable the ternary attachment of helium atoms to the stored molecular ions and by this make it possible to develop this new method for pure rotational spectroscopy. According to this, this work is divided into two parts. The first part deals with the new FELion experiment that was build and characterized in the first part of the thesis. FELion is a cryogenic 22-pole ion trap apparatus, allowing to generate, mass select, store and cool down, and analyze molecular ions. The different components of the instrument, e.g. the Storage Ion Source for generating the ions or the first quadrupole mass filter, are described and characterized in this part. Besides this also the newly developed control and data acquisitions system is introduced. With this instrument the measurements presented in the second part of the work were performed. The second part deals with the new action spectroscopic method of state-selective helium attachment to the stored molecular ions. For a deeper analysis of the new technique the systems of CD+ and helium and HCO+ and helium are investigated in detail. Analytical and numerical models of the process are presented and compared to experimental results. The results of these investigations point to a seemingly very general applicability of the new method to a wide class of molecular ions. In the final part of the thesis measurements of the rotational spectrum of l-C3H+ are presented. These measurements have to be high-lighted, since it was possible for the first time in the laboratory to unambiguously measure four low-lying rotational transitions of l-C3H+. These measurements (Brünken et al. ApJL 783, L4 (2014)) enabled the reliable identification of so far unidentified emision lines observed in several regions of the interstellar medium (Pety et al. Astron. Astrophys. 548, A68 (2012), McGuire et al. The Astrophysical Journal 774, 56 (2013) and McGuire et al. The Astrophysical Journal 783, 36 (2014)).
Resumo:
Reconfigurable platforms are a promising technology that offers an interesting trade-off between flexibility and performance, which many recent embedded system applications demand, especially in fields such as multimedia processing. These applications typically involve multiple ad-hoc tasks for hardware acceleration, which are usually represented using formalisms such as Data Flow Diagrams (DFDs), Data Flow Graphs (DFGs), Control and Data Flow Graphs (CDFGs) or Petri Nets. However, none of these models is able to capture at the same time the pipeline behavior between tasks (that therefore can coexist in order to minimize the application execution time), their communication patterns, and their data dependencies. This paper proves that the knowledge of all this information can be effectively exploited to reduce the resource requirements and the timing performance of modern reconfigurable systems, where a set of hardware accelerators is used to support the computation. For this purpose, this paper proposes a novel task representation model, named Temporal Constrained Data Flow Diagram (TCDFD), which includes all this information. This paper also presents a mapping-scheduling algorithm that is able to take advantage of the new TCDFD model. It aims at minimizing the dynamic reconfiguration overhead while meeting the communication requirements among the tasks. Experimental results show that the presented approach achieves up to 75% of resources saving and up to 89% of reconfiguration overhead reduction with respect to other state-of-the-art techniques for reconfigurable platforms.
Resumo:
In the near future, the LHC experiments will continue to be upgraded as the LHC luminosity will increase from the design 1034 to 7.5 × 1034, with the HL-LHC project, to reach 3000 × f b−1 of accumulated statistics. After the end of a period of data collection, CERN will face a long shutdown to improve overall performance by upgrading the experiments and implementing more advanced technologies and infrastructures. In particular, ATLAS will upgrade parts of the detector, the trigger, and the data acquisition system. It will also implement new strategies and algorithms for processing and transferring the data to the final storage. This PhD thesis presents a study of a new pattern recognition algorithm to be used in the trigger system, which is a software designed to provide the information necessary to select physical events from background data. The idea is to use the well-known Hough Transform mathematical formula as an algorithm for detecting particle trajectories. The effectiveness of the algorithm has already been validated in the past, independently of particle physics applications, to detect generic shapes in images. Here, a software emulation tool is proposed for the hardware implementation of the Hough Transform, to reconstruct the tracks in the ATLAS Trigger and Data Acquisition system. Until now, it has never been implemented on electronics in particle physics experiments, and as a hardware implementation it would provide overall latency benefits. A comparison between the simulated data and the physical system was performed on a Xilinx UltraScale+ FPGA device.
Resumo:
This paper describes the communication stack of the REMPLI system: a structure using power-lines and IPbased networks for communication, for data acquisition and control of energy distribution and consumption. It is furthermore prepared to use alternative communication media like GSM or analog modem connections. The REMPLI system provides communication service for existing applications, namely automated meter reading, energy billing and domotic applications. The communication stack, consisting of physical, network, transport, and application layer is described as well as the communication services provided by the system. We show how the peculiarities of the power-line communication influence the design of the communication stack, by introducing requirements to efficiently use the limited bandwidth, optimize traffic and implement fair use of the communication medium for the extensive communication partners.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Supervising and controlling the many processes involved in petroleum production is both dangerous and complex. Herein, we propose a multiagent supervisory and control system for handle continuous processes like those in chemical and petroleum industries In its architeture, there are agents responsible for managing data production and analysis, and also the production equipments. Fuzzy controllers were used as control agents. The application of a fuzzy control system to managing an off-shore installation for petroleum production onto a submarine separation process is described. © 2008 IEEE.
Resumo:
This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.
Resumo:
Objectives: The present study describes the natural history of Porphyromonas gingivalis, Actinobacillus actinomycetemcomitans and Prevotella intermedia over a 5-year period and the effect of a triclosan/copolymer dentifrice on these organisms in a normal adult population. Material and Methods: Subgingival plaque samples were collected from 504 adult volunteers. Probing pocket depths (PPD) and relative attachment levels were measured using an automated probe. Participants were matched for disease status (CPI), plaque index, age and gender, and allocated to receive either a triclosan/copolymer or placebo dentifrice. Re-examination and subgingival plaque sampling was repeated after 1, 2, 3, 4 and 5 years. P. gingivalis, A. actinomycetemcomitans and P. intermedia were detected and quantitated using an enzyme linked immunosorbent assay. Logistic regression and generalised linear modelling were used to analyse the data. Results: This 5-year longitudinal study showed considerable volatility in acquisition and loss (below the level of detection) of all three organisms in this population. Relatively few subjects had these organisms on multiple occasions. While P. gingivalis was related to loss of attachment and to PPD greater than or equal to3.5 mm, there was no relationship between A. actinomycetemcomitans or P. intermedia and disease progression over the 5 years of the study. Smokers with P. gingivalis had more PPD greater than or equal to3.5 mm than smokers without this organism. There was no significant effect of the triclosan dentifrice on P. gingivalis or A. actinomycetemcomitans . Subjects using triclosan were more likely to have P. intermedia than those not using the dentifrice; however this did not translate into these subjects having higher levels of P. intermedia and its presence was uniform showing no signs of increasing over the course of the study. Conclusion: The present 5-year longitudinal study has shown the transient nature of colonisation with P. gingivalis , A. actinomycetemcomitans and P. intermedia in a normal adult population. The use of a triclosan-containing dentifrice did not lead to an overgrowth of these organisms. The clinical effect of the dentifrice would appear to be independent of its antimicrobial properties.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
Objective. To study the acquisition and cross-transmission of Staphylococcus aureus in different intensive care units (ICUs). Methods. We performed a multicenter cohort study. Six ICUs in 6 countries participated. During a 3-month period at each ICU, all patients had nasal and perineal swab specimens obtained at ICU admission and during their stay. All S. aureus isolates that were collected were genotyped by spa typing and multilocus variable-number tandem-repeat analysis typing for cross-transmission analysis. A total of 629 patients were admitted to ICUs, and 224 of these patients were found to be colonized with S. aureus at least once during ICU stay (22% were found to be colonized with methicillin-resistant S. aureus [MRSA]). A total of 316 patients who had test results negative for S. aureus at ICU admission and had at least 1 follow-up swab sample obtained for culture were eligible for acquisition analysis. Results. A total of 45 patients acquired S. aureus during ICU stay (31 acquired methicillin-susceptible S. aureus [MSSA], and 14 acquired MRSA). Several factors that were believed to affect the rate of acquisition of S. aureus were analyzed in univariate and multivariate analyses, including the amount of hand disinfectant used, colonization pressure, number of beds per nurse, antibiotic use, length of stay, and ICU setting (private room versus open ICU treatment). Greater colonization pressure and a greater number of beds per nurse correlated with a higher rate of acquisition for both MSSA and MRSA. The type of ICU setting was related to MRSA acquisition only, and the amount of hand disinfectant used was related to MSSA acquisition only. In 18 (40%) of the cases of S. aureus acquisition, cross-transmission from another patient was possible. Conclusions. Colonization pressure, the number of beds per nurse, and the treatment of all patients in private rooms correlated with the number of S. aureus acquisitions on an ICU. The amount of hand disinfectant used was correlated with the number of cases of MSSA acquisition but not with the number of cases of MRSA acquisition. The number of cases of patient-to-patient cross-transmission was comparable for MSSA and MRSA.
Resumo:
Multiple genome-wide association studies (GWAS) have been performed in HIV-1 infected individuals, identifying common genetic influences on viral control and disease course. Similarly, common genetic correlates of acquisition of HIV-1 after exposure have been interrogated using GWAS, although in generally small samples. Under the auspices of the International Collaboration for the Genomics of HIV, we have combined the genome-wide single nucleotide polymorphism (SNP) data collected by 25 cohorts, studies, or institutions on HIV-1 infected individuals and compared them to carefully matched population-level data sets (a list of all collaborators appears in Note S1 in Text S1). After imputation using the 1,000 Genomes Project reference panel, we tested approximately 8 million common DNA variants (SNPs and indels) for association with HIV-1 acquisition in 6,334 infected patients and 7,247 population samples of European ancestry. Initial association testing identified the SNP rs4418214, the C allele of which is known to tag the HLA-B*57:01 and B*27:05 alleles, as genome-wide significant (p = 3.6×10(-11)). However, restricting analysis to individuals with a known date of seroconversion suggested that this association was due to the frailty bias in studies of lethal diseases. Further analyses including testing recessive genetic models, testing for bulk effects of non-genome-wide significant variants, stratifying by sexual or parenteral transmission risk and testing previously reported associations showed no evidence for genetic influence on HIV-1 acquisition (with the exception of CCR5Δ32 homozygosity). Thus, these data suggest that genetic influences on HIV acquisition are either rare or have smaller effects than can be detected by this sample size.