6 resultados para monitoring systems
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
We present a simultaneous optical signal-to-noise ratio (OSNR) and differential group delay (DGD) monitoring method based on degree of polarization (DOP) measurements in optical communications systems. For the first time in the literature (to our best knowledge), the proposed scheme is demonstrated to be able to independently and simultaneously extract OSNR and DGD values from the DOP measurements. This is possible because the OSNR is related to maximum DOP, while DGD is related to the ratio between the maximum and minimum values of DOP. We experimentally measured OSNR and DGD in the ranges from 10 to 30 dB and 0 to 90 ps for a 10 Gb/s non-return-to-zero signal. A theoretical analysis of DOP accuracy needed to measure low values of DGD and high OSNRs is carried out, showing that current polarimeter technology is capable of yielding an OSNR measurement within 1 dB accuracy, for OSNR values up to 34 dB, while DGD error is limited to 1.5% for DGD values above 10 ps. For the first time to our knowledge, the technique was demonstrated to accurately measure first-order polarization mode dispersion (PMD) in the presence of a high value of second-order PMD (as high as 2071 ps(2)). (C) 2012 Optical Society of America
Resumo:
A power transformer needs continuous monitoring and fast protection as it is a very expensive piece of equipment and an essential element in an electrical power system. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can mislead the conventional protection affecting the power system stability negatively. This study proposes the development of a new algorithm to improve the protection performance by using fuzzy logic, artificial neural networks and genetic algorithms. An electrical power system was modelled using Alternative Transients Program software to obtain the operational conditions and fault situations needed to test the algorithm developed, as well as a commercial differential relay. Results show improved reliability, as well as a fast response of the proposed technique when compared with conventional ones.
Resumo:
The Pierre Auger Observatory is a facility built to detect air showers produced by cosmic rays above 10(17) eV. During clear nights with a low illuminated moon fraction, the UV fluorescence light produced by air showers is recorded by optical telescopes at the Observatory. To correct the observations for variations in atmospheric conditions, atmospheric monitoring is performed at regular intervals ranging from several minutes (for cloud identification) to several hours (for aerosol conditions) to several days (for vertical profiles of temperature, pressure, and humidity). In 2009, the monitoring program was upgraded to allow for additional targeted measurements of atmospheric conditions shortly after the detection of air showers of special interest, e. g., showers produced by very high-energy cosmic rays or showers with atypical longitudinal profiles. The former events are of particular importance for the determination of the energy scale of the Observatory, and the latter are characteristic of unusual air shower physics or exotic primary particle types. The purpose of targeted (or "rapid") monitoring is to improve the resolution of the atmospheric measurements for such events. In this paper, we report on the implementation of the rapid monitoring program and its current status. The rapid monitoring data have been analyzed and applied to the reconstruction of air showers of high interest, and indicate that the air fluorescence measurements affected by clouds and aerosols are effectively corrected using measurements from the regular atmospheric monitoring program. We find that the rapid monitoring program has potential for supporting dedicated physics analyses beyond the standard event reconstruction.
Resumo:
A complete census of planetary systems around a volume-limited sample of solar-type stars (FGK dwarfs) in the Solar neighborhood (d a parts per thousand currency signaEuro parts per thousand 15 pc) with uniform sensitivity down to Earth-mass planets within their Habitable Zones out to several AUs would be a major milestone in extrasolar planets astrophysics. This fundamental goal can be achieved with a mission concept such as NEAT-the Nearby Earth Astrometric Telescope. NEAT is designed to carry out space-borne extremely-high-precision astrometric measurements at the 0.05 mu as (1 sigma) accuracy level, sufficient to detect dynamical effects due to orbiting planets of mass even lower than Earth's around the nearest stars. Such a survey mission would provide the actual planetary masses and the full orbital geometry for all the components of the detected planetary systems down to the Earth-mass limit. The NEAT performance limits can be achieved by carrying out differential astrometry between the targets and a set of suitable reference stars in the field. The NEAT instrument design consists of an off-axis parabola single-mirror telescope (D = 1 m), a detector with a large field of view located 40 m away from the telescope and made of 8 small movable CCDs located around a fixed central CCD, and an interferometric calibration system monitoring dynamical Young's fringes originating from metrology fibers located at the primary mirror. The mission profile is driven by the fact that the two main modules of the payload, the telescope and the focal plane, must be located 40 m away leading to the choice of a formation flying option as the reference mission, and of a deployable boom option as an alternative choice. The proposed mission architecture relies on the use of two satellites, of about 700 kg each, operating at L2 for 5 years, flying in formation and offering a capability of more than 20,000 reconfigurations. The two satellites will be launched in a stacked configuration using a Soyuz ST launch vehicle. The NEAT primary science program will encompass an astrometric survey of our 200 closest F-, G- and K-type stellar neighbors, with an average of 50 visits each distributed over the nominal mission duration. The main survey operation will use approximately 70% of the mission lifetime. The remaining 30% of NEAT observing time might be allocated, for example, to improve the characterization of the architecture of selected planetary systems around nearby targets of specific interest (low-mass stars, young stars, etc.) discovered by Gaia, ground-based high-precision radial-velocity surveys, and other programs. With its exquisite, surgical astrometric precision, NEAT holds the promise to provide the first thorough census for Earth-mass planets around stars in the immediate vicinity of our Sun.
Resumo:
Abstract Background A typical purification system that provides purified water which meets ionic and organic chemical standards, must be protected from microbial proliferation to minimize cross-contamination for use in cleaning and preparations in pharmaceutical industries and in health environments. Methodology Samples of water were taken directly from the public distribution water tank at twelve different stages of a typical purification system were analyzed for the identification of isolated bacteria. Two miniature kits were used: (i) identification system (api 20 NE, Bio-Mérieux) for non-enteric and non-fermenting gram-negative rods; and (ii) identification system (BBL crystal, Becton and Dickson) for enteric and non-fermenting gram-negative rods. The efficiency of the chemical sanitizers used in the stages of the system, over the isolated and identified bacteria in the sampling water, was evaluated by the minimum inhibitory concentration (MIC) method. Results The 78 isolated colonies were identified as the following bacteria genera: Pseudomonas, Flavobacterium and Acinetobacter. According to the miniature kits used in the identification, there was a prevalence of isolation of P. aeruginosa 32.05%, P. picketti (Ralstonia picketti) 23.08%, P. vesiculares 12.82%,P. diminuta 11.54%, F. aureum 6.42%, P. fluorescens 5.13%, A. lwoffi 2.56%, P. putida 2.56%, P. alcaligenes 1.28%, P. paucimobilis 1.28%, and F. multivorum 1.28%. Conclusions We found that research was required for the identification of gram-negative non-fermenting bacteria, which were isolated from drinking water and water purification systems, since Pseudomonas genera represents opportunistic pathogens which disperse and adhere easily to surfaces, forming a biofilm which interferes with the cleaning and disinfection procedures in hospital and industrial environments.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.