950 resultados para Monitoring vibration systems
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Engenharia Mecânica - FEB
Resumo:
This Special Issue presents a selection of papers initially presented at the 11th International Conference on Vibration Problems (ICOVP-2013), held from 9 to 12 September 2013 in Lisbon, Portugal. The main topics of this Special Issue are linear and, mainly, nonlinear dynamics, chaos and control of systems and structures and their applications in different field of science and engineering. According to the goal of the Special Issue, the selected contributions are divided into three major parts: “Vibration Problems in Vertical Transportation Systems”, “Nonlinear Dynamics, Chaos and Control of Elastic Structures” and “New Strategies and Challenges for Aerospace and Ocean Structures Dynamics and Control”.
Resumo:
Pós-graduação em Engenharia Mecânica - FEIS
Resumo:
Linear parameter varying (LPV) control is a model-based control technique that takes into account time-varying parameters of the plant. In the case of rotating systems supported by lubricated bearings, the dynamic characteristics of the bearings change in time as a function of the rotating speed. Hence, LPV control can tackle the problem of run-up and run-down operational conditions when dynamic characteristics of the rotating system change significantly in time due to the bearings and high vibration levels occur. In this work, the LPV control design for a flexible shaft supported by plain journal bearings is presented. The model used in the LPV control design is updated from unbalance response experimental results and dynamic coefficients for the entire range of rotating speeds are obtained by numerical optimization. Experimental implementation of the designed LPV control resulted in strong reduction of vibration amplitudes when crossing the critical speed, without affecting system behavior in sub- or supercritical speeds. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
A power transformer needs continuous monitoring and fast protection as it is a very expensive piece of equipment and an essential element in an electrical power system. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can mislead the conventional protection affecting the power system stability negatively. This study proposes the development of a new algorithm to improve the protection performance by using fuzzy logic, artificial neural networks and genetic algorithms. An electrical power system was modelled using Alternative Transients Program software to obtain the operational conditions and fault situations needed to test the algorithm developed, as well as a commercial differential relay. Results show improved reliability, as well as a fast response of the proposed technique when compared with conventional ones.
Resumo:
The Pierre Auger Observatory is a facility built to detect air showers produced by cosmic rays above 10(17) eV. During clear nights with a low illuminated moon fraction, the UV fluorescence light produced by air showers is recorded by optical telescopes at the Observatory. To correct the observations for variations in atmospheric conditions, atmospheric monitoring is performed at regular intervals ranging from several minutes (for cloud identification) to several hours (for aerosol conditions) to several days (for vertical profiles of temperature, pressure, and humidity). In 2009, the monitoring program was upgraded to allow for additional targeted measurements of atmospheric conditions shortly after the detection of air showers of special interest, e. g., showers produced by very high-energy cosmic rays or showers with atypical longitudinal profiles. The former events are of particular importance for the determination of the energy scale of the Observatory, and the latter are characteristic of unusual air shower physics or exotic primary particle types. The purpose of targeted (or "rapid") monitoring is to improve the resolution of the atmospheric measurements for such events. In this paper, we report on the implementation of the rapid monitoring program and its current status. The rapid monitoring data have been analyzed and applied to the reconstruction of air showers of high interest, and indicate that the air fluorescence measurements affected by clouds and aerosols are effectively corrected using measurements from the regular atmospheric monitoring program. We find that the rapid monitoring program has potential for supporting dedicated physics analyses beyond the standard event reconstruction.
Resumo:
A complete census of planetary systems around a volume-limited sample of solar-type stars (FGK dwarfs) in the Solar neighborhood (d a parts per thousand currency signaEuro parts per thousand 15 pc) with uniform sensitivity down to Earth-mass planets within their Habitable Zones out to several AUs would be a major milestone in extrasolar planets astrophysics. This fundamental goal can be achieved with a mission concept such as NEAT-the Nearby Earth Astrometric Telescope. NEAT is designed to carry out space-borne extremely-high-precision astrometric measurements at the 0.05 mu as (1 sigma) accuracy level, sufficient to detect dynamical effects due to orbiting planets of mass even lower than Earth's around the nearest stars. Such a survey mission would provide the actual planetary masses and the full orbital geometry for all the components of the detected planetary systems down to the Earth-mass limit. The NEAT performance limits can be achieved by carrying out differential astrometry between the targets and a set of suitable reference stars in the field. The NEAT instrument design consists of an off-axis parabola single-mirror telescope (D = 1 m), a detector with a large field of view located 40 m away from the telescope and made of 8 small movable CCDs located around a fixed central CCD, and an interferometric calibration system monitoring dynamical Young's fringes originating from metrology fibers located at the primary mirror. The mission profile is driven by the fact that the two main modules of the payload, the telescope and the focal plane, must be located 40 m away leading to the choice of a formation flying option as the reference mission, and of a deployable boom option as an alternative choice. The proposed mission architecture relies on the use of two satellites, of about 700 kg each, operating at L2 for 5 years, flying in formation and offering a capability of more than 20,000 reconfigurations. The two satellites will be launched in a stacked configuration using a Soyuz ST launch vehicle. The NEAT primary science program will encompass an astrometric survey of our 200 closest F-, G- and K-type stellar neighbors, with an average of 50 visits each distributed over the nominal mission duration. The main survey operation will use approximately 70% of the mission lifetime. The remaining 30% of NEAT observing time might be allocated, for example, to improve the characterization of the architecture of selected planetary systems around nearby targets of specific interest (low-mass stars, young stars, etc.) discovered by Gaia, ground-based high-precision radial-velocity surveys, and other programs. With its exquisite, surgical astrometric precision, NEAT holds the promise to provide the first thorough census for Earth-mass planets around stars in the immediate vicinity of our Sun.
Resumo:
Abstract Background A typical purification system that provides purified water which meets ionic and organic chemical standards, must be protected from microbial proliferation to minimize cross-contamination for use in cleaning and preparations in pharmaceutical industries and in health environments. Methodology Samples of water were taken directly from the public distribution water tank at twelve different stages of a typical purification system were analyzed for the identification of isolated bacteria. Two miniature kits were used: (i) identification system (api 20 NE, Bio-Mérieux) for non-enteric and non-fermenting gram-negative rods; and (ii) identification system (BBL crystal, Becton and Dickson) for enteric and non-fermenting gram-negative rods. The efficiency of the chemical sanitizers used in the stages of the system, over the isolated and identified bacteria in the sampling water, was evaluated by the minimum inhibitory concentration (MIC) method. Results The 78 isolated colonies were identified as the following bacteria genera: Pseudomonas, Flavobacterium and Acinetobacter. According to the miniature kits used in the identification, there was a prevalence of isolation of P. aeruginosa 32.05%, P. picketti (Ralstonia picketti) 23.08%, P. vesiculares 12.82%,P. diminuta 11.54%, F. aureum 6.42%, P. fluorescens 5.13%, A. lwoffi 2.56%, P. putida 2.56%, P. alcaligenes 1.28%, P. paucimobilis 1.28%, and F. multivorum 1.28%. Conclusions We found that research was required for the identification of gram-negative non-fermenting bacteria, which were isolated from drinking water and water purification systems, since Pseudomonas genera represents opportunistic pathogens which disperse and adhere easily to surfaces, forming a biofilm which interferes with the cleaning and disinfection procedures in hospital and industrial environments.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.
Resumo:
Nowadays, there is an increasing interest in wireless sensor networks (WSN) for environmental monitoring systems because it can be used to improve the quality of life and living conditions are becoming a major concern to people. This paper describes the design and development of a real time monitoring system based on ZigBee WSN characterized by a lower energy consumption, low cost, reduced dimensions and fast adaptation to the network tree topology. The developed system encompasses an optimized sensing process about environmental parameters, low rate transmission from sensor nodes to the gateway, packet parsing and data storing in a remote database and real time visualization through a web server.
Resumo:
Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are a large number of signal processing techniques that can be used in order to extract interesting information from a measured vibration signal. This study seeks to detect rotating machine defects using a range of techniques including synchronous time averaging, Hilbert transform-based demodulation, continuous wavelet transform, Wigner-Ville distribution and spectral correlation density function. The detection and the diagnostic capability of these techniques are discussed and compared on the basis of experimental results concerning gear tooth faults, i.e. fatigue crack at the tooth root and tooth spalls of different sizes, as well as assembly faults in diesel engine. Moreover, the sensitivity to fault severity is assessed by the application of these signal processing techniques to gear tooth faults of different sizes.