892 resultados para Acoustic Emission, Source Separation, Condition Monitoring, Diesel Engines, Injector Faults
Resumo:
Natural aerosol plays a significant role in the Earth’s system due to its ability to alter the radiative balance of the Earth. Here we use a global aerosol microphysics model together with a radiative transfer model to estimate radiative effects for five natural aerosol sources in the present-day atmosphere: dimethyl sulfide (DMS), sea-salt, volcanoes, monoterpenes, and wildfires. We calculate large annual global mean aerosol direct and cloud albedo effects especially for DMS-derived sulfate (–0.23 Wm–2 and –0.76 Wm–2, respectively), volcanic sulfate (–0.21 Wm–2 and –0.61 Wm–2) and sea-salt (–0.44 Wm–2 and –0.04 Wm–2). The cloud albedo effect responds nonlinearly to changes in emission source strengths. The natural sources have both markedly different radiative efficiencies and indirect/direct radiative effect ratios. Aerosol sources that contribute a large number of small particles (DMS-derived and volcanic sulfate) are highly effective at influencing cloud albedo per unit of aerosol mass burden.
Resumo:
Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.
Resumo:
The stratigraphic subdivision and correlation of dune deposits is difficult, especially when age datings are not available. A better understanding of the controls on texture and composition of eolian sands is necessary to interpret ancient eolian sediments. The Imbituba-Jaguaruna coastal zone (Southern Brazil, 28 degrees-29 degrees S) stands out due to its four well-preserved Late Pleistocene (eolian generation 1) to Holocene eolian units (eolian generations 2, 3, and 4). In this study, we evaluate the grain-size and heavy-mineral characteristics of the Imbituba-Jaguartma eolian units through statistical analysis of hundreds of sediment samples. Grain-size parameters and heavy-mineral content allow us to distinguish the Pleistocene from the Holocene units. The grain size displays a pattern of fining and better sorting from generation 1 (older) to 4 (younger), whereas the content of mechanically stable (dense and hard) heavy minerals decreases from eolian generation 1 to 4. The variation in grain size and heavy-mineral content records shifts in the origin and balance (input versus output) of eolian sediment supply attributable mainly to relative sea-level changes. Dunefields submitted to relative sea-level lowstand conditions (eolian generation 1) are characterized by lower accumulation rates and intense post-depositional dissection by fluvial incision. Low accumulation rates favor deflation in the eolian system, which promotes concentration of denser and stable heavy minerals (increase of ZTR index) as well as coarsening of eolian sands. Dissection involves the selective removal of finer sediments and less dense heavy minerals to the coastal source area. Under a high rate of relative sea-level rise and transgression (eolian generation 2), coastal erosion prevents deflation through high input of sediments to the coastal eolian source. This condition favors dunefield growth. Coastal erosion feeds sand from local sources to the eolian system. including sands from previous dunefields (eolian generation 1) and from drowned incised valleys. Therefore, dunefields corresponding to transgressive phases inherit the grain-size and heavy-mineral characteristics of previous dunefields, leading to selective enrichment of finer sands and lighter minerals. Eolian generations 3 and 4 developed during a regressive-progradational phase (Holocene relative sea level highstand). The high rate of sediment supply during the highstand phase prevents deflation. The lack of coastal erosion favors sediment supply from distal sources (fluvial sediments rich in unstable heavy minerals). Thus, dunefields of transgressive and highstand systems tracts may be distinguished from dunefields of the lowstand systems tract through high rates of accumulation (low deflation) in the former. The sediment source of the transgressive dunefields (high input of previously deposited coastal sands) differs from that of the highstand dunefields (high input of fluvial distal sands). Based on this case study, we propose a general framework for the relation between relative sea level, sediment supply and the texture and mineralogy of eolian sediments deposited in siliciclastic wet coastal zones similar to the Imbituba-Jaguaruna coast (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A dynamic atmosphere generator with a naphthalene emission source has been constructed and used for the development and evaluation of a bioluminescence sensor based on the bacteria Pseudomonas fluorescens HK44 immobilized in 2% agar gel (101 cell mL(-1)) placed in sampling tubes. A steady naphthalene emission rate (around 7.3 nmol min(-1) at 27 degrees C and 7.4 mLmin(-1) of purified air) was obtained by covering the diffusion unit containing solid naphthalene with a PTFE filter membrane. The time elapsed from gelation of the agar matrix to analyte exposure (""maturation time"") was found relevant for the bioluminescence assays, being most favorable between 1.5 and 3 h. The maximum light emission, observed after 80 min, is dependent on the analyte concentration and the exposure time (evaluated between 5 and 20 min), but not on the flow rate of naphthalene in the sampling tube, over the range of 1.8-7.4 nmol min(-1). A good linear response was obtained between 50 and 260 nmol L-1 with a limit of detection estimated in 20 nmol L-1 far below the recommended threshold limit value for naphthalene in air. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This project is based on Artificial Intelligence (A.I) and Digital Image processing (I.P) for automatic condition monitoring of sleepers in the railway track. Rail inspection is a very important task in railway maintenance for traffic safety issues and in preventing dangerous situations. Monitoring railway track infrastructure is an important aspect in which the periodical inspection of rail rolling plane is required.Up to the present days the inspection of the railroad is operated manually by trained personnel. A human operator walks along the railway track searching for sleeper anomalies. This monitoring way is not more acceptable for its slowness and subjectivity. Hence, it is desired to automate such intuitive human skills for the development of more robust and reliable testing methods. Images of wooden sleepers have been used as data for my project. The aim of this project is to present a vision based technique for inspecting railway sleepers (wooden planks under the railway track) by automatic interpretation of Non Destructive Test (NDT) data using A.I. techniques in determining the results of inspection.
Resumo:
The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.
Resumo:
In Borlänge, source separation has been the basis for management of household waste for over five years. This report reviews today?s system and gives a model for further follow-up through waste grouping. In the basic system waste is separated into three fractions: biodegradable, waste to energy and waste to landfill. All waste is packed in plastic bags, put in separate containers for each fraction, and collected from the property. Separate analyses were made of waste from single family houses and apartment buildings. The amount of waste per household and week, number of non-sorted bags, purity, recovery rate and density of each fraction was calculated. The amount of packaging collected together with the household waste is given. Material collected under the Swedish law of Producers? Responsibility is not covered in this report.
Resumo:
This paper investigates problems concerning vegetation along railways and proposes automatic means of detecting ground vegetation. Digital images of railway embankments have been acquired and used for the purpose. The current work mainly proposes two algorithms to be able to achieve automation. Initially a vegetation detection algorithm has been investigated for the purpose of detecting vegetation. Further a rail detection algorithm that is capable of identifying the rails and eventually the valid sampling area has been investigated. Results achieved in the current work report satisfactory (qualitative) detection rates.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals
Resumo:
Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations
Resumo:
This dissertation presents a new proposal for the Direction of Arrival (DOA) detection problem for more than one signal inciding simultaneously on an antennas array with linear or planar geometry by using intelligent algorithms. The DOA estimator is developed by using techniques of Conventional Beam-forming (CBF), Blind Source Separation (BSS), and the neural estimator MRBF (Modular Structure of Radial Basis Functions). The developed MRBF estimator has its capacity extended due to the interaction with the BSS technique. The BSS makes an estimation of the steering vectors of the multiple plane waves that reach the array in the same frequency, that means, obtains to separate mixed signals without information a priori. The technique developed in this work makes possible to identify the multiple sources directions and to identify and to exclude interference sources
Resumo:
This work considers the development of a filtering system composed of an intelligent algorithm, that separates information and noise coming from sensors interconnected by Foundation Fieldbus (FF) network. The algorithm implementation will be made through FF standard function blocks, with on-line training through OPC (OLE for Process Control), and embedded technology in a DSP (Digital Signal Processor) that interacts with the fieldbus devices. The technique ICA (Independent Component Analysis), that explores the possibility of separating mixed signals based on the fact that they are statistically independent, was chosen to this Blind Source Separation (BSS) process. The algorithm and its implementations will be Presented, as well as the results
Resumo:
Fuel is a material used to produce heat or power by burning, and lubricity is the capacity for reducing friction. The aim of this work is evaluate the lubricity of eight fossil and renewable fuels used in Diesel engines, by means of a HFRR tester, following the ASTM D 6079-04 Standard. In this conception, a sphere of AISI 52100 steel (diameter of 6,000,05 mm, Ra 0,050,005 μm, E = 210 GPa, HRC 624, HV0,2 63147) is submitted to a reciprocating motion under a normal load of 2 N and 50 Hz frequency to promote a wear track length of 1.10.1mm in a plan disc of AISI 52100 steel (HV0,05 18410, Ra 0,020,005 μm). The testing extent time was 75 minutes, 225,000 cycles. Each one test was repeated six times to furnish the results, by means of intrinsic signatures from the signals of the lubricant film percentage, friction coefficient, contact heating, Sound Pressure Level, SPL [dB]. These signal signatures were obtained by two thermocouples and a portable decibelmeter coupled to a data acquisition system and to the HFRR system. The wettability of droplet of the diesel fuel in thermal equilibrium on a horizontal surface of a virgin plan disc of 52100 steel, Ra 0,02 0,005 μm, were measured by its contact angle of 7,0 3,5o, while the results obtained for the biodiesel B5, B20 and B100 blends originated by the ethylic transesterification of soybean oil were, respectively, 7,5 3,5o, 13,5 3,5o e 19,0 1,0o; for the distilled water, 78,0 6,0o; the biodiesel B5, B20 and B100 blends originated by the ethylic transesterification of sunflower oil were, respectively, 7,0 4,0o, 8,5 4,5o e 19,5 2,5o. Different thickness of lubricant film were formed and measured by their percentage by means of the contact resistance technique, suggesting several regimes, since the boundary until the hydrodynamic lubrication. All oils analyzed in this study promoted the ball wear scars with diameters smaller than 400 μm. The lowest values were observed in the scar balls lubricated by mixtures B100, B20 and B5 of sunflower and B20 and B5 of soybean oils (WSD < 215 μm)
Resumo:
The work reported here involved an investigation into the grinding process, one of the last finishing processes carried out on a production line. Although several input parameters are involved in this process, attention today focuses strongly on the form and amount of cutting fluid employed, since these substances may be seriously pernicious to human health and to the environment, and involve high purchasing and maintenance costs when utilized and stored incorrectly. The type and amount of cutting fluid used directly affect some of the main output variables of the grinding process which are analyzed here, such as tangential cutting force, specific grinding energy, acoustic emission, diametrical wear, roughness, residual stress and scanning electron microscopy. To analyze the influence of these variables, an optimised fluid application methodology was developed (involving rounded 5, 4 and 3 turn diameter nozzles and high fluid application pressures) to reduce the amount of fluid used in the grinding process and improve its performance in comparison with the conventional fluid application method (of diffuser nozzles and lower fluid application pressure). To this end, two types of cutting fluid (a 5% synthetic emulsion and neat oil) and two abrasive tools (an aluminium oxide and a superabrasive CBN grinding wheel) were used. The results revealed that, in every situation, the optimised application of cutting fluid significantly improved the efficiency of the process, particularly the combined use of neat oil and CBN grinding wheel. (c) 2005 Elsevier Ltd. All rights reserved.