982 resultados para Data handling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the main challenges of slow speed machinery condition monitoring is that the energy generated from an incipient defect is too weak to be detected by traditional vibration measurements due to its low impact energy. Acoustic emission (AE) measurement is an alternative for this as it has the ability to detect crack initiations or rubbing between moving surfaces. However, AE measurement requires high sampling frequency and consequently huge amount of data are obtained to be processed. It also requires expensive hardware to capture those data, storage and involves signal processing techniques to retrieve valuable information on the state of the machine. AE signal has been utilised for early detection of defects in bearings and gears. This paper presents an online condition monitoring (CM) system for slow speed machinery, which attempts to overcome those challenges. The system incorporates relevant signal processing techniques for slow speed CM which include noise removal techniques to enhance the signal-to-noise and peak-holding down sampling to reduce the burden of massive data handling. The analysis software works under Labview environment, which enables online remote control of data acquisition, real-time analysis, offline analysis and diagnostic trending. The system has been fully implemented on a site machine and contributing significantly to improve the maintenance efficiency and provide a safer and reliable operation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

MapReduce frameworks such as Hadoop are well suited to handling large sets of data which can be processed separately and independently, with canonical applications in information retrieval and sales record analysis. Rapid advances in sequencing technology have ensured an explosion in the availability of genomic data, with a consequent rise in the importance of large scale comparative genomics, often involving operations and data relationships which deviate from the classical Map Reduce structure. This work examines the application of Hadoop to patterns of this nature, using as our focus a wellestablished workflow for identifying promoters - binding sites for regulatory proteins - Across multiple gene regions and organisms, coupled with the unifying step of assembling these results into a consensus sequence. Our approach demonstrates the utility of Hadoop for problems of this nature, showing how the tyranny of the "dominant decomposition" can be at least partially overcome. It also demonstrates how load balance and the granularity of parallelism can be optimized by pre-processing that splits and reorganizes input files, allowing a wide range of related problems to be brought under the same computational umbrella.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1 PDF document (8 pp., English).-- Contributed to: VSMM'08: 14th International Conference on Virtual Systems and Multimedia (Limassol, Cyprus, Oct 20-25, 2008)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is widely acknowledged that a company's ability to aquire market share, and hence its profitability, is very closely linked to the speed with which it can produce a new design. Indeed, a study by the U.K. Department of Trade and Industry has shown that the critical factor which determines profitability is the timely delivery of the new product. Late entry to market or high production costs dramatically reduce profits whilst an overrun on development cost has little significant effect. This paper describes a method which aims to assist the designer in producing higher performance turbomachinery designs more quickly by accelerating the process by which they are produced. The adopted approach combines an enhanced version of the 'Signposting' design process management methodology with industry-standard analysis codes and Computational Fluid Dynamics (CFD). It has been specifically configured to enable process-wide iteration, near instantaneous generation of guidance data for the designer and fully automatic data handling. A successful laboratory experiment based on the design of a large High Pressure Steam Turbine is described and this leads on to current work which incorporates the extension of the proven concept to a full industrial application for the design of Aeroengine Compressors with Rolls-Royce plc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A review of the data (handling) requirements for length-based stock assessment is presented, with emphasis on the relationship between the expected outputs and the key features of the samples required, and on biases and other sources of inaccuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O uso de sistemas computacionais para armazenamento, tratamento de dados e produção de informação, disseminou-se de maneira crescente nos últimos anos, e neste cenário estão incluídos os Sistemas de Informações Geográficas, os SIGs. A utilização de informação geográfica com acesso por computador é hoje a realidade de ambientes corporativos, entidades governamentais, escolas e residências. Esta dissertação apresenta uma proposta de modelagem de elementos de zoneamento urbano, baseada em uma ontologia de domínio. Ontologias são representadas como classes e atributos de um dado domínio. Na proposta apresentada, estas classes são exportadas para o formato XMI, resguardando as definições de classes, atributos e relacionamentos do domínio analisado e compondo um repositório de classes, permitindo, teoricamente, sua reutilização. Como exemplo da proposta, foi construída uma ontologia do Zoneamento Urbano do município de Macaé-RJ, seguindo a proposta do Plano Diretor Municipal, usando o editor Protégé. A ontologia construída foi exportada para o formato XMI, sendo a seguir criado um diagrama de classes, obtido através da importação das classes pelo software para modelagem de sistemas baseados no paradigma da OO, ArgoUML. Tal importação permite que a ontologia construída fique disponível na forma de um pacote de classes, que pode ser utilizado por aplicações que se baseiem no paradigma da OO para o desenvolvimento de sistemas de informação. Como forma de mostrar a utilização destas classes foi desenvolvido um protótipo utilizando o software ALOV Map, que oferece a visualização destas classes, na Web, como mapas temáticos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A two year, comprehensive, quantitative investigation was conducted to analyze and identify the spatial distribution of petrogenic and biogenic hydrocarbons in sediments, surface waters, fish and shellfish of Biscayne Bay, Florida. The goal for the first year of the project was to establish baseline information to support oil spill impact assessment and clean-up. One hundred fifty-five sediment and eleven biota samples were collected. The areas sampled included the Miami River, Intracoastal Waterway, tidal flats, access canals and environmentally sensitive shorelines. The second year of the study centered on areas exhibiting petroleum contamination. These areas included the Miami River, Little River, Goulds Canal, Black Creek and Military Canal. Surface and subsurface sediment, biota and surface water were collected. Sample collection, analyses, and data handling for the two year project were conducted so that all information was court-competent and scientifically accurate. Chain of custody was maintained for all samples. Total hydrocarbon content of surface sediments ranged from below detection limits to a high of 2663.44 pg/g. Several sample stations contained petroleum contamination. The majority of biota samples exhibited hydrocarbon concentrations and characteristics that indicated little, if any, petroleum contamination. Surface water samples ranged from 0.78 to 64.47 μg/L and several samples contained petroleum hydrocarbons. Our results indicate several areas of petroleum contamination. These areas are characterized by industrial complexes, port facilities, marinas, major boating routes and many of the major tributaries emptying into Biscayne Bay.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The remote sensing based Production Efficiency Models (PEMs), springs from the concept of "Light Use Efficiency" and has been applied more and more in estimating terrestrial Net Primary Productivity (NPP) regionally and globally. However, global NPP estimates vary greatly among different models in different data sources and handling methods. Because direct observation or measurement of NPP is unavailable at global scale, the precision and reliability of the models cannot be guaranteed. Though, there are ways to improve the accuracy of the models from input parameters. In this study, five remote sensing based PEMs have been compared: CASA, GLO-PEM, TURC, SDBM and VPM. We divided input parameters into three categories, and analyzed the uncertainty of (1) vegetation distribution, (2) fraction of photosynthetically active radiation absorbed by the canopy (fPAR) and (3) light use efficiency (e). Ground measurements of Hulunbeier typical grassland and meteorology measurements were introduced for accuracy evaluation. Results show that a real-time, more accurate vegetation distribution could significantly affect the accuracy of the models, since it's applied directly or indirectly in all models and affects other parameters simultaneously. Higher spatial and spectral resolution remote sensing data may reduce uncertainty of fPAR up to 51.3%, which is essential to improve model accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Se analizan y describen las principales líneas de trabajo de la Web Semántica en el ámbito de los archivos de televisión. Para ello, se analiza y contextualiza la web semántica desde una perspectiva general para posteriormente analizar las principales iniciativas que trabajan con lo audiovisual: Proyecto MuNCH, Proyecto S5T, Semantic Television y VideoActive.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The performance of the surface zone of concrete is acknowledged as a major factor governing the rate of deterioration of reinforced concrete structures as it provides the only barrier to the ingress of water containing dissolved ionic species such as chlorides which, ultimately, initiate corrosion of the reinforcement. In-situ monitoring of cover-zone concrete is therefore critical in attempting to make realistic predictions as to the in-service performance of the structure. To this end, this paper presents developments in a remote interrogation system to allow continuous, real-time monitoring of the cover-zone concrete from an office setting. Use is made of a multi-electrode array embedded within cover-zone concrete to acquire discretized electrical resistivity and temperature measurements, with both parameters monitored spatially and temporally. On-site instrumentation, which allows remote interrogation of concrete samples placed at a marine exposure site, is detailed, together with data handling and processing procedures. Site-measurements highlight the influence of temperature on electrical resistivity and an Arrhenius-based temperature correction protocol is developed using on-site measurements to standardize resistivity data to a reference temperature; this is an advancement over the use of laboratory-based procedures. The testing methodology and interrogation system represents a robust, low-cost and high-value technique which could be deployed for intelligent monitoring of reinforced concrete structures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Free-roaming dogs (FRD) represent a potential threat to the quality of life in cities from an ecological, social and public health point of view. One of the most urgent concerns is the role of uncontrolled dogs as reservoirs of infectious diseases transmittable to humans and, above all, rabies. An estimate of the FRD population size and characteristics in a given area is the first step for any relevant intervention programme. Direct count methods are still prominent because of their non-invasive approach, information technologies can support such methods facilitating data collection and allowing for a more efficient data handling. This paper presents a new framework for data collection using a topological algorithm implemented as ArcScript in ESRI® ArcGIS software, which allows for a random selection of the sampling areas. It also supplies a mobile phone application for Android® operating system devices which integrates Global Positioning System (GPS) and Google Maps™. The potential of such a framework was tested in 2 Italian regions. Coupling technological and innovative solutions associated with common counting methods facilitate data collection and transcription. It also paves the way to future applications, which could support dog population management systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta comunicação procura compreender a forma como uma professora do 1.º ciclo conduz os alunos na realização de uma tarefa de organização e tratamento de dados e a reflexão que realiza. Trata-se de um trabalho no âmbito de um estudo que segue uma metodologia de investigação interpretativa e qualitativa na modalidade de estudo de caso. Os resultados mostram que, na exploração da tarefa, a prática de ensino da professora contempla de modo diferenciado as diversas fases do ciclo investigativo estatístico. A professora revela particular atenção à participação dos alunos nas tomadas de decisão sobre os procedimentos a seguir. Na sua reflexão, identifica os principais momentos da aula, valoriza as decisões tomadas e destaca a fase de recolha de dados como o momento mais significativo. In this paper, we analyze how a primary school teacher leads students in doing a task involving data handling, as well as the reflection. The paper is carried in the framework of a larger study that follows an interpretative and qualitative research methodology with a case study design. The results indicate that, in the exploration of the task, the practice of this teacher includes in different ways the phases of a statistical investigation. The teacher showed particular concern with the participation of the students in making decisions about the procedures to follow. During her reflection, she identifies the main moments of the lesson, values the decisions made and highlights the phase of data collection as the most meaningful moment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

TLAStat+ is a powerful but easy to use statistics software package.It is an add-in for statistics to be used in conjunction with Microsoft Excel. It provides extra calculation, graph, and help features for statistical data analysis. TLAStat+ is designed for teaching introductory-intermediate courses in statistics. It runs on PCs in various versions of Excel and has powerful data handling capabilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The control of molecular architectures has been exploited in layer-by-layer (LbL) films deposited on Au interdigitated electrodes, thus forming an electronic tongue (e-tongue) system that reached an unprecedented high sensitivity (down to 10-12 M) in detecting catechol. Such high sensitivity was made possible upon using units containing the enzyme tyrosinase, which interacted specifically with catechol, and by processing impedance spectroscopy data with information visualization methods. These latter methods, including the parallel coordinates technique, were also useful for identifying the major contributors to the high distinguishing ability toward catechol. Among several film architectures tested, the most efficient had a tyrosinase layer deposited atop LbL films of alternating layers of dioctadecyldimethylammonium bromide (DODAB) and 1,2-dipalmitoyl-sn-3-glycero-fosfo-rac-(1-glycerol) (DPPG), viz., (DODAB/DPPG)5/DODAB/Tyr. The latter represents a more suitable medium for immobilizing tyrosinase when compared to conventional polyelectrolytes. Furthermore, the distinction was more effective at low frequencies where double-layer effects on the film/liquid sample dominate the electrical response. Because the optimization of film architectures based on information visualization is completely generic, the approach presented here may be extended to designing architectures for other types of applications in addition to sensing and biosensing. © 2013 American Chemical Society.