873 resultados para data acquisition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

"PB2005-917005."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development cost of any civil infrastructure is very high; during its life span, the civil structure undergoes a lot of physical loads and environmental effects which damage the structure. Failing to identify this damage at an early stage may result in severe property loss and may become a potential threat to people and the environment. Thus, there is a need to develop effective damage detection techniques to ensure the safety and integrity of the structure. One of the Structural Health Monitoring methods to evaluate a structure is by using statistical analysis. In this study, a civil structure measuring 8 feet in length, 3 feet in diameter, embedded with thermocouple sensors at 4 different levels is analyzed under controlled and variable conditions. With the help of statistical analysis, possible damage to the structure was analyzed. The analysis could detect the structural defects at various levels of the structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Literature is limited in its knowledge of the Bluetooth protocol based data acquisition process and in the accuracy and reliability of the analysis performed using the data. This paper extends the body of knowledge surrounding the use of data from the Bluetooth Media Access Control Scanner (BMS) as a complementary traffic data source. A multi layer simulation model named Traffic and Communication Simulation (TCS) is developed. TCS is utilised to model the theoretical properties of the BMS data and analyse the accuracy and reliability of travel time estimation using the BMS data.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Big Data and predictive analytics have received significant attention from the media and academic literature throughout the past few years, and it is likely that these emerging technologies will materially impact the mining sector. This short communication argues, however, that these technological forces will probably unfold differently in the mining industry than they have in many other sectors because of significant differences in the marginal cost of data capture and storage. To this end, we offer a brief overview of what Big Data and predictive analytics are, and explain how they are bringing about changes in a broad range of sectors. We discuss the “N=all” approach to data collection being promoted by many consultants and technology vendors in the marketplace but, by considering the economic and technical realities of data acquisition and storage, we then explain why a “n « all” data collection strategy probably makes more sense for the mining sector. Finally, towards shaping the industry’s policies with regards to technology-related investments in this area, we conclude by putting forward a conceptual model for leveraging Big Data tools and analytical techniques that is a more appropriate fit for the mining sector.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Network data packet capture and replay capabilities are basic requirements for forensic analysis of faults and security-related anomalies, as well as for testing and development. Cyber-physical networks, in which data packets are used to monitor and control physical devices, must operate within strict timing constraints, in order to match the hardware devices' characteristics. Standard network monitoring tools are unsuitable for such systems because they cannot guarantee to capture all data packets, may introduce their own traffic into the network, and cannot reliably reproduce the original timing of data packets. Here we present a high-speed network forensics tool specifically designed for capturing and replaying data traffic in Supervisory Control and Data Acquisition systems. Unlike general-purpose "packet capture" tools it does not affect the observed network's data traffic and guarantees that the original packet ordering is preserved. Most importantly, it allows replay of network traffic precisely matching its original timing. The tool was implemented by developing novel user interface and back-end software for a special-purpose network interface card. Experimental results show a clear improvement in data capture and replay capabilities over standard network monitoring methods and general-purpose forensics solutions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A Radio Frequency (RF) based digital data transmission scheme with 8 channel encoder/decoder ICs is proposed for surface electrode switching of a 16-electrode wireless Electrical Impedance Tomography (EIT) system. A RF based wireless digital data transmission module (WDDTM) is developed and the electrode switching of a EIT system is studied by analyzing the boundary data collected and the resistivity images of practical phantoms. An analog multiplexers based electrode switching module (ESM) is developed with analog multiplexers and switched with parallel digital data transmitted by a wireless transmitter/receiver (T-x/R-x) module working with radio frequency technology. Parallel digital bits are generated using NI USB 6251 card working in LabVIEW platform and sent to transmission module to transmit the digital data to the receiver end. The transmitter/receiver module developed is properly interfaced with the personal computer (PC) and practical phantoms through the ESM and USB based DAQ system respectively. It is observed that the digital bits required for multiplexer operation are sequentially generated by the digital output (D/O) ports of the DAQ card. Parallel to serial and serial to parallel conversion of digital data are suitably done by encoder and decoder ICs. Wireless digital data transmission module successfully transmitted and received the parallel data required for switching the current and voltage electrodes wirelessly. 1 mA, 50 kHz sinusoidal constant current is injected at the phantom boundary using common ground current injection protocol and the boundary potentials developed at the voltage electrodes are measured. Resistivity images of the practical phantoms are reconstructed from boundary data using EIDORS. Boundary data and the resistivity images reconstructed from the surface potentials are studied to assess the wireless digital data transmission system. Boundary data profiles of the practical phantom with different configurations show that the multiplexers are operating in the required sequence for common ground current injection protocol. The voltage peaks obtained at the proper positions in the boundary data profiles proved the sequential operation of multiplexers and successful wireless transmission of digital bits. Reconstructed images and their image parameters proved that the boundary data are successfully acquired by the DAQ system which in turn again indicates a sequential and proper operation of multiplexers as well as the successful wireless transmission of digital bits. Hence the developed RF based wireless digital data transmission module (WDDTM) is found suitable for transmitting digital bits required for electrode switching in wireless EIT data acquisition system. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

NMR-based approach to metabolomics typically involves the collection of two-dimensional (2D) heteronuclear correlation spectra for identification and assignment of metabolites. In case of spectral overlap, a 3D spectrum becomes necessary, which is hampered by slow data acquisition for achieving sufficient resolution. We describe here a method to simultaneously acquire three spectra (one 3D and two 2D) in a single data set, which is based on a combination of different fast data acquisition techniques such as G-matrix Fourier transform (GFT) NMR spectroscopy, parallel data acquisition and non-uniform sampling. The following spectra are acquired simultaneously: (1) C-13 multiplicity edited GFT (3,2)D HSQC-TOCSY, (2) 2D H-1- H-1] TOCSY and (3) 2D C-13- H-1] HETCOR. The spectra are obtained at high resolution and provide high-dimensional spectral information for resolving ambiguities. While the GFT spectrum has been shown previously to provide good resolution, the editing of spin systems based on their CH multiplicities further resolves the ambiguities for resonance assignments. The experiment is demonstrated on a mixture of 21 metabolites commonly observed in metabolomics. The spectra were acquired at natural abundance of C-13. This is the first application of a combination of three fast NMR methods for small molecules and opens up new avenues for high-throughput approaches for NMR-based metabolomics.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The mapping and geospatial analysis of benthic environments are multidisciplinary tasks that have become more accessible in recent years because of advances in technology and cost reductions in survey systems. The complex relationships that exist among physical, biological, and chemical seafloor components require advanced, integrated analysis techniques to enable scientists and others to visualize patterns and, in so doing, allow inferences to be made about benthic processes. Effective mapping, analysis, and visualization of marine habitats are particularly important because the subtidal seafloor environment is not readily viewed directly by eye. Research in benthic environments relies heavily, therefore, on remote sensing techniques to collect effective data. Because many benthic scientists are not mapping professionals, they may not adequately consider the links between data collection, data analysis, and data visualization. Projects often start with clear goals, but may be hampered by the technical details and skills required for maintaining data quality through the entire process from collection through analysis and presentation. The lack of technical understanding of the entire data handling process can represent a significant impediment to success. While many benthic mapping efforts have detailed their methodology as it relates to the overall scientific goals of a project, only a few published papers and reports focus on the analysis and visualization components (Paton et al. 1997, Weihe et al. 1999, Basu and Saxena 1999, Bruce et al. 1997). In particular, the benthic mapping literature often briefly describes data collection and analysis methods, but fails to provide sufficiently detailed explanation of particular analysis techniques or display methodologies so that others can employ them. In general, such techniques are in large part guided by the data acquisition methods, which can include both aerial and water-based remote sensing methods to map the seafloor without physical disturbance, as well as physical sampling methodologies (e.g., grab or core sampling). The terms benthic mapping and benthic habitat mapping are often used synonymously to describe seafloor mapping conducted for the purpose of benthic habitat identification. There is a subtle yet important difference, however, between general benthic mapping and benthic habitat mapping. The distinction is important because it dictates the sequential analysis and visualization techniques that are employed following data collection. In this paper general seafloor mapping for identification of regional geologic features and morphology is defined as benthic mapping. Benthic habitat mapping incorporates the regional scale geologic information but also includes higher resolution surveys and analysis of biological communities to identify the biological habitats. In addition, this paper adopts the definition of habitats established by Kostylev et al. (2001) as a “spatially defined area where the physical, chemical, and biological environment is distinctly different from the surrounding environment.” (PDF contains 31 pages)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Without knowledge of basic seafloor characteristics, the ability to address any number of critical marine and/or coastal management issues is diminished. For example, management and conservation of essential fish habitat (EFH), a requirement mandated by federally guided fishery management plans (FMPs), requires among other things a description of habitats for federally managed species. Although the list of attributes important to habitat are numerous, the ability to efficiently and effectively describe many, and especially at the scales required, does not exist with the tools currently available. However, several characteristics of seafloor morphology are readily obtainable at multiple scales and can serve as useful descriptors of habitat. Recent advancements in acoustic technology, such as multibeam echosounding (MBES), can provide remote indication of surficial sediment properties such as texture, hardness, or roughness, and further permit highly detailed renderings of seafloor morphology. With acoustic-based surveys providing a relatively efficient method for data acquisition, there exists a need for efficient and reproducible automated segmentation routines to process the data. Using MBES data collected by the Olympic Coast National Marine Sanctuary (OCNMS), and through a contracted seafloor survey, we expanded on the techniques of Cutter et al. (2003) to describe an objective repeatable process that uses parameterized local Fourier histogram (LFH) texture features to automate segmentation of surficial sediments from acoustic imagery using a maximum likelihood decision rule. Sonar signatures and classification performance were evaluated using video imagery obtained from a towed camera sled. Segmented raster images were converted to polygon features and attributed using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999) for use in a geographical information system (GIS). (PDF contains 41 pages.)