887 resultados para DATA-ACQUISITION SYSTEM
Resumo:
The reflection seismic prospecting technique is an important and a widely used method in the petroleum and coal surveying, and has been developed to a perfectly mature technique from the aspects of data acquisition, data processing to data interpretation. However, the metallic mine seismic prospecting, especially the high resolution seismic prospecting technique are being still in the course of studying and probing up to now. In this paper, the basic theory and the present situation of study on metallic mine seismic reflection are expatiated, the basic theory, the improving measure, the converging velocity and the ability on the integrating global optimization method are also illuminated in detail at first. Then the basic theory, the realization process and the practicing effects of the vector suppressing noise algorithm are also introduced. On the basis of studying of applying the integrating global optimization method to static correction and the vector suppressing noise algorithm, we elaborate processed the seismic data of Tongling metallic mine. We introduced the processing flow, the key steps and the processing effects. Basing on the processing results, we analyzed the major reflection characteristics, the geological interpretation results and the earth's crust top reflection structure and the space distribution status of Wutong set, the space shape of part lithological body and the contacting relations of horizonsunveiled.
Resumo:
Passive monitoring of large sites typically requires coordination between multiple cameras, which in turn requires methods for automatically relating events between distributed cameras. This paper tackles the problem of self-calibration of multiple cameras which are very far apart, using feature correspondences to determine the camera geometry. The key problem is finding such correspondences. Since the camera geometry and photometric characteristics vary considerably between images, one cannot use brightness and/or proximity constraints. Instead we apply planar geometric constraints to moving objects in the scene in order to align the scene"s ground plane across multiple views. We do not assume synchronized cameras, and we show that enforcing geometric constraints enables us to align the tracking data in time. Once we have recovered the homography which aligns the planar structure in the scene, we can compute from the homography matrix the 3D position of the plane and the relative camera positions. This in turn enables us to recover a homography matrix which maps the images to an overhead view. We demonstrate this technique in two settings: a controlled lab setting where we test the effects of errors in internal camera calibration, and an uncontrolled, outdoor setting in which the full procedure is applied to external camera calibration and ground plane recovery. In spite of noise in the internal camera parameters and image data, the system successfully recovers both planar structure and relative camera positions in both settings.
Resumo:
This report describes a computational system with which phonologists may describe a natural language in terms of autosegmental phonology, currently the most advanced theory pertaining to the sound systems of human languages. This system allows linguists to easily test autosegmental hypotheses against a large corpus of data. The system was designed primarily with tonal systems in mind, but also provides support for tree or feature matrix representation of phonemes (as in The Sound Pattern of English), as well as syllable structures and other aspects of phonological theory. Underspecification is allowed, and trees may be specified before, during, and after rule application. The association convention is automatically applied, and other principles such as the conjunctivity condition are supported. The method of representation was designed such that rules are designated in as close a fashion as possible to the existing conventions of autosegmental theory while adhering to a textual constraint for maximum portability.
Resumo:
This thesis describes a system that synthesizes regularity exposing attributes from large protein databases. After processing primary and secondary structure data, this system discovers an amino acid representation that captures what are thought to be the three most important amino acid characteristics (size, charge, and hydrophobicity) for tertiary structure prediction. A neural network trained using this 16 bit representation achieves a performance accuracy on the secondary structure prediction problem that is comparable to the one achieved by a neural network trained using the standard 24 bit amino acid representation. In addition, the thesis describes bounds on secondary structure prediction accuracy, derived using an optimal learning algorithm and the probably approximately correct (PAC) model.
Resumo:
Direct-injection electrospray ionization mass spectrometry in combination with information-dependent data acquisition (IDA), using a triple-quadrupole/linear ion trap combination, allows high-throughput qualitative analysis of complex phospholipid species from child whole blood. In the IDA experiments, scans to detect specific head groups (precursor ion or neutral loss scans) were used as survey scans to detect phospholipid classes. An enhanced resolution scan was then used to confirm the mass assignments, and the enhanced product ion scan was implemented as a dependent scan to determine the composition of each phospholipid class. These survey and dependent scans were performed sequentially and repeated for the entire duration of analysis, thus providing the maximum information from a single injection. In this way, 50 different phospholipids belonging to the phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, phosphatidylcholine and sphingomyelin classes were identified in child whole blood. Copyright (C) 2005 John Wiley & Sons, Ltd.
Resumo:
Ferr?, S. and King, R. D. (2004) BLID: an Application of Logical Information Systems in Bioinformatics. In P. Eklund (editor), 2nd International Conference on Formal Concept Analysis (ICFCA), Feb 2004. LNCS 2961, Springer.
Resumo:
Rowland, J.J. and Taylor, J. (2002). Adaptive denoising in spectral analysis by genetic programming. Proc. IEEE Congress on Evolutionary Computation (part of WCCI), May 2002. pp 133-138. ISBN 0-7803-7281-6
Resumo:
In this paper, a wireless sensor network mote hardware design and implementation are introduced for building deployment application. The core of the mote design is based on the 8 bit AVR microcontroller, Atmega1281 and 2.4 GHz wireless communication chip, CC2420. The module PCB fabrication is using the stackable technology providing powerful configuration capability. Three main layers of size 25 mm2 are structured to form the mote; these are RF, sensor and power layers. The sensors were selected carefully to meet both the building monitoring and design requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks. Experiments show that the developed mote works effectively in giving stable data acquisition and owns good communication and power performance.
Resumo:
Coastal zones and shelf-seas are important for tourism, commercial fishing and aquaculture. As a result the importance of good water quality within these regions to support life is recognised worldwide and a number of international directives for monitoring them now exist. This paper describes the AlgaRisk water quality monitoring demonstration service that was developed and operated for the UK Environment Agency in response to the microbiological monitoring needs within the revised European Union Bathing Waters Directive. The AlgaRisk approach used satellite Earth observation to provide a near-real time monitoring of microbiological water quality and a series of nested operational models (atmospheric and hydrodynamic-ecosystem) provided a forecast capability. For the period of the demonstration service (2008–2013) all monitoring and forecast datasets were processed in near-real time on a daily basis and disseminated through a dedicated web portal, with extracted data automatically emailed to agency staff. Near-real time data processing was achieved using a series of supercomputers and an Open Grid approach. The novel web portal and java-based viewer enabled users to visualise and interrogate current and historical data. The system description, the algorithms employed and example results focussing on a case study of an incidence of the harmful algal bloom Karenia mikimotoi are presented. Recommendations and the potential exploitation of web services for future water quality monitoring services are discussed.
Resumo:
Available methods for measuring the impact of ocean acidification (OA) and leakage from carbon capture and storage (CCS) on marine sedimentary pH profiles are unsuitable for replicated experimental setups. To overcome this issue, a novel optical sensor application is presented, using off-the-shelf optode technology (MOPP). The application is validated using microprofiling, during a CCS leakage experiment, where the impact and recovery from a high CO2 plume was investigated in two types of natural marine sediment. MOPP offered user-friendliness, speed of data acquisition, robustness to sediment type, and large sediment depth range. This ensemble of characteristics overcomes many of the challenges found with other pH measuring methods, in OA and CCS research. The impact varied greatly between sediment types, depending on baseline pH variability and sediment permeability. Sedimentary pH profile recovery was quick, with profiles close to control conditions 24 h after the cessation of the leak. However, variability of pH within the finer sediment was still apparent 4 days into the recovery phase. Habitat characteristics need therefore to be considered, to truly disentangle high CO2 perturbation impacts on benthic systems. Impacts on natural communities depend not only on the pH gradient caused by perturbation, but also on other processes that outlive the perturbation, adding complexity to recovery.
Resumo:
We have used four telescopes at different longitudes to obtain near-continuous light-curve coverage of the star HD80606 as it was transited by its ~4-MJup planet. The observations were performed during the predicted transit windows around 2008 October 25 and 2009 February 14. Our data set is unique in that it simultaneously constrains the duration of the transit and the planet's period. Our Markov Chain Monte Carlo analysis of the light curves, combined with constraints from radial-velocity data, yields system parameters consistent with previously reported values. We find a planet-to-star radius ratio marginally smaller than previously reported, corresponding to a planet radius of Rp = 0.921 +/- 0.036RJup.
Resumo:
A self-tuning filter is disclosed. The self-tuning filter includes a digital clocking signal and an input coupled to the digital clocking signal, whereby the input reads a value incident on the input when the digital clocking signal changes to a predetermined state. A clock-tunable filter is, furthermore, coupled to the digital clocking signal so that the frequency of the clock-tunable filter is adjusted in relation to a sampling frequency at which the digital clocking signal operates. The self-tuning filter may be applied to an input of a data acquisition unit and applied to an input having a variable sampling frequency. A method of controlling the frequency of a clock-tunable filter is also disclosed.
Resumo:
Raman spectroscopy is a noninvasive, nondestructive tool for capturing multiplexed biochemical information across diverse molecular species including proteins, lipids, DNA, and mineralizations. Based on light scattering from molecules, cells, and tissues, it is possible to detect molecular fingerprints and discriminate between subtly different members of each biochemical class. Raman spectroscopy is ideal for detecting perturbations from the expected molecular structure such as those occurring during senescence and the modification of long-lived proteins by metabolic intermediates as we age. Here, we describe the sample preparation, data acquisition, signal processing, data analysis and interpretation involved in using Raman spectroscopy for detecting age-related protein modifications in complex biological tissues.
Resumo:
The relationship between lameness and feeding behaviour in dairy cows is not yet fully understood. This study examined the effect of lameness on feeding behaviour at two points during lactation. Forty-five Holstein–Friesian dairy cows (average parity 3.3) were housed in cubicle accommodation after calving and fed a total mixed ration (TMR). At approximately 60 and 120 days post partum, 48 h of information on feeding behaviour (including number of meals eaten, meal duration, meal size and feeding rate) was collected for each animal using feed boxes fitted to a data recording system. At the same time points, locomotion scores were recorded for each cow as a measure of lameness (1.0-sound to 4.5-severely lame). Relationships between feeding behaviour and locomotion score were analysed using Residual Maximum Likelihood (REML) analysis. At both time points, cows with higher locomotion scores ate fewer (P < 0.001), larger meals (P < 0.001) and had a shorter total feeding time (P < 0.001). At day 60 post partum, an increase in locomotion score was associated with a decrease in dry matter intake (P < 0.05), but at day 120 post partum no relationship was found between locomotion score and DMI. No relationship was found at either time point between locomotion score and mean meal duration or rate of feeding. The results of this study suggest that the effect of lameness on feeding behaviour in dairy cows does not remain constant across lactation.
Resumo:
Tissue microarrays (TMAs) represent a powerful method for undertaking large-scale tissue-based biomarker studies. While TMAs offer several advantages, there are a number of issues specific to their use which need to be considered when employing this method. Given the investment in TMA-based research, guidance on design and execution of experiments will be of benefit and should help researchers new to TMA-based studies to avoid known pitfalls. Furthermore, a consensus on quality standards for TMA-based experiments should improve the robustness and reproducibility of studies, thereby increasing the likelihood of identifying clinically useful biomarkers. In order to address these issues, the National Cancer Research Institute Biomarker and Imaging Clinical Studies Group organized a 1-day TMA workshop held in Nottingham in May 2012. The document herein summarizes the conclusions from the workshop. It includes guidance and considerations on all aspects of TMA-based research, including the pre-analytical stages of experimental design, the analytical stages of data acquisition, and the postanalytical stages of data analysis. A checklist is presented which can be used both for planning a TMA experiment and interpreting the results of such an experiment. For studies of cancer biomarkers, this checklist could be used as a supplement to the REMARK guidelines.