888 resultados para Supervisory Control and Data Acquisition (SCADA) Topology
Resumo:
The design and development of a Bottom Pressure Recorder for a Tsunami Early Warning System is described here. The special requirements that it should satisfy for the specific application of deployment at ocean bed and pressure monitoring of the water column above are dealt with. A high-resolution data digitization and low circuit power consumption are typical ones. The implementation details of the data sensing and acquisition part to meet these are also brought out. The data processing part typically encompasses a Tsunami detection algorithm that should detect an event of significance in the background of a variety of periodic and aperiodic noise signals. Such an algorithm and its simulation are presented. Further, the results of sea trials carried out on the system off the Chennai coast are presented. The high quality and fidelity of the data prove that the system design is robust despite its low cost and with suitable augmentations, is ready for a full-fledged deployment at ocean bed. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
This report is a detailed description of data processing of NOAA/MLML spectroradiometry data. It introduces the MLML_DBASE programs, describes the assembly of diverse data fues, and describes general algorithms and how individual routines are used. Definitions of data structures are presented in Appendices. [PDF contains 48 pages]
Resumo:
Commercially available software packages for IBM PC-compatibles are evaluated to use for data acquisition and processing work. Moss Landing Marine Laboratories (MLML) acquired computers since 1978 to use on shipboard data acquisition (Le. CTD, radiometric, etc.) and data processing. First Hewlett-Packard desktops were used then a transition to the DEC VAXstations, with software developed mostly by the author and others at MLML (Broenkow and Reaves, 1993; Feinholz and Broenkow, 1993; Broenkow et al, 1993). IBM PC were at first very slow and limited in available software, so they were not used in the early days. Improved technology such as higher speed microprocessors and a wide range of commercially available software made use of PC more reasonable today. MLML is making a transition towards using the PC for data acquisition and processing. Advantages are portability and available outside support.
Resumo:
In this paper, a wireless sensor network mote hardware design and implementation are introduced for building deployment application. The core of the mote design is based on the 8 bit AVR microcontroller, Atmega1281 and 2.4 GHz wireless communication chip, CC2420. The module PCB fabrication is using the stackable technology providing powerful configuration capability. Three main layers of size 25 mm2 are structured to form the mote; these are RF, sensor and power layers. The sensors were selected carefully to meet both the building monitoring and design requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks. Experiments show that the developed mote works effectively in giving stable data acquisition and owns good communication and power performance.
Resumo:
PURPOSE: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D+dual energy+time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. METHODS: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction problem using the split Bregman method and GPU-based implementations of backprojection, reprojection, and kernel regression. Using a preclinical mouse model, the authors apply the proposed algorithm to study myocardial injury following radiation treatment of breast cancer. RESULTS: Quantitative 5D simulations are performed using the MOBY mouse phantom. Twenty data sets (ten cardiac phases, two energies) are reconstructed with 88 μm, isotropic voxels from 450 total projections acquired over a single 360° rotation. In vivo 5D myocardial injury data sets acquired in two mice injected with gold and iodine nanoparticles are also reconstructed with 20 data sets per mouse using the same acquisition parameters (dose: ∼60 mGy). For both the simulations and the in vivo data, the reconstruction quality is sufficient to perform material decomposition into gold and iodine maps to localize the extent of myocardial injury (gold accumulation) and to measure cardiac functional metrics (vascular iodine). Their 5D CT imaging protocol represents a 95% reduction in radiation dose per cardiac phase and energy and a 40-fold decrease in projection sampling time relative to their standard imaging protocol. CONCLUSIONS: Their 5D CT data acquisition and reconstruction protocol efficiently exploits the rank-sparse nature of spectral and temporal CT data to provide high-fidelity reconstruction results without increased radiation dose or sampling time.
Resumo:
We recently developed an approach for testing the accuracy of network inference algorithms by applying them to biologically realistic simulations with known network topology. Here, we seek to determine the degree to which the network topology and data sampling regime influence the ability of our Bayesian network inference algorithm, NETWORKINFERENCE, to recover gene regulatory networks. NETWORKINFERENCE performed well at recovering feedback loops and multiple targets of a regulator with small amounts of data, but required more data to recover multiple regulators of a gene. When collecting the same number of data samples at different intervals from the system, the best recovery was produced by sampling intervals long enough such that sampling covered propagation of regulation through the network but not so long such that intervals missed internal dynamics. These results further elucidate the possibilities and limitations of network inference based on biological data.
Resumo:
Type 1 diabetes (T1DM) is associated with increased risk of macrovascular complications. We examined longitudinal associations of serum conventional lipids and nuclear magnetic resonance (NMR)-determined lipoprotein subclasses with carotid intima-media thickness (IMT) in adults with T1DM (n=455) enrolled in the Diabetes Control and Complications Trial (DCCT). Data on serum lipids and lipoproteins were collected at DCCT baseline (1983-89) and were correlated with common and internal carotid IMT determined by ultrasonography during the observational follow-up of the DCCT, the Epidemiology of Diabetes Interventions and Complications (EDIC) study, at EDIC 'Year 1' (199-1996) and EDIC 'Year 6' (1998-2000). This article contains data on the associations of DCCT baseline lipoprotein profiles (NMR-based VLDL & chylomicrons, IDL/LDL and HDL subclasses and 'conventional' total, LDL-, HDL-, non-HDL-cholesterol and triglycerides) with carotid IMT at EDIC Years 1 and 6, stratified by gender. The data are supplemental to our original research article describing detailed associations of DCCT baseline lipids and lipoprotein profiles with EDIC Year 12 carotid IMT (Basu et al. in press) [1].
Resumo:
The implementation of an accurate and reliable data acquisition system is the first step to develope a good control system. This data acquisition should provide valuable data readings, not only for control purposes but also for applications in different research areas.