896 resultados para sensor grid database system
Resumo:
We experimentally demonstrated a highly sensitive twist sensor system based on a 45° and an 81° tilted fibre grating (TFG). The 81°-TFG has a set of dual-peaks that are due to the birefringence induced by its extremely tilted structure. When the 81°-TFG subjected to twist, the coupling to the two peaks would interchange from each other, providing a mechanism to measure and monitor the twist. We have investigated the performance of the sensor system by three interrogation methods (spectral, power-measurement and voltage-measurement). The experimental results clearly show that the 81°-TFG and the 45°-TFG could be combined forming a full fibre twist sensor system capable of not just measuring the magnitude but also recognising the direction of the applied twist.
Resumo:
We present a compact, portable and low cost generic interrogation strain sensor system using a fibre Bragg grating configured in transmission mode with a vertical-cavity surface-emitting laser (VCSEL) light source and a GaAs photodetector embedded in a polymer skin. The photocurrent value is read and stored by a microcontroller. In addition, the photocurrent data is sent via Bluetooth to a computer or tablet device that can present the live data in a real time graph. With a matched grating and VCSEL, the system is able to automatically scan and lock the VCSEL to the most sensitive edge of the grating. Commercially available VCSEL and photodetector chips are thinned down to 20 µm and integrated in an ultra-thin flexible optical foil using several thin film deposition steps. A dedicated micro mirror plug is fabricated to couple the driving optoelectronics to the fibre sensors. The resulting optoelectronic package can be embedded in a thin, planar sensing sheet and the host material for this sheet is a flexible and stretchable polymer. The result is a fully embedded fibre sensing system - a photonic skin. Further investigations are currently being carried out to determine the stability and robustness of the embedded optoelectronic components. © 2012 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).
Resumo:
In this paper, we report a simple fibre laser torsion sensor system using an intracavity tilted fibre grating as a torsion encoded loss filter. When the grating is subjected to twist, it induces loss to the cavity, thus affecting the laser oscillation build-up time. By measuring the build-up time, both twist direction and angle on the grating can be monitored. Using a low-cost photodiode and a two-channel digital oscilloscope, we have characterised the torsion sensing capability of this fibre laser system and obtained a torsion sensitivity of ~412µs/(rad/m) in the dynamic range from -150° to +150°.
Resumo:
A low-cost fiber optic sensor system based on multimode fiber and an LED light source is presented. A multimode fiber Bragg grating (MMFBG) element is used as a strain sensor. In a matched grating scheme, a MMFBG similar to the sensing one was used as a reference in the receiving unit. For detection of large wavelength shift we demonstrated the feasibility of MMFBG wavelength detection using a single mode fiber fused coupler edge filter. The high cost normally associated with wavelength interrogators for single mode fiber FBG sensors was overcome by the utilization of a low cost multimode fiber pigtailed LED light source. The multimode fiber sensing system has the potential of maintaining much of the advantages of its single mode FBG sensor system counterparts. The MMFBG sensing schemes could be used for short distance, high sensitivity, high speed, strain, temperature and acoustic sensing applications.
Resumo:
Hybrid WDM/TDM enabled microstructure based optical fiber sensor network with large capacity is proposed. Assisted by Fabry-Perot filter, the demodulation system with high speed of 500Hz and high wavelength resolution less than 4.91pm is realized. © OSA 2015.
Resumo:
ACM Computing Classification System (1998): H.5.2, H.2.8, J.2, H.5.3.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication contains measurements from the Continuous Surface Sampling System [CSSS] made during one campaign of the Tara Oceans Expedition. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a Sea-Bird TSG temperature and conductivity sensor. System maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication contains measurements from the Continuous Surface Sampling System [CSSS] made during one campaign of the Tara Oceans Expedition. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a Sea-Bird TSG temperature and conductivity sensor. System maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication contains measurements from the Continuous Surface Sampling System [CSSS] made during one campaign of the Tara Oceans Expedition. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a Sea-Bird TSG temperature and conductivity sensor. System maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data publication contains measurements from the Continuous Surface Sampling System [CSSS] made during one campaign of the Tara Oceans Expedition. Water was pumped at the front of the vessel from ~2m depth, then de-bubbled and circulated to a Sea-Bird TSG temperature and conductivity sensor. System maintenance (instrument cleaning, flushing) was done approximately once a week and in port between successive legs. All data were stamped with a GPS.