896 resultados para sensor grid database system
Resumo:
One of the most important advantages of database systems is that the underlying mathematics is rich enough to specify very complex operations with a small number of statements in the database language. This research covers an aspect of biological informatics that is the marriage of information technology and biology, involving the study of real-world phenomena using virtual plants derived from L-systems simulation. L-systems were introduced by Aristid Lindenmayer as a mathematical model of multicellular organisms. Not much consideration has been given to the problem of persistent storage for these simulations. Current procedures for querying data generated by L-systems for scientific experiments, simulations and measurements are also inadequate. To address these problems the research in this paper presents a generic process for data-modeling tools (L-DBM) between L-systems and database systems. This paper shows how L-system productions can be generically and automatically represented in database schemas and how a database can be populated from the L-system strings. This paper further describes the idea of pre-computing recursive structures in the data into derived attributes using compiler generation. A method to allow a correspondence between biologists' terms and compiler-generated terms in a biologist computing environment is supplied. Once the L-DBM gets any specific L-systems productions and its declarations, it can generate the specific schema for both simple correspondence terminology and also complex recursive structure data attributes and relationships.
Resumo:
Nowadays there is an increase of location-aware mobile applications. However, these applications only retrieve location with a mobile device's GPS chip. This means that in indoor or in more dense environments these applications don't work properly. To provide location information everywhere a pedestrian Inertial Navigation System (INS) is typically used, but these systems can have a large estimation error since, in order to turn the system wearable, they use low-cost and low-power sensors. In this work a pedestrian INS is proposed, where force sensors were included to combine with the accelerometer data in order to have a better detection of the stance phase of the human gait cycle, which leads to improvements in location estimation. Besides sensor fusion an information fusion architecture is proposed, based on the information from GPS and several inertial units placed on the pedestrian body, that will be used to learn the pedestrian gait behavior to correct, in real-time, the inertial sensors errors, thus improving location estimation.
Resumo:
The goal of this work was to move structural health monitoring (SHM) one step closer to being ready for mainstream use by the Iowa Department of Transportation (DOT) Office of Bridges and Structures. To meet this goal, the objective of this project was to implement a pilot multi-sensor continuous monitoring system on the Iowa Falls Arch Bridge such that autonomous data analysis, storage, and retrieval can be demonstrated. The challenge with this work was to develop the open channels for communication, coordination, and cooperation of various Iowa DOT offices that could make use of the data. In a way, the end product was to be something akin to a control system that would allow for real-time evaluation of the operational condition of a monitored bridge. Development and finalization of general hardware and software components for a bridge SHM system were investigated and completed. This development and finalization was framed around the demonstration installation on the Iowa Falls Arch Bridge. The hardware system focused on using off-the-shelf sensors that could be read in either “fast” or “slow” modes depending on the desired monitoring metric. As hoped, the installed system operated with very few problems. In terms of communications—in part due to the anticipated installation on the I-74 bridge over the Mississippi River—a hardline digital subscriber line (DSL) internet connection and grid power were used. During operation, this system would transmit data to a central server location where the data would be processed and then archived for future retrieval and use. The pilot monitoring system was developed for general performance evaluation purposes (construction, structural, environmental, etc.) such that it could be easily adapted to the Iowa DOT’s bridges and other monitoring needs. The system was developed allowing easy access to near real-time data in a format usable to Iowa DOT engineers.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.
Resumo:
Staphylococcus carnosus is a facultative anaerobic bacterium which features the cytoplasmic NreABC system. It is necessary for regulation of nitrate respiration and the nitrate reductase gene narG in response to oxygen and nitrate availability. NreB is a sensor kinase of a two-component system and represents the oxygen sensor of the system. It binds an oxygen labile [4Fe-4S]2+ cluster under anaerobic conditions. NreB autophosphorylates and phosphoryl transfer activates the response regulator NreC which induces narG expression. The third component of the Nre system is the nitrate receptor NreA. In this study the role of the nitrate receptor protein NreA in nitrate regulation and its functional and physiological effect on oxygen regulation and interaction with the NreBC two-component system were detected. In vivo, a reporter gene assay for measuring expression of the NreABC regulated nitrate reductase gene narG was used for quantitative evaluation of NreA function. Maximal narG expression in wild type S. carnosus required anaerobic conditions and the presence of nitrate. Deletion of nreA allowed expression of narG under aerobic conditions, and under anaerobic conditions nitrate was no longer required for maximal induction. This indicates that NreA is a nitrate regulated inhibitor of narG expression. Purified NreA and variant NreA(Y95A) inhibited the autophosphorylation of anaerobic NreB in part and completely, respectively. Neither NreA nor NreA(Y95A) stimulated dephosphorylation of NreB-phosphate, however. Inhibition of phosphorylation was relieved completely when NreA with bound nitrate (NreA•[NO3-]) was used. The same effects of NreA were monitored with aerobically isolated Fe-S-less NreB, which indicates that NreA does not have an influence on the iron-sulfur cluster of NreB. In summary, the data of this study show that NreA interacts with the oxygen sensor NreB and controls its phosphorylation level in a nitrate dependent manner. This modulation of NreB-function by NreA and nitrate results in nitrate/oxygen co-sensing by an NreA/NreB sensory unit. It transmits the regulatory signal from oxygen and nitrate in a joint signal to target promoters. Therefore, nitrate and oxygen regulation of nitrate dissimilation follows a new mode of regulation not present in other facultative anaerobic bacteria.
Resumo:
Nowadays words like Smart City, Internet of Things, Environmental Awareness surround us with the growing interest of Computer Science and Engineering communities. Services supporting these paradigms are definitely based on large amounts of sensed data, which, once obtained and gathered, need to be analyzed in order to build maps, infer patterns, extract useful information. Everything is done in order to achieve a better quality of life. Traditional sensing techniques, like Wired or Wireless Sensor Network, need an intensive usage of distributed sensors to acquire real-world conditions. We propose SenSquare, a Crowdsensing approach based on smartphones and a central coordination server for time-and-space homogeneous data collecting. SenSquare relies on technologies such as CoAP lightweight protocol, Geofencing and the Military Grid Reference System.
Resumo:
The electromechanical transfer characteristics of adhesively bonded piezoelectric sensors are investigated. By the use of dynamic piezoelectricity theory, Mindlin plate theory for flexural wave propagation, and a multiple integral transform method, the frequency-response functions of piezoelectric sensors with and without backing materials are developed and the pressure-voltage transduction functions of the sensors calculated. The corresponding simulation results show that the sensitivity of the sensors is not only dependent on the sensors' inherent features, such as piezoelectric properties and geometry, but also on local characteristics of the tested structures and the admittance and impedance of the attached electrical circuit. It is also demonstrated that the simplified rigid mass sensor model can be used to analyze successfully the sensitivity of the sensor at low frequencies, but that the dynamic piezoelectric continuum model has to be used for higher frequencies, especially around the resonance frequency of the coupled sensor-structure vibration system.
Resumo:
A presença de sistemas eletrónicos nos veículos automóveis tem vindo a aumentar de forma considerável nos últimos 30 anos, tornando possível o aumento dos padrões de eficiência, segurança e conforto dos mesmos. Os sistemas de acionamento automático dos limpa-para-brisas, baseados em sensores de chuva óticos, têm registado um crescimento quase exponencial nos últimos 10 a 15 anos; no ano 2000, apenas 5 % dos veículos novos produzidos na Europa estavam equipados com este sistema, hoje é um equipamento amplamente difundido na oferta automóvel existente. O presente trabalho consistiu no estudo de uma solução para deteção de chuva em veículos automóveis com a aplicação de um sensor piezoelétrico, tendo em vista a obtenção de uma solução mais versátil e aplicável em vários pontos do veículo. As reduzidas dimensões, a elevada sensibilidade do sensor e a facilidade de aplicação nas superfícies de ensaio foram fatores que motivaram a escolha deste tipo de equipamento como elemento sensorial. As hipóteses definidas para o procedimento laboratorial basearam-se nas conclusões obtidas em estudos anteriormente desenvolvidos no campo dos sensores de chuva para automóveis e nas capacidades dos materiais piezoelétricos para medição de pluviosidade. Os sensores foram instalados sob as superfícies do veículo que apresentavam, simultaneamente, uma maior exposição à pluviosidade, quando este está em movimento, e um menor risco de sofrer danos. Os resultados obtidos permitiram concluir que a utilização deste tipo de sensores permite detetar elevados níveis de pluviosidade e em superfícies com considerável capacidade de deformação elástica. A sua implementação futura em veículos automóveis exige mais algum trabalho de melhoria dos processos de fixação dos sensores e do condicionamento de sinal utilizados.
Resumo:
Future distribution systems will have to deal with an intensive penetration of distributed energy resources ensuring reliable and secure operation according to the smart grid paradigm. SCADA (Supervisory Control and Data Acquisition) is an essential infrastructure for this evolution. This paper proposes a new conceptual design of an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). This SCADA model is used to support the energy resource management undertaken by a distribution network operator (DNO). Resource management considers all the involved costs, power flows, and electricity prices, allowing the use of network reconfiguration and load curtailment. Locational Marginal Prices (LMP) are evaluated and used in specific situations to apply Demand Response (DR) programs on a global or a local basis. The paper includes a case study using a 114 bus distribution network and load demand based on real data.
Resumo:
In this paper a new PCA-based positioning sensor and localization system for mobile robots to operate in unstructured environments (e. g. industry, services, domestic ...) is proposed and experimentally validated. The inexpensive positioning system resorts to principal component analysis (PCA) of images acquired by a video camera installed onboard, looking upwards to the ceiling. This solution has the advantage of avoiding the need of selecting and extracting features. The principal components of the acquired images are compared with previously registered images, stored in a reduced onboard image database, and the position measured is fused with odometry data. The optimal estimates of position and slippage are provided by Kalman filters, with global stable error dynamics. The experimental validation reported in this work focuses on the results of a set of experiments carried out in a real environment, where the robot travels along a lawn-mower trajectory. A small position error estimate with bounded co-variance was always observed, for arbitrarily long experiments, and slippage was estimated accurately in real time.
Resumo:
The forthcoming smart grids are comprised of integrated microgrids operating in grid-connected and isolated mode with local generation, storage and demand response (DR) programs. The proposed model is based on three successive complementary steps for power transaction in the market environment. The first step is characterized as a microgrid’s internal market; the second concerns negotiations between distinct interconnected microgrids; and finally, the third refers to the actual electricity market. The proposed approach is modeled and tested using a MAS framework directed to the study of the smart grids environment, including the simulation of electricity markets. This is achieved through the integration of the proposed approach with the MASGriP (Multi-Agent Smart Grid Platform) system.
Resumo:
We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT). Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds) when the activity occurred. As a result, the computer builds up several files (one per detector/sensor) containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT) or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.). Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats
Resumo:
A new localization approach to increase the navigational capabilities and object manipulation of autonomous mobile robots, based on an encoded infrared sheet of light beacon system, which provides position errors smaller than 0.02m is presented in this paper. To achieve this minimal position error, a resolution enhancement technique has been developed by utilising an inbuilt odometric/optical flow sensor information. This system respects strong low cost constraints by using an innovative assembly for the digitally encoded infrared transmitter. For better guidance of mobile robot vehicles, an online traffic signalling capability is also incorporated. Other added features are its less computational complexity and online localization capability all these without any estimation uncertainty. The constructional details, experimental results and computational methodologies of the system are also described
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.