952 resultados para sensor grid database system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The simulated annealing approach to structure solution from powder diffraction data, as implemented in the DASH program, is easily amenable to parallelization at the individual run level. Very large scale increases in speed of execution can therefore be achieved by distributing individual DASH runs over a network of computers. The GDASH program achieves this by packaging DASH in a form that enables it to run under the Univa UD Grid MP system, which harnesses networks of existing computing resources to perform calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A distributed database system is subject to site failure and link failure. This paper presents a reactive system approach to achieving fault tolerance in such a system. The reactive system concepts are an attractive paradigm for system design, development and maintenance because it separates policies from mechanisms. In the paper we give a solution using different reactive modules to implement the fault tolerant policies and the failure detection mechanisms. The solution shows that they can be separated without impact on each other; thus the system can adapt to constant changes in environments and user requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A distributed database system is subject to site failure and link failure. This paper presents a reactive system approach to achieving the fault-tolerance in such a system. The reactive system concepts are an attractive paradigm for system design, development and maintenance because it separates policies from mechanisms. In the paper we give a solution using different reactive modules to implement the fault-tolerant policies and the failure detection mechanisms. The solution shows that they can be separated without impact on each other thus the system can adapt to constant changes in user requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A great number of research organisations in Japan have been conducting structural steel experiments for many years, particularly seismic tests of steel structures like cyclic-loading and pseudo-dynamic tests, in order to determine their seismic performances. However, the original test data gained by most research organisations are not well stored in an appropriate manner for distribution and possible usage by others. With the rapid development of information networks, structural engineers and researchers are able to exchange various types of test data through the Internet. In this paper, the authors present the development of a distributed collaborative database system for structural steel experiments. The database is made available on the Internet, and the use of Java language enables efficient interactive retrieval. The potential applications of the developed database system for structural engineering education are validated for the retrieval of experimental data and seismic numerical analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the high requirements of civil infrastructures against the earthquake in Japan, a great number of research organizations have been conducting the structural steel experiments, in particular the seismic tests such as the cyclic loading test and the pseudo-dynamic test, for many years to determine the seismic performances of steel structures. However, the original test data gained by most research organizations are not well stored in an appropriate manner for distribution and possible usage of others. Although a Numerical Database of Steel Structures (NDSS) was developed some years ago to preserve and share experimental data of the ultimate strength tests acquired at Nagoya University, it was not easy to access this database from other computer platform due to the lack of the support of proper communication media. With the rapid development of information networks and their browsers, structural engineers and researchers are able to exchange various types of test data through Internet. This paper presents the development of a distributed collaborative database system for structural steel experiments. The database is made available on the World-Wide Web, and the Java language enables the interactive retrieval efficiently. The applications of the developed database system for the retrieval of experimental data and seismic numerical analysis are validated in the form of examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sport video data is growing rapidly as a result of the maturing digital technologies that support digital video capture, faster data processing, and large storage. However, (1) semi-automatic content extraction and annotation, (2) scalable indexing model, and (3) effective retrieval and browsing, still pose the most challenging problems for maximizing the usage of large video databases. This article will present the findings from a comprehensive work that proposes a scalable and extensible sports video retrieval system with two major contributions in the area of sports video indexing and retrieval. The first contribution is a new sports video indexing model that utilizes semi-schema-based indexing scheme on top of an Object-Relationship approach. This indexing model is scalable and extensible as it enables gradual index construction which is supported by ongoing development of future content extraction algorithms. The second contribution is a set of novel queries which are based on XQuery to generate dynamic and user-oriented summaries and event structures. The proposed sports video retrieval system has been fully implemented and populated with soccer, tennis, swimming, and diving video. The system has been evaluated against 20 users to demonstrate and confirm its feasibility and benefits. The experimental sports genres were specifically selected to represent the four main categories of sports domain: period-, set-point-, time (race)-, and performance-based sports. Thus, the proposed system should be generic and robust for all types of sports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A grid computing system consists of a group of programs and resources that are spread across machines in the grid. A grid system has a dynamic environment and decentralized distributed resources, so it is important to provide efficient scheduling for applications. Task scheduling is an NP-hard problem and deterministic algorithms are inadequate and heuristic algorithms such as particle swarm optimization (PSO) are needed to solve the problem. PSO is a simple parallel algorithm that can be applied in different ways to resolve optimization problems. PSO searches the problem space globally and needs to be combined with other methods to search locally as well. In this paper, we propose a hybrid-scheduling algorithm to solve the independent task- scheduling problem in grid computing. We have combined PSO with the gravitational emulation local search (GELS) algorithm to form a new method, PSO–GELS. Our experimental results demonstrate the effectiveness of PSO–GELS compared to other algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of grid computing is to produce a virtual supercomputer by using free resources available through widespread networks such as the Internet. This resource distribution, changes in resource availability, and an unreliable communication infrastructure pose a major challenge for efficient resource allocation. Because of the geographical spread of resources and their distributed management, grid scheduling is considered to be a NP-complete problem. It has been shown that evolutionary algorithms offer good performance for grid scheduling. This article uses a new evaluation (distributed) algorithm inspired by the effect of leaders in social groups, the group leaders' optimization algorithm (GLOA), to solve the problem of scheduling independent tasks in a grid computing system. Simulation results comparing GLOA with several other evaluation algorithms show that GLOA produces shorter makespans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a basic frame for rehabilitation motion practice system which detects 3D motion trajectory with the Microsoft Kinect (MSK) sensor system and proposes a cost-effective 3D motion matching algorithm. The rehabilitation motion practice system displays a reference 3D motion in the database system that the player (patient) tries to follow. The player’s motion is traced by the MSK sensor system and then compared with the reference motion to evaluate how well the player follows the reference motion. In this system, 3D motion matching algorithm is a key feature for accurate evaluation for player’s performance. Even though similarity measurement of 3D trajectories is one of the most important tasks in 3D motion analysis, existing methods are still limited. Recent researches focus on the full length 3D trajectory data set. However, it is not true that every point on the trajectory plays the same role and has the same meaning. In this situation, we developed a new cost-effective method that only uses the less number of features called ‘signature’ which is a flexible descriptor computed from the region of ‘elbow points’. Therefore, our proposed method runs faster than other methods which use the full length trajectory information. The similarity of trajectories is measured based on the signature using an alignment method such as dynamic time warping (DTW), continuous dynamic time warping (CDTW) or longest common sub-sequence (LCSS) method. In the experimental studies, we applied the MSK sensor system to detect, trace and match the 3D motion of human body. This application was assumed as a system for guiding a rehabilitation practice which can evaluate how well the motion practice was performed based on comparison of the patient’s practice motion traced by the MSK system with the pre-defined reference motion in a database. In order to evaluate the accuracy of our 3D motion matching algorithm, we compared our method with two other methods using Australian sign word dataset. As a result, our matching algorithm outperforms in matching 3D motion, and it can be exploited for a base framework for various 3D motion-based applications at low cost with high accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An underwater gas pipeline is the portion of the pipeline that crosses a river beneath its bottom. Underwater gas pipelines are subject to increasing dangers as time goes by. An accident at an underwater gas pipeline can lead to technological and environmental disaster on the scale of an entire region. Therefore, timely troubleshooting of all underwater gas pipelines in order to prevent any potential accidents will remain a pressing task for the industry. The most important aspect of resolving this challenge is the quality of the automated system in question. Now the industry doesn't have any automated system that fully meets the needs of the experts working in the field maintaining underwater gas pipelines. Principle Aim of this Research: This work aims to develop a new system of automated monitoring which would simplify the process of evaluating the technical condition and decision making on planning and preventive maintenance and repair work on the underwater gas pipeline. Objectives: Creation a shared model for a new, automated system via IDEF3; Development of a new database system which would store all information about underwater gas pipelines; Development a new application that works with database servers, and provides an explanation of the results obtained from the server; Calculation of the values MTBF for specified pipelines based on quantitative data obtained from tests of this system. Conclusion: The new, automated system PodvodGazExpert has been developed for timely and qualitative determination of the physical conditions of underwater gas pipeline; The basis of the mathematical analysis of this new, automated system uses principal component analysis method; The process of determining the physical condition of an underwater gas pipeline with this new, automated system increases the MTBF by a factor of 8.18 above the existing system used today in the industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Algae are considered a promising source of biofuels in the future. However, the environmental impact of algae-based fuel has high variability in previous LCA studies due to lack of accurate data from researchers and industry. The National Alliance for Advanced Biofuels and Bioproducts (NAABB) project was designed to produce and evaluate new technologies that can be implemented by the algal biofuel industry and establish the overall process sustainability. The MTU research group within NAABB worked on the environmental sustainability part of the consortium with UOP-Honeywell and with the University of Arizona (Dr. Paul Blowers). Several life cycle analysis (LCA) models were developed within the GREET Model and SimaPro 7.3 software to quantitatively assess the environment viability and sustainability of algal fuel processes. The baseline GREET Harmonized algae life cycle was expanded and replicated in SimaPro software, important differences in emission factors between GREET/E-Grid database and SimaPro/Ecoinvent database were compared, and adjustments were made to the SimaPro analyses. The results indicated that in most cases SimaPro has a higher emission penalty for inputs of electricity, chemicals, and other materials to the algae biofuels life cycle. A system-wide model of algae life cycle was made starting with preliminary data from the literature, and then progressed to detailed analyses based on inputs from all NAABB research areas, and finally several important scenarios in the algae life cycle were investigated as variations to the baseline scenario. Scenarios include conversion to jet fuel instead of biodiesel or renewable diesel, impacts of infrastructure for algae cultivation, co-product allocation methodology, and different usage of lipid-extracted algae (LEA). The infrastructure impact of algae cultivation is minimal compared to the overall life cycle. However, in the scenarios investigating LEA usage for animal feed instead of internal recycling for energy use and nutrient recovery the results reflect the high potential variability in LCA results. Calculated life cycle GHG values for biofuel production scenarios where LEA is used as animal feed ranged from a 55% reduction to 127% increase compared to the GREET baseline scenario depending on the choice of feed meal. Different allocation methods also affect LCA results significantly. Four novel harvesting technologies and two extraction technologies provided by the NAABB internal report have been analysis using SimaPro LCA software. The results indicated that a combination of acoustic extraction and acoustic harvesting technologies show the most promising result of all combinations to optimize the extraction of algae oil from algae. These scenario evaluations provide important insights for consideration when planning for the future of an algae-based biofuel industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS This study's objective is to assess the safety of non-therapeutic atomoxetine exposures reported to the US National Poison Database System (NPDS). METHODS This is a retrospective database study of non-therapeutic single agent ingestions of atomoxetine in children and adults reported to the NPDS between 2002 and 2010. RESULTS A total of 20 032 atomoxetine exposures were reported during the study period, and 12 370 of these were single agent exposures. The median age was 9 years (interquartile range 3, 14), and 7380 were male (59.7%). Of the single agent exposures, 8813 (71.2%) were acute exposures, 3315 (26.8%) were acute-on-chronic, and 166 (1.3%) were chronic. In 10 608 (85.8%) cases, exposure was unintentional, in 1079 (8.7%) suicide attempts, and in 629 (5.1%) cases abuse. Of these cases, 3633 (29.4 %) were managed at health-care facilities. Acute-on-chronic exposure was associated with an increased risk of a suicidal reason for exposure compared with acute ingestions (odds ratio 1.44, 95% confidence interval 1.26-1.65). Most common clinical effects were drowsiness or lethargy (709 cases; 5.7%), tachycardia (555; 4.5%), and nausea (388; 3.1%). Major toxicity was observed in 21 cases (seizures in nine (42.9%), tachycardia in eight (38.1%), coma in six (28.6%), and ventricular dysrhythmia in one case (4.8%)). CONCLUSIONS Non-therapeutic atomoxetine exposures were largely safe, but seizures were rarely observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a practical activity, part of the renewable energy course where the students have to build their own complete wind generation system, including blades, PM-generator, power electronics and control. After connecting the system to the electric grid the system has been tested during real wind scenarios. The paper will describe the electric part of the work surface-mounted permanent magnet machine design criteria as well as the power electronics part for the power control and the grid connection. A Kalman filter is used for the voltage phase estimation and current commands obtained in order to control active and reactive power. The connection to the grid has been done and active and reactive power has been measured in the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here an inertial sensor-based monitoring system for measuring and analyzing upper limb movements is presented. The final goal is the integration of this motion-tracking device within a portable rehabilitation system for brain injury patients. A set of four inertial sensors mounted on a special garment worn by the patient provides the quaternions representing the patient upper limb’s orientation in space. A kinematic model is built to estimate 3D upper limb motion for accurate therapeutic evaluation. The human upper limb is represented as a kinematic chain of rigid bodies with three joints and six degrees of freedom. Validation of the system has been performed by co-registration of movements with a commercial optoelectronic tracking system. Successful results are shown that exhibit a high correlation among signals provided by both devices and obtained at the Institut Guttmann Neurorehabilitation Hospital.