916 resultados para sensor grid database system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is difficult to get the decision about an opinion after many users get the meeting in same place. It used to spend too much time in order to find solve some problem because of the various opinions of each other. TAmI (Group Decision Making Toolkit) is the System to Group Decision in Ambient Intelligence [1]. This program was composed with IGATA [2], WebMeeting and the related Database system. But, because it is sent without any encryption in IP / Password, it can be opened to attacker. They can use the IP / Password to the bad purpose. As the result, although they make the wrong result, the joined member can’t know them. Therefore, in this paper, we studied the applying method of user’s authentication into TAmI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dança e os espetáculos foram atividades desenvolvidas e praticadas pelo homem desde praticamente a sua existência. Ao longo de todo esse período de tempo e até aos dias atuais estas atividades foram sofrendo evoluções que as fizeram manterem-se relevantes e de grande importância na sociedade humana e na sua cultura. A evolução não se fez sentir apenas no estilo das danças e espetáculos mas também nos acessórios e efeitos que estas implementam de forma a torna-las mais atrativas para quem as vê. Apesar desta evolução, a maioria dos efeitos não permite um nível de interação com a dança ou espetáculo, fazendo com que exista uma clara separação entre a componente pura da dança e o cenário do espetáculo no que diz respeito á componente acessória de efeitos. Com o intuito de colmatar esta clara divisão de componentes, iniciamos um estudo no sentido de criar um sistema que permitisse derrubar essa barreira e juntar as duas componentes com o intuito de criar efeitos que interajam com a própria dança tornando o espetáculo mais interativo, e que não seja apenas mais um componente acessório, isto ao mesmo tempo torna todo o espetáculo mais apelativo para o público em geral. Para conseguir criar tal sistema, recorremos às tecnologias de sensores de movimento atuais para que a ponte de ligação entre o artista e os efeitos fosse conseguida. No mercado existem diversas ofertas de sensores de movimentos que serviriam para criar o sistema, mas apenas um poderia ser escolhido, então para tal numa primeira parte foi feito um estudo para determinar qual destes sensores seria o mais adequado para ser utilizado no sistema, tendo em conta uma diversidade de fatores. Após a escolha do sensor foi então desenvolvido o sistema MoveU e tendo no final sido feitos uma série de testes que permitiram validar o protótipo e verificar se os objetivos propostos foram atingidos. Por fim, o MoveU foi demonstrado a uma série de pessoas (dançarinos e espectadores), para que pudessem opinar sobre ele e indicar possíveis melhoramentos. Foram também criados uma série de questionários para que o público a quem foi demonstrado o protótipo, com a finalidade de realizar uma análise estatística para determinar se este sistema seria do agrado das pessoas e também permitir retirar conclusões sobre este trabalho.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cryoturbated Upper Chalk is a dichotomous porous medium wherein the intra-fragment porosity provides water storage and the inter-fragment porosity provides potential pathways for relatively rapid flow near saturation. Chloride tracer movement through 43 cm long and 45 cm diameter undisturbed chalk columns was studied at water application rates of 0.3, 1.0, and 1.5 cm h(-1). Microscale heterogeneity in effluent was recorded using a grid collection system consisting of 98 funnel-shaped cells each 3.5 cm in diameter. The total porosity of the columns was 0.47 +/- 0.02 m(3) m(-3), approximately 13% of pores were >15 mu m diameter, and the saturated hydraulic conductivity was 12.66 +/- 1.31 m day(-1). Although the column remained unsaturated during the leaching even at all application rates, proportionate flow through macropores increased as the application rate decreased. The number of dry cells (with 0 ml of effluent) increased as application rate decreased. Half of the leachate was collected from 15, 19 and 22 cells at 0.3, 1.0, 1.5 cm h(-1) application rates respectively. Similar breakthrough curves (BTCs) were obtained at all three application rates when plotted as a function of cumulative drainage, but they were distinctly different when plotted as a function of time. The BTCs indicate that the columns have similar drainage requirement irrespective of application rates, as the rise to the maxima (C/C-o) is almost similar. However, the time required to achieve that leaching requirement varies with application rates, and residence time was less in the case of a higher application rate. A two-region convection-dispersion model was used to describe the BTCs and fitted well (r(2) = 0.97-0-99). There was a linear relationship between dispersion coefficient and pore water velocity (correlation coefficient r = 0.95). The results demonstrate the microscale heterogeneity of hydrodynamic properties in the Upper Chalk. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis focuses on using photovoltaic produced electricity to power air conditioners in a tropical climate. The study takes place in Surabaya, Indonesia at two different locations the classroom, located at the UBAYA campus and the home office, 10 km away. Indonesia has an average solar irradiation of about 4.8 kWh/m²/day (PWC Indonesia, 2013) which is for ideal conditions for these tests. At the home office, tests were conducted on different photovoltaic systems. A series of measuring devices recorded the performance of the 800 W PV system and the consumption of the 1.35 kW air conditioner (cooling capacity). To have an off grid system many of the components need to be oversized. The inverter has to be oversized to meet the startup load of the air conditioner, which can be 3 to 8 times the operating power (Rozenblat, 2013). High energy consumption of the air conditioner would require a large battery storage to provide one day of autonomy. The PV systems output must at least match the consumption of the air conditioner. A grid connect system provides a much better solution with the 800 W PV system providing 80 % of the 3.5 kWh load of the air conditioner, the other 20 % coming from the grid during periods of low irradiation. In this system the startup load is provided by the grid so the inverter does not need to be oversized. With the grid-connected system, the PV panel’s production does not need to match the consumption of the air conditioner, although a smaller PV array will mean a smaller percentage of the load will be covered by PV. Using the results from the home office tests and results from measurements made in the classroom. Two different PV systems (8 kW and 12 kW) were simulated to power both the current air conditioners (COP 2.78) and new air conditioners (COP 4.0). The payback period of the systems can vary greatly depending on if a feed in tariff is awarded or not. If the feed in tariff is awarded the best system is the 12 kW system, with a payback period of 4.3 years and a levelized cost of energy at -3,334 IDR/kWh. If the feed in tariff is not granted then the 8 kW system is the best choice with a lower payback period and lower levelized cost of energy than the 12 kW system under the same conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to guarantee database consistency, a database system should synchronize operations of concurrent transactions. The database component responsible for such synchronization is the scheduler. A scheduler synchronizes operations belonging to different transactions by means of concurrency control protocols. Concurrency control protocols may present different behaviors: in general, a scheduler behavior can be classified as aggressive or conservative. This paper presents the Intelligent Transaction Scheduler (ITS), which has the ability to synchronize the execution of concurrent transactions in an adaptive manner. This scheduler adapts its behavior (aggressive or conservative), according to the characteristics of the computing environment in which it is inserted, using an expert system based on fuzzy logic. The ITS can implement different correctness criteria, such as conventional (syntactic) serializability and semantic serializability. In order to evaluate the performance of the ITS in relation to others schedulers with exclusively aggressive or conservative behavior, it was applied in a dynamic environment, such as a Mobile Database Community (MDBC). An MDBC simulator was developed and many sets of tests were run. The experimentation results, presented herein, prove the efficiency of the ITS in synchronizing transactions in a dynamic environment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an environment for constructing multimedia applications which are used to present different multimedia database objects in accordance with spatiotemporal constraints and from different sources. The main contribution of this paper is to propose an environment which integrates both a modelling case tool and an object-oriented database system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much research has focused on desertification and land degradation assessments without putting sufficient emphasis on prevention and mitigation, although the concept of sustainable land management (SLM) is increasingly being acknowledged. A variety of SLM measures have already been applied at the local level, but they are rarely adequately recognised, evaluated, shared or used for decision support. WOCAT (World Overview of Technologies and Approaches) has developed an internationally recognised, standardised methodology to document and evaluate SLM technologies and approaches, including spatial distribution, allowing the sharing of SLM knowledge worldwide. The recent methodological integration into a participatory process allows now analysing and using this knowledge for decision support at the local and national level. The use of the WOCAT tools stimulates evaluation (self-evaluation as well as learning from comparing experiences) within SLM initiatives where all too often there is not only insufficient monitoring but also a lack of critical analysis. The comprehensive questionnaires and database system facilitate to document, evaluate and disseminate local experiences of SLM technologies and their implementation approaches. This evaluation process - in a team of experts and together with land users - greatly enhances understanding of the reasons behind successful (or failed) local practices. It has now been integrated into a new methodology for appraising and selecting SLM options. The methodology combines a local collective learning and decision approach with the use of the evaluated global best practices from WOCAT in a concise three step process: i) identifying land degradation and locally applied solutions in a stakeholder learning workshop; ii) assessing local solutions with the standardised WOCAT tool; iii) jointly selecting promising strategies for implementation with the help of a decision support tool. The methodology has been implemented in various countries and study sites around the world mainly within the FAO LADA (Land Degradation Assessment Project) and the EU-funded DESIRE project. Investments in SLM must be carefully assessed and planned on the basis of properly documented experiences and evaluated impacts and benefits: concerted efforts are needed and sufficient resources must be mobilised to tap the wealth of knowledge and learn from SLM successes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tandem mass spectral database system consists of a library of reference spectra and a search program. State-of-the-art search programs show a high tolerance for variability in compound-specific fragmentation patterns produced by collision-induced decomposition and enable sensitive and specific 'identity search'. In this communication, performance characteristics of two search algorithms combined with the 'Wiley Registry of Tandem Mass Spectral Data, MSforID' (Wiley Registry MSMS, John Wiley and Sons, Hoboken, NJ, USA) were evaluated. The search algorithms tested were the MSMS search algorithm implemented in the NIST MS Search program 2.0g (NIST, Gaithersburg, MD, USA) and the MSforID algorithm (John Wiley and Sons, Hoboken, NJ, USA). Sample spectra were acquired on different instruments and, thus, covered a broad range of possible experimental conditions or were generated in silico. For each algorithm, more than 30,000 matches were performed. Statistical evaluation of the library search results revealed that principally both search algorithms can be combined with the Wiley Registry MSMS to create a reliable identification tool. It appears, however, that a higher degree of spectral similarity is necessary to obtain a correct match with the NIST MS Search program. This characteristic of the NIST MS Search program has a positive effect on specificity as it helps to avoid false positive matches (type I errors), but reduces sensitivity. Thus, particularly with sample spectra acquired on instruments differing in their Setup from tandem-in-space type fragmentation, a comparably higher number of false negative matches (type II errors) were observed by searching the Wiley Registry MSMS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of the methodology presented in this paper is to provide a framework for a participatory process for the appraisal and selection of options to mitigate desertification and land degradation. This methodology is being developed within the EU project DESIRE (www.desire-project.eu/) in collaboration with WOCAT (www.wocat.org). It is used to select promising conservation strategies for test-implementation in each of the 16 degradation and desertification hotspot sites in the Mediterranean and around the world. The methodology consists of three main parts: In a first step, prevention and mitigation strategies already applied at the respective DESIRE study site are identified and listed during a workshop with representatives of different stakeholders groups (land users, policy makers, researchers). The participatory and process-oriented approach initiates a mutual learning process among the different stakeholders by sharing knowledge and jointly reflecting on current problems and solutions related to land degradation and desertification. In the second step these identified, locally applied solutions (technologies and approaches) are assessed with the help of the WOCAT methodology. Comprehensive questionnaires and a database system have been developed to document and evaluate all relevant aspects of technical measures as well as implementation approaches by teams of researchers and specialists, together with land users. This research process ensures systematic assessing and piecing together of local information, together with specific details about the environmental and socio-economic setting. The third part consists of another stakeholder workshop where promising strategies for sustainable land management in the given context are selected, based on the best practices database of WOCAT, including the evaluated locally applied strategies at the DESIRE sites. These promising strategies will be assessed with the help of a selection and decision support tool and adapted for test-implementation at the study site.