952 resultados para sensor grid database system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

列表类型数据是生态研究中最为常见的数据形式。在分析列表类型数据特征及其与元数据关系,数据安全和共享策略等问题基础上,提出了生态研究列表类数据管理系统设计和开发方案。研究认为数据集的元数据不仅是对数据集实体的说明,而且一定程度上决定着数据集实体的内容和数量,以及数据集实体之间的内在联系,这种联系正是进行列表类型数据管理依据所在。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

HIRFL(兰州重离子加速器)是我国第一个大型重离子物理研究装置。HIRFL控制系统是保证HIRFL系统正常高效运转的重要环节。本文利用数据库技术、网络技术设计了HIRFL控制系统数据库系统,实现了在网络环境中只使用一个应用程序就可对整个HIRFL系统控制进行控制的要求。论文首先介绍了数据库系统理论,然后对软件设计涉及的各部分基础知识进行了阐述,最后详细讨论了整个数据库系统设计中的数据库设计、应用程序人机界面设计等部分。数据库应用SQL Server 2000大型数据库,并使用ODBC技术作为数据库访问手段;人机界面部分使用面向对象编程技术进行Windows编程。整个论文工作在建立了一个包含整个控制系统各部分设备信息的数据库的同时,还提供了一套包含所有基本控制功能的HIRFL分布式控制软件,为HIRPL控制系统数据库系统的进一步完善提供了二次开发平台。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

随着信息社会各个领域的发展,数据的采集和存储变得越来越重要。传统的数据库技术由于缺乏对时序关系的支持,不能有效地管理与时间相关的数据。时态关系模型的提出拓展了传统的关系模型,对时序关系提供了很好的支持。越来越多的应用需要数据库系统能够有效地存储海量数据并且具有高效的存储结构和良好的查询性能。本文以该目标为出发点,设计并实现了AgiTDB时态数据库系统,并在企业中进行了应用验证,取得了较好的效果。 论文的主要工作如下: 1. 分析了当前时态数据库研究现状,总结了应用中存在的问题;针对当前时态数据库存储和查询性能的不足,设计了AgiTDB的体系结构,给出了该体系结构的主要模块——存储模块、查询模块、压缩模块、并发控制模块、文件访问模块。 2. 设计了高效的数据文件结构和基于时间区间的多级索引结构;给出了系统内核中的数据缓存结构、存储服务工作流程、消息队列管理机制;给出了查询管理器结构及工作流程、高速查询缓存结构和工作流程。 3. 给出了有损压缩算法及其流程和无损压缩算法的统一接口;设计了系统的并发控制机制,实现了基于读写锁的并发控制。 4. 开发并实现了AgiTDB的系统,并作为历史数据管理子系统成功嵌入Agilor实时数据库系统中。 实际应用表明,以AgiTDB系统为历史数据管理子系统的Agilor实时数据库显示了存储海量数据的能力和良好的系统性能。目前,Agilor已经在石化、电力、通信等很多领域得到了应用。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

本文借助于数据溶合方法中的引导法构成的系统在移动式机器人定位中取得了令人满意的结果.本系统的特点是:用分布式黑板作为计算机构,使系统具有并行处理能力;把时间(时序)推理引入系统.在数据溶合中考虑了时间的重要作用.本文提出的溶合方法和结构原则上可用于其它相关的问题领域.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

汉字微机数据库的最大缺点是速度太慢,因此难以实用化。本文提出了一种新的汉字检索方法——标志域法,解决了这个关键问题,使查找速度提高了近十倍;再采用单层连续提问等一系列措施,还可节省相当多的存储空间,扩大了微型机的应用范围。本数据库非常适合于“最终用户”使用,即使是不懂计算机的人,也能在一、两分钟内学会使用它。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an outsourced database system the data owner publishes information through a number of remote, untrusted servers with the goal of enabling clients to access and query the data more efficiently. As clients cannot trust servers, query authentication is an essential component in any outsourced database system. Clients should be given the capability to verify that the answers provided by the servers are correct with respect to the actual data published by the owner. While existing work provides authentication techniques for selection and projection queries, there is a lack of techniques for authenticating aggregation queries. This article introduces the first known authenticated index structures for aggregation queries. First, we design an index that features good performance characteristics for static environments, where few or no updates occur to the data. Then, we extend these ideas and propose more involved structures for the dynamic case, where the database owner is allowed to update the data arbitrarily. Our structures feature excellent average case performance for authenticating queries with multiple aggregate attributes and multiple selection predicates. We also implement working prototypes of the proposed techniques and experimentally validate the correctness of our ideas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to improve retrieval and navigation services on bibliographic data held in digital libraries. This paper presents the design and implementation of OntoBib¸ an ontology-based bibliographic database system that adopts ontology-driven search in its retrieval. The presented work exemplifies how a digital library of bibliographic data can be managed using Semantic Web technologies and how utilizing the domain specific knowledge improves both search efficiency and navigation of web information and document retrieval.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present an inertial-sensor-based monitoring system for measuring the movement of human upper limbs. Two wearable inertial sensors are placed near the wrist and elbow joints, respectively. The measurement drift in segment orientation is dramatically reduced after a Kalman filter is applied to estimate inclinations using accelerations and turning rates from gyroscopes. Using premeasured lengths of the upper and lower arms, we compute the position of the wrist and elbow joints via a proposed kinematic model. Experimental results demonstrate that this new motion capture system, in comparison to an optical motion tracker, possesses an RMS position error of less than 0.009 m, with a drift of less than 0.005 ms-1 in five daily activities. In addition, the RMS angle error is less than 3??. This indicates that the proposed approach has performed well in terms of accuracy and reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho teve por objectivo global o estudo e desenvolvimento de sensores baseados em fibra óptica polimérica. O crescimento da tecnologia polimérica nos últimos anos permitiu a introdução deste tipo de fibras ópticas na área das telecomunicações e no desenvolvimento de sensores. As vantagens associadas à metrologia óptica com fibra polimérica têm vindo a atrair as atenções da comunidade científica dado que permitem o desenvolvimento de sistemas de baixo-custo ou custo competitivo face às tecnologias convencionais. Dada a actualidade do tema proposto, descreve-se, numa primeira fase, a tecnologia em fibra óptica polimérica existente no mercado e o estado de arte de sensores em fibra óptica polimérica. Segue-se a descrição de dois tipos de sensores baseados em modulação de intensidade. Projectou-se um sensor extrínseco capaz de avaliar a quantidade de luz dispersa e absorvida por partículas suspensas num líquido. Foi efectuada a caracterização do sensor quanto à concentração de partículas suspensas, tamanho e reflectividade. O sensor foi testado no âmbito da monitorização ambiental, designadamente, na análise de turbidez em amostras de sedimentos recolhidos em áreas ardidas. O sistema desenvolvido foi comparado com um sistema comercial. Um sensor intrínseco, baseado no polimento lateral de fibra óptica polimérica, foi analisado analiticamente. O modelo teórico avalia o sensor em diferentes condições de macroencurvamento e de índice de refracção do meio envolvente. O modelo teórico foi validado positivamente através de resultados experimentais. Foi avaliada a sensibilidade à temperatura e os conhecimentos adquiridos foram aplicados no desenvolvimento de um sistema capaz de monitorizar a cura de diferentes materiais. É ainda apresentada uma técnica para melhorar a sensibilidade do sensor de curvatura através da aplicação de um revestimento na zona sensível. A dependência na curvatura da potência transmitida por uma fibra óptica polida lateralmente serviu de base ao desenvolvimento de uma joelheira e de uma cotoveleira instrumentada, capazes de avaliar quantitativamente o movimento articular. A necessidade de portabilidade levou ao desenvolvimento de um sistema sem fios para aquisição e transmissão de dados. Espera-se que os protótipos desenvolvidos venham a ter um impacto significativo em sistemas futuros aplicados à medicina física e reabilitação.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is difficult to get the decision about an opinion after many users get the meeting in same place. It used to spend too much time in order to find solve some problem because of the various opinions of each other. TAmI (Group Decision Making Toolkit) is the System to Group Decision in Ambient Intelligence [1]. This program was composed with IGATA [2], WebMeeting and the related Database system. But, because it is sent without any encryption in IP / Password, it can be opened to attacker. They can use the IP / Password to the bad purpose. As the result, although they make the wrong result, the joined member can’t know them. Therefore, in this paper, we studied the applying method of user’s authentication into TAmI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dança e os espetáculos foram atividades desenvolvidas e praticadas pelo homem desde praticamente a sua existência. Ao longo de todo esse período de tempo e até aos dias atuais estas atividades foram sofrendo evoluções que as fizeram manterem-se relevantes e de grande importância na sociedade humana e na sua cultura. A evolução não se fez sentir apenas no estilo das danças e espetáculos mas também nos acessórios e efeitos que estas implementam de forma a torna-las mais atrativas para quem as vê. Apesar desta evolução, a maioria dos efeitos não permite um nível de interação com a dança ou espetáculo, fazendo com que exista uma clara separação entre a componente pura da dança e o cenário do espetáculo no que diz respeito á componente acessória de efeitos. Com o intuito de colmatar esta clara divisão de componentes, iniciamos um estudo no sentido de criar um sistema que permitisse derrubar essa barreira e juntar as duas componentes com o intuito de criar efeitos que interajam com a própria dança tornando o espetáculo mais interativo, e que não seja apenas mais um componente acessório, isto ao mesmo tempo torna todo o espetáculo mais apelativo para o público em geral. Para conseguir criar tal sistema, recorremos às tecnologias de sensores de movimento atuais para que a ponte de ligação entre o artista e os efeitos fosse conseguida. No mercado existem diversas ofertas de sensores de movimentos que serviriam para criar o sistema, mas apenas um poderia ser escolhido, então para tal numa primeira parte foi feito um estudo para determinar qual destes sensores seria o mais adequado para ser utilizado no sistema, tendo em conta uma diversidade de fatores. Após a escolha do sensor foi então desenvolvido o sistema MoveU e tendo no final sido feitos uma série de testes que permitiram validar o protótipo e verificar se os objetivos propostos foram atingidos. Por fim, o MoveU foi demonstrado a uma série de pessoas (dançarinos e espectadores), para que pudessem opinar sobre ele e indicar possíveis melhoramentos. Foram também criados uma série de questionários para que o público a quem foi demonstrado o protótipo, com a finalidade de realizar uma análise estatística para determinar se este sistema seria do agrado das pessoas e também permitir retirar conclusões sobre este trabalho.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components.