803 resultados para common agent architecture design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mediator software architecture design has been developed to provide data integration and retrieval in distributed, heterogeneous environments. Since the initial conceptualization of this architecture, many new technologies have emerged that can facilitate the implementation of this design. The purpose of this thesis was to show that a mediator framework supporting users of mobile devices could be implemented using common software technologies available today. In addition, the prototype was developed with a view to providing a better understanding of what a mediator is and to expose issues that will have to be addressed in full, more robust designs. The prototype developed for this thesis was implemented using various technologies including: Java, XML, and Simple Object Access Protocol (SOAP) among others. SOAP was used to accomplish inter-process communication. In the end, it is expected that more data intensive software applications will be possible in a world with ever-increasing demands for information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the usability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective,providing new features and enriching the mobile user’s experience through a broad scope of potential applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dacryocystorhinostomy is the treatment of choice for the obstruction of the lachrymal apparatus. At the end of last century, the development of the endoscopic instruments for nasosinusal surgery has made it possible to do it through the endoscopic pathway. Nonetheless, anatomical variations make it difficult to have reproducibility endonasaly. Aim: study the endoscopic anatomy of the lachrymal fossa through transillumination of the common canaliculus. Study design: experimental. Materials and Methods: we dissected 40 lachrymal pathways from 20 human cadavers, in three stages: 1. identification and dilation of the lachrymal canaliculus. 2 Optic fiber beam introduction; 3 - endoscopic dissection of the lachrymal sac, describing its position. Results: the most frequent position of the lachrymal sac was between the free border of the middle turbinate and its insertion immediately underneath it. The maxillary line was seen in 95% of the cases. Septoplasty was needed in 12.5%, unicifectomy in 35% and middle turbinectomy in 7.5%. Conclusion: Although the lachrymal sac has a more frequent location, its position varied considerably. The transillumination of the common canaliculus proved useful, solving the problem of the anatomical variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A evolução tecnológica e das sociedades permitiu que, hoje em dia, uma boa parte da população tenha acesso a dispositivos móveis com funcionalidades avançadas. Com este tipo de dispositivos, temos acesso a inúmeras fontes de informação em tempo-real, mas esta característica ainda não é, hoje em dia, aproveitada na sua totalidade. Este projecto tenta tirar partido desta realidade para, utilizando os diversos dispositivos móveis, criar uma rede de troca de informações de trânsito. O utilizador apenas necessita de servir-se do seu dispositivo móvel para, automaticamente, obter as mais recentes informações de trânsito enquanto, paralelamente, partilha com os outros utilizadores a sua informação. Apesar de existirem outras alternativas no mercado, com soluções que permitem usufruir do mesmo tipo de funcionalidades, nenhuma utiliza este tipo de dispositivos (GPS’s convencionais, por exemplo). Um dos requisitos necessário na implementação deste projecto é uma solução de geocoding. Após terem sido testadas várias soluções, nenhuma cumpria, na totalidade, os requisitos deste projecto, o que originou o desenvolvimento de uma nova solução que cumpre esses requisitos. A solução é, toda ela, muito modular, formada por vários componentes, cada um com responsabilidades bem identificadas. A arquitectura desta solução baseia-se nos padrões de desenvolvimento de uma Service Oriented Architecture. Todos os componentes disponibilizam as suas operações através de web services, e a sua descoberta recorre ao protocolo WS-Discovery. Estes vários componentes podem ser divididos em duas categorias: os do núcleo, responsáveis por criar e oferecer as funcionalidades requisitadas neste projecto e os módulos externos, nos quais se incluem as aplicações que apresentam as funcionalidades ao utilizador. Foram criadas duas formas de consumir a informação oferecida pelo serviço SIAT: a aplicação móvel e um website. No âmbito dos dispositivos móveis, foi desenvolvida uma aplicação para o sistema operativo Windows Phone 7.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Competitive electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is an electricity market simulator able to model market players and simulate their operation in the market. As market players are complex entities, having their characteristics and objectives, making their decisions and interacting with other players, a multi-agent architecture is used and proved to be adequate. MASCEM players have learning capabilities and different risk preferences. They are able to refine their strategies according to their past experience (both real and simulated) and considering other agents’ behavior. Agents’ behavior is also subject to its risk preferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis to obtain the Master of Science Degree in Computer Science and Engineering

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, many of the manufactory and industrial system has a diagnosis system on top of it, responsible for ensuring the lifetime of the system itself. It achieves this by performing both diagnosis and error recovery procedures in real production time, on each of the individual parts of the system. There are many paradigms currently being used for diagnosis. However, they still fail to answer all the requirements imposed by the enterprises making it necessary for a different approach to take place. This happens mostly on the error recovery paradigms since the great diversity that is nowadays present in the industrial environment makes it highly unlikely for every single error to be fixed under a real time, no production stop, perspective. This work proposes a still relatively unknown paradigm to manufactory. The Artificial Immune Systems (AIS), which relies on bio-inspired algorithms, comes as a valid alternative to the ones currently being used. The proposed work is a multi-agent architecture that establishes the Artificial Immune Systems, based on bio-inspired algorithms. The main goal of this architecture is to solve for a resolution to the error currently detected by the system. The proposed architecture was tested using two different simulation environment, each meant to prove different points of views, using different tests. These tests will determine if, as the research suggests, this paradigm is a promising alternative for the industrial environment. It will also define what should be done to improve the current architecture and if it should be applied in a decentralised system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Doutoramento Programa Doutoral em Engenharia Electrónica e Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aquest projecte presenta la implementació d'un disseny, i la seva posterior síntesi en una FPGA, d'una arquitectura de tipus wormhole packet switching per a una infraestructura de NetWork-On-Chip amb una topologia 2D-Mesh. Agafant un router circuit switching com a punt de partida, s'han especificat els mòduls en Verilog per tal d'obtenir l'arquitectura wormhole desitjada. Dissenyar la màquina de control per governar els flits que conformen els paquets dins la NoC,i afegir les cues a la sortida del router (outuput queuing) són els punts principals d'aquest treball. A més, com a punt final s'han comparat ambdues arquitectures de router en termes de costos en àrea i en memòria i se n’han obtingut diverses conclusions i resultats experimentals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuestra Web tiene como eje central divulgar todo el material desarrollado en el Proyecto Wolframio, queriendo ser una herramienta sencilla y ágil para gestionar el material y ser más accesible para los usuarios. Para el desarrollo del proyecto se ha diseñado una arquitectura en la que conviven varias tecnologías y varios lenguajes de codificación y programación: HTML, PHP, MySQL, Java script, AJAX, JQUERY y CSS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ANGLÈS] This project introduces GNSS-SDR, an open source Global Navigation Satellite System software-defined receiver. The lack of reconfigurability of current commercial-of-the-shelf receivers and the advent of new radionavigation signals and systems make software receivers an appealing approach to design new architectures and signal processing algorithms. With the aim of exploring the full potential of this forthcoming scenario with a plurality of new signal structures and frequency bands available for positioning, this paper describes the software architecture design and provides details about its implementation, targeting a multiband, multisystem GNSS receiver. The result is a testbed for GNSS signal processing that allows any kind of customization, including interchangeability of signal sources, signal processing algorithms, interoperability with other systems, output formats, and the offering of interfaces to all the intermediate signals, parameters and variables. The source code release under the GNU General Public License (GPL) secures practical usability, inspection, and continuous improvement by the research community, allowing the discussion based on tangible code and the analysis of results obtained with real signals. The source code is complemented by a development ecosystem, consisting of a website (http://gnss-sdr.org), as well as a revision control system, instructions for users and developers, and communication tools. The project shows in detail the design of the initial blocks of the Signal Processing Plane of the receiver: signal conditioner, the acquisition block and the receiver channel, the project also extends the functionality of the acquisition and tracking modules of the GNSS-SDR receiver to track the new Galileo E1 signals available. Each section provides a theoretical analysis, implementation details of each block and subsequent testing to confirm the calculations with both synthetically generated signals and with real signals from satellites in space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Red blood cell-derived microparticles (RMPs) are small phospholipid vesicles shed from RBCs in blood units, where they accumulate during storage. Because microparticles are bioactive, it could be suggested that RMPs are mediators of posttransfusion complications or, on the contrary, constitute a potential hemostatic agent. STUDY DESIGN AND METHODS: This study was performed to establish the impact on coagulation of RMPs isolated from blood units. Using calibrated automated thrombography, we investigated whether RMPs affect thrombin generation (TG) in plasma. RESULTS: We found that RMPs were not only able to increase TG in plasma in the presence of a low exogenous tissue factor (TF) concentration, but also to initiate TG in plasma in absence of exogenous TF. TG induced by RMPs in the absence of exogenous TF was neither affected by the presence of blocking anti-TF nor by the absence of Factor (F)VII. It was significantly reduced in plasma deficient in FVIII or F IX and abolished in FII-, FV-, FX-, or FXI-deficient plasma. TG was also totally abolished when anti-XI 01A6 was added in the sample. Finally, neither Western blotting, flow cytometry, nor immunogold labeling allowed the detection of traces of TF antigen. In addition, RMPs did not comprise polyphosphate, an important modulator of coagulation. CONCLUSIONS: Taken together, our data show that RMPs have FXI-dependent procoagulant properties and are able to initiate and propagate TG. The anionic surface of RMPs might be the site of FXI-mediated TG amplification and intrinsic tenase and prothrombinase complex assembly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Testbeds are a stage between the simulation and the production stages. To this end they must be as close as possible to production environments (i.e. real hardware, on the field deployments) while also keeping the traits of experimentation facilities (i.e. fault tolerance, ease of deployment, testing and data collection). This paper presents WiBed, a FOSS platform for WiFi testbeds based on OpenWRT Linux made to run oncommodity IEEE802.11 WiFi routers part of the Community-lab.net project, a global testbed for Community networks. WiBedhas been designed to support realistic low layer network exper-iments (according to the OSI model). This work recolects thedetails of the architecture, design and implementation of WiBed consolidated during its operation as a testbed. In addition to a set of routing experimentation results obtained during the Wireless Battlemesh v7 where WiBed was used as testbed platform.