16 resultados para Information Architecture
Resumo:
The BDI architecture, where agents are modelled based on their beliefs, desires and intentions, provides a practical approach to develop large scale systems. However, it is not well suited to model complex Supervisory Control And Data Acquisition (SCADA) systems pervaded by uncertainty. In this paper we address this issue by extending the operational semantics of Can(Plan) into Can(Plan)+. We start by modelling the beliefs of an agent as a set of epistemic states where each state, possibly using a different representation, models part of the agent's beliefs. These epistemic states are stratified to make them commensurable and to reason about the uncertain beliefs of the agent. The syntax and semantics of a BDI agent are extended accordingly and we identify fragments with computationally efficient semantics. Finally, we examine how primitive actions are affected by uncertainty and we define an appropriate form of lookahead planning.
Resumo:
This paper examines the relation between technical possibilities, liberal logics, and the concrete reconfiguration of markets. It focuses on the enrolling of innovations in communication and information technologies into the markets traditionally dominated by stock exchanges. With the development of capacities to trade on-screen, the power of incumbent market makers has been challenged as a less stable array of competing quasi-public and private marketplaces emerges. Developing a case study of the Toronto Stock Exchange, I argue that narrative emphasis on the performative power of sociotechnical innovations, the deterritorialisation of financial relations, and the erosion of state capacities needs qualification. A case is made for the importance of developing an understanding of: the spaces of encounter between emerging social technologies and property rights, rules of exchange, and structures of governance; and the interplay of orderings of different institutional composition and spatial reach in the reconfiguration of market architectures. Only then can a better grasp be gained of the evolving dynamics between making markets, the regulatory powers of the state, and their delimitations.
Resumo:
A key element in the architecture of a quantum-information processing network is a reliable physical interface between fields and qubits. We study a process of entanglement transfer engineering, where two remote qubits respectively interact with an entangled two-mode continuous-variable (CV) field. We quantify the entanglement induced in the qubit state at the expenses of the loss of entanglement in the CV system. We discuss the range of mixed entangled states which can be obtained with this setup. Furthermore, we suggest a protocol to determine the residual entangling power of the light fields inferring, thus, the entanglement left in the field modes which, after the interaction, are no longer in a Gaussian state. Two different setups are proposed: a cavity-QED system and an interface between superconducting qubits and field modes. We address in detail the practical difficulties inherent in these two proposals, showing that the latter is promising in many aspects.
Resumo:
Model Driven Architecture supports the transformation from reusable models to executable software. Business representations, however, cannot be fully and explicitly represented in such models for direct transformation into running systems. Thus, once business needs change, the language abstractions used by MDA (e.g. Object Constraint Language / Action Semantics), being low level, have to be edited directly. We therefore describe an Agent-oriented Model Driven Architecture (AMDA) that uses a set of business models under continuous maintenance by business people, reflecting the current business needs and being associated with adaptive agents that interpret the captured knowledge to behave dynamically. Three contributions of the AMDA approach are identified: 1) to Agent-oriented Software Engineering, a method of building adaptive Multi-Agent Systems; 2) to MDA, a means of abstracting high level business-oriented models to align executable systems with their requirements at runtime; 3) to distributed systems, the interoperability of disparate components and services via the agent abstraction.
Resumo:
This paper presents the design of a novel single chip adaptive beamformer capable of performing 50 Gflops, (Giga-floating-point operations/second). The core processor is a QR array implemented on a fully efficient linear systolic architecture, derived using a mapping that allows individual processors for boundary and internal cell operations. In addition, the paper highlights a number of rapid design techniques that have been used to realise this system. These include an architecture synthesis tool for quickly developing the circuit architecture and the utilisation of a library of parameterisable silicon intellectual property (IP) cores, to rapidly develop detailed silicon designs.
Resumo:
A key issue in the design of next generation Internet routers and switches will be provision of traffic manager (TM) functionality in the datapaths of their high speed switching fabrics. A new architecture that allows dynamic deployment of different TM functions is presented. By considering the processing requirements of operations such as policing and congestion, queuing, shaping and scheduling, a solution has been derived that is scalable with a consistent programmable interface. Programmability is achieved using a function computation unit which determines the action (e.g. drop, queue, remark, forward) based on the packet attribute information and a memory storage part. Results of a Xilinx Virtex-5 FPGA reference design are presented.
Resumo:
Continuing achievements in hardware technology are bringing ubiquitous computing closer to reality. The notion of a connected, interactive and autonomous environment is common to all sensor networks, biosystems and radio frequency identification (RFID) devices, and the emergence of significant deployments and sophisticated applications can be expected. However, as more information is collected and transmitted, security issues will become vital for such a fully connected environment. In this study the authors consider adding security features to low-cost devices such as RFID tags. In particular, the authors consider the implementation of a digital signature architecture that can be used for device authentication, to prevent tag cloning, and for data authentication to prevent transmission forgery. The scheme is built around the signature variant of the cryptoGPS identification scheme and the SHA-1 hash function. When implemented on 130 nm CMOS the full design uses 7494 gates and consumes 4.72 mu W of power, making it smaller and more power efficient than previous low-cost digital signature designs. The study also presents a low-cost SHA-1 hardware architecture which is the smallest standardised hash function design to date.
Resumo:
The ability of building information modeling (BIM) to positively impact projects in the AEC through greater collaboration and integration is widely acknowledged. This paper aims to examine the development of BIM and how it can contribute to the cold-formed steel (CFS) building industry. This is achieved through the adoption of a qualitative methodology encompassing a literature review, exploratory interviews with industry experts, culminating in the development of e-learning material for the sector. In doing so, the research team have collaborated with one of the United Kingdom’s largest cold-formed steel designer/fabricators. By demonstrating the capabilities of BIM software and providing technical and informative videos in its creation, this project has found two key outcomes. Firstly, to provide invaluable assistance in the transition from traditional processes to a fully collaborative 3D BIM as required by the UK Government under the “Government Construction Strategy” by 2016 in all public sector projects. Secondly, to demonstrate BIM’s potential not only within CFS companies, but also within the AEC sector as a whole. As the flexibility, adaptability and interoperability of BIM software is alluded to, the results indicate that the introduction and development of BIM and the underlying ethos suggests that it is a key tool in the development of the industry as a whole.
Resumo:
Architecture Description Languages (ADLs) have emerged in recent years as a tool for providing high-level descriptions of software systems in terms of their architectural elements and the relationships among them. Most of the current ADLs exhibit limitations which prevent their widespread use in industrial applications. In this paper, we discuss these limitations and introduce ALI, an ADL that has been developed to address such limitations. The ALI language provides a rich and flexible syntax for describing component interfaces, architectural patterns, and meta-information. Multiple graphical architectural views can then be derived from ALI's textual notation.
Resumo:
Due to the intermittent nature of renewable generation it is desirable to consider the potential of controlling the demand-side load to smooth overall system demand. The architecture and control methodologies of such a system on a large scale would require careful consideration. Some of these considerations are discussed in this paper; such as communications infrastructure, systems architecture, control methodologies and security. A domestic fridge is used in this paper as an example of a controllable appliance. A layered approach to smart-grid is introduced and it can be observed how each smart-grid component from physical cables, to the end-devices (or smart-applications) can be mapped to these set layers. It is clear how security plays an integral part in each component of the smart-grid so this is also an integral part of each layer. The controllable fridge is described in detail and as one potential smart-grid application which maps to the layered approach. A demonstration system is presented which involves a Raspberry Pi (a low-power, low-cost device representing the appliance controller).
Resumo:
This paper presents a multi-agent system approach to address the difficulties encountered in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission and distribution. The approach models uncertainty and combines multiple sources of uncertain information to deliver robust plan selection. We examine the approach in the context of a simplified power supply/demand scenario using a residential grid connected solar system and consider the challenges of modelling and reasoning with
uncertain sensor information in this environment. We discuss examples of plans and actions required for sensing, establish and discuss the effect of uncertainty on such systems and investigate different uncertainty theories and how they can fuse uncertain information from multiple sources for effective decision making in
such a complex system.
Resumo:
The upcoming IEEE 802.11ac standard boosts the throughput of previous IEEE 802.11n by adding wider 80 MHz and 160 MHz channels with up to 8 antennas (versus 40 MHz channel and 4 antennas in 802.11n). This necessitates new 1-8 stream 256/512-point Fast Fourier Transform (FFT) / inverse FFT (IFFT) processing with 80/160 MSample/s throughput. Although there are abundant related work, they all fail to meet the requirements of IEEE 802.11ac FFT/IFFT on point size, throughput and multiple data streams at the same time. This paper proposes the first software defined FFT/IFFT architecture as a solution. By making use of a customised soft stream processor on FPGA, we show how a software defined FFT architecture can meet all the requirements of IEEE 802.11ac with low cost and high resource efficiency. When compared with dedicated Xilinx FFT core, our implementation exhibits only one third of the resources also up to three times of resource efficiency.