968 resultados para Open quantum systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behaviours of autonomous agents may deviate from those deemed to be for the good of the societal systems of which they are a part. Norms have therefore been proposed as a means to regulate agent behaviours in open and dynamic systems, where these norms specify the obliged, permitted and prohibited behaviours of agents. Regulation can effectively be achieved through use of enforcement mechanisms that result in a net loss of utility for an agent in cases where the agent’s behaviour fails to comply with the norms. Recognition of compliance is thus crucial for achieving regulation. In this paper we propose a generic architecture for observation of agent behaviours, and recognition of these behaviours as constituting, or counting as, compliance or violation. The architecture deploys monitors that receive inputs from observers, and processes these inputs together with transition network representations of individual norms. In this way, monitors determine the fulfillment or violation status of norms. The paper also describes a proof of concept implementation and deployment of monitors in electronic contracting environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Open Provenance Model is a model of provenance that is designed to meet the following requirements: (1) To allow provenance information to be exchanged between systems, by means of a compatibility layer based on a shared provenance model. (2) To allow developers to build and share tools that operate on such a provenance model. (3) To define provenance in a precise, technology-agnostic manner. (4) To support a digital representation of provenance for any 'thing', whether produced by computer systems or not. (5) To allow multiple levels of description to coexist. (6) To define a core set of rules that identify the valid inferences that can be made on provenance representation. This document contains the specification of the Open Provenance Model (v1.1) resulting from a community-effort to achieve inter-operability in the Provenance Challenge series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A description of a data item's provenance can be provided in dierent forms, and which form is best depends on the intended use of that description. Because of this, dierent communities have made quite distinct underlying assumptions in their models for electronically representing provenance. Approaches deriving from the library and archiving communities emphasise agreed vocabulary by which resources can be described and, in particular, assert their attribution (who created the resource, who modied it, where it was stored etc.) The primary purpose here is to provide intuitive metadata by which users can search for and index resources. In comparison, models for representing the results of scientific workflows have been developed with the assumption that each event or piece of intermediary data in a process' execution can and should be documented, to give a full account of the experiment undertaken. These occurrences are connected together by stating where one derived from, triggered, or otherwise caused another, and so form a causal graph. Mapping between the two approaches would be benecial in integrating systems and exploiting the strengths of each. In this paper, we specify such a mapping between Dublin Core and the Open Provenance Model. We further explain the technical issues to overcome and the rationale behind the approach, to allow the same method to apply in mapping similar schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the feasibility of implementing Japanese manufacturing systems in the United States. The recent success of Japanese transplant companies suggests that Just-In-Time (JIT) production is possible within America's industrial environment. Once American workers receive proper training, they have little difficulty participating in rapid setup procedures and utilizing the kanban system. Japanese transplants are gradually developing Japanese-style relationships with their American supplier companies by initiating long-term, mutually beneficial agreements. They are also finding ways to cope with America's problem of distance, which is steadily decreasing as an obstacle to JIT delivery. American companies, however, encounter Significant problems in trying to convert traditionally organized, factories to the JIT system. This paper demonstrates that it is both feasible and beneficial for American manufacturers to implement JIT production techniques. Many of the difficulties manufacturers experience center around a general lack of information about JIT. Once a company realizes its potential for setup-time reduction, a prerequisite for the JIT system, workers and managers can work together to create a new process for handling equipment changeover. Significant results are possible with minimal investment. Also, supervisors often do not realize that the JIT method of ordering goods from suppliers is compatible with current systems. This "kanban system" not only enhances current systems but also reduces the amount of paperwork and scheduling involved. When arranging JlT delivery of supplier goods, American manufacturers tend to overlook important aspects of JIT supplier management. However, by making long-tenn commitments, initiating the open exchange of information, assisting suppliers in reaching new standards of performance, increasing the level of conununication, and relying more on suppliers' engineering capabilities, even American manufacturers can develop Japanese-style supplier relationships that enhance the effectiveness of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic applications are currently developed under the reuse-based paradigm. This design methodology presents several advantages for the reduction of the design complexity, but brings new challenges for the test of the final circuit. The access to embedded cores, the integration of several test methods, and the optimization of the several cost factors are just a few of the several problems that need to be tackled during test planning. Within this context, this thesis proposes two test planning approaches that aim at reducing the test costs of a core-based system by means of hardware reuse and integration of the test planning into the design flow. The first approach considers systems whose cores are connected directly or through a functional bus. The test planning method consists of a comprehensive model that includes the definition of a multi-mode access mechanism inside the chip and a search algorithm for the exploration of the design space. The access mechanism model considers the reuse of functional connections as well as partial test buses, cores transparency, and other bypass modes. The test schedule is defined in conjunction with the access mechanism so that good trade-offs among the costs of pins, area, and test time can be sought. Furthermore, system power constraints are also considered. This expansion of concerns makes it possible an efficient, yet fine-grained search, in the huge design space of a reuse-based environment. Experimental results clearly show the variety of trade-offs that can be explored using the proposed model, and its effectiveness on optimizing the system test plan. Networks-on-chip are likely to become the main communication platform of systemson- chip. Thus, the second approach presented in this work proposes the reuse of the on-chip network for the test of the cores embedded into the systems that use this communication platform. A power-aware test scheduling algorithm aiming at exploiting the network characteristics to minimize the system test time is presented. The reuse strategy is evaluated considering a number of system configurations, such as different positions of the cores in the network, power consumption constraints and number of interfaces with the tester. Experimental results show that the parallelization capability of the network can be exploited to reduce the system test time, whereas area and pin overhead are strongly minimized. In this manuscript, the main problems of the test of core-based systems are firstly identified and the current solutions are discussed. The problems being tackled by this thesis are then listed and the test planning approaches are detailed. Both test planning techniques are validated for the recently released ITC’02 SoC Test Benchmarks, and further compared to other test planning methods of the literature. This comparison confirms the efficiency of the proposed methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As empresas e instituições estão num ambiente que oferece oportunidades e ameaças, o que exige um conjunto de informações voltado tanto a processos e decisões táticas, operacionais e estratégicos. No entanto, conseguir informações com rapidez e qualidade não se trata apenas de adquirir pacotes de sistemas de informações ou mesmo desenvolvê-los nas organizações. Infelizmente, isto é o que mais tem ocorrido. Desta forma, a fim de ultrapassar esse amadorismo, faz-se necessário um diagnóstico sistêmico da organização, com objetivo de identificar os requisitos informacionais necessários à construção de um sistema de apoio às decisões. Destarte, este estudo realiza um diagnóstico sistêmico numa farmácia com a utilização da “Soft Systems Methodology”, a qual a partir de ampla interação entre pesquisador e as pessoas envolvidas , identifica e estrutura a situação problemática de forma encadeada, analisando-a sob duas preocupações: uma relacionada ao mundo real e outra ao pensamento sistêmico. Com este processo, desenvolve uma aprendizagem que permite não só a identificação dos requisitos informacionais necessários à construção de um sistema de informações como também reunir e organizar visões muitas vezes divergentes a respeito de uma realidade complexa, a fim de propor um rol de atividades e ações que possam contribuir para o processo de melhoria da situação problemática.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this thesis is to discuss the development and modeling of an interface architecture to be employed for interfacing analog signals in mixed-signal SOC. We claim that the approach that is going to be presented is able to achieve wide frequency range, and covers a large range of applications with constant performance, allied to digital configuration compatibility. Our primary assumptions are to use a fixed analog block and to promote application configurability in the digital domain, which leads to a mixed-signal interface. The use of a fixed analog block avoids the performance loss common to configurable analog blocks. The usage of configurability on the digital domain makes possible the use of all existing tools for high level design, simulation and synthesis to implement the target application, with very good performance prediction. The proposed approach utilizes the concept of frequency translation (mixing) of the input signal followed by its conversion to the ΣΔ domain, which makes possible the use of a fairly constant analog block, and also, a uniform treatment of input signal from DC to high frequencies. The programmability is performed in the ΣΔ digital domain where performance can be closely achieved according to application specification. The interface performance theoretical and simulation model are developed for design space exploration and for physical design support. Two prototypes are built and characterized to validate the proposed model and to implement some application examples. The usage of this interface as a multi-band parametric ADC and as a two channels analog multiplier and adder are shown. The multi-channel analog interface architecture is also presented. The characterization measurements support the main advantages of the approach proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complex of Brookhart Ni(α-diimine)Cl2 (1) (α-diimine = 1,4-bis(2,6- diisopropylphenyl)-acenaphthenediimine) has been characterized after impregnation on silica (S1) and MAO-modified silicas (4.0, 8.0 and 23.0 wts.% Al/SiO2 called S2, S3 and S4, respectively). The treatment of these heterogeneous systems with MAO produces some active catalysts for the polymerization of the ethylene. A high catalytic activity has been gotten while using the system supported 1/S3 (196 kg of PE/mol[Ni].h.atm; toluene, Al/Ni = 1000, 30ºC, 60 min and atmospheric pressure of ethylene). The effects of polymerization conditions have been tested with the catalyst supported in S2 and the best catalytic activity has been gotten with solvent hexane, MAO as cocatalyst, molar ratio Al/Ni of 1000 and to the temperature of 30°C (285 kg of PE/mol[Ni].h.atm). When the reaction has been driven according to the in situ methodology, the activity practically doubled and polymers showed some similar properties. Polymers products by the supported catalysts showed the absence of melting fusion, results similar to those gotten with the homogeneous systems by DSC analysis. But then, polymers gotten with the transplanted system present according to the GPC’s curves the polydispersity (MwD) varies between 1.7 and 7.0. A polyethylene blend (BPE/LPE) was prepared using the complex Ni(α-diimine)Cl2 (1) (α-diimine = 1,4-bis(2,6-diisopropylphenyl)-acenaphthenediimine) and {TpMs*}TiCl3 (2) (TpMs* = hydridobis(3-mesitylpyrazol-1-yl)(5-mesitylpyrazol-1-yl)) supported in situ on MAO-modified silica (4.0 wts. -% Al/SiO2, S2). Reactions of polymerization of ethylene have been executed in the toluene in two different temperatures (0 and 30°C), varying the molars fraction of nickel (xNi), and using MAO as external cocatalyst. To all temperatures, the activities show a linear variation tendency with xNi and indicate the absence of the effect synergic between the species of nickel and the titanium. The maximum of activity have been found at 0°C. The melting temperature for the blends of polyethylene produced at 0 °C decrease whereas xNi increases indicating a good compatibility between phases of the polyethylene gotten with the two catalysts. The melting temperature for the blends of polyethylene showed be depend on the order according to which catalysts have been supported on the MAO-modified silica. The initial immobilization of 1 on the support (2/1/S2) product of polymers with a melting temperature (Tm) lower to the one of the polymer gotten when the titanium has been supported inicially (1/2/S2). The observation of polyethylenes gotten with the two systems (2/1/S2 and 1/2/S2) by scanning electron microscopy (SEM) showed the spherical polymer formation showing that the spherical morphology of the support to been reproduced. Are described the synthesis, the characterization and the catalytic properties for the oligomerization of the ethylene of four organometallics compounds of CrIII with ligands ([bis[2-(3,5-dimethyl-1-pyrazolyl)ethyl]amine] chromium (III) chloride (3a), [bis[2-(3,5- dimethyl-l-pyrazolyl)ethyl]benzylamine] chromium (III) chloride (3b), [bis[2-(3,5-dimethyl-lpyrazolyl) ethyl]ether] chromiun(III)chloride (3c), [bis[2-(3-phenyl-lpyrazolyl) ethyl]ether]chromiun(III)chloride (3d)). In relation of the oligomerization, at exception made of the compounds 3a, all complex of the chromium showed be active after activation with MAO and the TOF gotten have one effect differentiated to those formed with CrCl3(thf)3. The coordination of a tridentate ligand on the metallic center doesn't provoke any considerable changes on the formation of the C4 and C6, but the amount of C8 are decrease and the C10 and C12+ have increased. The Polymers produced by the catalyst 3a to 3 and 20 bar of ethylene have, according to analyses by DSC, the temperatures of fusion of 133,8 and 136ºC respectively. It indicates that in the two cases the production of high density polyethylene. The molar mass, gotten by GPC, is 46647 g/mols with MwD = 2,4 (3 bar). The system 3c/MAO showed values of TOF, activity and selectivity to different α-olefins according to the pressure of ethylene uses. Himself that shown a big sensibility to the concentration of ethylene solubilized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SOUZA, Rodrigo B. ; MEDEIROS, Adelardo A. D. ; NASCIMENTO, João Maria A. ; GOMES, Heitor P. ; MAITELLI, André L. A Proposal to the Supervision of Processes in an Industrial Environment with Heterogeneous Systems. In: INTERNATIONAL CONFERENCE OF THE IEEEOF THE INDUSTRUI ELECTRONICS SOCIETY,32., Paris, 2006, Paris. Anais... Paris: IECON, 2006

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photosynthetic induction in leaves of four-month-old Eucalyptus urograndis seedlings and of cuttings obtained from adult trees that were previously dark-adapted was studied by the in vivo and in situ Open Photoacoustic Cell Technique, Results for the gas exchange component of the photoacoustic (PA) signal were interpreted considering that the gas uptake component would have a phase angle nearly opposite to that of the oxygen evolution component. By subtracting the thermal component from the total PA signal, we studied the competition between gas uptake and oxygen evolution during the photosynthetic induction. Seedlings presented a net oxygen evolution prior to cuttings, but cuttings reached a higher steady-state photosynthetic activity. The chlorophyll (Chl) a/b ratio and the Chl fluorescence induction characteristic F-v/F-m were significantly higher for cuttings, while there was no difference between samples in stomata density and leaf thickness. Thus the differences in PA signals of seedlings and cuttings are associated to differences between the photosystem 2 antenna systems of these samples.