923 resultados para Complex Engineering Systems
Resumo:
We propose a preliminary methodology for agent-oriented software engineering based on the idea of agent interaction analysis. This approach uses interactions between undetermined agents as the primary component of analysis and design. Agents as a basis for software engineering are useful because they provide a powerful and intuitive abstraction which can increase the comprehensiblity of a complex design. The paper describes a process by which the designer can derive the interactions that can occur in a system satisfying the given requirements and use them to design the structure of an agent-based system, including the identification of the agents themselves. We suggest that this approach has the flexibility necessary to provide agent-oriented designs for open and complex applications, and has value for future maintenance and extension of these systems.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
A neural approach to solve the problem defined by the economic load dispatch in power systems is presented in this paper, Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements and the high degree of connectivity between these elements the ability of neural networks to realize some complex nonlinear function makes them attractive for system optimization the neural networks applyed in economic load dispatch reported in literature sometimes fail to converge towards feasible equilibrium points the internal parameters of the modified Hopfield network developed here are computed using the valid-subspace technique These parameters guarantee the network convergence to feasible quilibrium points, A solution for the economic load dispatch problem corresponds to an equilibrium point of the network. Simulation results and comparative analysis in relation to other neural approaches are presented to illustrate efficiency of the proposed approach.
Resumo:
Domains where knowledge representation is too complex to be described analytically and in a deterministic way is very common in the petroleum industry, particularly in the field of exploration and production. In these domains, applications of artificial intelligence techniques are very suitable, especially in cases where the preservation of corporate and technical knowledge is important. The Laboratory for Research on Artificial Intelligence Applied to Petroleum Engineering (LIAP) at Unicamp, has, during the last 10 years, dedicated research efforts to build intelligent systems in well drilling and petroleum production fields. In the following sections, recent advances in intelligent systems, under development in the research laboratory, are described. (C) 2001 Published by Elsevier B.V. B.V.
Resumo:
A complex system is often identified by the absence of a characteristic length, e.g. as in a fractal. A very large system subject to a fragmentation and/or aggregation dynamics passes through such complex configurations. We study statistically creation and maintenance of such configurations in space dimensions d = 1 to 5 and find that they are easily created (maintained) for small (large) d. An intermediate d such as d = 3 seems to be ideal for the creation and maintenance of complex systems. This has consequences in a statistical description of the universe.
Resumo:
Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We consider a class of involutive systems of n smooth vector fields on the n + 1 dimensional torus. We obtain a complete characterization for the global solvability of this class in terms of Liouville forms and of the connectedness of all sublevel and superlevel sets of the primitive of a certain 1-form in the minimal covering space.
Resumo:
Competitive learning is an important machine learning approach which is widely employed in artificial neural networks. In this paper, we present a rigorous definition of a new type of competitive learning scheme realized on large-scale networks. The model consists of several particles walking within the network and competing with each other to occupy as many nodes as possible, while attempting to reject intruder particles. The particle's walking rule is composed of a stochastic combination of random and preferential movements. The model has been applied to solve community detection and data clustering problems. Computer simulations reveal that the proposed technique presents high precision of community and cluster detections, as well as low computational complexity. Moreover, we have developed an efficient method for estimating the most likely number of clusters by using an evaluator index that monitors the information generated by the competition process itself. We hope this paper will provide an alternative way to the study of competitive learning.
The role of empirical research in the study of complex forms of governance in agroindustrial systems
Resumo:
The growing complexity of supply chains poses new challenges for Agricultural Research Centers and statistical agencies. The aim of this perspective paper is to discuss the role of empirical research in understanding the complex forms of governance in agribusiness. The authors argue that there are three fundamental levels of analysis: (i) the basic structure of the market, (ii) the formal contractual arrangements that govern relations within the agroindustrial system and (iii) the transactional dimensions governed by non-contractual means. The case of the agrochemical industry in Brazil illustrates how traditional analyses that only address market structure are insufficient to fully explain the agricultural sector and its supply chain. The article concludes by suggesting some indicators which could be collected by statistical agencies to improve understanding of the complex relationships among agribusiness segments. In doing so, the paper seeks to minimize costs and to enable a better formulation of public and private policies.
Resumo:
Doctoral program: Motor praxiology, physical education and sport training
Resumo:
[ES] El reto de conseguir una red eléctrica más eficiente pasa por la introducción masiva de energías renovables en la red eléctrica, disminuyendo así las emisiones de CO2. Por ello, se propone no sólo controlar la producción, como se ha hecho hasta ahora, sino que también se propone controlar la demanda. Por ello, en esta investigación se evalúa el uso de la Ingeniería Dirigida por Modelos para gestionar la complejidad en el modelado de redes eléctricas, la Inteligencia de Negocio para analizar la gran cantidad de datos de simulaciones y la Inteligencia Colectiva para optimizar el reparto de energía entre los millones de dispositivos que se encuentran en el lado de la demanda.
Resumo:
[EN]A complex stochastic Boolean system (CSBS) is a complex system depending on an arbitrarily large number
Resumo:
This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.
Resumo:
Deutsch:In der vorliegenden Arbeit konnten neue Methoden zur Synthese anorganischer Materialien mit neuartiger Architektur im Mikrometer und Nanometer Maßstab beschrieben werden. Die zentrale Rolle der Formgebung basiert dabei auf der templatinduzierten Abscheidung der anorganischen Materialien auf selbstorganisierten Monoschichten. Als Substrate eignen sich goldbedampfte Glasträger und Goldkolloide, die eine Mittelstellung in der Welt der Atome bzw. Moleküle und der makroskopischen Welt der ausgedehnten Festkörper einnehmen. Auf diesen Substraten lassen sich Thiole zu einer monomolekularen Schicht adsorbieren und damit die Oberflächeneigenschaften des Substrates ändern. Ein besonderer Schwerpunkt bei dieser Arbeit stellt die Synthese speziell auf die Bedürfnisse der jeweiligen Anwendung ausgerichteten Thiole dar.Im ersten Teil der Arbeit wurden goldbedampfte Glasoberflächen als Template verwendet. Die Abscheidung von Calciumcarbonat wurde in Abhängigkeit der Schichtdicke der adsorbierten Monolage untersucht. Aragonit, eine der drei Hauptphasen des Calciumcarbonat Systems, wurde auf polyaromatischen Amid - Oberflächen mit Schichtdicken von 5 - 400 nm Dicke unter milden Bedingung abgeschieden. Die einstellbaren Parameter waren dabei die Kettenlänge des Polymers, der w-Substituent, die Bindung an die Goldoberfläche über Verwendung verschiedener Aminothiole und die Kristallisationstemperatur. Die Schichtdickeneinstellung der Polymerfilme erfolgte hierbei über einen automatisierten Synthesezyklus.Titanoxid Filme konnten auf Oberflächen strukturiert werden. Dabei kam ein speziell synthetisiertes Thiol zum Einsatz, das die Funktionalität einer Styroleinheit an der Oberflächen Grenze als auch eine Möglichkeit zur späteren Entfernung von der Oberfläche in sich vereinte. Die PDMS Stempeltechnik erzeugte dabei Mikrostrukturen auf der Goldoberfläche im Bereich von 5 bis 10 µm, die ihrerseits über die Polymerisation und Abscheidung des Polymers in den Titanoxid Film überführt werden konnten. Drei dimensionale Strukturen wurden über Goldkolloid Template erhalten. Tetraethylenglykol konnte mit einer Thiolgruppe im Austausch zu einer Hydroxylgruppe monofunktionalisiert werden. Das erhaltene Molekül wurde auf kolloidalem Gold selbstorganisiert; es entstand dabei ein wasserlösliches Goldkolloid. Die Darstellung erfolgte dabei in einer Einphasenreaktion. Die so erhaltenen Goldkolloide wurden als Krstallisationstemplate für die drei dimensionale Abscheidung von Calciumcarbonat verwendet. Es zeigte sich, dass Glykol die Kristallisation bzw. den Habitus des krsitalls bei niedrigem pH Wert modifiziert. Bei erhöhtem pH Wert (pH = 12) jedoch agieren die Glykol belegten Goldkolloide als Template und führen zu sphärisch Aggregaten. Werden Goldkolloide langkettigen Dithiolen ausgesetzt, so führt dies zu einer Aggregation und Ausfällung der Kolloide aufgrund der Vernetzung mehrer Goldkolloide mit den Thiolgruppen der Alkyldithiole. Zur Vermeidung konnte in dieser Arbeit ein halbseitig geschütztes Dithiol synthetisiert werden, mit dessen Hilfe die Aggregation unterbunden werden konnte. Das nachfolgende Entschützten der Thiolfunktion führte zu Goldkolloiden, deren Oberfläche Thiol funktionalisiert werden konnte. Die thiolaktiven Goldkolloide fungierten als template für die Abscheidung von Bleisulfid aus organisch/wässriger Lösung. Die Funktionsweise der Schutzgruppe und die Entschützung konnte mittels Plasmonenresonanz Spektroskopie verdeutlicht werden. Titanoxid / Gold / Polystyrol Komposite in Röhrenform konnten synthetisiert werden. Dazu wurde ein menschliches Haar als biologisches Templat für die Formgebung gewählt.. Durch Bedampfung des Haares mit Gold, Assemblierung eines Stryrolmonomers, welches zusätzlich eine Thiolfunktionalität trug, Polymerisation auf der Oberfläche, Abscheidung des Titanoxid Films und anschließendem Auflösen des biologischen Templates konnte eine Röhrenstruktur im Mikrometer Bereich dargestellt werden. Goldkolloide fungierten in dieser Arbeit nicht nur als Kristallisationstemplate und Formgeber, auch sie selbst wurden dahingehend modifiziert, dass sie drahtförmige Agglormerate im Nanometerbereich ausbilden. Dazu wurden Template aus Siliziumdioxid benutzt. Zum einen konnten Nanoröhren aus amorphen SiO2 in einer Sol Gel Methode dargestellt werden, zum anderen bediente sich diese Arbeit biologischer Siliziumoxid Hohlnadeln aus marinen Schwämmen isoliert. Goldkolloide wurden in die Hohlstrukturen eingebettet und die Struktur durch Ausbildung von Kolloid - Thiol Netzwerken mittels Dithiol Zugabe gefestigt. Die Gold-Nanodrähte im Bereich von 100 bis 500 nm wurden durch Auflösen des SiO2 - Templates freigelegt.