853 resultados para Heterogeneous interacting-agent model
Resumo:
Цветомир Цачев - В настоящия доклад се прави преглед на някои резултати от областта на оптималното управление на непрекъснатите хетерогенни системи, публикувани в периодичната научна литература в последните години. Една динамична система се нарича хетерогенна, ако всеки от нейните елементи има собствена динамиката. Тук разглеждаме оптимално управление на системи, чиято хетерогенност се описва с едномерен или двумерен параметър – на всяка стойност на параметъра отговаря съответен елемент на системата. Хетерогенните динамични системи се използват за моделиране на процеси в икономиката, епидемиологията, биологията, опазване на обществената сигурност (ограничаване на използването на наркотици) и др. Тук разглеждаме модел на оптимално инвестиране в образование на макроикономическо ниво [11], на ограничаване на последствията от разпространението на СПИН [9], на пазар на права за въглеродни емисии [3, 4] и на оптимален макроикономически растеж при повишаване на нивото на върховите технологии [1]. Ключови думи: оптимално управление, непрекъснати хетерогенни динамични системи, приложения в икономиката и епидемиолегията
Resumo:
Most machine-learning algorithms are designed for datasets with features of a single type whereas very little attention has been given to datasets with mixed-type features. We recently proposed a model to handle mixed types with a probabilistic latent variable formalism. This proposed model describes the data by type-specific distributions that are conditionally independent given the latent space and is called generalised generative topographic mapping (GGTM). It has often been observed that visualisations of high-dimensional datasets can be poor in the presence of noisy features. In this paper we therefore propose to extend the GGTM to estimate feature saliency values (GGTMFS) as an integrated part of the parameter learning process with an expectation-maximisation (EM) algorithm. The efficacy of the proposed GGTMFS model is demonstrated both for synthetic and real datasets.
Resumo:
In studies of complex heterogeneous networks, particularly of the Internet, significant attention was paid to analysing network failures caused by hardware faults or overload. There network reaction was modelled as rerouting of traffic away from failed or congested elements. Here we model network reaction to congestion on much shorter time scales when the input traffic rate through congested routes is reduced. As an example we consider the Internet where local mismatch between demand and capacity results in traffic losses. We describe the onset of congestion as a phase transition characterised by strong, albeit relatively short-lived, fluctuations of losses caused by noise in input traffic and exacerbated by the heterogeneous nature of the network manifested in a power-law load distribution. The fluctuations may result in the network strongly overreacting to the first signs of congestion by significantly reducing input traffic along the communication paths where congestion is utterly negligible. © 2013 IEEE.
Resumo:
Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.
Resumo:
It is important to help researchers find valuable papers from a large literature collection. To this end, many graph-based ranking algorithms have been proposed. However, most of these algorithms suffer from the problem of ranking bias. Ranking bias hurts the usefulness of a ranking algorithm because it returns a ranking list with an undesirable time distribution. This paper is a focused study on how to alleviate ranking bias by leveraging the heterogeneous network structure of the literature collection. We propose a new graph-based ranking algorithm, MutualRank, that integrates mutual reinforcement relationships among networks of papers, researchers, and venues to achieve a more synthetic, accurate, and less-biased ranking than previous methods. MutualRank provides a unified model that involves both intra- and inter-network information for ranking papers, researchers, and venues simultaneously. We use the ACL Anthology Network as the benchmark data set and construct the gold standard from computer linguistics course websites of well-known universities and two well-known textbooks. The experimental results show that MutualRank greatly outperforms the state-of-the-art competitors, including PageRank, HITS, CoRank, Future Rank, and P-Rank, in ranking papers in both improving ranking effectiveness and alleviating ranking bias. Rankings of researchers and venues by MutualRank are also quite reasonable.
Resumo:
Small and Medium Enterprises (SMEs) play an important part in the economy of any country. Initially, a flat management hierarchy, quick response to market changes and cost competitiveness were seen as the competitive characteristics of an SME. Recently, in developed economies, technological capabilities (TCs) management- managing existing and developing or assimilating new technological capabilities for continuous process and product innovations, has become important for both large organisations and SMEs to achieve sustained competitiveness. Therefore, various technological innovation capability (TIC) models have been developed at firm level to assess firms‘ innovation capability level. These models output help policy makers and firm managers to devise policies for deepening a firm‘s technical knowledge generation, acquisition and exploitation capabilities for sustained technological competitive edge. However, in developing countries TCs management is more of TCs upgrading: acquisitions of TCs from abroad, and then assimilating, innovating and exploiting them. Most of the TIC models for developing countries delineate the level of TIC required as firms move from the acquisition to innovative level. However, these models do not provide tools for assessing the existing level of TIC of a firm and various factors affecting TIC, to help practical interventions for TCs upgrading of firms for improved or new processes and products. Recently, the Government of Pakistan (GOP) has realised the importance of TCs upgrading in SMEs-especially export-oriented, for their sustained competitiveness. The GOP has launched various initiatives with local and foreign assistance to identify ways and means of upgrading local SMEs capabilities. This research targets this gap and developed a TICs assessment model for identifying the existing level of TIC of manufacturing SMEs existing in clusters in Sialkot, Pakistan. SME executives in three different export-oriented clusters at Sialkot were interviewed to analyse technological capabilities development initiatives (CDIs) taken by them to develop and upgrade their firms‘ TCs. Data analysed at CDI, firm, cluster and cross-cluster level first helped classify interviewed firms as leader, follower and reactor, with leader firms claiming to introduce mostly new CDIs to their cluster. Second, the data analysis displayed that mostly interviewed leader firms exhibited ‗learning by interacting‘ and ‗learning by training‘ capabilities for expertise acquisition from customers and international consultants. However, these leader firms did not show much evidence of learning by using, reverse engineering and R&D capabilities, which according to the extant literature are necessary for upgrading existing TIC level and thus TCs of firm for better value-added processes and products. The research results are supported by extant literature on Sialkot clusters. Thus, in sum, a TIC assessment model was developed in this research which qualitatively identified interviewed firms‘ TIC levels, the factors affecting them, and is validated by existing literature on interviewed Sialkot clusters. Further, the research gives policy level recommendations for TIC and thus TCs upgrading at firm and cluster level for targeting better value-added markets.
Resumo:
The small-scale energy-transfer mechanism in zero-temperature superfluid turbulence of helium-4 is still a widely debated topic. Currently, the main hypothesis is that weakly nonlinear interacting Kelvin waves (KWs) transfer energy to sufficiently small scales such that energy is dissipated as heat via phonon excitations. Theoretically, there are at least two proposed theories for Kelvin-wave interactions. We perform the most comprehensive numerical simulation of weakly nonlinear interacting KWs to date and show, using a specially designed numerical algorithm incorporating the full Biot-Savart equation, that our results are consistent with the nonlocal six-wave KW interactions as proposed by L'vov and Nazarenko.
Resumo:
This work introduces a model in which agents of a network act upon one another according to three different kinds of moral decisions. These decisions are based on an increasing level of sophistication in the empathy capacity of the agent, a hierarchy which we name Piaget's ladder. The decision strategy of the agents is non-rational, in the sense they are arbitrarily fixed, and the model presents quenched disorder given by the distribution of its defining parameters. An analytical solution for this model is obtained in the large system limit as well as a leading order correction for finite-size systems which shows that typical realisations of the model develop a phase structure with both continuous and discontinuous non-thermal transitions.
Resumo:
A tanulmány a nemzetköziesedés elméleti modelljeinek hiányosságaival szembesülve amellett érvel, hogy ahhoz, hogy megértsük a nemzetköziesedés időben zajló folyamatát, a vállalatok nemzetközi evolúcióját a kontextussal összefüggésben kell vizsgálni. A tanulmány a vállalatok nemzetköziesedésének folyamatát vizsgáló elméleti szakirodalomra építve egy olyan kutatási modellt vázol fel, amely alkalmas lehet a nemzetköziesedés dinamikájának empirikus vizsgálatához. A modell a belépés, diverzitás, ütem és szakasz koncepciók mentén operacionalizált nemzetköziesedés mintázata és a kontextus (környezet, vállalat, menedzsment) közötti kapcsolatot teremti meg. A vázolt kutatási modell empirikus alkalmazásával lehetővé válik a nemzetközivé válás folyamata során kialakuló komplex kapcsolatrendszer feltárása és megértése. ________ Addressing some of the limitations of the theoretical models of firm internationalization this paper argues that in order to understand the process of firm internationalization along time one should observe firms and their contextual environments as complex interacting processes. Drawing on the theoretical models of firm internationalization process this study proposes a research model that may be suitable for the empirical examination of the dynamics of internationalization. The proposed model builds a relationship between the pattern of firm internationalization operationalized via concepts such as entry, diversity, pace, phases and the context (environment, firm, management) of it. By analyzing the interaction between the multi-level processes that shape internationalization, one can explore the reasons behind the dynamic profile of firm internationalization.
Resumo:
The phenomenonal growth of the Internet has connected us to a vast amount of computation and information resources around the world. However, making use of these resources is difficult due to the unparalleled massiveness, high communication latency, share-nothing architecture and unreliable connection of the Internet. In this dissertation, we present a distributed software agent approach, which brings a new distributed problem-solving paradigm to the Internet computing researches with enhanced client-server scheme, inherent scalability and heterogeneity. Our study discusses the role of a distributed software agent in Internet computing and classifies it into three major categories by the objects it interacts with: computation agent, information agent and interface agent. The discussion of the problem domain and the deployment of the computation agent and the information agent are presented with the analysis, design and implementation of the experimental systems in high performance Internet computing and in scalable Web searching. ^ In the computation agent study, high performance Internet computing can be achieved with our proposed Java massive computation agent (JAM) model. We analyzed the JAM computing scheme and built a brutal force cipher text decryption prototype. In the information agent study, we discuss the scalability problem of the existing Web search engines and designed the approach of Web searching with distributed collaborative index agent. This approach can be used for constructing a more accurate, reusable and scalable solution to deal with the growth of the Web and of the information on the Web. ^ Our research reveals that with the deployment of the distributed software agent in Internet computing, we can have a more cost effective approach to make better use of the gigantic scale network of computation and information resources on the Internet. The case studies in our research show that we are now able to solve many practically hard or previously unsolvable problems caused by the inherent difficulties of Internet computing. ^
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^
Resumo:
Query processing is a commonly performed procedure and a vital and integral part of information processing. It is therefore important and necessary for information processing applications to continuously improve the accessibility of data sources as well as the ability to perform queries on those data sources. ^ It is well known that the relational database model and the Structured Query Language (SQL) are currently the most popular tools to implement and query databases. However, a certain level of expertise is needed to use SQL and to access relational databases. This study presents a semantic modeling approach that enables the average user to access and query existing relational databases without the concern of the database's structure or technicalities. This method includes an algorithm to represent relational database schemas in a more semantically rich way. The result of which is a semantic view of the relational database. The user performs queries using an adapted version of SQL, namely Semantic SQL. This method substantially reduces the size and complexity of queries. Additionally, it shortens the database application development cycle and improves maintenance and reliability by reducing the size of application programs. Furthermore, a Semantic Wrapper tool illustrating the semantic wrapping method is presented. ^ I further extend the use of this semantic wrapping method to heterogeneous database management. Relational, object-oriented databases and the Internet data sources are considered to be part of the heterogeneous database environment. Semantic schemas resulting from the algorithm presented in the method were employed to describe the structure of these data sources in a uniform way. Semantic SQL was utilized to query various data sources. As a result, this method provides users with the ability to access and perform queries on heterogeneous database systems in a more innate way. ^
Resumo:
A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^
Resumo:
In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^
Resumo:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^