985 resultados para Link information
Resumo:
The ability to solve conflicting beliefs is crucial for multi- agent systems where the information is dynamic, incomplete and dis- tributed over a group of autonomous agents. The proposed distributed belief revision approach consists of a distributed truth maintenance sy- stem and a set of autonomous belief revision methodologies. The agents have partial views and, frequently, hold disparate beliefs which are au- tomatically detected by system’s reason maintenance mechanism. The nature of these conflicts is dynamic and requires adequate methodolo- gies for conflict resolution. The two types of conflicting beliefs addressed in this paper are Context Dependent and Context Independent Conflicts which result, in the first case, from the assignment, by different agents, of opposite belief statuses to the same belief, and, in the latter case, from holding contradictory distinct beliefs. The belief revision methodology for solving Context Independent Con- flicts is, basically, a selection process based on the assessment of the cre- dibility of the opposing belief statuses. The belief revision methodology for solving Context Dependent Conflicts is, essentially, a search process for a consensual alternative based on a “next best” relaxation strategy.
Resumo:
Multi-agent architectures are well suited for complex inherently distributed problem solving domains. From the many challenging aspects that arise within this framework, a crucial one emerges: how to incorporate dynamic and conflicting agent beliefs? While the belief revision activity in a single agent scenario is concentrated on incorporating new information while preserving consistency, in a multi-agent system it also has to deal with possible conflicts between the agents perspectives. To provide an adequate framework, each agent, built as a combination of an assumption based belief revision system and a cooperation layer, was enriched with additional features: a distributed search control mechanism allowing dynamic context management, and a set of different distributed consistency methodologies. As a result, a Distributed Belief Revision Testbed (DiBeRT) was developed. This paper is a preliminary report presenting some of DiBeRT contributions: a concise representation of external beliefs; a simple and innovative methodology to achieve distributed context management; and a reduced inter-agent data exchange format.
Resumo:
Environmental management is a complex task. The amount and heterogeneity of the data needed for an environmental decision making tool is overwhelming without adequate database systems and innovative methodologies. As far as data management, data interaction and data processing is concerned we here propose the use of a Geographical Information System (GIS) whilst for the decision making we suggest a Multi-Agent System (MAS) architecture. With the adoption of a GIS we hope to provide a complementary coexistence between heterogeneous data sets, a correct data structure, a good storage capacity and a friendly user’s interface. By choosing a distributed architecture such as a Multi-Agent System, where each agent is a semi-autonomous Expert System with the necessary skills to cooperate with the others in order to solve a given task, we hope to ensure a dynamic problem decomposition and to achieve a better performance compared with standard monolithical architectures. Finally, and in view of the partial, imprecise, and ever changing character of information available for decision making, Belief Revision capabilities are added to the system. Our aim is to present and discuss an intelligent environmental management system capable of suggesting the more appropriate land-use actions based on the existing spatial and non-spatial constraints.
Resumo:
This article discusses the development of an Intelligent Distributed Environmental Decision Support System, built upon the association of a Multi-agent Belief Revision System with a Geographical Information System (GIS). The inherent multidisciplinary features of the involved expertises in the field of environmental management, the need to define clear policies that allow the synthesis of divergent perspectives, its systematic application, and the reduction of the costs and time that result from this integration, are the main reasons that motivate the proposal of this project. This paper is organised in two parts: in the first part we present and discuss the developed Distributed Belief Revision Test-bed — DiBeRT; in the second part we analyse its application to the environmental decision support domain, with special emphasis on the interface with a GIS.
Resumo:
In this article, we present the first study on probabilistic tsunami hazard assessment for the Northeast (NE) Atlantic region related to earthquake sources. The methodology combines the probabilistic seismic hazard assessment, tsunami numerical modeling, and statistical approaches. We consider three main tsunamigenic areas, namely the Southwest Iberian Margin, the Gloria, and the Caribbean. For each tsunamigenic zone, we derive the annual recurrence rate for each magnitude range, from Mw 8.0 up to Mw 9.0, with a regular interval, using the Bayesian method, which incorporates seismic information from historical and instrumental catalogs. A numerical code, solving the shallow water equations, is employed to simulate the tsunami propagation and compute near shore wave heights. The probability of exceeding a specific tsunami hazard level during a given time period is calculated using the Poisson distribution. The results are presented in terms of the probability of exceedance of a given tsunami amplitude for 100- and 500-year return periods. The hazard level varies along the NE Atlantic coast, being maximum along the northern segment of the Morocco Atlantic coast, the southern Portuguese coast, and the Spanish coast of the Gulf of Cadiz. We find that the probability that a maximum wave height exceeds 1 m somewhere in the NE Atlantic region reaches 60 and 100 % for 100- and 500-year return periods, respectively. These probability values decrease, respectively, to about 15 and 50 % when considering the exceedance threshold of 5 m for the same return periods of 100 and 500 years.
Resumo:
The Tagus estuary is bordered by the largest metropolitan area in Portugal, the Lisbon capital city council. It has suffered the impact of several major tsunamis in the past, as shown by a recent revision of the catalogue of tsunamis that struck the Portuguese coast over the past two millennia. Hence, the exposure of populations and infrastructure established along the riverfront comprises a critical concern for the civil protection services. The main objectives of this work are to determine critical inundation areas in Lisbon and to quantify the associated severity through a simple index derived from the local maximum of momentum flux per unit mass and width. The employed methodology is based on the mathematical modelling of a tsunami propagating along the estuary, resembling the one occurred on the 1 November of 1755 that followed the 8.5 M-w Great Lisbon Earthquake. The employed simulation tool was STAV-2D, a shallow-flow solver coupled with conservation equations for fine solid phases, and now featuring the novelty of discrete Lagrangian tracking of large debris. Different sets of initial conditions were studied, combining distinct tidal, atmospheric and fluvial scenarios, so that the civil protection services were provided with comprehensive information to devise public warning and alert systems and post-event mitigation intervention. For the most severe scenario, the obtained results have shown a maximum inundation extent of 1.29 km at the AlcA cent ntara valley and water depths reaching nearly 10 m across Lisbon's riverfront.
Resumo:
This paper presents a model for the simulation of an offshore wind system having a rectifier input voltage malfunction at one phase. The offshore wind system model comprises a variable-speed wind turbine supported on a floating platform, equipped with a permanent magnet synchronous generator using full-power four-level neutral point clamped converter. The link from the offshore floating platform to the onshore electrical grid is done through a light high voltage direct current submarine cable. The drive train is modeled by a three-mass model. Considerations about the smart grid context are offered for the use of the model in such a context. The rectifier voltage malfunction domino effect is presented as a case study to show capabilities of the model. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Hospitals are considered as a special and important type of indoor public place where air quality has significant impacts on potential health outcomes. Information on indoor air quality of these environments, concerning exposures to particulate matter (PM) and related toxicity, is limited though. This work aims to evaluate risks associated with inhalation exposure to ten toxic metals and chlorine (As, Ni, Cr, Cd, Pb, Mn, Se, Ba, Al, Si, and Cl) in coarse (PM2.5–10) and fine (PM2.5) particles in a Portuguese hospital in comparison with studies representative of other countries. Samples were collected during 1 month in one urban hospital; elemental PM characterization was determined by proton-induced X-ray emission. Noncarcinogenic and carcinogenic risks were assessed according to the methodology provided by the United States Environmental Protection Agency (USEPA; Region III Risk-Based Concentration Table) for three different age categories of hospital personnel (adults, >20, and <65 years) and patients (considering nine different age groups, i.e., children of 1–3 years to seniors of >65 years). The estimated noncarcinogenic risks due to occupational inhalation exposure to PM2.5-bound metals ranged from 5.88×10−6 for Se (adults, 55–64 years) to 9.35×10−1 for As (adults, 20–24 years) with total noncarcinogenic risks (sum of all metals) above the safe level for all three age categories. As and Cl (the latter due to its high abundances) were the most important contributors (approximately 90 %) to noncarcinogenic risks. For PM2.5–10, noncarcinogenic risks of all metals were acceptable to all age groups. Concerning carcinogenic risks, for Ni and Pb, they were negligible (<1×10−6) in both PM fractions for all age groups of hospital personnel; potential risks were observed for As and Cr with values in PM2.5 exceeding (up to 62 and 5 times, respectively) USEPA guideline across all age groups; for PM2.5–10, increased excess risks of As and Cr were observed particularly for long-term exposures (adults, 55–64 years). Total carcinogenic risks highly (up to 67 times) exceeded the recommended level for all age groups, thus clearly showing that occupational exposure to metals in fine particles pose significant risks. If the extensive working hours of hospital medical staff were considered, the respective noncarcinogenic and carcinogenic risks were increased, the latter for PM2.5 exceeding the USEPA cumulative guideline of 10−4. For adult patients, the estimated noncarcinogenic and carcinogenic risks were approximately three times higher than for personnel, with particular concerns observed for children and adolescents.
Resumo:
MSc. Dissertation presented at Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa to obtain the Master degree in Electrical and Computer Engineering
Resumo:
In this paper we address the problem of computing multiple roots of a system of nonlinear equations through the global optimization of an appropriate merit function. The search procedure for a global minimizer of the merit function is carried out by a metaheuristic, known as harmony search, which does not require any derivative information. The multiple roots of the system are sequentially determined along several iterations of a single run, where the merit function is accordingly modified by penalty terms that aim to create repulsion areas around previously computed minimizers. A repulsion algorithm based on a multiplicative kind penalty function is proposed. Preliminary numerical experiments with a benchmark set of problems show the effectiveness of the proposed method.
Resumo:
Composition is a practice of key importance in software engineering. When real-time applications are composed, it is necessary that their timing properties (such as meeting the deadlines) are guaranteed. The composition is performed by establishing an interface between the application and the physical platform. Such an interface typically contains information about the amount of computing capacity needed by the application. For multiprocessor platforms, the interface should also present information about the degree of parallelism. Several interface proposals have recently been put forward in various research works. However, those interfaces are either too complex to be handled or too pessimistic. In this paper we propose the generalized multiprocessor periodic resource model (GMPR) that is strictly superior to the MPR model without requiring a too detailed description. We then derive a method to compute the interface from the application specification. This method has been implemented in Matlab routines that are publicly available.
Resumo:
Trabalho final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações
Resumo:
This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.
Resumo:
Develop a client-server application for a mobile environment can bring many challenges because of the mobile devices limitations. So, in this paper is discussed what can be the more reliable way to exchange information between a server and an Android mobile application, since it is important for users to have an application that really works in a responsive way and preferably without any errors. In this discussion two data transfer protocols (Socket and HTTP) and three serialization data formats (XML, JSON and Protocol Buffers) were tested using some metrics to evaluate which is the most practical and fast to use.
Resumo:
In machine learning and pattern recognition tasks, the use of feature discretization techniques may have several advantages. The discretized features may hold enough information for the learning task at hand, while ignoring minor fluctuations that are irrelevant or harmful for that task. The discretized features have more compact representations that may yield both better accuracy and lower training time, as compared to the use of the original features. However, in many cases, mainly with medium and high-dimensional data, the large number of features usually implies that there is some redundancy among them. Thus, we may further apply feature selection (FS) techniques on the discrete data, keeping the most relevant features, while discarding the irrelevant and redundant ones. In this paper, we propose relevance and redundancy criteria for supervised feature selection techniques on discrete data. These criteria are applied to the bin-class histograms of the discrete features. The experimental results, on public benchmark data, show that the proposed criteria can achieve better accuracy than widely used relevance and redundancy criteria, such as mutual information and the Fisher ratio.