907 resultados para Multi Criteria Analysis
Resumo:
In a real world multiagent system, where the agents are faced with partial, incomplete and intrinsically dynamic knowledge, conflicts are inevitable. Frequently, different agents have goals or beliefs that cannot hold simultaneously. Conflict resolution methodologies have to be adopted to overcome such undesirable occurrences. In this paper we investigate the application of distributed belief revision techniques as the support for conflict resolution in the analysis of the validity of the candidate beams to be produced in the CERN particle accelerators. This CERN multiagent system contains a higher hierarchy agent, the Specialist agent, which makes use of meta-knowledge (on how the con- flicting beliefs have been produced by the other agents) in order to detect which beliefs should be abandoned. Upon solving a conflict, the Specialist instructs the involved agents to revise their beliefs accordingly. Conflicts in the problem domain are mapped into conflicting beliefs of the distributed belief revision system, where they can be handled by proven formal methods. This technique builds on well established concepts and combines them in a new way to solve important problems. We find this approach generally applicable in several domains.
Resumo:
The bending of simply supported composite plates is analyzed using a direct collocation meshless numerical method. In order to optimize node distribution the Direct MultiSearch (DMS) for multi-objective optimization method is applied. In addition, the method optimizes the shape parameter in radial basis functions. The optimization algorithm was able to find good solutions for a large variety of nodes distribution.
Resumo:
Auditory event-related potentials (AERPs) are widely used in diverse fields of today’s neuroscience, concerning auditory processing, speech perception, language acquisition, neurodevelopment, attention and cognition in normal aging, gender, developmental, neurologic and psychiatric disorders. However, its transposition to clinical practice has remained minimal. Mainly due to scarce literature on normative data across age, wide spectrumof results, variety of auditory stimuli used and to different neuropsychological meanings of AERPs components between authors. One of the most prominent AERP components studied in last decades was N1, which reflects auditory detection and discrimination. Subsequently, N2 indicates attention allocation and phonological analysis. The simultaneous analysis of N1 and N2 elicited by feasible novelty experimental paradigms, such as auditory oddball, seems an objective method to assess central auditory processing. The aim of this systematic review was to bring forward normative values for auditory oddball N1 and N2 components across age. EBSCO, PubMed, Web of Knowledge and Google Scholarwere systematically searched for studies that elicited N1 and/or N2 by auditory oddball paradigm. A total of 2,764 papers were initially identified in the database, of which 19 resulted from hand search and additional references, between 1988 and 2013, last 25 years. A final total of 68 studiesmet the eligibility criteria with a total of 2,406 participants from control groups for N1 (age range 6.6–85 years; mean 34.42) and 1,507 for N2 (age range 9–85 years; mean 36.13). Polynomial regression analysis revealed thatN1latency decreases with aging at Fz and Cz,N1 amplitude at Cz decreases from childhood to adolescence and stabilizes after 30–40 years and at Fz the decrement finishes by 60 years and highly increases after this age. Regarding N2, latency did not covary with age but amplitude showed a significant decrement for both Cz and Fz. Results suggested reliable normative values for Cz and Fz electrode locations; however, changes in brain development and components topography over age should be considered in clinical practice.
Resumo:
Mestrado em Engenharia Mecânica – Especialização Gestão Industrial
Resumo:
Thesis submitted to Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa in partial fulfilment of the requirements for the degree of Master in Computer Science
Resumo:
Recent changes in electricity markets (EMs) have been potentiating the globalization of distributed generation. With distributed generation the number of players acting in the EMs and connected to the main grid has grown, increasing the market complexity. Multi-agent simulation arises as an interesting way of analysing players’ behaviour and interactions, namely coalitions of players, as well as their effects on the market. MASCEM was developed to allow studying the market operation of several different players and MASGriP is being developed to allow the simulation of the micro and smart grid concepts in very different scenarios This paper presents a methodology based on artificial intelligence techniques (AI) for the management of a micro grid. The use of fuzzy logic is proposed for the analysis of the agent consumption elasticity, while a case based reasoning, used to predict agents’ reaction to price changes, is an interesting tool for the micro grid operator.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.
Risk Acceptance in the Furniture Sector: Analysis of Acceptance Level and Relevant Influence Factors
Resumo:
Risk acceptance has been broadly discussed in relation to hazardous risk activities and/or technologies. A better understanding of risk acceptance in occupational settings is also important; however, studies on this topic are scarce. It seems important to understand the level of risk that stakeholders consider sufficiently low, how stakeholders form their opinion about risk, and why they adopt a certain attitude toward risk. Accordingly, the aim of this study is to examine risk acceptance in regard to occupational accidents in furniture industries. The safety climate analysis was conducted through the application of the Safety Climate in Wood Industries questionnaire. Judgments about risk acceptance, trust, risk perception, benefit perception, emotions, and moral values were measured. Several models were tested to explain occupational risk acceptance. The results showed that the level of risk acceptance decreased as the risk level increased. High-risk and death scenarios were assessed as unacceptable. Risk perception, emotions, and trust had an important influence on risk acceptance. Safety climate was correlated with risk acceptance and other variables that influence risk acceptance. These results are important for the risk assessment process in terms of defining risk acceptance criteria and strategies to reduce risks.
Resumo:
Over the past decades several approaches for schedulability analysis have been proposed for both uni-processor and multi-processor real-time systems. Although different techniques are employed, very little has been put forward in using formal specifications, with the consequent possibility for mis-interpretations or ambiguities in the problem statement. Using a logic based approach to schedulability analysis in the design of hard real-time systems eases the synthesis of correct-by-construction procedures for both static and dynamic verification processes. In this paper we propose a novel approach to schedulability analysis based on a timed temporal logic with time durations. Our approach subsumes classical methods for uni-processor scheduling analysis over compositional resource models by providing the developer with counter-examples, and by ruling out schedules that cause unsafe violations on the system. We also provide an example showing the effectiveness of our proposal.
Resumo:
The aim of this research was to evaluate the protein polymorphism degree among seventy-five C. albicans strains from healthy children oral cavities of five socioeconomic categories from eight schools (private and public) in Piracicaba city, São Paulo State, in order to identify C. albicans subspecies and their similarities in infantile population groups and to establish their possible dissemination route. Cell cultures were grown in YEPD medium, collected by centrifugation, and washed with cold saline solution. The whole-cell proteins were extracted by cell disruption, using glass beads and submitted to SDS-PAGE technique. After electrophoresis, the protein bands were stained with Coomassie-blue and analyzed by statistics package NTSYS-pc version 1.70 software. Similarity matrix and dendrogram were generated by using the Dice similarity coefficient and UPGMA algorithm, respectively, which made it possible to evaluate the similarity or intra-specific polymorphism degrees, based on whole-cell protein fingerprinting of C. albicans oral isolates. A total of 13 major phenons (clusters) were analyzed, according to their homogeneous (socioeconomic category and/or same school) and heterogeneous (distinct socioeconomic categories and/or schools) characteristics. Regarding to the social epidemiological aspect, the cluster composition showed higher similarities (0.788 < S D < 1.0) among C. albicans strains isolated from healthy children independent of their socioeconomic bases (high, medium, or low). Isolates of high similarity were not found in oral cavities from healthy children of social stratum A and D, B and D, or C and E. This may be explained by an absence of a dissemination route among these children. Geographically, some healthy children among identical and different schools (private and public) also are carriers of similar strains but such similarity was not found among other isolates from children from certain schools. These data may reflect a restricted dissemination route of these microorganisms in some groups of healthy scholars, which may be dependent of either socioeconomic categories or geographic site of each child. In contrast to the higher similarity, the lower similarity or higher polymorphism degree (0.499 < S D < 0.788) of protein profiles was shown in 23 (30.6%) C. albicans oral isolates. Considering the social epidemiological aspect, 42.1%, 41.7%, 26.6%, 23.5%, and 16.7% were isolates from children concerning to socioeconomic categories A, D, C, B, and E, respectively, and geographically, 63.6%, 50%, 33.3%, 33.3%, 30%, 25%, and 14.3% were isolates from children from schools LAE (Liceu Colégio Albert Einstein), MA (E.E.P.S.G. "Prof. Elias de Melo Ayres"), CS (E.E.P.G. "Prof. Carlos Sodero"), AV (Alphaville), HF (E.E.P.S.G. "Honorato Faustino), FMC (E.E.P.G. "Prof. Francisco Mariano da Costa"), and MEP (E.E.P.S.G. "Prof. Manasses Ephraim Pereira), respectively. Such results suggest a higher protein polymorphism degree among some strains isolated from healthy children independent of their socioeconomic strata or geographic sites. Complementary studies, involving healthy students and their families, teachers, servants, hygiene and nutritional habits must be done in order to establish the sources of such colonization patterns in population groups of healthy children. The whole-cell protein profile obtained by SDS-PAGE associated with computer-assisted numerical analysis may provide additional criteria for the taxonomic and epidemiological studies of C. albicans.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
This paper studies periodic gaits of multi-legged locomotion systems based on dynamic models. The purpose is to determine the system performance during walking and the best set of locomotion variables. For that objective the prescribed motion of the robot is completely characterized in terms of several locomotion variables such as gait, duty factor, body height, step length, stroke pitch, foot clearance, legs link lengths, foot-hip offset, body and legs mass and cycle time. In this perspective, we formulate three performance measures of the walking robot namely, the mean absolute energy, the mean power dispersion and the mean power lost in the joint actuators per walking distance. A set of model-based experiments reveals the influence of the locomotion variables in the proposed indices.