911 resultados para Case Based Computing
Resumo:
Progress in Industrial Ecology, An International Journal, nº 4(5), p. 363-381
Resumo:
This paper presents the applicability of a reinforcement learning algorithm based on the application of the Bayesian theorem of probability. The proposed reinforcement learning algorithm is an advantageous and indispensable tool for ALBidS (Adaptive Learning strategic Bidding System), a multi-agent system that has the purpose of providing decision support to electricity market negotiating players. ALBidS uses a set of different strategies for providing decision support to market players. These strategies are used accordingly to their probability of success for each different context. The approach proposed in this paper uses a Bayesian network for deciding the most probably successful action at each time, depending on past events. The performance of the proposed methodology is tested using electricity market simulations in MASCEM (Multi-Agent Simulator of Competitive Electricity Markets). MASCEM provides the means for simulating a real electricity market environment, based on real data from real electricity market operators.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
This paper presents an electricity medium voltage (MV) customer characterization framework supportedby knowledge discovery in database (KDD). The main idea is to identify typical load profiles (TLP) of MVconsumers and to develop a rule set for the automatic classification of new consumers. To achieve ourgoal a methodology is proposed consisting of several steps: data pre-processing; application of severalclustering algorithms to segment the daily load profiles; selection of the best partition, corresponding tothe best consumers’ segmentation, based on the assessments of several clustering validity indices; andfinally, a classification model is built based on the resulting clusters. To validate the proposed framework,a case study which includes a real database of MV consumers is performed.
Resumo:
This paper presents the first phase of the redevelopment of the Electric Vehicle Scenario Simulator (EVeSSi) tool. A new methodology to generate traffic demand scenarios for the Simulation of Urban MObility (SUMO) tool for urban traffic simulation is described. This methodology is based on a Portugal census database to generate a synthetic population for a given area under study. A realistic case study of a Portuguese city, Vila Real, is assessed. For this area the road network was created along with a synthetic population and public transport. The traffic results were obtained and an electric buses fleet was evaluated assuming that the actual fleet would be replaced in a near future. The energy requirements to charge the electric fleet overnight were estimated in order to evaluate the impacts that it would cause in the local electricity network.
Resumo:
Demand response programs and models have been developed and implemented for an improved performance of electricity markets, taking full advantage of smart grids. Studying and addressing the consumers’ flexibility and network operation scenarios makes possible to design improved demand response models and programs. The methodology proposed in the present paper aims to address the definition of demand response programs that consider the demand shifting between periods, regarding the occurrence of multi-period demand response events. The optimization model focuses on minimizing the network and resources operation costs for a Virtual Power Player. Quantum Particle Swarm Optimization has been used in order to obtain the solutions for the optimization model that is applied to a large set of operation scenarios. The implemented case study illustrates the use of the proposed methodology to support the decisions of the Virtual Power Player in what concerns the duration of each demand response event.
Resumo:
A population-based case-control design was used to investigate the association between migration, urbanisation and schistosomiasis in the Metropolitan Region of Recife, Northeast of Brazil. 1022 cases and 994 controls, aged 10 to 25, were selected. The natives and the migrants who come from endemic areas have a similar risk of infection. On the other hand, the risk of infection of migrants from nonendemic areas seems to be related with the time elapsed since their arrival in São Lourenço da Mata; those who have been living in that urban area for 5 or more years have a risk of infection similar to that of the natives. Those arriving in the metropolitan region of Recife mostly emigrate from "zona da mata" and "zona do agreste" in the state of Pernambuco. Due to the changes in the sugar agro-industry and to the increase in the area used for cattle grazing these workers were driven to villages and cities. The pattern of urbanisation created the conditions for the establishment of foci of transmission in São Lourenço da Mata.
Resumo:
Comunicação apresentada na CAPSI 2011 - 11ª Conferência da Associação Portuguesa de Sistemas de Informação – A Gestão de Informação na era da Cloud Computing, Lisboa, ISEG/IUL-ISCTE/, 19 a 21 de Outubro de 2011.
Resumo:
With the emergence of a global division of labour, the internationalisation of markets and cultures, the growing power of supranational organisations and the spread of new information technologies to every field of life, it starts to appear a different kind of society, different from the industrial society, and called by many as ‘the knowledge-based economy’, emphasizing the importance of information and knowledge in many areas of work and organisation of societies. Despite the common trends of evolution, these transformations do not necessarily produce a convergence of national and regional social and economic structures, but a diversity of realities emerging from the relations between economic and political context on one hand and the companies and their strategies on the other. In this sense, which future can we expect to the knowledge economy? How can we measure it and why is it important? This paper will present some results from the European project WORKS – Work organisation and restructuring in the knowledge society (6th Framework Programme), focusing the future visions and possible future trends in different countries, sectors and industries, given empirical evidences of the case studies applied in several European countries, underling the importance of foresight exercises to design policies, prevent uncontrolled risks and anticipate alternatives, leading to different ‘knowledge economies’ and not to the ‘knowled
Resumo:
Based on a poster submitted to CONCORD 2011 - Conference on Corporate R&D: The dynamics of Europe's industrial structure and the growth of innovative firms, Sevilla, IPTS, 6 Out. 2011, Seville, http://www.eventisimo.com/concord2011/recibido.html
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
Based on a retrospective case-control study we evaluated the score system adopted by the Ministry of Health of Brazil (Ministério da Saúde - MS), to diagnose pulmonary tuberculosis (PTB) in childhood. This system is independent of bacteriological or histopathological data to define a very likely (> or = 40 points), possible (30-35 points) or unlikely (< or = 25 points) diagnosis of tuberculosis. Records of hospitalized non-infected HIV children at the Instituto de Puericultura e Pediatria Martagão Gesteira of Federal University of Rio de Janeiro (IPPMG-UFRJ), were reviewed. Patients were adjusted for age and divided in two different groups: 45 subjects in the case group (culture-positive) [mean of age = 10.64 mo; SD 9.66]; and 96 in the control group (culture-negative and clinic criteria that dismissed the disease) [mean of age = 11.79 mo.; SD 11.31]. Among the variables analyzed, the radiological status had the greater impact into the diagnosis (OR = 25.39), followed by exposure to adult with tuberculosis (OR = 10.67), tuberculin skin test >10mm (OR = 8.23). The best cut-off point to the diagnosis of PTB was 30 points, where the score system was more accurate, with sensitivity of 88.9% and specificity of 86.5%.
Resumo:
In recent years, vehicular cloud computing (VCC) has emerged as a new technology which is being used in wide range of applications in the area of multimedia-based healthcare applications. In VCC, vehicles act as the intelligent machines which can be used to collect and transfer the healthcare data to the local, or global sites for storage, and computation purposes, as vehicles are having comparatively limited storage and computation power for handling the multimedia files. However, due to the dynamic changes in topology, and lack of centralized monitoring points, this information can be altered, or misused. These security breaches can result in disastrous consequences such as-loss of life or financial frauds. Therefore, to address these issues, a learning automata-assisted distributive intrusion detection system is designed based on clustering. Although there exist a number of applications where the proposed scheme can be applied but, we have taken multimedia-based healthcare application for illustration of the proposed scheme. In the proposed scheme, learning automata (LA) are assumed to be stationed on the vehicles which take clustering decisions intelligently and select one of the members of the group as a cluster-head. The cluster-heads then assist in efficient storage and dissemination of information through a cloud-based infrastructure. To secure the proposed scheme from malicious activities, standard cryptographic technique is used in which the auotmaton learns from the environment and takes adaptive decisions for identification of any malicious activity in the network. A reward and penalty is given by the stochastic environment where an automaton performs its actions so that it updates its action probability vector after getting the reinforcement signal from the environment. The proposed scheme was evaluated using extensive simulations on ns-2 with SUMO. The results obtained indicate that the proposed scheme yields an improvement of 10 % in detection rate of malicious nodes when compared with the existing schemes.
Resumo:
A case of massive Ancylostoma sp. larval infestation is presented in a patient who had received systemic corticosteroid therapy. What attracts attention in this case is the exuberance and rarity of clinical manifestation. Based on the pertinent literature, we discuss the mechanisms of parasital infection, the natural history of the disease and its treatment.
Resumo:
Disaster management is one of the most relevant application fields of wireless sensor networks. In this application, the role of the sensor network usually consists of obtaining a representation or a model of a physical phenomenon spreading through the affected area. In this work we focus on forest firefighting operations, proposing three fully distributed ways for approximating the actual shape of the fire. In the simplest approach, a circular burnt area is assumed around each node that has detected the fire and the union of these circles gives the overall fire’s shape. However, as this approach makes an intensive use of the wireless sensor network resources, we have proposed to incorporate two in-network aggregation techniques, which do not require considering the complete set of fire detections. The first technique models the fire by means of a complex shape composed of multiple convex hulls representing different burning areas, while the second technique uses a set of arbitrary polygons. Performance evaluation of realistic fire models on computer simulations reveals that the method based on arbitrary polygons obtains an improvement of 20% in terms of accuracy of the fire shape approximation, reducing the overhead in-network resources to 10% in the best case.