982 resultados para Applied Computing
Resumo:
Artificial Intelligence has been applied to dynamic games for many years. The ultimate goal is creating responses in virtual entities that display human-like reasoning in the definition of their behaviors. However, virtual entities that can be mistaken for real persons are yet very far from being fully achieved. This paper presents an adaptive learning based methodology for the definition of players’ profiles, with the purpose of supporting decisions of virtual entities. The proposed methodology is based on reinforcement learning algorithms, which are responsible for choosing, along the time, with the gathering of experience, the most appropriate from a set of different learning approaches. These learning approaches have very distinct natures, from mathematical to artificial intelligence and data analysis methodologies, so that the methodology is prepared for very distinct situations. This way it is equipped with a variety of tools that individually can be useful for each encountered situation. The proposed methodology is tested firstly on two simpler computer versus human player games: the rock-paper-scissors game, and a penalty-shootout simulation. Finally, the methodology is applied to the definition of action profiles of electricity market players; players that compete in a dynamic game-wise environment, in which the main goal is the achievement of the highest possible profits in the market.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding the management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
Systematics is the study of diversity of the organisms and their relationships comprising classification, nomenclature and identification. The term classification or taxonomy means the arrangement of the organisms in groups (rate) and the nomenclature is the attribution of correct international scientific names to organisms and identification is the inclusion of unknown strains in groups derived from classification. Therefore, classification for a stable nomenclature and a perfect identification are required previously. The beginning of the new bacterial systematics era can be remembered by the introduction and application of new taxonomic concepts and techniques, from the 50s and 60s. Important progress were achieved using numerical taxonomy and molecular taxonomy. Molecular taxonomy, brought into effect after the emergence of the Molecular Biology resources, provided knowledge that comprises systematics of bacteria, in which occurs great evolutionary interest, or where is observed the necessity of eliminating any environmental interference. When you study the composition and disposition of nucleotides in certain portions of the genetic material, you study searching their genome, much less susceptible to environmental alterations than proteins, codified based on it. In the molecular taxonomy, you can research both DNA and RNA, and the main techniques that have been used in the systematics comprise the build of restriction maps, DNA-DNA hybridization, DNA-RNA hybridization, sequencing of DNA sequencing of sub-units 16S and 23S of rRNA, RAPD, RFLP, PFGE etc. Techniques such as base sequencing, though they are extremely sensible and greatly precise, are relatively onerous and impracticable to the great majority of the bacterial taxonomy laboratories. Several specialized techniques have been applied to taxonomic studies of microorganisms. In the last years, these have included preliminary electrophoretic analysis of soluble proteins and isoenzymes, and subsequently determination of deoxyribonucleic acid base composition and assessment of base sequence homology by means of DNA-RNA hybrid experiments beside others. These various techniques, as expected, have generally indicated a lack of taxonomic information in microbial systematics. There are numberless techniques and methodologies that make bacteria identification and classification study possible, part of them described here, allowing establish different degrees of subspecific and interspecific similarity through phenetic-genetic polymorphism analysis. However, was pointed out the necessity of using more than one technique for better establish similarity degrees within microorganisms. Obtaining data resulting from application of a sole technique isolatedly may not provide significant information from Bacterial Systematics viewpoint
Resumo:
Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.
Resumo:
Comunicação apresentada na CAPSI 2011 - 11ª Conferência da Associação Portuguesa de Sistemas de Informação – A Gestão de Informação na era da Cloud Computing, Lisboa, ISEG/IUL-ISCTE/, 19 a 21 de Outubro de 2011.
Resumo:
O desenvolvimento de aplicações para dispositivos móveis já não é uma área recente, contudo continua a crescer a um ritmo veloz. É notório o avanço tecnológico dos últimos anos e a crescente popularidade destes dispositivos. Este avanço deve-se não só à grande evolução no que diz respeito às características destes dispositivos, mas também à possibilidade de criar aplicações inovadoras, práticas e passíveis de solucionar os problemas dos utilizadores em geral. Nesse sentido, as necessidades do quotidiano obrigam à implementação de soluções que satisfaçam os utilizadores, e nos dias de hoje, essa satisfação muitas vezes passa pelos dispositivos móveis, que já tem um papel fundamental na vida das pessoas. Atendendo ao aumento do número de raptos de crianças e à insegurança que se verifica nos dias de hoje, as quais dificultam a tarefa de todos os pais/cuidadores que procuraram manter as suas crianças a salvo, é relevante criar uma nova ferramenta capaz de os auxiliar nesta árdua tarefa. A partir desta realidade, e com vista a cumprir os aspetos acima mencionados, surge assim esta dissertação de mestrado. Esta aborda o estudo e implementação efetuados no sentido de desenvolver um sistema de monitorização de crianças. Assim, o objetivo deste projeto passa por desenvolver uma aplicação nativa para Android e um back-end, utilizando um servidor de base de dados NoSQL para o armazenamento da informação, aplicando os conceitos estudados e as tecnologias existentes. A solução tem como principais premissas: ser o mais user-friendly possível, a otimização, a escalabilidade para outras situações (outros tipos de monitorizações) e a aplicação das mais recentes tecnologias. Assim sendo, um dos estudos mais aprofundados nesta dissertação de mestrado está relacionado com as bases de dados NoSQL, dada a sua importância no projeto.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for degree of Master in Statistics and Information Management.
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
Biosensors have opened new horizons in biomedical analysis, by ensuring increased assay speed and flexibility, and allowing point-of-care applications, multi-target analyses, automation and reduced costs of testing. This has been a result of many studies merging nanotechnology with biochemistry over the years, thereby enabling the creation of more suitable environments to biological receptors and their substitution by synthetic analogue materials. Sol-gel chemistry, among other materials, is deeply involved in this process. Sol-gel processing allows the immobilization of organic molecules, biomacromolecules and cells maintaining their properties and activities, permitting their integration into different transduction devices, of electrochemical or optical nature, for single or multiple analyses. Sol-gel also allows to the production of synthetic materials mimicking the activity of natural receptors, while bringing advantages, mostly in terms of cost and stability. Moreover, the biocompatibility of sol-gel materials structures of biological nature allowed the use of these materials in emerging in vivo applications. In this chapter, biosensors for biomedical applications based on sol-gel derived composites are presented, compared and described, along with current emerging applications in vivo, concerning drug delivery or biomaterials. Sol-gel materials are shown as a promising tool for current, emerging and future medical applications. - See more at: http://www.eurekaselect.com/127191/article#sthash.iPqqyhox.dpuf
Resumo:
This paper studies several topics related with the concept of “fractional” that are not directly related with Fractional Calculus, but can help the reader in pursuit new research directions. We introduce the concept of non-integer positional number systems, fractional sums, fractional powers of a square matrix, tolerant computing and FracSets, negative probabilities, fractional delay discrete-time linear systems, and fractional Fourier transform.
Resumo:
This article presents a framework to an Industrial Engineering and Management Science course from School of Management and Industrial Studies using Autonomous Ground Vehicles (AGV) to supply materials to a production line as an experimental setup for the students to acquire knowledge in the production robotics area. The students must be capable to understand and put into good use several concepts that will be of utmost importance in their professional life such as critical decisions regarding the study, development and implementation of a production line. The main focus is a production line using AGVs, where the students are required to address several topics such as: sensors actuators, controllers and an high level management and optimization software. The presented framework brings to the robotics teaching community methodologies that allow students from different backgrounds, that normally don’t experiment with the robotics concepts in practice due to the big gap between theory and practice, to go straight to ”making” robotics. Our aim was to suppress the minimum start point level thus allowing any student to fully experience robotics with little background knowledge.
Resumo:
Thick smears of human feces can be made adequate for identification of helminth eggs by means of refractive index matching. Although this effect can be obtained by simply spreading a fleck of feces on a microscope slide, a glycerol solution has been routinely used to this end. Aiming at practicability, a new quantitative technique has been developed. To enhance both sharpness and contrast of the images, a sucrose solution (refractive index = 1.49) is used, which reduces the effect of light-scattering particulates. To each slide a template-measured (38.5 mm³) fecal sample is transferred. Thus, egg counts and sensitivity evaluations are easily made.
Resumo:
In this paper, we apply the following four power indices to the Portuguese Parliament: Shapley–Shubik index, Banzhaf index, Deegan–Packel index and Public Good Index. We also present the main notions related with simple games and discuss the features of each power index by means of their axiomatic characterizations.