77 resultados para information noncooperative game
Resumo:
In the recent past, hardly anyone could predict this course of GIS development. GIS is moving from desktop to cloud. Web 2.0 enabled people to input data into web. These data are becoming increasingly geolocated. Big amounts of data formed something that is called "Big Data". Scientists still don't know how to deal with it completely. Different Data Mining tools are used for trying to extract some useful information from this Big Data. In our study, we also deal with one part of these data - User Generated Geographic Content (UGGC). The Panoramio initiative allows people to upload photos and describe them with tags. These photos are geolocated, which means that they have exact location on the Earth's surface according to a certain spatial reference system. By using Data Mining tools, we are trying to answer if it is possible to extract land use information from Panoramio photo tags. Also, we tried to answer to what extent this information could be accurate. At the end, we compared different Data Mining methods in order to distinguish which one has the most suited performances for this kind of data, which is text. Our answers are quite encouraging. With more than 70% of accuracy, we proved that extracting land use information is possible to some extent. Also, we found Memory Based Reasoning (MBR) method the most suitable method for this kind of data in all cases.
Resumo:
Nowadays there is a big percentage of the population, specially young users, which are smartphone users and there is a lot of information to be provided within the applications, information provision should be done carefully and should be accurate, otherwise an overload of information will be produced, and the user will discard the app which is providing the information. Mobile devices are becoming smarter and provide many ways to filter information. However, there are alternatives to improve information provision from the side of the application. Some examples are, taking into account the local time, considering the battery level before doing an action and checking the user location to send personalized information attached to that location. SmartCampus and SmartCities are becoming a reality and they have more and more data integrated every day. With all this amount of data it is crucial to decide when and where is the user going to receive a notification with new information. Geofencing is a technique which allows applications to deliver information in a more useful way, in the right time and in the right place. It consists of geofences, physical regions delimited by boundaries, and devices that are eligible to receive the information assigned to the geofence. When devices cross one of these geofences an alert is pushed to the mobile device with the information.
Resumo:
This work models the competitive behaviour of individuals who maximize their own utility managing their network of connections with other individuals. Utility is taken as a synonym of reputation in this model. Each agent has to decide between two variables: the quality of connections and the number of connections. Hence, the reputation of an individual is a function of the number and the quality of connections within the network. On the other hand, individuals incur in a cost when they improve their network of contacts. The initial value of the quality and number of connections of each individual is distributed according to an initial (given) distribution. The competition occurs over continuous time and among a continuum of agents. A mean field game approach is adopted to solve the model, leading to an optimal trajectory for the number and quality of connections for each individual.
Resumo:
This thesis evaluates a start-up company (Jogos Almirante Lda) whose single asset is a board game named Almirante. It aims to conclude whether it makes sense to create a company or just earn copyrights. The thesis analyzes the board game’s market, as part of the general toy’s market, from which some data exists: European countries as well as the USA. In this work it is analyzed the several ways to finance a start-up company and then present an overview of the valuation of the Jogos Almirante based on three different methods: Discounted Cash Flow, Venture Capital Method and Real Options.
Resumo:
This work project focuses on developing new approaches which enhance Portuguese exports towards a defined German industry sector within the information technology and electronics fields. Firstly and foremost, information was collected and a set of expert and top managers’ interviews were performed in order to acknowledge the demand of the German market while identifying compatible Portuguese supply capabilities. Among the main findings, Industry 4.0 presents itself as a valuable opportunity in the German market for Portuguese medium sized companies in the embedded systems area of expertise for machinery and equipment companies. In order to achieve the purpose of the work project, an embedded systems platform targeting machinery and equipment companies was suggested as well as it was developed several recommendations on how to implement it. An alternative approach for this platform was also considered within the German market namely the eHealth sector having the purpose of enhancing the current healthcare service provision.
Resumo:
Online third-party reviews have been grown over the last decade and they now play an important role as a tool for helping customers evaluate products and services that in many cases offer more than tangible features. This study intends to quantify the impact online ratings have over video game sales by conducting a linear regression analysis on 300 titles for the previous console generation (PlayStation® 3 and Xbox® 360) using a data from the video game industry to understand the existing influence on this particular market. The findings showed that these variables have a weak linear relationship thus suggesting that quality of a title explains little the commercial success of a video game and instead this should cover a wider range of factors. Afterwards, we compare results to previous ones and discuss the managerial implications for upcoming gaming generations.
Resumo:
Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.
Resumo:
In the early nineties, Mark Weiser wrote a series of seminal papers that introduced the concept of Ubiquitous Computing. According to Weiser, computers require too much attention from the user, drawing his focus from the tasks at hand. Instead of being the centre of attention, computers should be so natural that they would vanish into the human environment. Computers become not only truly pervasive but also effectively invisible and unobtrusive to the user. This requires not only for smaller, cheaper and low power consumption computers, but also for equally convenient display solutions that can be harmoniously integrated into our surroundings. With the advent of Printed Electronics, new ways to link the physical and the digital worlds became available. By combining common printing techniques such as inkjet printing with electro-optical functional inks, it is starting to be possible not only to mass-produce extremely thin, flexible and cost effective electronic circuits but also to introduce electronic functionalities into products where it was previously unavailable. Indeed, Printed Electronics is enabling the creation of novel sensing and display elements for interactive devices, free of form factor. At the same time, the rise in the availability and affordability of digital fabrication technologies, namely of 3D printers, to the average consumer is fostering a new industrial (digital) revolution and the democratisation of innovation. Nowadays, end-users are already able to custom design and manufacture on demand their own physical products, according to their own needs. In the future, they will be able to fabricate interactive digital devices with user-specific form and functionality from the comfort of their homes. This thesis explores how task-specific, low computation, interactive devices capable of presenting dynamic visual information can be created using Printed Electronics technologies, whilst following an approach based on the ideals behind Personal Fabrication. Focus is given on the use of printed electrochromic displays as a medium for delivering dynamic digital information. According to the architecture of the displays, several approaches are highlighted and categorised. Furthermore, a pictorial computation model based on extended cellular automata principles is used to programme dynamic simulation models into matrix-based electrochromic displays. Envisaged applications include the modelling of physical, chemical, biological, and environmental phenomena.
Resumo:
A introdução de videojogos como ferramenta auxiliar do processo de educação tem vindo a crescer lado a lado com a evolução tecnológica. O aumento da capacidade dos processadores e das placas gráficas permitiram um aumento na complexidade dos videojogos e, por consequência, aumentaram as experiências fornecidas pelos mesmos. Sob o prisma da educação, o uso de um videojogo não foi a primeira ferramenta utilizada. O ensino por correspondência foi o primeiro auxílio ao método de ensino, iniciando assim o primeiro conceito de e-learning. Este permite enriquecer quem procura mais conhecimento, quem não consegue ter acesso ao mesmo ou quem tem dificuldades no ensino tradicional. A dificuldade presente em crianças com problemas do neurodesenvolvimento na aprendizagem da língua portuguesa é um problema na sociedade actual e para o combater são usados vários métodos manuais, sem recurso a tecnologia, passando normalmente por exercícios manuscritos e de observação. Desta forma, as pessoas que acompanham as crianças são obriga-das a estar presentes para recolher dados, em vez de se concentrarem nas outras actividades educativas. Este facto resulta numa possível falta de interesse por parte da criança no exercício apresentado, implicando uma perda de produtividade. A introdução de tecnologias ao serviço de causas sociais é crucial, pois permite um melhor acompanhamento das crianças, auxiliando tanto as pessoas que precisam do acompanhamento como aquelas que as acompanham. Por exemplo, um sistema automático que apresente os exercícios manuscritos num ecrã e, ao mesmo tempo, guarde os dados referentes ao seu uso seria útil para as pessoas que se encarregam de ajudar as crianças na aprendizagem da Língua Portuguesa. Esta dissertação insere-se num projecto desenvolvido pelo DIFERENÇAS – Centro de Desenvolvimento Infantil, denominado por “No Reino dos Fonemas”. Este projecto baseia-se em apresentar a crianças diversas imagens com objectivos diferentes de forma a cobrir as cinco vogais e todas as consoantes, no âmbito da aprendizagem da Língua Portuguesa. Neste contexto, as crianças podem-se interessar por um videojogo e aprender ao mesmo tempo, enquanto as pessoas que as acompanham, através dos dados do videojogo, podem focar-se mais nas dificuldades apresentadas por cada criança. Desta forma, é possível uma melhor organização dos dados de cada criança e, por conseguinte, um melhor acompanhamento das suas dificuldades.
Resumo:
Equity research report
Resumo:
This thesis examines the effects of macroeconomic factors on inflation level and volatility in the Euro Area to improve the accuracy of inflation forecasts with econometric modelling. Inflation aggregates for the EU as well as inflation levels of selected countries are analysed, and the difference between these inflation estimates and forecasts are documented. The research proposes alternative models depending on the focus and the scope of inflation forecasts. I find that models with a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) in mean process have better explanatory power for inflation variance compared to the regular GARCH models. The significant coefficients are different in EU countries in comparison to the aggregate EU-wide forecast of inflation. The presence of more pronounced GARCH components in certain countries with more stressed economies indicates that inflation volatility in these countries are likely to occur as a result of the stressed economy. In addition, other economies in the Euro Area are found to exhibit a relatively stable variance of inflation over time. Therefore, when analysing EU inflation one have to take into consideration the large differences on country level and focus on those one by one.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
Making the transition between plans and unexpected occurrences is something organizations are used to doing every day. However, not much is known about how actors cope with unanticipated events and how they accommodate them within predefined schedules. In this study, we draw on an inductive analysis of aspiring filmmakers’ film sets to elaborate on how they plan their shooting activities every day, only to adjust them when unforeseen complications arise. We discover that film crews anchor their expectations for the day based on a planned shooting schedule, yet they incorporate a built-in assumption that it will inevitably be disrupted. We argue that they resort to triage processes and “troubleshooting protocols” that help decipher incoming problems. Familiar problems are solved by making use of experience obtained from past situations, whereas unprecedented problems are solved through a tacit protocol used as a tool to quickly devise an appropriate game plan. This study contributes to the literature on sense-making and provides valuable information about the unexplored world of filmmaking.