929 resultados para Twitter election


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ever increasing popularity of social media makes it a promising source for the personalization of gameplay experiences. Furthermore, involving social network friends in a game can greatly enrich the satisfaction of the player and also attract potential novel players to a game. This master thesis describes a social overlay designed for desktop games, called GameNshare. It allows players to easily capture and share with multiple social networks game-related screenshots, videos and stories. Additionally, it also provides asynchronous multiplayer game mechanics to directly integrate social network friends in the game. GameNshare was designed to interact with the users in a non-intrusive way allowing them to be in complete control of what is shared. It prevents unsolicited sharing of messages, a key problem in social media integration tools, by the use of built-in message monitoring and anti-spam measures. GameNshare was specially designed for players aged from 18 to 25 years that are regular users of Twitter and Facebook. It was tested by a group of 10 individuals from the target age range that were surveyed to capture their insights on the use of the social overlay. The implemented GameNshare features were well accepted by the testers that were also useful in highlighting features for future development. GameNshare ultimate goal is to make players look and ask for social integration and allow them to take full advantage of their social communities to improve gaming experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMO: A Diabetes Mellitus é uma doença metabólica crónica, com deficiência a nível do metabolismo dos hidratos de carbono, lípidos e proteínas, resultante de deficiências na secreção ou ação da insulina, ou de ambas, que quando não tratada antecipadamente e de modo conveniente, pode ter consequências muito graves. Dado a incidência a nível mundial da Diabetes Mellitus, torna-se de elevada importância avaliar toda a sua envolvência e estudar bem quais os critérios a ter em consideração. Este trabalho propõe-se estudar para além dos parâmetros bioquímicos relacionados com a doença - Glicose e Hemoglobina Glicada A1c (HbA1c), analisar os resultados dos últimos cinco anos (2008-2012) dos ensaios interlaboratoriais do PNAEQ, do Departamento de Epidemiologia, do Instituto Nacional de Saúde Dr. Ricardo Jorge. Foram também analisadas as metodologias utilizadas e as variações interlaboratoriais, de forma a entender qual ou quais são os parâmetros mais adequados para o seu diagnóstico e controlo. Este estudo utilizou a população de laboratórios portugueses, públicos e privados, de Portugal Continental e Ilhas, um laboratório de Angola e outro de Macau que se inscreveram no PNAEQ nestes cinco anos, sendo a amostra composta pelo n.º de participações. No programa de Química Clinica foram distribuídas 38 amostras e no programa de HbA1c foram distribuídas 22 amostras. Para a glicose, o nível de desempenho nos ensaios é na globalidade das amostras de Excelente, no entanto verifica-se que sempre que a concentração da amostra é de nível patológico, que a maioria dos ensaios o desempenho foi inferior – Bom. O método de eleição e com CV% mais baixos foi o método da hexoquinase. Para a HbA1c, o nível de desempenho nos ensaios é na globalidade das amostras de Excelente. O método de eleição e com CV% mais baixos foi o método de HPLC. O CV% para a glicose ronda desde 2010 a 2012, os 3% e para a HbA1c foi de aproximadamente 4,0% em 2012. A HbA1c tem mostrado ser uma ferramenta muito útil, importante e robusta na monitorização da Diabetes, sendo hoje em dia quase sempre requisitada em análises de rotina a diabéticos de modo a prevenir complicações que possam vir a acorrer. No futuro poderá ser um importante, senão o parâmetro de futuro, para o diagnóstico da Diabetes, no entanto, mesmo já tendo sido muito trabalhada a sua padronização, ainda existem questões por responder como quais são na realidade todos os seus interferentes, qual a verdadeira relação da HbA1c com a glicose média estimada, em todas as populações e com estudos epidemiológicos. Também a própria educação do diabético e clínico deve ser aprimorada, pelo que neste momento as PTGO e os doseamentos de glicose em jejum devem ser utilizados e encontrando-se a Norma da DGS N.º 033/2011 de acordo com as necessidades e com o estado da arte deste parâmetro. A implementação da glicose média estimada será uma mais-valia na monitorização dos diabéticos pelo que deverá ser uma das prioridades a ter em conta no futuro desta padronização, uniformizando a decisão clinica baseada nela e minimizando a dificuldade de interpretação de resultados de laboratório para laboratório. --------------ABSTRACT: Diabetes Mellitus is a chronic metabolic disease, with a deficit in the metabolism of carbohydrates, lipids and proteins, resulting from deficiencies in insulin secretion or action, or both, which if, when not early treated in a proper way, may result in very serious consequences. Given the worldwide incidence of diabetes mellitus, it is highly important to evaluate all its background and study specifically all the criteria to take into consideration. The aim of this thesis is to study and evaluate beyond the biochemical parameters related to the disease - Glucose and Glycated Haemoglobin A1c (HbA1c), analyze the results of the last five years (2008-2012) of the PNAEQ interlaboratorial tests, in the Department of Epidemiology of National Institute of Health Dr. Ricardo Jorge. It is also intended to analyze the methodologies used and the interlaboratorial variations, in order to understand the most suitable parameters for the diagnosis and control. This study was based in a population of Portuguese laboratories, public and private, of Portugal mainland and islands, a laboratory of Angola and other from Macau, who enrolled in PNAEQ in these five years, and the sample was composed by the n. º of holdings. In the Clinical Chemistry Program there were distributed 38 samples and in the program HbA1c were distributed 22 samples. For glucose, the level of performance in the total nº of the samples was Excellent; however, it was found that when the concentration level of the sample was pathological, in most of the tests the performance was Good. The most preferred method with the lowest CV% is the hexokinase method. For the HbA1c, as a whole, the samples’ tests were Excellent, at the level of performance. The method of election with the lower CV% was the HPLC. The CV% for glucose was around 3%, from 2010 to 2012 and the HbA1c was approximately 4.0% in 2012. The HbA1c method has demonstrated to be a very useful tool, important and robust for monitoring diabetes, being nowadays, almost always required in routine analysis to prevent future complications. In the future it may be an important parameter, if not the most important, for the diagnosis of diabetes. However, despite it has already been standardized, there are still some questions that need to be answered, such as, which are in fact all their interferences, which is the true connection of HbA1c, when compared with the estimated average glucose, in all populations and epidemiological studies. Moreover, the education of the patient and the doctor concerning diabetes should be improved. Nowadays, the Oral Glucose Tolerance Test (OGTT) and fasting glucose determinations should be used and, the needs and the state of the art of this parameter, should be in accordance with the Standard DGS N. º 033/2011. The Implementation of the estimated average glucose will be an added value in monitoring diabetics and, therefore, should be a priority to consider in its future standardization and clinical decision based on it, will be uniform and the difficulty of interpreting results from laboratory to laboratory will be minimal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Ciências da Comunicação – Estudos dos Media e do Jornalismo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Voter education campaigns often aim to increase voter particpation and political accountability. We follow randomized interventions implemented nationwide during the 2009 Mozambican elections using a free newspaper, leaflets, and text messaging. We investigate whether treatment effects were transmitted through social networks (kinship and chatting) and geographical proximity. For individuals personally targeted by the campaign, we estimate the reinforcement effect of proximity to other targeted individuals. For untargeted individuals, we estimate the diffusion of the campaign depending on a proximity to targeted individuals. We find evidence for both effects, similar across the different treatments and across the different connectedness measures. We observe that the treatments worked through the networks by raising the levels of information and interest about the election, in line with the average treatment effects of voter education on voter participation. We interpret this result as a free riding effect, likely to occur for costly actions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Masters Thesis, presented as part of the requirements for the award of a Research Masters Degree in Economics from NOVA – School of Business and Economics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com a crescente utilização mundial das novas tecnologias, como o uso de determinadas redes sociais, Facebook, Twitter, MSN, My Space, Blogs, Youtube, Instagram, entre muitos outros programas ao nosso alcance, cresceu também um o problema. A forma como os utilizamos comunicarmos com familiares, amigos e colegas de trabalho no âmbito da nossa esfera privada No entanto, muitas vezes não temos consciência de que aquilo que divulgamos no meio virtual nos pode trazer consequências a nível profissional. Implicações essas, que podem levar mesmo ao despedimento com justa causa, ou seja, sem direito a qualquer tipo de indeminização. Exemplo disso mesmo, é o caso do trabalhador da empresa Esegur que foi despedido, devido a comentários realizados no Facebook, a rede social mais utilizada em Portugal, que colocavam em causa os direitos de personalidade de colegas e superiores hierárquicos. Todavia, para que esta forma de despedimento tão gravosa possa vir a ocorrer temos de ter sempre em mente, o seguinte: - Se fotos, publicações ou comentários publicados causam danos não patrimoniais ou patrimoniais nos direitos da empresa/empregador; -Se a Quebra do elo de Confiança entre o trabalhador e o empregador é tão gravosa que justifique o despedimento; - Se Lesões patrimoniais na esfera da entidade empregadora. Assim, há que ter sempre em conta, se na situação em concreto, o comportamento foi tão grave que destruiu ou abalou de tal modo a confiança da parte do empregador, que justifique sanção tão gravosa. Tendo em conta que naquela situação um trabalhador normal teria tomado uma opção completamente diferente, que não colocaria em causa a relação laboral. Facilmente se compreende que o princípio da proporcionalidade (art.18º CRP) é fundamental apreciação deste tipo de casos. Uma vez que, muitas vezes é bastante difícil aplicar a medida disciplinar, das que o empregador tem á sua disposição (artº328CT), que mais se adequa á gravidade do ato cometido pelo trabalhador, o que nem sempre será o despedimento do trabalhador cujo comportamento resultou nas lesões. O que no âmbito deste trabalho é a divulgação de conteúdos nas redes sociais, claro que uns serão mais graves, que outros.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho de Projecto consiste numa plataforma online que possibilita ao seu utilizador gerir simultaneamente a saúde, o exercício físico e o regime alimentar. A plataforma é uma ferramenta que ajuda a promover o bem-estar do indivíduo, porque auxilia a monitorizar a toma de medicação e a agendar consultas e/ou exames. Permite ainda identificar o exercício físico e a dieta adequada para o estado de saúde do utilizador, através do teste de aptidão física e das calculadoras de IMC, TMB e NCD. Para o desenvolvimento da plataforma foram considerados os princípios da usabilidade e acessibilidade heurística de Jacob Nielsen. O desenvolvimento do projecto assenta nas linguagens de Programação de HTML5, CSS3, PHP, MySQL, Javascript (JQuery) e na plataforma de RWD do Twitter-Bootstrap.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O crescimento e a expansão das redes sociais trouxe novas formas de interação entre os seres humanos que se repercutem na vida real. Os textos partilhados nas redes sociais e as interações resultantes de todas as atividades virtuais têm vindo a ganhar um grande impacto no quotidiano da sociedade e no âmbito económico e financeiro, as redes sociais tem sido alvo de diversos estudos, particularmente em termos de previsão e descrição do mercado acionista (Zhang, Fuehres, & Gloor, 2011) (Bollen, Mao & Zheng, 2010). Nesta investigação percebemos se o sentimento do Twitter, rede social de microblogging, se relaciona diretamente com o mercado acionista, querendo assim compreender qual o impacto das redes sociais no mercado financeiro. Tentámos assim relacionar duas dimensões, social e financeira, de forma a conseguirmos compreender de que forma poderemos utilizar os valores de uma para prever a outra. É um tópico especialmente interessante para empresas e investidores na medida em que se tenta compreender se o que se diz de determinada empresa no Twitter pode ter relação com o valor de mercado dessa empresa. Usámos duas técnicas de análise de sentimentos, uma de comparação léxica de palavras e outra de machine learning para compreender qual das duas tinha uma melhor precisão na classificação dos tweets em três atributos, positivo, negativo ou neutro. O modelo de machine learning foi o modelo escolhido e relacionámos esses dados com os dados do mercado acionista através de um teste de causalidade de Granger. Descobrimos que para certas empresas existe uma relação entre as duas variáveis, sentimento do Twitter e alteração da posição da ação entre dois períodos de tempo no mercado acionista, esta última variável estando dependente da dimensão temporal em que agrupamos o nosso sentimento do Twitter. Este estudo pretendeu assim dar seguimento ao trabalho desenvolvido por Bollen, Mao e Zheng (2010) que descobriram que uma dimensão de sentimento (calma) consegue ser usada para prever a direção das ações do mercado acionista, apesar de terem rejeitado que o sentimento geral (positivo, negativo ou neutro) não se relacionava de modo global com o mercado acionista. No seu trabalho compararam o sentimento de todos os tweets de um determinado período sem exclusão com o índice geral de ações no mercado enquanto a metodologia adotada nesta investigação foi realizada por empresa e apenas nos interessaram tweets que se relacionavam com aquela empresa em específico. Com esta diferença obtemos resultados diferentes e certas empresas demonstravam que existia relação entre várias combinações, principalmente para empresas tecnológicas. Testamos o agrupamento do sentimento do Twitter em 3 minutos, 1 hora e 1 dia, sendo que certas empresas só demonstravam relação quando aumentávamos a nossa dimensão temporal. Isto leva-nos a querer que o sentimento geral da empresa, e se a mesma for uma empresa tecnológica, está ligado ao mercado acionista estando condicionada esta relação à dimensão temporal que possamos estar a analisar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial analysis and social network analysis typically take into consideration social processes in specific contexts of geographical or network space. The research in political science increasingly strives to model heterogeneity and spatial dependence. To better understand and geographically model the relationship between “non-political” events, streaming data from social networks, and political climate was the primary objective of the current study. Geographic information systems (GIS) are useful tools in the organization and analysis of streaming data from social networks. In this study, geographical and statistical analysis were combined in order to define the temporal and spatial nature of the data eminating from the popular social network Twitter during the 2014 FIFA World Cup. The study spans the entire globe because Twitter’s geotagging function, the fundamental data that makes this study possible, is not limited to a geographic area. By examining the public reactions to an inherenlty non-political event, this study serves to illuminate broader questions about social behavior and spatial dependence. From a practical perspective, the analyses demonstrate how the discussion of political topics fluсtuate according to football matches. Tableau and Rapidminer, in addition to a set basic statistical methods, were applied to find patterns in the social behavior in space and time in different geographic regions. It was found some insight into the relationship between an ostensibly non-political event – the World Cup - and public opinion transmitted by social media. The methodology could serve as a prototype for future studies and guide policy makers in governmental and non-governmental organizations in gauging the public opinion in certain geographic locations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Work Project investigates the determinants of reelection using data on the 278 Portuguese mainland municipalities for the period 1976-2009. We implement a logit fixed effect model to control for the municipalities’ unobserved characteristics that remain constant over time. Political variables, such as the vote share of the incumbent’s party in previous election, the number of mayor’s consecutive mandates and abstention rate, are found to be relevant in explaining incumbent’s reelection. Moreover, as to the mayor’s individual characteristics, age and education contribute to explain reelection prospects. We also provide weak evidence that a higher degree of fiscal autonomy increases political turnover and that the good economic prospects of the municipality positively affect reelection. Finally, the residents’ level of education and the size of the municipal population have an explanatory power on mayor’s reelection. We perform several robustness checks to confirm these results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).