11 resultados para 380303 Computer Perception, Memory and Attention
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.
Resumo:
This paper investigates the relationship between memory and the essentiality of money. We consider a random matching economy with a large finite population in which commitment is not possible and memory is limited in the sense that only a fraction m E(0; 1) of the population has publicly observable histories. We show that no matter how limited memory is, there exists a social norm that achieves the first best regardless of the population size. In other words, money can fail to be essential irrespective of the amount of memory in the economy. This suggests that the emphasis on limited memory as a fundamental friction for money to be essential deserves a deeper examination.
Resumo:
Event Marketing represents a common promotional strategy that involves direct contact between brands and consumers at special events, namely concerts, festivals, sporting events and fairs. Brands have been investing in sponsorship as a means of associating themselves with particular events, essentially with the goal to enhance brand image and brand awareness. Interestingly, the response of consumers to event marketing has not yet been fully understood. This dissertation fills this gap. More specifically, it intends to determine the extent to which sponsoring brands at events favors brand awareness (recall and recognition) and how it relates to brand attitude. Based on three Portuguese music festivals, two studies were conducted to ascertain event sponsorship’s impact on consumer memory, notably Brand Recall and Brand Recognition, and correlation with attitudes towards the brands such as familiarity and liking. The key findings of these studies show that recognition is much higher for those respondents who attended the festivals, presenting a score of 73,9%, in comparison with recall, presenting a much lower score of 37,5%. Further, and surprisingly, it suggests that the ability to recall and recognize sponsoring brands is not associated to consumer attitudes towards the brands. Instead, it relates to the time consumers dedicated to these particular events, that is, the number of music festivals attended.
Resumo:
How strategical decisions are taken? The present work consists of a psychological experiment that it aims to search the knowledge about the subcognitive structure of strategical vision of deeper form, investigating its interaction with the cognitives processes of human beings ¿ perception, memory, and learning. We also argue the nature of chunks (pieces or units), that, in opposition to the current theories, we consider to be provided with essence or meanings in detriment of the appearance or superficial features. In this way, we choose as domain for our experiment the chess game, because its dealing with lesser complexity of the one that decisions in the politics or industry. Thus, we shows the importance that the perception of the abstract roles playing in specific chess position, leading to a strategical vision of this. Moreover, after the experiment, was verified that the expert chess players are capable to perceive distinct positions in the appearance as being similar strategically", while that the beginners had gotten greater difficulty. Finally, we present part of an emergent theory that claims that the human being cognition is nothing more than the abstract perception, as well as the replication of this theory in other domains, for example in the management and the real world."
Resumo:
The town of Nova Friburgo, in Brazil, was founded in 1820 by swiss immigrants who, as often happens in the majority of migratory flows, crossed the ocean in search of better life conditions. The scope of this paper is to trail the path of a swiss immigrant called Marianne Joset Salusse and then follow on to investigate the mechanisms involved in elaborating family memory and public memory around this woman who would become a symbol of immigration to this town. Thus a series of interviews were held with her descendants, which were fundamental for the understanding of current representations and the main elements which constitute the collective memory around Marianne. Besides oral sources, we had recourse to written documents which allowed a retrieval of relevant information about her life. More than simply adding information, written sources allowed for a more profound analysis of oral accounts, unravelling as well as unveiling selective procedures peculiar to memory construction (Pollak, 1989;1992).
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of pattern recognition". Based on the results of this research, we explore a change of perspective. The idea of "pattern recognition" presupposes that the processing of relevant information is on "patterns" (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of 'pattern recognition'. Based on the results of this research, we explore a change of perspective. The idea of 'pattern recognition' presupposes that the processing of relevant information is on 'patterns' (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
The work with oral history consists of recording interviews which have historical and documental proprieties, with actors/actresses or witnesses of events, conjunctures, movements, institutions and ways of living along the contemporary history. One of its basic foundations is the narrative. An event or a situation lived by the interviewee can not be transmitted to any other person without being narrated. That means that it frames itself (meaning that it does become something) at the very moment of the interview. By telling his/her life experiences, the interviewee transforms what has been lived into language, selecting and organizing facts according to some determined meanings. This work of language in crystallising images (images which refer to, and mean again, life experience) is common in all narratives - and we do know that sometimes it is much more successful than others (just the way some oral history interviews are certainly more successful than others). However, perhaps we have not given yet all the attention needed to this work of language in the oral sources.
Resumo:
We study constrained efficient aggregate risk sharing and its consequence for the behavior of macro-aggregates in a dynamic Mirrlees’s (1971) setting. Privately observed idiosyncratic productivity shocks are assumed to be independent of i.i.d. publicly observed aggregate shocks. Yet, private allocations display memory with respect to past aggregate shocks, when idosyncratic shocks are also i.i.d.. Under a mild restriction on the nature of optimal allocations the result extends to more persistent idiosyncratic shocks, for all but the limit at which idiosyncratic risk disappears, and the model collapses to a pure heterogeneity repeated Mirrlees economy identical to Werning [2007]. When preferences are iso-elastic we show that an allocation is memoryless only if it displays a strong form of separability with respect to aggregate shocks. Separability characterizes the pure heterogeneity limit as well as the general case with log preferences. With less than full persistence and risk aversion different from unity both memory and non-separability characterize optimal allocations. Exploiting the fact that non-separability is associated with state-varying labor wedges, we apply a business cycle accounting procedure (e.g. Chari et al. [2007]) to the aggregate data generated by the model. We show that, whenever risk aversion is great than one our model produces efficient counter-cyclical labor wedges.
Resumo:
Há mais de 30 anos o Brasil tem desenvolvido políticas específicas para o setor de informática, desde a Política Nacional de Informática da década de 70, passando pelo Período de Reserva de Mercado dos anos 80 e, nos dias de hoje, em que as Tecnologias de Informação e Comunicação (TIC) são tidas como uma das áreas prioritárias na Política Industrial. Dentre as metas atuais, destaca-se o foco na ampliação do volume de exportações de software e serviços. Contudo, apesar dessas pretensões, o país não tem tido destaque internacional expressivo para o setor. Por outro lado, a Índia, também considerada como um país emergente, figurando na lista dos BRIC, foi responsável pela exportação de cerca de US$47 bilhões em software e serviços de Tecnologia da Informação (TI) em 2009, se destacando como um país protagonista no mercado internacional do setor. A implementação de uma indústria tecnicamente sofisticada como a do software, que exige um ambiente propício à inovação, em um país em desenvolvimento como a Índia chama a atenção. De certo existiram arranjos jurídico-institucionais que foram utilizados naquele país. Quais? Em que medida tais arranjos ajudaram no desenvolvimento indiano do setor? E no Brasil? Este trabalho parte da hipótese de que o ambiente jurídico-institucional desses países definiu fluxos de conhecimento distintos, influenciando o tipo de desenvolvimento do setor de software de cada um. Averiguar como, entre outros fatores sócio-econômicos, esses arranjos jurídico-institucionais influenciaram na conformação diversa de fluxos de conhecimento é o objetivo específico desta pesquisa. Entende-se aqui como ambiente jurídico-institucional todas as regulamentações que estabelecem instituições, diretrizes e condições comuns para determinado tema. Partindo do pressuposto de que o setor de software desenvolve atividades intensivas em conhecimento, para cada país em questão, serão analisados apenas arranjos jurídico-institucionais que tiveram, ou têm, poder de delimitar o fluxo de conhecimento referente ao setor, sejam eles provenientes de políticas comerciais (de exportação e importação, ou de propriedade intelectual) ou de políticas de investimento para inovação. A questão fundamental ultrapassa o debate se o Estado deve ou não intervir, para focar-se na análise sobre os diferentes tipos de envolvimento observados e quais os seus efeitos. Para tal, além de revisão bibliográfica, foi feita uma pesquisa de campo na Índia (Delhi, Mumbai, Bangalore) e no Brasil (São Paulo, Brasília e Rio de Janeiro), onde foram conduzidas entrevistas com empresas e associações de software, gestores públicos e acadêmicos que estudam o setor.
Resumo:
Modelos de tomada de decisão necessitam refletir os aspectos da psi- cologia humana. Com este objetivo, este trabalho é baseado na Sparse Distributed Memory (SDM), um modelo psicologicamente e neuro- cientificamente plausível da memória humana, publicado por Pentti Kanerva, em 1988. O modelo de Kanerva possui um ponto crítico: um item de memória aquém deste ponto é rapidamente encontrado, e items além do ponto crítico não o são. Kanerva calculou este ponto para um caso especial com um seleto conjunto de parâmetros (fixos). Neste trabalho estendemos o conhecimento deste ponto crítico, através de simulações computacionais, e analisamos o comportamento desta “Critical Distance” sob diferentes cenários: em diferentes dimensões; em diferentes números de items armazenados na memória; e em diferentes números de armazenamento do item. Também é derivada uma função que, quando minimizada, determina o valor da “Critical Distance” de acordo com o estado da memória. Um objetivo secundário do trabalho é apresentar a SDM de forma simples e intuitiva para que pesquisadores de outras áreas possam imaginar como ela pode ajudá-los a entender e a resolver seus problemas.