13 resultados para Cant.
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of pattern recognition". Based on the results of this research, we explore a change of perspective. The idea of "pattern recognition" presupposes that the processing of relevant information is on "patterns" (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of 'pattern recognition'. Based on the results of this research, we explore a change of perspective. The idea of 'pattern recognition' presupposes that the processing of relevant information is on 'patterns' (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
This work has as its objective to create a service¿s pattern to EPGE Secretary, so we can solve the endless stress of the quaternary sector organization, where teachers and secretary professionals just can¿t get along. This systemic model intends to create a definite solution to that problem, allowing teachers to dedicate their time only to research and orientation to students. That way, we can keep the high quality of the secretary¿s service, as we are the symbol of EPGE through the eyes of the public. Having done the theoretical conceptualization of Organizations, Systems and Methods and also Geertz Theory of Ethnography, we build a pattern system, adapted to FGV¿s secretaries cultural situation. A group of 21 professionals from FGV¿s secretaries was selected and through a qualitative research, it was sustained the six years field work of the thesis author. The group¿s contribution to this work was fundamental to build the model and can also create a new way of thinking the work from secretaries of FGV.
Resumo:
The aim of this research was to detect how strategy changes took place in a successful organization. This was an unique case study focused on O BOTICÁRIO, a cosmetic and perfumery company in the state of Paraná, Brazil. Some information (from primary data) was obtained through interviewing members of the retail development department. Extra information was taken from articles about the company and interviews with members of the board of directors published in newspapers and magazines. The analysis was carried out in an explanatory-descriptive way, with a qualitative approach. The data, other than serving to characterize the company, was used for a number of purposes. They were: to introduce information about strategy changes in the company; to identify factors that gave rise to these changes; to identify the effects of these changes on the organization; to classify the strategy changes. The data revealed that the company is highly receptive to changes. Amongst the factors that caused strategy changes the following two factors stand out: the flexibility that the company presents in adapting to environmental variations and, overall, the interdependency that exists between changes. Confirming the systemic character of organizations, it was found that the changes lead to new changes. When qualifying the strategy changes, it was discovered that the incremental changes were more related to process evolution, i.e., comprehending transformations directly related to the activities to which the changes are associated with. The discontinuous changes, on the contrary, involved transformations throughout the whole company. Of all the changes that were analyzed only two were considered to be truly strategy revolutions. This confirms the idea that organizations usually choose to search for established strategic paths. The strategy changes analyzed produced a resistance level. This resistance was directly proportional to the level of discontinuance that the change had. It was verified, however, that all changes were considered beneficial to the company. The study concludes that other factors are also responsible for the fast expansion of the company. The most important of these are: the constant lauching of new products; the identification of the company with environmental issues; the policy towards employees; the strong marketing strategy; the quality of its products; the excellence on its services; the treatment of consumers; the use of franchise shops; the training provided to all the franchise network and as well as to the employees; and the seek to hire the best professionals to supervise its services. However, it can¿t be omitted that particular cause of O BOTICÁRIO¿s success is the intuition and the feeling of its President.
Resumo:
Esta tese tem como objetivo principal aproximar a evidencia empirica existente sobre os agregados macroeconomicos com as novas evidencias empiricas baseadas nos micro dados de precos ao consumidor, tendo como base os modelos padroes de rigidez de preco utilizados na literatura de politica monetaria. Para isso, esta tese utiliza a base de dados individuais de precos ao consumidor no Brasil fornecida pela Fundacao Getulio Vargas. Especificamente, esta tese foca em tres temas principais: a existencia de variac˜oes temporararias de precos, a heterogeneidade na rigidez de precos entre firmas de um mesmo setor e o formato das func˜oes hazard. Os resultados mostram que: existe de fato uma correlac˜ao entre as variaveis referentes as mudancas temporararias de precos e os agregados macroeconomicos; a heterogeneidade na rigidez de precos entre firmas de um mesmo setor apresenta efeitos significativos sobre a dinamica dos agregados macroeconomicos; e por fim, o formato mais geral da func˜ao hazard proposta nesta tese possibilita novas dinamicas dos agregados macroeconomicos.
Resumo:
O objetivo desta pesquisa é o de descrever as bases teóricas que fundamentam a utilização da atividade em Psiquiatria e na Psicologia Dinâmica como o recurso terapêutico. Damos ênfase as possibilidades que esta utilização apresenta na ampliação da consciência, que consideramos como essencial no processo de intervenção psicoterápica. Em psiquiatria, abordamos a evolução da compreensão do papel da atividade no processo terapêutico e relacionamos tal compreensão às transformações do conceito de doença mental que foram concomitantemente tendo lugar nesta área. Para tanto, descrevemos duas linhas de trabalho especificamente vinculadas à questão da atividade: a Terapia Ocupacional e a Arte terapia. Abordamos, também, as diferentes formas de utilização da atividade como recurso terapêutico na Psicologia Dinâmica. Procuramos identificar em cada uma das teorias estudadas, - na Psicanálise, no Psicodrama, na Gestalt Terapia e na psicologia Analítica - a forma de utilização da atividade, sempre relacionando a à compreensão de ser humano e à compreensão dos objetivos do processo terapêutico por elas explicitados. Concluímos que no estágio atual da utilização da atividade como recurso psicoterápico, não podemos descartar em totalidade nenhuma das contribuições desenvolvidas até o momento, pois se algumas nos oferecem fundamentação sobre as vantagens da atividade enquanto fazer organizador e socializante, outras nos oferecem fundamentação sobre seus efeitos no psiquismo em termos de integração de conteúdos inconscientes à consciência: dos aspectos recalcados, dos papéis introjetados, dos sentimentos e formas de lidar com o mundo (em sua intencional idade), dos aspectos do inconsciente pessoal e do inconsciente coletivo não necessariamente recalcados, mas em forma de potencialidades de ser.
Resumo:
Using intraday data for the most actively traded stocks on the São Paulo Stock Market (BOVESPA) index, this study considers two recently developed models from the literature on the estimation and prediction of realized volatility: the Heterogeneous Autoregressive Model of Realized Volatility (HAR-RV), developed by Corsi (2009), and the Mixed Data Sampling model (MIDAS-RV), developed by Ghysels et al. (2004). Using measurements to compare in-sample and out-of-sample forecasts, better results were obtained with the MIDAS-RV model for in-sample forecasts. For out-of-sample forecasts, however, there was no statistically signi cant di¤erence between the models. We also found evidence that the use of realized volatility induces distributions of standardized returns that are closer to normal
Resumo:
O presente trabalho consiste na aplicação da simulação computacional como método de aprofundamento do estudo de mecanismos de leilões aplicados na alocação do direito de exploração das reservas de petróleo da camada do pré-sal. A camada do pré-sal está localizada na costa brasileira e apresenta um grande potencial em termos de reserva de óleo e gás. A função lance aplicada para os participantes criados computacionalmente foi estimada com base em dados experimentais e segue uma função exponencial. A simulação possibilita reproduzir o modelo de leilão considerando todas as características e parâmetros dos experimentos sem incorrer no custo da realização de novas sessões de leilão com participantes reais. Os leilões estudados foram o leilão de valores privados de 1° preço e o leilão de valores privados de 2° preço. Através dos resultados obtidos identificou-se que o leilão de valores privados de 1° preço é menos arriscado que o leilão de valores privados de 2° preço; no leilão com simetria, o Princípio de Equivalência de Receita é válido; a eficiência observada é menor em leilões assimétricos; o leilão de 2° preço comparado com o de 1° preço apresenta um tradeoff entre a eficiência e a receita do governo; e que considerando o aprendizado dos participantes, não se observam alterações significativas nas estatísticas analisadas à medida que os participantes se tornam mais experientes.
Resumo:
The objective of this paper is to test for optimality of consumption decisions at the aggregate level (representative consumer) taking into account popular deviations from the canonical CRRA utility model rule of thumb and habit. First, we show that rule-of-thumb behavior in consumption is observational equivalent to behavior obtained by the optimizing model of King, Plosser and Rebelo (Journal of Monetary Economics, 1988), casting doubt on how reliable standard rule-of-thumb tests are. Second, although Carroll (2001) and Weber (2002) have criticized the linearization and testing of euler equations for consumption, we provide a deeper critique directly applicable to current rule-of-thumb tests. Third, we show that there is no reason why return aggregation cannot be performed in the nonlinear setting of the Asset-Pricing Equation, since the latter is a linear function of individual returns. Fourth, aggregation of the nonlinear euler equation forms the basis of a novel test of deviations from the canonical CRRA model of consumption in the presence of rule-of-thumb and habit behavior. We estimated 48 euler equations using GMM, with encouraging results vis-a-vis the optimality of consumption decisions. At the 5% level, we only rejected optimality twice out of 48 times. Empirical-test results show that we can still rely on the canonical CRRA model so prevalent in macroeconomics: out of 24 regressions, we found the rule-of-thumb parameter to be statistically signi cant at the 5% level only twice, and the habit ƴ parameter to be statistically signi cant on four occasions. The main message of this paper is that proper return aggregation is critical to study intertemporal substitution in a representative-agent framework. In this case, we fi nd little evidence of lack of optimality in consumption decisions, and deviations of the CRRA utility model along the lines of rule-of-thumb behavior and habit in preferences represent the exception, not the rule.
Resumo:
This paper investigates heterogeneity in the market assessment of public macro- economic announcements by exploring (jointly) two main mechanisms through which macroeconomic news might enter stock prices: instantaneous fundamental news im- pacts consistent with the asset pricing view of symmetric information, and permanent order ow e¤ects consistent with a microstructure view of asymmetric information related to heterogeneous interpretation of public news. Theoretical motivation and empirical evidence for the operation of both mechanisms are presented. Signi cant in- stantaneous news impacts are detected for news related to real activity (including em- ployment), investment, in ation, and monetary policy; however, signi cant order ow e¤ects are also observed on employment announcement days. A multi-market analysis suggests that these asymmetric information e¤ects come from uncertainty about long term interest rates due to heterogeneous assessments of future Fed responses to em- ployment shocks.
Resumo:
Economists and policymakers have long been concerned with increasing the supply of health professionals in rural and remote areas. This work seeks to understand which factors influence physicians’ choice of practice location right after completing residency. Differently from previous papers, we analyse the Brazilian missalocation and assess the particularities of developing countries. We use a discrete choice model approach with a multinomial logit specification. Two rich databases are employed containing the location and wage of formally employed physicians as well as details from their post-graduation. Our main findings are that amenities matter, physicians have a strong tendency to remain in the region they completed residency and salaries are significant in the choice of urban, but not rural, communities. We conjecture this is due to attachments built during training and infrastructure concerns.
Resumo:
The community of lawyers and their clients form a scale-free bipartite network that develops naturally as the outcome of the recommendation process through which lawyers form their client base. This process is an example of preferential attachment where lawyers with more clients are more likely to be recommended to new clients. Consumer litigation is an important market for lawyers. In large consumer societies, there always a signi cant amount of consumption disputes that escalate to court. In this paper we analyze a dataset of thousands of lawsuits, reconstructing the lawyer-client network embedded in the data. Analyzing the degree distribution of this network we noticed that it follows that of a scale-free network built by preferential attachment, but for a few lawyers with much larger client base than could be expected by preferential attachment. Incidentally, most of these also gured on a list put together by the judiciary of Lawyers which openly advertised the bene ts of consumer litigation. According to the code of ethics of their profession, lawyers should not stimulate clients into litigation, but it is not strictly illegal. From a network formation point of view, this stimulation can be seen as a separate growth mechanism than preferential attachment alone. In this paper we nd that this composite growth can be detected by a simple statistical test, as simulations show that lawyers which use both mechanisms quickly become the \Dragon-Kings" of the distribution of the number of clients per lawyer.
Resumo:
The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations