920 resultados para Bayesian statistical decision theory
Resumo:
A dissertação tem como objeto a investigação das relações entre a o design de um website de comércio eletrônico de CDs e o comportamento do consumidor virtual, com ênfase a sua atitude e intenção de compra. O objetivo principal é mensurar o efeito do design da loja virtual (website) em seu papel de agente de vendas na Internet. A análise do comércio varejista de CDs foi escolhida, pois este produto é um dos principais artigos de venda neste canal. O estudo é apoiado em um referencial teórico, no qual são analisadas as características dos seguintes pontos: i) a Internet atuando como canal de vendas; ii) o comércio varejista de CD no Brasil e iii) o comportamento do consumidor e o seu processo decisório. Ainda no referencial teórico são apresentados os diversos modelos de avaliação de website existentes: baseado em Marketing, Teoria dos Dois Fatores, avaliação da qualidade, avaliação da web (W AM) e aceitação da Tecnologia na Web (T AM). A análise e comparação destes modelos serviu como base para o desenvolvimento da proposta do modelo de avaliação website. O estudo é complementado pelo desenvolvimento de uma pesquisa, com aplicação de questionário via web (websurvey). A coleta dos dados é utilizada como forma de validação estatística das relações existentes no modelo desenvolvido. Isto é feito por meio do uso da ferramenta de análise de Equações Estruturadas (SEM), suportada pelos conceitos e métodos de abordagem descritos no referencial teórico. A ferramenta permite tanto avaliar um modelo de mensuração e um modelo estrutural simultaneamente.
Resumo:
We develop a framework to explain the private capital flows between the rest of the world and an emerging economy. The model, based on the monetary premium theory, relates an endogenous supply of foreign capitals to an endogenous differential of interest rates; its estimation uses the econometric techniques initiated by Heckman. Four questions regarding the capital flows phenomenon are explored, including the statistical process that governs the events of default and the impact of the probability of default on the interest rate differential. Using the methodology, we analyse the dynamics of foreign capital movements in Brazil during the 1991- 1998 period.
Resumo:
This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.
Resumo:
On March 4, 1999, the newly appointed President of the Brazilian Central Bank, Mr Armínio Fraga, raised interest rates to a staggering 45% per annum. The objective of that decision was to keep foreign investors assets in Brazil, and prevent the country from default. At the time, Brazil suffered frem an enormously intense crisis of confidence, and fears of such default were widespread. Mr Fraga was walking a very fine line when making that decision, for it could bring forth unintended effects: the market, already concerned about Brazil's sustainability, could perceive the increased rate as an irreversible step towards the abyss inevitable default. Economic theory postulates the rational actor model as the driving force behind economic decision-making. The objective of this thesis is to present and discuss the hypothesis that this particular decision, and by extension many others, are better explained threugh the recognition-primed decision mode!.
Resumo:
A tese apresenta três ensaios empíricos sobre os padrões decisórios de magistrados no Brasil, elaborados à partir de bases de dados inéditas e de larga escala, que contém detalhes de dezenas de milhares de processos judiciais na primeira e na segunda instância. As bases de dados são coletadas pelo próprio autor através de programas-robô de coleta em massa de informações, aplicados aos "links" de acompanhamento processual de tribunais estaduais no Brasil (Paraná, Minas Gerais e Santa Catarina). O primeiro artigo avalia - com base em modelo estatístico - a importância de fatores extra-legais sobre os resultados de ações judiciais, na Justiça Estadual do Paraná. Isto é, se os juízes favorecem sistematicamente a parte hipossuficiente (beneficiária de Assistência Judiciária Gratuita). No segundo artigo, estuda-se a relação entre a duração de ações cíveis no primeiro grau e a probabilidade de reforma da sentença, utilizando-se dados da Justiça Estadual de Minas Gerais. O objetivo é avaliar se existe um dilema entre a duração e a qualidade das sentenças. Dito de outra forma, se existe um dilema entre a observância do direito ao devido processo legal e a celeridade processual. O último artigo teste a hipótese - no âmbito de apelações criminais e incidentes recursais no Tribunal de Justiça de Santa Catarina - de que as origens profissionais dos desembargadores influenciam seus padrões decisórios. Isto é, testa-se a hipótese de que desembargadores/relatores oriundos da carreira da advocacia são mais "garantistas" ( e desembargadores oriundos da carreira do Ministério Público são menos "garantistas") relativamente aos seus pares oriundos da carreira da magistratura. Testam-se as hipóteses com base em um modelo estatístico que explica a probabilidade de uma decisão recursal favorável ao réu, em função da origem de carreira do relator do recurso, além de um conjunto de características do processo e do órgão julgador.
Resumo:
Corporate governance has been in the spotlight for the past two decades, being subject of numerous researches all over the world. Governance is pictured as a broad and diverse theme, evolving through different routes to form distinct systems. This scenario together with 2 types of agency problems (investor vs. management and minorities vs. controlling shareholders) produce different definitions for governance. Usually, studies investigate whether corporate governance structures influence firm performance, and company valuation. This approach implies investors can identify those impacts and later take them into consideration when making investment decisions. However, behavioral finance theory shows that not always investors take rational decisions, and therefore the modus operandi of those professionals needs to be understood. So, this research aimed to investigate to what extent Brazilian corporate governance standards and practices influence the investment decision-making process of equity markets' professionals from the sell-side and buy-side. This exploratory study was carried out through qualitative and quantitative approaches. In the qualitative phase, 8 practitioners were interviewed and 3 dimensions emerged: understanding, pertinence and practice. Based on the interviews’ findings, a questionnaire was formulated and distributed to buy-siders and sell-siders that cover Brazilian stocks. 117 respondents from all over the world contributed to the study. The data obtained were analyzed through structural equation modeling and descriptive statistics. The 3 dimensions became 5 constructs: definition (institutionalized governance, informal governance), pertinence (relevance), practice (valuation process, structured governance assessment) The results of this thesis suggest there is no definitive answer, as the extent to which governance will influence an investment decision process will depend on a number of circumstances which compose the context. The only certainty is the need to present a “corporate governance behavior”, rather than simply establishing rules and regulations at firm and country level.
Resumo:
Nesse artigo, eu desenvolvo e analiso um modelo de dois perí odos em que dois polí ticos competem pela preferência de um eleitor representativo, que sabe quão benevolente é um dos polí ticos mas é imperfeitamente informado sobre quão benevolente é o segundo polí tico. O polí tico conhecido é interpretado como um incumbente de longo prazo, ao passo que o polí tico desconhecido é interpretado como um desa fiante menos conhecido. É estabelecido que o mecanismo de provisão de incentivos inerente às elei cões - que surge através da possibilidade de não reeleger um incumbente - e considerações acerca de aquisi cão de informa cão por parte do eleitor se combinam de modo a determinar que em qualquer equilí brio desse jogo o eleitor escolhe o polí tico desconhecido no per íodo inicial do modelo - uma a cão à qual me refi ro como experimenta cão -, fornecendo assim uma racionaliza cão para a não reelei cão de incumbentes longevos. Especifi camente, eu mostro que a decisão do eleitor quanto a quem eleger no per odo inicial se reduz à compara cão entre os benefí cios informacionais de escolher o polí tico desconhecido e as perdas econômicas de fazê-lo. Os primeiros, que capturam as considera cões relacionadas à aquisi cão de informa cão, são mostrados serem sempre positivos, ao passo que as últimas, que capturam o incentivo à boa performance, são sempre não-negativas, implicando que é sempre ótimo para o eleitor escolher o polí tico desconhecido no per íodo inicial.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work discusses the environmental management thematic, on the basis of ISO 14001 standard and learning organization. This study is carried through an exploratory survey in a company of fuel transport, located in Natal/RN. The objective of this research was to investigate the practices of environmental management, carried through in the context of an implemented ISO 14001 environmental management system, in the researched organization, from the perspective of the learning organization. The methodology used in this work is supported in the quantitative method, combining the exploratory and descriptive types, and uses the technique of questionnaires, having as scope of the research, the managers, employee controlling, coordinators, supervisors and - proper and contracted - of the company. To carry through the analysis of the data of this research, it was used software Excel and Statistical version 6.0. The analysis of the data is divided in two parts: descriptive analysis and analysis of groupings (clusters). The results point, on the basis of the studied theory, as well as in the results of the research, that the implemented ISO 14001 environmental system in the searched organization presents elements that promote learning organization. From the results, it can be concluded that the company uses external information in the decision taking on environmental problems; that the employees are mobilized to generate ideas and to collect n environmental information and that the company has carried through partnerships in the activities of the environmental area with other companies. All these item cited can contribute for the generation of knowledge of the organization. It can also be concluded that the company has evaluated environmental errors occurrences in the past, as well as carried through environmental benchmarking. These practical can be considered as good ways of the company to acquire knowledge. The results also show that the employees have not found difficulties in the accomplishment of the tasks when the manager of its sector is not present. This result can demonstrate that the company has a good diffusion of knowledge
Resumo:
The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
Several statistical models can be used for assessing genotype X environment interaction (GEI) and studying genotypic stability. The objectives of this research were to show how (i) to use Bayesian methodology for computing Shukla's phenotypic stability variance and (ii) to incorporate prior information on the parameters for better estimation. Potato [Solanum tuberosum subsp. andigenum (Juz. & Bukasov) Hawkes], wheat (Triticum aestivum L.), and maize (Zea mays L.) multi environment trials (MET) were used for illustrating the application of the Bayes paradigm. The potato trial included 15 genotypes, but prior information for just three genotypes was used. The wheat trial used prior information on all 10 genotypes included in the trial, whereas for the maize trial, noninformative priors for the nine genotypes was used. Concerning the posterior distribution of the genotypic means, the maize MET with 20 sites gave less disperse posterior distributions of the genotypic means than did the posterior distribution of the genotypic means of the other METs, which included fewer environments. The Bayesian approach allows use of other statistical strategies such as the normal truncated distribution (used in this study). When analyzing grain yield, a lower bound of zero and an upper bound set by the researcher's experience can be used. The Bayesian paradigm offers plant breeders the possibility of computing the probability of a genotype being the best performer. The results of this study show that although some genotypes may have a very low probability of being the best in all sites, they have a relatively good chance of being among the five highest yielding genotypes.
Resumo:
We prove the equivalence of many-gluon Green's functions in the Duffin-Kemmer-Petieu and Klein-Gordon-Fock statistical quantum field theories. The proof is based on the functional integral formulation for the statistical generating functional in a finite-temperature quantum field theory. As an illustration, we calculate one-loop polarization operators in both theories and show that their expressions indeed coincide.
Resumo:
Enterprises need continuous product development activities to remain competitive in the marketplace. Their product development process (PDP) must manage stakeholders' needs - technical, financial, legal, and environmental aspects, customer requirements, Corporate strategy, etc. -, being a multidisciplinary and strategic issue. An approach to use real option to support the decision-making process at PDP phases in taken. The real option valuation method is often presented as an alternative to the conventional net present value (NPV) approach. It is based on the same principals of financial options: the right to buy or sell financial values (mostly stocks) at a predetermined price, with no obligation to do so. In PDP, a multi-period approach that takes into account the flexibility of, for instance, being able to postpone prototyping and design decisions, waiting for more information about technologies, customer acceptance, funding, etc. In the present article, the state of the art of real options theory is prospected and a model to use the real options in PDP is proposed, so that financial aspects can be properly considered at each project phase of the product development. Conclusion is that such model can provide more robustness to the decisions processes within PDP.
Resumo:
A methodology to define favorable areas in petroleum and mineral exploration is applied, which consists in weighting the exploratory variables, in order to characterize their importance as exploration guides. The exploration data are spatially integrated in the selected area to establish the association between variables and deposits, and the relationships among distribution, topology, and indicator pattern of all variables. Two methods of statistical analysis were compared. The first one is the Weights of Evidence Modeling, a conditional probability approach (Agterberg, 1989a), and the second one is the Principal Components Analysis (Pan, 1993). In the conditional method, the favorability estimation is based on the probability of deposit and variable joint occurrence, with the weights being defined as natural logarithms of likelihood ratios. In the multivariate analysis, the cells which contain deposits are selected as control cells and the weights are determined by eigendecomposition, being represented by the coefficients of the eigenvector related to the system's largest eigenvalue. The two techniques of weighting and complementary procedures were tested on two case studies: 1. Recôncavo Basin, Northeast Brazil (for Petroleum) and 2. Itaiacoca Formation of Ribeira Belt, Southeast Brazil (for Pb-Zn Mississippi Valley Type deposits). The applied methodology proved to be easy to use and of great assistance to predict the favorability in large areas, particularly in the initial phase of exploration programs. © 1998 International Association for Mathematical Geology.