893 resultados para Two-stage MCMC method
Resumo:
Mixed models may be defined with or without reference to sampling, and can be used to predict realized random effects, as when estimating the latent values of study subjects measured with response error. When the model is specified without reference to sampling, a simple mixed model includes two random variables, one stemming from an exchangeable distribution of latent values of study subjects and the other, from the study subjects` response error distributions. Positive probabilities are assigned to both potentially realizable responses and artificial responses that are not potentially realizable, resulting in artificial latent values. In contrast, finite population mixed models represent the two-stage process of sampling subjects and measuring their responses, where positive probabilities are only assigned to potentially realizable responses. A comparison of the estimators over the same potentially realizable responses indicates that the optimal linear mixed model estimator (the usual best linear unbiased predictor, BLUP) is often (but not always) more accurate than the comparable finite population mixed model estimator (the FPMM BLUP). We examine a simple example and provide the basis for a broader discussion of the role of conditioning, sampling, and model assumptions in developing inference.
Resumo:
When missing data occur in studies designed to compare the accuracy of diagnostic tests, a common, though naive, practice is to base the comparison of sensitivity, specificity, as well as of positive and negative predictive values on some subset of the data that fits into methods implemented in standard statistical packages. Such methods are usually valid only under the strong missing completely at random (MCAR) assumption and may generate biased and less precise estimates. We review some models that use the dependence structure of the completely observed cases to incorporate the information of the partially categorized observations into the analysis and show how they may be fitted via a two-stage hybrid process involving maximum likelihood in the first stage and weighted least squares in the second. We indicate how computational subroutines written in R may be used to fit the proposed models and illustrate the different analysis strategies with observational data collected to compare the accuracy of three distinct non-invasive diagnostic methods for endometriosis. The results indicate that even when the MCAR assumption is plausible, the naive partial analyses should be avoided.
Resumo:
Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In this thesis, one of the current control algorithms for the R744 cycle, which tries tooptimize the performance of the system by two SISO control loops, is compared to acost-effective system with just one actuator. The operation of a key component of thissystem, a two stage orifice expansion valve is examined in a range of typical climateconditions. One alternative control loop for this system, which has been proposed byBehr group, is also scrutinized.The simulation results affirm the preference of using two control-loops instead of oneloop, but refute advantages of the Behr alternate control approach against one-loopcontrol. As far as the economic considerations of the A/C unit are concerned, usinga two-stage orifice expansion valve is desired by the automotive industry, thus basedon the experiment results, an improved logic for control of this system is proposed.In the second part, it is investigated whether the one-actuator control approach isapplicable to a system consisting of two parallel evaporators to allow passengers tocontrol different climate zones. The simulation results show that in the case of usinga two-stage orifice valve for the front evaporator and a fixed expansion valve forthe rear one, a proper distribution of the cooling power between the front and rearcompartment is possible for a broad range of climate conditions.
Resumo:
This thesis is an application of the Almost Ideal Demand System approach of Deaton and Muellbauer,1980, for a particular pharmaceutical, Citalopram, in which GORMAN´s (1971) multi-stage budgeting approach is applied basically since it is one of the most useful approach in estimating demand for differentiated products. Citalopram is an antidepressant drug that is used in the treatment of major depression. As for most other pharmaceuticals whose the patent has expired, there exist branded and generic versions of Citalopram. This paper is aimed to define its demand system with two stage models for the branded version and five generic versions, and to show whether generic versions are able to compete with the branded version. I calculated the own price elasticities, and it made me possible to compare and make a conclusion about the consumers’ choices over the brand and generic drugs. Even though the models need for being developed with some additional variables, estimation results of models and uncompensated price elasticities indicated that the branded version has still power in the market, and generics are able to compete with lower prices. One important point that has to be taken into consideration is that the Swedish pharmaceutical market faced a reform on October 1, 2002, that aims to make consumer better informed about the price and decrease the overall expenditures for pharmaceuticals. Since there were not significantly enough generic sales to take into calculation before the reform, my paper covers sales after the reform.
Resumo:
The Mauri Model DMF is unique in its approach to the management of water resources as the framework offers a transparent and inclusive approach to considering the environmental, economic, social and cultural aspects of the decisions being contemplated. The Mauri Model DMF is unique because it is capable of including multiple-worldviews and adopts mauri (intrinsic value or well-being) in the place of the more common monetised assessments of pseudo sustainability using Cost Benefit Analysis. The Mauri Model DMF uses a two stage process that first identifies participants’ worldviews and inherent bias regarding water resource management, and then facilitates transparent assessment of selected sustainability performance indicators. The assessment can then be contemplated as the separate environmental, economic, social and cultural dimensions of the decision, and collectively as an overall result; or the priorities associated with different worldviews can be applied to determine the sensitivity of the result to different cultural contexts or worldviews.
Resumo:
This paper measures the degree of segmentation in the brazilian labor market. Controlling for observable and unobservable characteristics, workers earn more in the formal sector, which supports the segmentation hypothesis. We break down the degree of segmentation by socio-economic attributes to identify the groups where this phenomenon is more prevalent. We investigate the robustness of our findings to the inclusion of self-employed individuals, and apply a two-stage panel probit model using the self-selection correction strategy to investigate a potential weakness of the fixed-effects estimator
Resumo:
Esse trabalho analisa como a competência dos trabalhadores de manutenção pode afetar a confiabilidade dos equipamentos de produção, num processo de terceirização da manutenção. Mais especificamente, tratando do caso específico da terceirização da execução de manutenção elétrica e de instrumentação do Lingotamento Contínuo da CST - Companhia Siderúrgica de Tubarão. Para tanto, foram analisados dois momentos históricos diferentes: antes e após a terceirização da execução de manutenção. Para as análises de competência utilizaram-se dois métodos de diagnóstico diferentes, sendo o primeiro deles, baseado nos estudos de comportamento propostos por David C. McClelland e compilados por Spencer & Spencer (1993). O segundo é uma proposta do próprio autor, que foi nominado locus de competência, baseado em práticas existentes e utilizando-se do que há de mais novo ligado à teoria de competência, nessa nova fase de racionalização do trabalho, tempo em que vivemos. A pesquisa foi explicativa, metodológica e aplicada. Foi realizada por meio de pesquisa de campo, sendo também bibliográfica, documental e por fim, participante. Os resultados da pesquisa mostraram que muito embora a competência das equipes de execução de manutenção elétrica e de instrumentação do Lingotamento Contínuo da CST tenha diminuído com a terceirização, isso não afetou a confiabilidade dos equipamentos de produção da unidade produtiva. Por fim, alguns pontos que poderiam explicar essa constatação são levantados ao final do trabalho.
Resumo:
O presente artigo estuda a relação entre corrupção e discricionariedade do gasto público ao responder a seguinte pergunta: regras de licitação mais rígidas, uma proxy para discricionariedade, resultam em menor prevalência de corrupção nos municípios brasileiros? A estratégia empírica é uma aproximação de regressões em dois estágios (2SLS) estimadas localmente em cada transição de regras de licitação, cuja fonte de dados de corrupção é o Programa de Fiscalização por Sorteio da CGU e os dados sobre discricionariedade são derivados da Lei 8.666/93, responsável por regular os processos de compras e construção civil em todas as esferas de governo. Os resultados mostram, entretanto, que menor discricionariedade está relacionada com maior corrupção para quase todos os cortes impostos pela lei de licitações.
Resumo:
This dissertation deals with the problem of making inference when there is weak identification in models of instrumental variables regression. More specifically we are interested in one-sided hypothesis testing for the coefficient of the endogenous variable when the instruments are weak. The focus is on the conditional tests based on likelihood ratio, score and Wald statistics. Theoretical and numerical work shows that the conditional t-test based on the two-stage least square (2SLS) estimator performs well even when instruments are weakly correlated with the endogenous variable. The conditional approach correct uniformly its size and when the population F-statistic is as small as two, its power is near the power envelopes for similar and non-similar tests. This finding is surprising considering the bad performance of the two-sided conditional t-tests found in Andrews, Moreira and Stock (2007). Given this counter intuitive result, we propose novel two-sided t-tests which are approximately unbiased and can perform as well as the conditional likelihood ratio (CLR) test of Moreira (2003).
Resumo:
Esta dissertação concentra-se nos processos estocásticos espaciais definidos em um reticulado, os chamados modelos do tipo Cliff & Ord. Minha contribuição nesta tese consiste em utilizar aproximações de Edgeworth e saddlepoint para investigar as propriedades em amostras finitas do teste para detectar a presença de dependência espacial em modelos SAR (autoregressivo espacial), e propor uma nova classe de modelos econométricos espaciais na qual os parâmetros que afetam a estrutura da média são distintos dos parâmetros presentes na estrutura da variância do processo. Isto permite uma interpretação mais clara dos parâmetros do modelo, além de generalizar uma proposta de taxonomia feita por Anselin (2003). Eu proponho um estimador para os parâmetros do modelo e derivo a distribuição assintótica do estimador. O modelo sugerido na dissertação fornece uma interpretação interessante ao modelo SARAR, bastante comum na literatura. A investigação das propriedades em amostras finitas dos testes expande com relação a literatura permitindo que a matriz de vizinhança do processo espacial seja uma função não-linear do parâmetro de dependência espacial. A utilização de aproximações ao invés de simulações (mais comum na literatura), permite uma maneira fácil de comparar as propriedades dos testes com diferentes matrizes de vizinhança e corrigir o tamanho ao comparar a potência dos testes. Eu obtenho teste invariante ótimo que é também localmente uniformemente mais potente (LUMPI). Construo o envelope de potência para o teste LUMPI e mostro que ele é virtualmente UMP, pois a potência do teste está muito próxima ao envelope (considerando as estruturas espaciais definidas na dissertação). Eu sugiro um procedimento prático para construir um teste que tem boa potência em uma gama de situações onde talvez o teste LUMPI não tenha boas propriedades. Eu concluo que a potência do teste aumenta com o tamanho da amostra e com o parâmetro de dependência espacial (o que está de acordo com a literatura). Entretanto, disputo a visão consensual que a potência do teste diminui a medida que a matriz de vizinhança fica mais densa. Isto reflete um erro de medida comum na literatura, pois a distância estatística entre a hipótese nula e a alternativa varia muito com a estrutura da matriz. Fazendo a correção, concluo que a potência do teste aumenta com a distância da alternativa à nula, como esperado.
Resumo:
This paper studies the effects of generic drug’s entry on bidding behavior of drug suppliers in procurement auctions for pharmaceuticals, and the consequences on procurer’s price paid for drugs. Using an unique data set on procurement auctions for off-patent drugs organized by Brazilian public bodies, we surprisingly find no statistically difference between bids and prices paid for generic and branded drugs. On the other hand, some branded drug suppliers leave auctions in which there exists a supplier of generics, whereas the remaining ones lower their bidding price. These findings explain why we find that the presence of any supplier of generic drugs in a procurement auction reduces the price paid for pharmaceuticals by 7 percent. To overcome potential estimation bias due to generic’s entry endogeneity, we exploit variation in the number of days between drug’s patent expiration date and the tendering session. The two-stage estimations document the same pattern as the generalized least square estimations find. This evidence indicates that generic competition affects branded supplier’s behavior in public procurement auctions differently from other markets.
Resumo:
The estimation of labor supply elasticities has been an important issue m the economic literature. Yet all works have estimated conditional mean labor supply functions only. The objective of this paper is to obtain more information on labor supply, by estimating the conditional quantile labor supply function. vI/e use a sample of prime age urban males employees in Brazil. Two stage estimators are used as the net wage and virtual income are found to be endogenous to the model. Contrary to previous works using conditional mean estimators, it is found that labor supply elasticities vary significantly and asymmetrically across hours of work. vVhile the income and wage elasticities at the standard work week are zero, for those working longer hours the elasticities are negative.
Resumo:
We study N-bidders, asymmetric all-pay auctions under incomplete information. First, we solve for the equilibrium of a parametric model. Each bidder’s valuation is independently drawn from an uniform [0, αi] where the parameter αi may vary across bidders. In this game, asymmetries are exogenously given. Next, a two-stage game where asymmetries are endogenously generated is studied. At the first stage, each bidder chooses the level of an observable, costly, value-enhancing action. The second stage is the bidding sub-game, whose equilibrium is simply the equilibrium of the, previously analyzed, game with exogenous asymmetries. Finally, natural applications of the all pay-auction in the context of political lobbying are considered: the effects of excluding bidders, as well as, the impact of caps on bids.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework.