989 resultados para uncertainty estimation
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
I analyze an economy with uncertainty in which a set of indivisible objects and a certain amount of money is to be distributed among agents. The set of intertemporally fair social choice functions based on envy-freeness and Pareto efficiency is characterized. I give a necessary and sufficient condition for its non-emptiness and propose a mechanism that implements the set of intertemporally fair allocations in Bayes-Nash equilibrium. Implementation at the ex ante stage is considered, too. I also generalize the existence result obtained with envy-freeness using a broader fairness concept, introducing the aspiration function.
Resumo:
We report on a series of experiments that test the effects of an uncertain supply on the formation of bids and prices in sequential first-price auctions with private-independent values and unit-demands. Supply is assumed uncertain when buyers do not know the exact number of units to be sold (i.e., the length of the sequence). Although we observe a non-monotone behavior when supply is certain and an important overbidding, the data qualitatively support our price trend predictions and the risk neutral Nash equilibrium model of bidding for the last stage of a sequence, whether supply is certain or not. Our study shows that behavior in these markets changes significantly with the presence of an uncertain supply, and that it can be explained by assuming that bidders formulate pessimistic beliefs about the occurrence of another stage.
Resumo:
We study the relation between the number of firms and price-cost margins under price competition with uncertainty about competitors' costs. We present results of an experiment in which two, three and four identical firms repeatedly interact in this environment. In line with the theoretical prediction, market prices decrease with the number of firms, but on average stay above marginal costs. Pricing is less aggressive in duopolies than in triopolies and tetrapolies. However, independently from the number of firms, pricing is more aggressive than in the theoretical equilibrium. Both the absolute and the relative surpluses increase with the number of firms. Total surplus is close to the equilibrium level, since enhanced consumer surplus through lower prices is counteracted by occasional displacements of the most efficient firm in production.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.
Resumo:
According to the hypothesis of Traub, also known as the 'formula of Traub', postmortem values of glucose and lactate found in the cerebrospinal fluid or vitreous humor are considered indicators of antemortem blood glucose levels. However, because the lactate concentration increases in the vitreous and cerebrospinal fluid after death, some authors postulated that using the sum value to estimate antemortem blood glucose levels could lead to an overestimation of the cases of glucose metabolic disorders with fatal outcomes, such as diabetic ketoacidosis. The aim of our study, performed on 470 consecutive forensic cases, was to ascertain the advantages of the sum value to estimate antemortem blood glucose concentrations and, consequently, to rule out fatal diabetic ketoacidosis as the cause of death. Other biochemical parameters, such as blood 3-beta-hydroxybutyrate, acetoacetate, acetone, glycated haemoglobin and urine glucose levels, were also determined. In addition, postmortem native CT scan, autopsy, histology, neuropathology and toxicology were performed to confirm diabetic ketoacidosis as the cause of death. According to our results, the sum value does not add any further information for the estimation of antemortem blood glucose concentration. The vitreous glucose concentration appears to be the most reliable marker to estimate antemortem hyperglycaemia and, along with the determination of other biochemical markers (such as blood acetone and 3-beta-hydroxybutyrate, urine glucose and glycated haemoglobin), to confirm diabetic ketoacidosis as the cause of death.
Resumo:
The dispersal process, by which individuals or other dispersing agents such as gametes or seeds move from birthplace to a new settlement locality, has important consequences for the dynamics of genes, individuals, and species. Many of the questions addressed by ecology and evolutionary biology require a good understanding of species' dispersal patterns. Much effort has thus been devoted to overcoming the difficulties associated with dispersal measurement. In this context, genetic tools have long been the focus of intensive research, providing a great variety of potential solutions to measuring dispersal. This methodological diversity is reviewed here to help (molecular) ecologists find their way toward dispersal inference and interpretation and to stimulate further developments.
Resumo:
BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.
Resumo:
This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.
Resumo:
Recent attempts to incorporate optimal fiscal policy into New Keynesian models subject to nominal inertia, have tended to assume that policy makers are benevolent and have access to a commitment technology. A separate literature, on the New Political Economy, has focused on real economies where there is strategic use of policy instruments in a world of political conflict. In this paper we combine these literatures and assume that policy is set in a New Keynesian economy by one of two policy makers facing electoral uncertainty (in terms of infrequent elections and an endogenous voting mechanism). The policy makers generally share the social welfare function, but differ in their preferences over fiscal expenditure (in its size and/or composition). Given the environment, policy shall be realistically constrained to be time-consistent. In a sticky-price economy, such heterogeneity gives rise to the possibility of one policy maker utilising (nominal) debt strategically to tie the hands of the other party, and influence the outcome of any future elections. This can give rise to a deficit bias, implying a sub-optimally high level of steady-state debt, and can also imply a sub-optimal response to shocks. The steady-state distortions and inflation bias this generates, combined with the volatility induced by the electoral cycle in a sticky-price environment, can significantly
Resumo:
The paper studies the interaction between cyclical uncertainty and investment in a stochastic real option framework where demand shifts stochastically between three different states, each with different rates of drift and volatility. In our setting the shifts are governed by a three-state Markov switching model with constant transition probabilities. The magnitude of the link between cyclical uncertainty and investment is quantified using simulations of the model. The chief implication of the model is that recessions and financial turmoil are important catalysts for waiting. In other words, our model shows that macroeconomic risk acts as an important deterrent to investments.