932 resultados para Uncertainty in governance
Resumo:
The mass of the top quark is measured in a data set corresponding to 4.6 fb−1 of proton--proton collisions with centre-of-mass energy s√=7 TeV collected by the ATLAS detector at the LHC. Events consistent with hadronic decays of top--antitop quark pairs with at least six jets in the final state are selected. The substantial background from multijet production is modelled with data-driven methods that utilise the number of identified b-quark jets and the transverse momentum of the sixth leading jet, which have minimal correlation. The top-quark mass is obtained from template fits to the ratio of three-jet to dijet mass. The three-jet mass is calculated from the three jets of a top-quark decay. Using these three jets the dijet mass is obtained from the two jets of the W boson decay. The top-quark mass obtained from this fit is thus less sensitive to the uncertainty in the energy measurement of the jets. A binned likelihood fit yields a top-quark mass of mt = 175.1 ± 1.4 (stat.) ± 1.2 (syst.) GeV.
Resumo:
I analyze an economy with uncertainty in which a set of indivisible objects and a certain amount of money is to be distributed among agents. The set of intertemporally fair social choice functions based on envy-freeness and Pareto efficiency is characterized. I give a necessary and sufficient condition for its non-emptiness and propose a mechanism that implements the set of intertemporally fair allocations in Bayes-Nash equilibrium. Implementation at the ex ante stage is considered, too. I also generalize the existence result obtained with envy-freeness using a broader fairness concept, introducing the aspiration function.
Resumo:
One of the most notable characteristics of the change in governance of the past two decades has been the restructuring of the state, most notably the delegation of authority from politicians and ministries to technocrats and regulatory agencies. Our unique dataset on the extent of these reforms in seven sectors in 36 countries reveals the widespread diffusion of these reforms in recent decades. In 1986 there were only 23 agencies across these sectors and countries (less than one agency per country); by 2002 this number had increased more than seven-fold, to 169. On average these 36 countries each have more than four agencies in the seven sectors studied. Yet the widespread diffusion of these reforms is characterized by cross-regional and cross-sectoral variations. Our data reveal two major variations: first, reforms are more widespread in economic regulation that in social spheres; second, regulatory agencies in the social spheres are more widespread in Europe than in Latin America. Why these variations in the spread of the reforms? In this paper we present for the first time the regulatory gaps across regions and sectors and then move on to offer some explanations for these gaps in a way that sheds some light on the nature of these reforms and on their limits. Our explanatory framework combines diffusion and structural explanations and in doing so sheds new light on the global diffusion of public policy ideas.
Resumo:
Bayesian model averaging (BMA) methods are regularly used to deal with model uncertainty in regression models. This paper shows how to introduce Bayesian model averaging methods in quantile regressions, and allow for different predictors to affect different quantiles of the dependent variable. I show that quantile regression BMA methods can help reduce uncertainty regarding outcomes of future inflation by providing superior predictive densities compared to mean regression models with and without BMA.
Resumo:
Traffic forecasts provide essential input for the appraisal of transport investment projects. However, according to recent empirical evidence, long-term predictions are subject to high levels of uncertainty. This paper quantifies uncertainty in traffic forecasts for the tolled motorway network in Spain. Uncertainty is quantified in the form of a confidence interval for the traffic forecast that includes both model uncertainty and input uncertainty. We apply a stochastic simulation process based on bootstrapping techniques. Furthermore, the paper proposes a new methodology to account for capacity constraints in long-term traffic forecasts. Specifically, we suggest a dynamic model in which the speed of adjustment is related to the ratio between the actual traffic flow and the maximum capacity of the motorway. This methodology is applied to a specific public policy that consists of suppressing the toll on a certain motorway section before the concession expires.
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market. Keywords: Product introduction, entry, uncertainty, multiproduct firms, automobile JEL codes: L11, L13
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market
Resumo:
This paper analyzes the optimal behavior of farmers in the presence of direct payments and uncertainty. In an empirical analysis for Switzerland, it confirms previously obtained theoretical results and determines the magnitude of the theoretical predicted effects. The results show that direct payments increase agricultural production between 3.7% to 4.8%. Alternatively to direct payments, the production effect of tax reductions is evaluated in order to determine its magnitude. The empirical analysis corroborates the theoretical results of the literature and demonstrates that tax reductions are also distorting, but to a substantially lesser degree if losses are not offset. However, tax reductions, independently whether losses are offset or not, lead to higher government spending than pure direct payments
Resumo:
This paper deals with fault detection and isolation problems for nonlinear dynamic systems. Both problems are stated as constraint satisfaction problems (CSP) and solved using consistency techniques. The main contribution is the isolation method based on consistency techniques and uncertainty space refining of interval parameters. The major advantage of this method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements, and model errors. Interval calculations bring independence from the assumption of monotony considered by several approaches for fault isolation which are based on observers. An application to a well known alcoholic fermentation process model is presented
Resumo:
The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented
Resumo:
Aim Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location World-wide.Methods Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.
Resumo:
How do organizations cope with extreme uncertainty? The existing literature is divided on this issue: some argue that organizations deal best with uncertainty in the environment by reproducing it in the organization, whereas others contend that the orga nization should be protected from the environment. In this paper we study the case of a Wall Street investment bank that lost its entire office and trading technology in the terrorist attack of September 11 th. The traders survived, but were forced to relocate to a makeshift trading room in New Jersey. During the six months the traders spent outside New York City, they had to deal with fears and insecurities inside the company as well as outside it: anxiety about additional attacks, questions of professional identity, doubts about the future of the firm, and ambiguities about the future re-location of the trading room. The firm overcame these uncertainties by protecting the traders' identities and their ability to engage in sensemaking. The organization held together through a leadership style that managed ambiguities and created the conditions for new solutions to emerge.
Resumo:
The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.