964 resultados para Uncertainty analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The relationship between uncertainty and firms’ risk-taking behaviour has been a focus of investigation since early discussion of the nature of enterprise activity. Here, we focus on how firms’ perceptions of environmental uncertainty and their perceptions of the risks involved impact on their willingness to undertake green innovation. Analysis is based on a cross-sectional survey of UK food companies undertaken in 2008. The results reinforce the relationship between perceived environmental uncertainty and perceived innovation risk and emphasise the importance of macro-uncertainty in shaping firms’ willingness to undertake green innovation. The perceived (market-related) riskiness of innovation also positively influences the probability of innovating, suggesting either a proactive approach to stimulating market disruption or an opportunistic approach to innovation leadership.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study answers to How scenario analysis could help acquiring companies to reduce uncertainty in the acquisition process? It is due to the mismatch between academic world’s caveat emptor and business world’s eagerness to pursue acquisitions that motivated this study. Acquisitions are as popular as ever, thus, managing the uncertainty surrounding these transactions is relevant. This study creates a generic theoretical model with a strategy-level scope. Thus, the study does not discuss nor does it seek answers to operational issues related in both fields. This study is explorative and constructivist in nature. It discusses briefly the concepts and relatedness of risk and uncertainty and establishes a hierarchy between these two: Risks being a “sub-section” of uncertainty, although not with clear boundaries. Acquisition theory follows the process view that understands acquisitions as a process with various levels – some strategic, some operational. Scenario analysis is presented as tool for management to enrich their strategic discussion and understand their future options. The empirical data collection is done through interviewing. The results are reflected on literature on strategic management, scenario literature, and on a consultancy’s report picturing firm’s strategies in accordance with their acquisition processes. The study has an abductive approach as it tries to combine multiple views and generates discussion between literature review, interviews, the report, and second round of literature. The model suggests three propositions: First, at the strategic decision making level, when the decision whether or not to pursue an acquisition growth strategy has been made, it provides firms new data and enriches the strategic discussion. Second, when the acquisition strategy has been created, it can be applied as a tool to measure possible acquisition targets against the backdrop of the first set of scenarios. Third, due to the scenario analysis’ requirement to include people with various backgrounds and from multiple levels of the corporate hierarchy, it could help managers to avoid biases stemming from hubris.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. A model-independent reconstruction of the cosmic expansion rate is essential to a robust analysis of cosmological observations. Our goal is to demonstrate that current data are able to provide reasonable constraints on the behavior of the Hubble parameter with redshift, independently of any cosmological model or underlying gravity theory. Methods. Using type Ia supernova data, we show that it is possible to analytically calculate the Fisher matrix components in a Hubble parameter analysis without assumptions about the energy content of the Universe. We used a principal component analysis to reconstruct the Hubble parameter as a linear combination of the Fisher matrix eigenvectors (principal components). To suppress the bias introduced by the high redshift behavior of the components, we considered the value of the Hubble parameter at high redshift as a free parameter. We first tested our procedure using a mock sample of type Ia supernova observations, we then applied it to the real data compiled by the Sloan Digital Sky Survey (SDSS) group. Results. In the mock sample analysis, we demonstrate that it is possible to drastically suppress the bias introduced by the high redshift behavior of the principal components. Applying our procedure to the real data, we show that it allows us to determine the behavior of the Hubble parameter with reasonable uncertainty, without introducing any ad-hoc parameterizations. Beyond that, our reconstruction agrees with completely independent measurements of the Hubble parameter obtained from red-envelope galaxies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For environmental quality assessment, INAA has been applied for determining chemical elements in small (200 mg) and large (200 g) samples of leaves from 200 trees. By applying the Ingamells` constant, the expected percent standard deviation was estimated in 0.9-2.2% for 200 mg samples. Otherwise, for composite samples (200 g), expected standard deviation varied from 0.5 to 10% in spite of analytical uncertainties ranging from 2 to 30%. Results thereby suggested the expression of the degree of representativeness as a source of uncertainty, contributing for increasing of the reliability of environmental studies mainly in the case of composite samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The metrological principles of neutron activation analysis are discussed. It has been demonstrated that this method can provide elemental amount of substance with values fully traceable to the SI. The method has been used by several laboratories worldwide in a number of CCQM key comparisons - interlaboratory comparison tests at the highest metrological level - supplying results equivalent to values from other methods for elemental or isotopic analysis in complex samples without the need to perform chemical destruction and dissolution of these samples. The CCOM accepted therefore in April 2007 the claim that neutron activation analysis should have the similar status as the methods originally listed by the CCOM as `primary methods of measurement`. Analytical characteristics and scope of application are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a study of a specific type of beam-to-column connection for precast concrete structures. Furthermore, an analytical model to determine the strength and the stiffness of the connection, based on test results of two prototypes, is proposed. To evaluate the influence of the strength and stiffness of the connection on the behaviour of the structure, the results of numerical simulations of a typical multi-storey building with semi-rigid connections are also presented and compared with the results using pinned and rigid connections. The main conclusions are: (a) the proposed design model can reasonably evaluate the studied connection strength; (b) the evaluation of strength is more accurate than that of stiffness; (c) for a typical structure, it is possible to increase the number of storeys of the structure from two to four with lower horizontal displacement at the top, and only a small increase of the column base bending moment by replacing the pinned connections with semi-rigid ones; and (d) although there is significant uncertainty in the connection stiffness, the results show that the displacements at the top of the structure, and the column base moments present low susceptibility deviations to this parameter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spending by aid agencies on emergencies has quadrupled over the last decade, to over US$ 6 billion. To date, cost-effectiveness has seldom been considered in the prioritization and evaluation of emergency interventions. The sheer volume of resources spent on humanitarian aid and the chronicity of many humanitarian interventions call for more attention to be paid to the issue of 'value for money'. In this paper we present data from a major humanitarian crisis, an epidemic of visceral leishmaniasis (VL) in war-torn Sudan. The special circumstances provided us, in retrospect, with unusually accurate data on excess mortality, costs of the intervention and its effects, thus allowing us to express cost-effectiveness as the cost per Disability Adjusted Life Year (DALY) averted. The cost-effectiveness ratio, of US$ 18.40 per DALY (uncertainty range between US$ 13.53 and US$ 27.63), places the treatment of VL in Sudan among health interventions considered 'very flood value for money' (interventions of less than US$ 25 per DALY). We discuss the usefulness of this analysis to the internal management of the VL programme, the procurement of funds for the programme, and more generally, to priority setting in humanitarian relief interventions. We feel that in evaluations of emergency interventions attempts could be made more often to perform cost-effectiveness analyses, including the use of DALYs, provided that the outcomes of these analyses are seen in the broad context of the emergency situation and its consequences on the affected population. This paper provides a first contribution to what is hoped to become an international database of cost-effectiveness studies of health outcome such as the DALY.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a personal view of the interaction between the analysis of choice under uncertainty and the analysis of production under uncertainty. Interest in the foundations of the theory of choice under uncertainty was stimulated by applications of expected utility theory such as the Sandmo model of production under uncertainty. This interest led to the development of generalized models including rank-dependent expected utility theory. In turn, the development of generalized expected utility models raised the question of whether such models could be used in the analysis of applied problems such as those involving production under uncertainty. Finally, the revival of the state-contingent approach led to the recognition of a fundamental duality between choice problems and production problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epidemiological studies report confidence or uncertainty intervals around their estimates. Estimates of the burden of diseases and risk factors are subject to a broader range of uncertainty because of the combination of multiple data sources and value choices. Sensitivity analysis can be used to examine the effects of social values that have been incorporated into the design of the disability–adjusted life year (DALY). Age weight, where a year of healthy life lived at one age is valued differently from at another age, is the most controversial value built into the DALY. The discount rate, which addresses the difference in value of current versus future health benefits, also has been criticized. The distribution of the global disease burden and rankings of various conditions are largely insensitive to alternate assumptions about the discount rate and age weighting. The major effects of discounting and age weighting are to enhance the importance of neuropsychiatric conditions and sexually transmitted infections. The Global Burden of Disease study also has been criticized for estimating mortality and disease burden for regions using incomplete and uncertain data. Including uncertain results, with uncertainty quantified to the extent possible, is preferable, however, to leaving blank cells in tables intended to provide policy makers with an overall assessment of burden of disease. No estimate is generally interpreted as no problem. Greater investment in getting the descriptive epidemiology of diseases and injuries correct in poor countries will do vastly more to reduce uncertainty in disease burden assessments than a philosophical debate about the appropriateness of social value

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We generate and characterize continuous variable polarization entanglement between two optical beams. We first produce quadrature entanglement, and by performing local operations we transform it into a polarization basis. We extend two entanglement criteria, the inseparability criteria proposed by Duan et al (2000 Phys. Rev. Lett. 84 2722) and the Einstein–Podolsky–Rosen (EPR) paradox criteria proposed by Reid and Drummond (1988 Phys. Rev. Lett. 60 2731), to Stokes operators; and use them to characterize the entanglement. Our results for the EPR paradox criteria are visualized in terms of uncertainty balls on the Poincaré sphere. We demonstrate theoretically that using two quadrature entangled pairs it is possible to entangle three orthogonal Stokes operators between a pair of beams, although with a bound √3 times more stringent than for the quadrature entanglement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de Mestrado em Gestão de Empresas/MBA.