994 resultados para scientific uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of guessing the realization of a finite alphabet source, when some side information is provided, in a setting where the only knowledge the guesser has about the source and the correlated side information is that the joint source is one among a family. We define a notion of redundancy, identify a quantity that measures this redundancy, and study its properties. We then identify good guessing strategies that minimize the supremum redundancy (over the family). The minimum value measures the richness of the uncertainty class.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the effect of uncertainty on investment and labor demand for Finnish firms during the time period 1987 – 2000. Utilizing a stock return based measure of uncertainty decomposed into systematic and idiosyncratic components, the results reveal that idiosyncratic uncertainty significantly reduces both investment and labor demand. Idiosyncratic uncertainty seems to influence investment in the current period, whereas the depressing effect on labor demand appears with a one-year lag. The results provide support that the depressing effect of idiosyncratic uncertainty on investment is stronger for small firms in comparison to large firms. Some evidence is reported regarding differential effects of uncertainty on labor demand conditional on firm characteristics. Most importantly, the depressing effect of lagged idiosyncratic uncertainty on labor demand tends to be stronger for diversified firms compared with focused firms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Life cycle assessment (LCA) is used to estimate a product's environmental impact. Using LCA during the earlier stages of design may produce erroneous results since information available on the product's lifecycle is typically incomplete at these stages. The resulting uncertainty must be accounted for in the decision-making process. This paper proposes a method for estimating the environmental impact of a product's life cycle and the associated degree of uncertainty of that impact using information generated during the design process. Total impact is estimated based on aggregation of individual product life cycle processes impacts. Uncertainty estimation is based on assessing the mismatch between the information required and the information available about the product life cycle in each uncertainty category, as well as their integration. The method is evaluated using pre-defined scenarios with varying uncertainty. DOI: 10.1115/1.4002163]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embryonic stem cells offer potentially a ground-breaking insight into health and diseases and are said to offer hope in discovering cures for many ailments unimaginable few years ago. Human embryonic stem cells are undifferentiated, immature cells that possess an amazing ability to develop into almost any body cell such as heart muscle, bone, nerve and blood cells and possibly even organs in due course. This remarkable feature, enabling embryonic stem cells to proliferate indefinitely in vitro (in a test tube), has branded them as a so-called miracle cure . Their potential use in clinical applications provides hope to many sufferers of debilitating and fatal medical conditions. However, the emergence of stem cell research has resulted in intense debates about its promises and dangers. On the one hand, advocates hail its potential, ranging from alleviating and even curing fatal and debilitating diseases such as Parkinson s, diabetes, heart ailments and so forth. On the other hand, opponents decry its dangers, drawing attention to the inherent risks of human embryo destruction, cloning for research purposes and reproductive cloning eventually. Lately, however, the policy battles surrounding human embryonic stem cell innovation have shifted from being a controversial research to scuffles within intellectual property rights. In fact, the ability to obtain patents represents a pivotal factor in the economic success or failure of this new biotechnology. Although, stem cell patents tend to more or less satisfy the standard patentability requirements, they also raise serious ethical and moral questions about the meaning of the exclusions on ethical or moral grounds as found in European and to an extent American and Australian patent laws. At present there is a sort of a calamity over human embryonic stem cell patents in Europe and to an extent in Australia and the United States. This in turn has created a sense of urgency to engage all relevant parties in the discourse on how best to approach patenting of this new form of scientific innovation. In essence, this should become a highly favoured patenting priority. To the contrary, stem cell innovation and its reliance on patent protection risk turmoil, uncertainty, confusion and even a halt on not only stem cell research but also further emerging biotechnology research and development. The patent system is premised upon the fundamental principle of balance which ought to ensure that the temporary monopoly awarded to the inventor equals that of the social benefit provided by the disclosure of the invention. Ensuring and maintaining this balance within the patent system when patenting human embryonic stem cells is of crucial contemporary relevance. Yet, the patenting of human embryonic stem cells raises some fundamental moral, social and legal questions. Overall, the present approach of patenting human embryonic stem cell related inventions is unsatisfactory and ineffective. This draws attention to a specific question which provides for a conceptual framework for this work. That question is the following: how can the investigated patent offices successfully deal with patentability of human embryonic stem cells? This in turn points at the thorny issue of application of the morality clause in this field. In particular, the interpretation of the exclusions on ethical or moral grounds as found in Australian, American and European legislative and judicial precedents. The Thesis seeks to compare laws and legal practices surrounding patentability of human embryonic stem cells in Australia and the United States with that of Europe. By using Europe as the primary case study for lessons and guidance, the central goal of the Thesis then becomes the determination of the type of solutions available to Europe with prospects to apply such to Australia and the United States. The Dissertation purports to define the ethical implications that arise with patenting human embryonic stem cells and intends to offer resolutions to the key ethical dilemmas surrounding patentability of human embryonic stem cells and other morally controversial biotechnology inventions. In particular, the Thesis goal is to propose a functional framework that may be used as a benchmark for an informed discussion on the solution to resolving ethical and legal tensions that come with patentability of human embryonic stem cells in Australian, American and European patent worlds. Key research questions that arise from these objectives and which continuously thread throughout the monograph are: 1. How do common law countries such as Australia and the United States approach and deal with patentability of human embryonic stem cells in their jurisdictions? These practices are then compared to the situation in Europe as represented by the United Kingdom (first two chapters), the Court of Justice of the European Union and the European Patent Office decisions (Chapter 3 onwards) in order to obtain a full picture of the present patenting procedures on the European soil. 2. How are ethical and moral considerations taken into account at patent offices investigated when assessing patentability of human embryonic stem cell related inventions? In order to assess this part, the Thesis evaluates how ethical issues that arise with patent applications are dealt with by: a) Legislative history of the modern patent system from its inception in 15th Century England to present day patent laws. b) Australian, American and European patent offices presently and in the past, including other relevant legal precedents on the subject matter. c) Normative ethical theories. d) The notion of human dignity used as the lowest common denominator for the interpretation of the European morality clause. 3. Given the existence of the morality clause in form of Article 6(1) of the Directive 98/44/EC of the European Parliament and of the Council of 6 July 1998 on the legal protection of biotechnological inventions which corresponds to Article 53(a) European Patent Convention, a special emphasis is put on Europe as a guiding principle for Australia and the United States. Any room for improvement of the European morality clause and Europe s current manner of evaluating ethical tensions surrounding human embryonic stem cell inventions is examined. 4. A summary of options (as represented by Australia, the United States and Europe) available as a basis for the optimal examination procedure of human embryonic stem cell inventions is depicted, whereas the best of such alternatives is deduced in order to create a benchmark framework. This framework is then utilised on and promoted as a tool to assist Europe (as represented by the European Patent Office) in examining human embryonic stem cell patent applications. This method suggests a possibility of implementing an institution solution. 5. Ultimately, a question of whether such reformed European patent system can be used as a founding stone for a potential patent reform in Australia and the United States when examining human embryonic stem cells or other morally controversial inventions is surveyed. The author wishes to emphasise that the guiding thought while carrying out this work is to convey the significance of identifying, analysing and clarifying the ethical tensions surrounding patenting human embryonic stem cells and ultimately present a solution that adequately assesses patentability of human embryonic stem cell inventions and related biotechnologies. In answering the key questions above, the Thesis strives to contribute to the broader stem cell debate about how and to which extent ethical and social positions should be integrated into the patenting procedure in pluralistic and morally divided democracies of Europe and subsequently Australia and the United States.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often the soil hydraulic parameters are obtained by the inversion of measured data (e.g. soil moisture, pressure head, and cumulative infiltration, etc.). However, the inverse problem in unsaturated zone is ill-posed due to various reasons, and hence the parameters become non-unique. The presence of multiple soil layers brings the additional complexities in the inverse modelling. The generalized likelihood uncertainty estimate (GLUE) is a useful approach to estimate the parameters and their uncertainty when dealing with soil moisture dynamics which is a highly non-linear problem. Because the estimated parameters depend on the modelling scale, inverse modelling carried out on laboratory data and field data may provide independent estimates. The objective of this paper is to compare the parameters and their uncertainty estimated through experiments in the laboratory and in the field and to assess which of the soil hydraulic parameters are independent of the experiment. The first two layers in the field site are characterized by Loamy sand and Loamy. The mean soil moisture and pressure head at three depths are measured with an interval of half hour for a period of 1 week using the evaporation method for the laboratory experiment, whereas soil moisture at three different depths (60, 110, and 200 cm) is measured with an interval of 1 h for 2 years for the field experiment. A one-dimensional soil moisture model on the basis of the finite difference method was used. The calibration and validation are approximately for 1 year each. The model performance was found to be good with root mean square error (RMSE) varying from 2 to 4 cm(3) cm(-3). It is found from the two experiments that mean and uncertainty in the saturated soil moisture (theta(s)) and shape parameter (n) of van Genuchten equations are similar for both the soil types. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of uncertainties on performance predictions of a helicopter is studied in this article. The aeroelastic parameters such as the air density, blade profile drag coefficient, main rotor angular velocity, main rotor radius, and blade chord are considered as uncertain variables. The propagation of these uncertainties in the performance parameters such as thrust coefficient, figure of merit, induced velocity, and power required are studied using Monte Carlo simulation and the first-order reliability method. The Rankine-Froude momentum theory is used for performance prediction in hover, axial climb, and forward flight. The propagation of uncertainty causes large deviations from the baseline deterministic predictions, which undoubtedly affect both the achievable performance and the safety of the helicopter. The numerical results in this article provide useful bounds on helicopter power requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses uncertainty effect on the health monitoring of a smart structure using control gain shifts as damage indicators. A finite element model of the smart composite plate with surface-bonded piezoelectric sensors and actuators is formulated using first-order shear deformation theory and a matrix crack model is integrated into the finite element model. A constant gain velocity/position feedback control algorithm is used to provide active damping to the structure. Numerical results show that the response of the structure is changed due to matrix cracks and this change can be compensated by actively tuning the feedback controller. This change in control gain can be used as a damage indicator for structural health monitoring. Monte Carlo simulation is conducted to study the effect of material uncertainty on the damage indicator by considering composite material properties and piezoelectric coefficients as independent random variables. It is found that the change in position feedback control gain is a robust damage indicator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We define lacunary Fourier series on a compact connected semisimple Lie group G. If f is an element of L-1 (G) has lacunary Fourier series and f vanishes on a non empty open subset of G, then we prove that f vanishes identically. This result can be viewed as a qualitative uncertainty principle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An intelligent computer aided defect analysis (ICADA) system, based on artificial intelligence techniques, has been developed to identify design, process or material parameters which could be responsible for the occurrence of defective castings in a manufacturing campaign. The data on defective castings for a particular time frame, which is an input to the ICADA system, has been analysed. It was observed that a large proportion, i.e. 50-80% of all the defective castings produced in a foundry, have two, three or four types of defects occurring above a threshold proportion, say 10%. Also, a large number of defect types are either not found at all or found in a very small proportion, with a threshold value below 2%. An important feature of the ICADA system is the recognition of this pattern in the analysis. Thirty casting defect types and a large number of causes numbering between 50 and 70 for each, as identified in the AFS analysis of casting defects-the standard reference source for a casting process-constituted the foundation for building the knowledge base. Scientific rationale underlying the formation of a defect during the casting process was identified and 38 metacauses were coded. Process, material and design parameters which contribute to the metacauses were systematically examined and 112 were identified as rootcauses. The interconnections between defects, metacauses and rootcauses were represented as a three tier structured graph and the handling of uncertainty in the occurrence of events such as defects, metacauses and rootcauses was achieved by Bayesian analysis. The hill climbing search technique, associated with forward reasoning, was employed to recognize one or several root causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Perfect or even mediocre weather predictions over a long period are almost impossible because of the ultimate growth of a small initial error into a significant one. Even though the sensitivity of initial conditions limits the predictability in chaotic systems, an ensemble of prediction from different possible initial conditions and also a prediction algorithm capable of resolving the fine structure of the chaotic attractor can reduce the prediction uncertainty to some extent. All of the traditional chaotic prediction methods in hydrology are based on single optimum initial condition local models which can model the sudden divergence of the trajectories with different local functions. Conceptually, global models are ineffective in modeling the highly unstable structure of the chaotic attractor. This paper focuses on an ensemble prediction approach by reconstructing the phase space using different combinations of chaotic parameters, i.e., embedding dimension and delay time to quantify the uncertainty in initial conditions. The ensemble approach is implemented through a local learning wavelet network model with a global feed-forward neural network structure for the phase space prediction of chaotic streamflow series. Quantification of uncertainties in future predictions are done by creating an ensemble of predictions with wavelet network using a range of plausible embedding dimensions and delay times. The ensemble approach is proved to be 50% more efficient than the single prediction for both local approximation and wavelet network approaches. The wavelet network approach has proved to be 30%-50% more superior to the local approximation approach. Compared to the traditional local approximation approach with single initial condition, the total predictive uncertainty in the streamflow is reduced when modeled with ensemble wavelet networks for different lead times. Localization property of wavelets, utilizing different dilation and translation parameters, helps in capturing most of the statistical properties of the observed data. The need for taking into account all plausible initial conditions and also bringing together the characteristics of both local and global approaches to model the unstable yet ordered chaotic attractor of a hydrologic series is clearly demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High sensitivity detection techniques are required for indoor navigation using Global Navigation Satellite System (GNSS) receivers, and typically, a combination of coherent and non- coherent integration is used as the test statistic for detection. The coherent integration exploits the deterministic part of the signal and is limited due to the residual frequency error, navigation data bits and user dynamics, which are not known apriori. So, non- coherent integration, which involves squaring of the coherent integration output, is used to improve the detection sensitivity. Due to this squaring, it is robust against the artifacts introduced due to data bits and/or frequency error. However, it is susceptible to uncertainty in the noise variance, and this can lead to fundamental sensitivity limits in detecting weak signals. In this work, the performance of the conventional non-coherent integration-based GNSS signal detection is studied in the presence of noise uncertainty. It is shown that the performance of the current state of the art GNSS receivers is close to the theoretical SNR limit for reliable detection at moderate levels of noise uncertainty. Alternate robust post-coherent detectors are also analyzed, and are shown to alleviate the noise uncertainty problem. Monte-Carlo simulations are used to confirm the theoretical predictions.