987 resultados para Experimental uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis has two basic themes: the investigation of new experiments which can be used to test relativistic gravity, and the investigation of new technologies and new experimental techniques which can be applied to make gravitational wave astronomy a reality.

Advancing technology will soon make possible a new class of gravitation experiments: pure laboratory experiments with laboratory sources of non-Newtonian gravity and laboratory detectors. The key advance in techno1ogy is the development of resonant sensing systems with very low levels of dissipation. Chapter 1 considers three such systems (torque balances, dielectric monocrystals, and superconducting microwave resonators), and it proposes eight laboratory experiments which use these systems as detectors. For each experiment it describes the dominant sources of noise and the technology required.

The coupled electro-mechanical system consisting of a microwave cavity and its walls can serve as a gravitational radiation detector. A gravitational wave interacts with the walls, and the resulting motion induces transitions from a highly excited cavity mode to a nearly unexcited mode. Chapter 2 describes briefly a formalism for analyzing such a detector, and it proposes a particular design.

The monitoring of a quantum mechanical harmonic oscillator on which a classical force acts is important in a variety of high-precision experiments, such as the attempt to detect gravitational radiation. Chapter 3 reviews the standard techniques for monitoring the oscillator; and it introduces a new technique which, in principle, can determine the details of the force with arbitrary accuracy, despite the quantum properties of the oscillator.

The standard method for monitoring the oscillator is the "amplitude- and-phase" method (position or momentum transducer with output fed through a linear amplifier). The accuracy obtainable by this method is limited by the uncertainty principle. To do better requires a measurement of the type which Braginsky has called "quantum nondemolition." A well-known quantum nondemolition technique is "quantum counting," which can detect an arbitrarily weak force, but which cannot provide good accuracy in determining its precise time-dependence. Chapter 3 considers extensively a new type of quantum nondemolition measurement - a "back-action-evading" measurement of the real part X1 (or the imaginary part X2) of the oscillator's complex amplitude. In principle X1 can be measured arbitrarily quickly and arbitrarily accurately, and a sequence of such measurements can lead to an arbitrarily accurate monitoring of the classical force.

Chapter 3 describes explicit gedanken experiments which demonstrate that X1 can be measured arbitrarily quickly and arbitrarily accurately, it considers approximate back-action-evading measurements, and it develops a theory of quantum nondemolition measurement for arbitrary quantum mechanical systems.

In Rosen's "bimetric" theory of gravity the (local) speed of gravitational radiation vg is determined by the combined effects of cosmological boundary values and nearby concentrations of matter. It is possible for vg to be less than the speed of light. Chapter 4 shows that emission of gravitational radiation prevents particles of nonzero rest mass from exceeding the speed of gravitational radiation. Observations of relativistic particles place limits on vg and the cosmological boundary values today, and observations of synchrotron radiation from compact radio sources place limits on the cosmological boundary values in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A workshop on the computational fluid dynamics (CFD) prediction of shock boundary-layer interactions (SBLIs) was held at the 48th AIAA Aerospace Sciences Meeting. As part of the workshop, numerous CFD analysts submitted solutions to four experimentally measured SBLIs. This paper describes the assessment of the CFD predictions. The assessment includes an uncertainty analysis of the experimental data, the definition of an error metric, and the application of that metric to the CFD solutions. The CFD solutions provided very similar levels of error and, in general, it was difficult to discern clear trends in the data. For the Reynolds-averaged Navier-Stokes (RANS) methods, the choice of turbulence model appeared to be the largest factor in solution accuracy. Scale-resolving methods, such as large-eddy simulation (LES), hybrid RANS/LES, and direct numerical simulation, produced error levels similar to RANS methods but provided superior predictions of normal stresses. Copyright © 2012 by Daniella E. Raveh and Michael Iovnovich.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vibration and acoustic analysis at higher frequencies faces two challenges: computing the response without using an excessive number of degrees of freedom, and quantifying its uncertainty due to small spatial variations in geometry, material properties and boundary conditions. Efficient models make use of the observation that when the response of a decoupled vibro-acoustic subsystem is sufficiently sensitive to uncertainty in such spatial variations, the local statistics of its natural frequencies and mode shapes saturate to universal probability distributions. This holds irrespective of the causes that underly these spatial variations and thus leads to a nonparametric description of uncertainty. This work deals with the identification of uncertain parameters in such models by using experimental data. One of the difficulties is that both experimental errors and modeling errors, due to the nonparametric uncertainty that is inherent to the model type, are present. This is tackled by employing a Bayesian inference strategy. The prior probability distribution of the uncertain parameters is constructed using the maximum entropy principle. The likelihood function that is subsequently computed takes the experimental information, the experimental errors and the modeling errors into account. The posterior probability distribution, which is computed with the Markov Chain Monte Carlo method, provides a full uncertainty quantification of the identified parameters, and indicates how well their uncertainty is reduced, with respect to the prior information, by the experimental data. © 2013 Taylor & Francis Group, London.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many real world situations, we make decisions in the presence of multiple, often conflicting and non-commensurate objectives. The process of optimizing systematically and simultaneously over a set of objective functions is known as multi-objective optimization. In multi-objective optimization, we have a (possibly exponentially large) set of decisions and each decision has a set of alternatives. Each alternative depends on the state of the world, and is evaluated with respect to a number of criteria. In this thesis, we consider the decision making problems in two scenarios. In the first scenario, the current state of the world, under which the decisions are to be made, is known in advance. In the second scenario, the current state of the world is unknown at the time of making decisions. For decision making under certainty, we consider the framework of multiobjective constraint optimization and focus on extending the algorithms to solve these models to the case where there are additional trade-offs. We focus especially on branch-and-bound algorithms that use a mini-buckets algorithm for generating the upper bound at each node of the search tree (in the context of maximizing values of objectives). Since the size of the guiding upper bound sets can become very large during the search, we introduce efficient methods for reducing these sets, yet still maintaining the upper bound property. We define a formalism for imprecise trade-offs, which allows the decision maker during the elicitation stage, to specify a preference for one multi-objective utility vector over another, and use such preferences to infer other preferences. The induced preference relation then is used to eliminate the dominated utility vectors during the computation. For testing the dominance between multi-objective utility vectors, we present three different approaches. The first is based on a linear programming approach, the second is by use of distance-based algorithm (which uses a measure of the distance between a point and a convex cone); the third approach makes use of a matrix multiplication, which results in much faster dominance checks with respect to the preference relation induced by the trade-offs. Furthermore, we show that our trade-offs approach, which is based on a preference inference technique, can also be given an alternative semantics based on the well known Multi-Attribute Utility Theory. Our comprehensive experimental results on common multi-objective constraint optimization benchmarks demonstrate that the proposed enhancements allow the algorithms to scale up to much larger problems than before. For decision making problems under uncertainty, we describe multi-objective influence diagrams, based on a set of p objectives, where utility values are vectors in Rp, and are typically only partially ordered. These can be solved by a variable elimination algorithm, leading to a set of maximal values of expected utility. If the Pareto ordering is used this set can often be prohibitively large. We consider approximate representations of the Pareto set based on ϵ-coverings, allowing much larger problems to be solved. In addition, we define a method for incorporating user trade-offs, which also greatly improves the efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of the air quality modelling work has been so far oriented towards deterministic simulations of ambient pollutant concentrations. This traditional approach, which is based on the use of one selected model and one data set of discrete input values, does not reflect the uncertainties due to errors in model formulation and input data. Given the complexities of urban environments and the inherent limitations of mathematical modelling, it is unlikely that a single model based on routinely available meteorological and emission data will give satisfactory short-term predictions. In this study, different methods involving the use of more than one dispersion model, in association with different emission simulation methodologies and meteorological data sets, were explored for predicting best CO and benzene estimates, and related confidence bounds. The different approaches were tested using experimental data obtained during intensive monitoring campaigns in busy street canyons in Paris, France. Three relative simple dispersion models (STREET, OSPM and AEOLIUS) that are likely to be used for regulatory purposes were selected for this application. A sensitivity analysis was conducted in order to identify internal model parameters that might significantly affect results. Finally, a probabilistic methodology for assessing urban air quality was proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the social dimensions of an experimental release of carbon dioxide (CO2) carried out in Ardmucknish Bay, Argyll, United Kingdom. The experiment, which aimed to understand detectability and potential effects on the marine environment should there be any leakage from a CO2 storage site, provided a rare opportunity to study the social aspects of a carbon dioxide capture and storage-related event taking place in a lived-in environment. Qualitative research was carried out in the form of observation at public information events about the release, in-depth interviews with key project staff and local stakeholders/community members, and a review of online media coverage of the experiment. Focusing mainly on the observation and interview data, we discuss three key findings: the role of experience and analogues in learning about unfamiliar concepts like CO2 storage; the challenge of addressing questions of uncertainty in public engagement; and the issue of when to commence engagement and how to frame the discussion. We conclude that whilst there are clearly slippages between a small-scale experiment and full-scale CCS, the social research carried out for this project demonstrates that issues of public and stakeholder perception are as relevant for offshore CO2 storage as they are for onshore.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

George Brecht, an artist best known for his associations with Fluxus, is considered to have made significant contributions to emerging traditions of conceptual art and experimental music in the early 1960s. His Event scores, brief verbal scores that comprised lists of terms or open-ended instructions, provided a signature model for indeterminate composition and were ‘used extensively by virtually every Fluxus artist’. This article revisits Brecht’s early writings and research to argue that, while Event scores were adopted within Fluxus performance, they were intended as much more than performance devices. Specifically, Brecht conceived of his works as ‘structures of experience’ that, by revealing the underlying connections between chanced forms, could enable a kind of enlightenment rooted within an experience of a ‘unified reality’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining the trophic niche width of an animal population and the relative degree to which a generalist population consists of dietary specialists are long-standing problems of ecology. It has been proposed that the variance of stable isotope values in consumer tissues could be used to quantify trophic niche width of consumer populations. However, this promising idea has not yet been rigorously tested. By conducting controlled laboratory experiments using model consumer populations (Daphnia sp., Crustacea) with controlled diets, we investigated the effect of individual- and population-level specialisation and generalism on consumer d C mean and variance values. While our experimental data follow general expectations, we extend current qualitative models to quantitative predictions of the dependence of isotopic variance on dietary correlation time, a measure for the typical time over which a consumer changes its diet. This quantitative approach allows us to pinpoint possible procedural pitfalls and critical sources of measurement uncertainty. Our results show that the stable isotope approach represents a powerful method for estimating trophic niche widths, especially when taking the quantitative concept of dietary correlation time into account. © 2012 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an experiment designed to study the psychological basis for the willingness to accept (WTA)–willingness to pay (WTP) gap. Specifically, we conduct a standard WTA–WTP economic experiment to replicate the gap and include in it five additional instruments to try to follow the psychological processes producing it. These instruments are designed to measure five psychological constructs we consider especially relevant: (1) attitudes, (2) feelings, (3) familiarity with the target good, (4) risk attitudes, and (5) personality. Our results provide important new insights into the psychological foundations of the WTA–WTP disparity, which can be used to organize some major previous results and cast serious doubts on the claim that the gap might be just a consequence of inappropriate experimental practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply experimental methods to study the role of risk aversion on players’ behavior in repeated prisoners’ dilemma games. Faced with quantitatively equal discount factors, the most risk-averse players will choose Nash strategies more often in the presence of uncertainty than when future profits are discounted in a deterministic way. Overall, we find that risk aversion relates negatively with the frequency of collusive outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEIS

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present the an experimental setup to check the Heisenberg uncertainty principle. The description of the experimental setup and of the theoretical foundations is aimed at promoting the familiarization of the students with the involved concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managers know more about the performance of the organization than investors, which makes the disclosure of information a possible strategy for competitive differentiation, minimizing adverse selection. This paper's main goal is to analyze whether or not an entity's level of diclosure may affect the risk perception of individuals and the process of evaluating their shares. The survey was carried out in an experimental study with 456 subjects. In a stock market simulation, we investigated the pricing of the stocks of two companies with different levels of information disclosure at four separate stages. The results showed that, when other variables are constant, the level of disclosure of an entity can affect the expectations of individuals and the process of evaluating their shares. A higher level of disclosure by an entity affected the value of its share and the other company's.