928 resultados para Experiments with Change
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.
Resumo:
As scientific workflows and the data they operate on, grow in size and complexity, the task of defining how those workflows should execute (which resources to use, where the resources must be in readiness for processing etc.) becomes proportionally more difficult. While "workflow compilers", such as Pegasus, reduce this burden, a further problem arises: since specifying details of execution is now automatic, a workflow's results are harder to interpret, as they are partly due to specifics of execution. By automating steps between the experiment design and its results, we lose the connection between them, hindering interpretation of results. To reconnect the scientific data with the original experiment, we argue that scientists should have access to the full provenance of their data, including not only parameters, inputs and intermediary data, but also the abstract experiment, refined into a concrete execution by the "workflow compiler". In this paper, we describe preliminary work on adapting Pegasus to capture the process of workflow refinement in the PASOA provenance system.
Resumo:
This thesis aims to explore the concept of impression management from the financial analysts’ point of view. Impression management is the definition of the act of an agent manipulating an impression that another person have of this agent, in the context of this thesis it happens when a company make graphics to disclosure financial-accounting information in order to manipulate the market’s perception of their performance. Three types of impression management were analyzed: presentation enhancement (color manipulation), measurement distortion (scale manipulation) and selectivity (the disclosure of positive information only). While presentation enhancement improved only the most impulsive financial analysts’ perception of firm’s performance, the measurement distortion improved the perception of performance for both groups of financial analysts (impulsive and reflective). Finally, selectivity improved the financial analysts’ perception of firm’s performance for both groups (impulsive and reflective), although impulsive financial analysts assigned lower ratings when compared to their reflective peers, on average, to a hypothetical company.
Resumo:
We consider a procedure for obtaining a compact fourth order method to the steady 2D Navier-Stokes equations in the streamfunction formulation using the computer algebra system Maple. The resulting code is short and from it we obtain the Fortran program for the method. To test the procedure we have solved many cavity-type problems which include one with an analytical solution and the results are compared with results obtained by second order central differences to moderate Reynolds numbers. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
We generalize a procedure proposed by Mancera and Hunt [P.F.A. Mancera, R. Hunt, Some experiments with high order compact methods using a computer algebra software-Part 1, Appl. Math. Comput., in press, doi: 10.1016/j.amc.2005.05.015] for obtaining a compact fourth-order method to the steady 2D Navier-Stokes equations in the streamfunction formulation-vorticity using the computer algebra system Maple, which includes conformal mappings and non-uniform grids. To analyse the procedure we have solved a constricted stepped channel problem, where a fine grid is placed near the re-entrant corner by transformation of the independent variables. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Discurso inaugural del 22 periodo de sesiones de la CEPAL efectuado por el senor Gert Rosenthal, en el cual se refiere en especial al papel pasado, presente y futuro de la Comision en el desarrollo economico y social de la region.
Resumo:
Includes bibliography
Resumo:
Despite their importance in the evaluation of petroleum and gas reservoirs, measurements of self-potential data under borehole conditions (well-logging) have found only minor applications in aquifer and waste-site characterization. This can be attributed to lower signals from the diffusion fronts in near-surface environments because measurements are made long after the drilling of the well, when concentration fronts are already disappearing. Proportionally higher signals arise from streaming potentials that prevent using simple interpretation models that assume signals from diffusion only. Our laboratory experiments found that dual-source self-potential signals can be described by a simple linear model, and that contributions (from diffusion and streaming potentials) can be isolated by slightly perturbing the borehole conditions. Perturbations are applied either by changing the concentration of the borehole-filling solution or its column height. Parameters useful for formation evaluation can be estimated from data measured during perturbations, namely, pore water resistivity, pressure drop across the borehole wall, and electrokinetic coupling parameter. These are important parameters to assess, respectively, water quality, aquifer lateral continuity, and interfacial properties of permeable formations.
Resumo:
Vortex-Induced Vibration (VIV) experiments were carried out with yawed cylinders. The purpose was to investigate the validity of the Independence Principle (IP) for properly describing the flow characteristics and the dynamics of structures subjected to oblique flow. Five yaw angles in relation to the direction perpendicular to the free stream velocity were tested, namely View the MathML sourceθ=0°,10°,20°,30° and 45°. Both the upstream and downstream orientations were tested. The models were mounted on a leaf spring apparatus that allows experiments with one or two degrees of freedom. The Reynolds numbers based on the component normal to the cylinder axis fell in the interval 3×103
Resumo:
L’obiettivo del lavoro esposto nella seguente relazione di tesi ha riguardato lo studio e la simulazione di esperimenti di radar bistatico per missioni di esplorazione planeteria. In particolare, il lavoro si è concentrato sull’uso ed il miglioramento di un simulatore software già realizzato da un consorzio di aziende ed enti di ricerca nell’ambito di uno studio dell’Agenzia Spaziale Europea (European Space Agency – ESA) finanziato nel 2008, e svolto fra il 2009 e 2010. L’azienda spagnola GMV ha coordinato lo studio, al quale presero parte anche gruppi di ricerca dell’Università di Roma “Sapienza” e dell’Università di Bologna. Il lavoro svolto si è incentrato sulla determinazione della causa di alcune inconsistenze negli output relativi alla parte del simulatore, progettato in ambiente MATLAB, finalizzato alla stima delle caratteristiche della superficie di Titano, in particolare la costante dielettrica e la rugosità media della superficie, mediante un esperimento con radar bistatico in modalità downlink eseguito dalla sonda Cassini-Huygens in orbita intorno al Titano stesso. Esperimenti con radar bistatico per lo studio di corpi celesti sono presenti nella storia dell’esplorazione spaziale fin dagli anni ’60, anche se ogni volta le apparecchiature utilizzate e le fasi di missione, durante le quali questi esperimenti erano effettuati, non sono state mai appositamente progettate per lo scopo. Da qui la necessità di progettare un simulatore per studiare varie possibili modalità di esperimenti con radar bistatico in diversi tipi di missione. In una prima fase di approccio al simulatore, il lavoro si è incentrato sullo studio della documentazione in allegato al codice così da avere un’idea generale della sua struttura e funzionamento. È seguita poi una fase di studio dettagliato, determinando lo scopo di ogni linea di codice utilizzata, nonché la verifica in letteratura delle formule e dei modelli utilizzati per la determinazione di diversi parametri. In una seconda fase il lavoro ha previsto l’intervento diretto sul codice con una serie di indagini volte a determinarne la coerenza e l’attendibilità dei risultati. Ogni indagine ha previsto una diminuzione delle ipotesi semplificative imposte al modello utilizzato in modo tale da identificare con maggiore sicurezza la parte del codice responsabile dell’inesattezza degli output del simulatore. I risultati ottenuti hanno permesso la correzione di alcune parti del codice e la determinazione della principale fonte di errore sugli output, circoscrivendo l’oggetto di studio per future indagini mirate.