957 resultados para Heelu, Jan van.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data assimilation methods which avoid the assumption of Gaussian error statistics are being developed for geoscience applications. We investigate how the relaxation of the Gaussian assumption affects the impact observations have within the assimilation process. The effect of non-Gaussian observation error (described by the likelihood) is compared to previously published work studying the effect of a non-Gaussian prior. The observation impact is measured in three ways: the sensitivity of the analysis to the observations, the mutual information, and the relative entropy. These three measures have all been studied in the case of Gaussian data assimilation and, in this case, have a known analytical form. It is shown that the analysis sensitivity can also be derived analytically when at least one of the prior or likelihood is Gaussian. This derivation shows an interesting asymmetry in the relationship between analysis sensitivity and analysis error covariance when the two different sources of non-Gaussian structure are considered (likelihood vs. prior). This is illustrated for a simple scalar case and used to infer the effect of the non-Gaussian structure on mutual information and relative entropy, which are more natural choices of metric in non-Gaussian data assimilation. It is concluded that approximating non-Gaussian error distributions as Gaussian can give significantly erroneous estimates of observation impact. The degree of the error depends not only on the nature of the non-Gaussian structure, but also on the metric used to measure the observation impact and the source of the non-Gaussian structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the interaction of organic molecules with TiO2 surfaces is important for a wide range of technological applications. While density functional theory (DFT) calculations can provide valuable insight about these interactions, traditional DFT approaches with local exchange-correlation functionals suffer from a poor description of non-bonding van der Waals (vdW) interactions. We examine here the contribution of vdW forces to the interaction of small organic molecules (methane, methanol, formic acid and glycine) with the TiO2 (110) surface, based on DFT calculations with the optB88-vdW functional. The adsorption geometries and energies at different configurations were also obtained in the standard generalized gradient approximation (GGA-PBE) for comparison. We find that the optB88-vdW consistently gives shorter surface adsorbate-to-surface distances and slightly stronger interactions than PBE for the weak (physisorbed) modes of adsorption. In the case of strongly adsorbed (chemisorbed) molecules both functionals give similar results for the adsorption geometries, and also similar values of the relative energies between different chemisorption modes for each molecule. In particular both functionals predict that dissociative adsorption is more favourable than molecular adsorption for methanol, formic acid and glycine, in general agreement with experiment. The dissociation energies obtained from both functionals are also very similar, indicating that vdW interactions do not affect the thermodynamics of surface deprotonation. However, the optB88-vdW always predicts stronger adsorption than PBE. The comparison of the methanol adsorption energies with values obtained from a Redhead analysis of temperature programmed desorption data suggests that optB88-vdW significantly overestimates the adsorption strength, although we warn about the uncertainties involved in such comparisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article shows how one can formulate the representation problem starting from Bayes’ theorem. The purpose of this article is to raise awareness of the formal solutions,so that approximations can be placed in a proper context. The representation errors appear in the likelihood, and the different possibilities for the representation of reality in model and observations are discussed, including nonlinear representation probability density functions. Specifically, the assumptions needed in the usual procedure to add a representation error covariance to the error covariance of the observations are discussed,and it is shown that, when several sub-grid observations are present, their mean still has a representation error ; socalled ‘superobbing’ does not resolve the issue. Connection is made to the off-line or on-line retrieval problem, providing a new simple proof of the equivalence of assimilating linear retrievals and original observations. Furthermore, it is shown how nonlinear retrievals can be assimilated without loss of information. Finally we discuss how errors in the observation operator model can be treated consistently in the Bayesian framework, connecting to previous work in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show how two linearly independent vectors can be used to construct two orthogonal vectors of equal magnitude in a simple way. The proof that the constructed vectors are orthogonal and of equal magnitude is a good exercise for students studying properties of scalar and vector triple products. We then show how this result can be used to prove van Aubel's theorem that relates the two line segments joining the centres of squares on opposite sides of a plane quadrilateral.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lifestyle factors are responsible for a considerable portion of cancer incidence worldwide, but credible estimates from the World Health Organization and the International Agency for Research on Cancer (IARC) suggest that the fraction of cancers attributable to toxic environmental exposures is between 7% and 19%. To explore the hypothesis that low-dose exposures to mixtures of chemicals in the environment may be combining to contribute to environmental carcinogenesis, we reviewed 11 hallmark phenotypes of cancer, multiple priority target sites for disruption in each area and prototypical chemical disruptors for all targets, this included dose-response characterizations, evidence of low-dose effects and cross-hallmark effects for all targets and chemicals. In total, 85 examples of chemicals were reviewed for actions on key pathways/mechanisms related to carcinogenesis. Only 15% (13/85) were found to have evidence of a dose-response threshold, whereas 59% (50/85) exerted low-dose effects. No dose-response information was found for the remaining 26% (22/85). Our analysis suggests that the cumulative effects of individual (non-carcinogenic) chemicals acting on different pathways, and a variety of related systems, organs, tissues and cells could plausibly conspire to produce carcinogenic synergies. Additional basic research on carcinogenesis and research focused on low-dose effects of chemical mixtures needs to be rigorously pursued before the merits of this hypothesis can be further advanced. However, the structure of the World Health Organization International Programme on Chemical Safety 'Mode of Action' framework should be revisited as it has inherent weaknesses that are not fully aligned with our current understanding of cancer biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the first book on Tschichold to be based on extensive archive research, Burke turns fresh and revealing light on his subject. He sets Tschichold in the network of artists and designers who constituted New Typography in its moment of definition and exploration, and puts new emphasis on Tschichold as an activist collector, editor and writer. Tschichold’s work is shown in colour throughout, in freshly made photographs of examples drawn from public and private collections. This is not a biography, but rather a discussion of the work seen in the context of Tschichold’s life and the times in which he lived.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A smoother introduced earlier by van Leeuwen and Evensen is applied to a problem in which real obser vations are used in an area with strongly nonlinear dynamics. The derivation is new , but it resembles an earlier derivation by van Leeuwen and Evensen. Again a Bayesian view is taken in which the prior probability density of the model and the probability density of the obser vations are combined to for m a posterior density . The mean and the covariance of this density give the variance-minimizing model evolution and its errors. The assumption is made that the prior probability density is a Gaussian, leading to a linear update equation. Critical evaluation shows when the assumption is justified. This also sheds light on why Kalman filters, in which the same ap- proximation is made, work for nonlinear models. By reference to the derivation, the impact of model and obser vational biases on the equations is discussed, and it is shown that Bayes’ s for mulation can still be used. A practical advantage of the ensemble smoother is that no adjoint equations have to be integrated and that error estimates are easily obtained. The present application shows that for process studies a smoother will give superior results compared to a filter , not only owing to the smooth transitions at obser vation points, but also because the origin of features can be followed back in time. Also its preference over a strong-constraint method is highlighted. Further more, it is argued that the proposed smoother is more efficient than gradient descent methods or than the representer method when error estimates are taken into account

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is for mally proved that the general smoother for nonlinear dynamics can be for mulated as a sequential method, that is, obser vations can be assimilated sequentially during a for ward integration. The general filter can be derived from the smoother and it is shown that the general smoother and filter solutions at the final time become identical, as is expected from linear theor y. Then, a new smoother algorithm based on ensemble statistics is presented and examined in an example with the Lorenz equations. The new smoother can be computed as a sequential algorithm using only for ward-in-time model integrations. It bears a strong resemblance with the ensemble Kalman filter . The difference is that ever y time a new dataset is available during the for ward integration, an analysis is computed for all previous times up to this time. Thus, the first guess for the smoother is the ensemble Kalman filter solution, and the smoother estimate provides an improvement of this, as one would expect a smoother to do. The method is demonstrated in this paper in an intercomparison with the ensemble Kalman filter and the ensemble smoother introduced by van Leeuwen and Evensen, and it is shown to be superior in an application with the Lorenz equations. Finally , a discussion is given regarding the properties of the analysis schemes when strongly non-Gaussian distributions are used. It is shown that in these cases more sophisticated analysis schemes based on Bayesian statistics must be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solitar y meanders of the Agulhas Current, so-called Natal pulses, may play an important role in the overall dynamics of this current system. Several hypotheses concer ning the triggering of these pulses are tested using sea sur face height and temperature data from satellites. The data show the for mation of pulses in the Natal Bight area at irregular inter vals ranging from 50 to 240 days. Moving downstream at speeds between 10 and 20 km day 2 1 they sometimes reach sizes of up to 300 km. They seem to play a role in the shedding of Agulhas rings that penetrate the South Atlantic. The inter mittent for mation of these solitar y meanders is argued to be most probably related to barotropic instability of the strongly baroclinic Agulhas Current in the Natal Bight. The vorticity structure of the obser ved basic flow is argued to be stable anywhere along its path. However , a proper perturbation of the jet in the Natal Bight area will allow barotropic instability , because the bottom slope there is considerably less steep than elsewhere along the South African east coast. Using satellite altimetr y these perturbations seem to be related to the inter mittent presence of offshore anticyclonic anomalies, both upstream and eastward of the Natal Bight.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ther mohaline exchange between the Atlantic and the Souther n Ocean is analyzed, using a dataset based on WOCE hydrographic data. It is shown that the salt and heat transports brought about by the South Atlantic subtropical gyre play an essential role in the Atlantic heat and salt budgets. It is found that on average the exported North Atlantic Deep W ater (NADW) is fresher than the retur n flows (basically composed of ther mocline and inter mediate water), indicating that the overtur ning circulation (OC) exports freshwater from the Atlantic. The sensitivity of the OC to interbasin fluxes of heat and salt is studied in a 2 D model, representing the Atlantic between 60 8 N and 30 8 S. The model is forced by mixed boundar y conditions at the sur face, and by realistic fluxes of heat and salt at its 30 8 S boundar y. The model circulation tur ns out to be ver y sensitive to net buoyancy fluxes through the sur face. Both net sur face cooling and net sur face saltening are sources of potential energy and impact positively on the circulation strength. The vertical distributions of the lateral fluxes tend to stabilize the stratification, and, as they extract potential energy from the system, tend to weaken the flow . These results imply that a change in the composition of the NADW retur n transports, whether by a change in the ratio ther mocline/inter mediate water , o r by a change in their ther mohaline characteristics, might influence the Atlantic OC considerably . It is also shown that the circulation is much more sensitive to changes in the shape of the lateral buoyancy flux than to changes in the shape of the sur face buoyancy flux, as the latter does not explicitly impact on the potential energy of the system. It is concluded that interocean fluxes of heat and salt are important for the strength and operation of the Atlantic ther mohaline circulation, and should be correctly represented in models that are used for climate sensitivity studies.