907 resultados para Stabilisation of filter
Resumo:
The renewed interest in magnetite (Fe3O4) as a major phase in different types of catalysts has led us to study the oxidation–reduction behaviour of its most prominent surfaces. We have employed computer modelling techniques based on the density functional theory to calculate the geometries and surface free energies of a number of surfaces at different compositions, including the stoichiometric plane, and those with a deficiency or excess of oxygen atoms. The most stable surfaces are the (001) and (111), leading to a cubic Fe3O4 crystal morphology with truncated corners under equilibrium conditions. The scanning tunnelling microscopy images of the different terminations of the (001) and (111) stoichiometric surfaces were calculated and compared with previous reports. Under reducing conditions, the creation of oxygen vacancies in the surface leads to the formation of reduced Fe species in the surface in the vicinity of the vacant oxygen. The (001) surface is slightly more prone to reduction than the (111), due to the higher stabilisation upon relaxation of the atoms around the oxygen vacancy, but molecular oxygen adsorbs preferentially at the (111) surface. In both oxidized surfaces, the oxygen atoms are located on bridge positions between two surface iron atoms, from which they attract electron density. The oxidised state is thermodynamically favourable with respect to the stoichiometric surfaces under ambient conditions, although not under the conditions when bulk Fe3O4 is thermodynamically stable with respect to Fe2O3. This finding is important in the interpretation of the catalytic properties of Fe3O4 due to the presence of oxidised species under experimental conditions.
Resumo:
The current work discusses the compositional analysis of spectra that may be related to amorphous materials that lack discernible Lorentzian, Debye or Drude responses. We propose to model such response using a 3-dimensional random RLC network using a descriptor formulation which is converted into an input-output transfer function representation. A wavelet identification study of these networks is performed to infer the composition of the networks. It was concluded that wavelet filter banks enable a parsimonious representation of the dynamics in excited randomly connected RLC networks. Furthermore, chemometric classification using the proposed technique enables the discrimination of dielectric samples with different composition. The methodology is promising for the classification of amorphous dielectrics.
Resumo:
Entomopathogenic nematodes (EPN) frequently kill their host within 1–2 days, and interest in EPN focuses mainly on their lethality. However, insects may take longer to die, or may fail to die despite being infected, but little is known about the effects of EPN infection on insects, other than death. Here we investigate both lethal and sub-lethal effects of infection by two EPN species, Steinernema carpocapsae and Heterorhabditis downesi, on adults of the large pine weevil, Hylobius abietis. Following 12 h nematode–weevil contact in peat, S. carpocapsae killed a significantly higher proportion of weevils (87–93%) than H. downesi (43–57%) at all concentrations tested. Less than 10% of weevils were dead within 2 days, and weevils continued to die for up to 10 days after exposure (LT50 of 3 days or more). In a separate experiment, live weevils dissected 6 days after a 24 h exposure to nematodes on filter paper harbored encapsulated and dead nematodes, showing that weevils could defend themselves against infection. Some live weevils also harbored live nematodes 6 days after they had been removed from the nematode infested medium. Feeding by weevils was not affected by infection with, or exposure to, either species of EPN. We discuss these results in relation to the use of EPN in biological control against H. abietis.
Resumo:
Timediscretization in weatherandclimate modelsintroduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leap-frog integrations from first-order to fifth-order.This improvement is achieved by replacing the Robert–Asselin filter with the Robert–Asselin–Williams (RAW) filter and using a linear combination of unfiltered and filtered states to compute the tendency term. The purpose of the present article is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leap-frog scheme is suitable for use in semi-implicit integrations.
Resumo:
A new online method to analyse water isotopes of speleothem fluid inclusions using a wavelength scanned cavity ring down spectroscopy (WS-CRDS) instrument is presented. This novel technique allows us simultaneously to measure hydrogen and oxygen isotopes for a released aliquot of water. To do so, we designed a new simple line that allows the online water extraction and isotope analysis of speleothem samples. The specificity of the method lies in the fact that fluid inclusions release is made on a standard water background, which mainly improves the δ D robustness. To saturate the line, a peristaltic pump continuously injects standard water into the line that is permanently heated to 140 °C and flushed with dry nitrogen gas. This permits instantaneous and complete vaporisation of the standard water, resulting in an artificial water background with well-known δ D and δ18O values. The speleothem sample is placed in a copper tube, attached to the line, and after system stabilisation it is crushed using a simple hydraulic device to liberate speleothem fluid inclusions water. The released water is carried by the nitrogen/standard water gas stream directly to a Picarro L1102-i for isotope determination. To test the accuracy and reproducibility of the line and to measure standard water during speleothem measurements, a syringe injection unit was added to the line. Peak evaluation is done similarly as in gas chromatography to obtain &delta D; and δ18O isotopic compositions of measured water aliquots. Precision is better than 1.5 ‰ for δ D and 0.4 ‰ for δ18O for water measurements for an extended range (−210 to 0 ‰ for δ D and −27 to 0 ‰ for δ18O) primarily dependent on the amount of water released from speleothem fluid inclusions and secondarily on the isotopic composition of the sample. The results show that WS-CRDS technology is suitable for speleothem fluid inclusion measurements and gives results that are comparable to the isotope ratio mass spectrometry (IRMS) technique.
Resumo:
In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.
Resumo:
This paper reports the first derived thermo-optical properties for vacuum deposited infrared thin films embedded in multilayers. These properties were extracted from the temperature-dependence of manufactured narrow bandpass filters across the 4-17 µm mid-infrared wavelength region. Using a repository of spaceflight multi-cavity bandpass filters, the thermo-optical expansion coefficients of PbTe and ZnSe were determined across an elevated temperature range 20-160 ºC. Embedded ZnSe films showed thermo-optical properties similar to reported bulk values, whilst the embedded PbTe films of lower optical density, deviate from reference literature sources. Detailed knowledge of derived coefficients is essential to the multilayer design of temperature-invariant narrow bandpass filters for use in non-cooled infrared detection systems. We further present manufacture of the first reported temperature-invariant multi-cavity narrow bandpass filter utilizing PbS chalcogenide layer material.
Resumo:
Biochars are biological residues combusted under low oxygen conditions, resulting in a porous, low density carbon rich material. Their large surface areas and cation exchange capacities, determined to a large extent by source materials and pyrolysis temperatures, enables enhanced sorption of both organic and inorganic contaminants to their surfaces, reducing pollutant mobility when amending contaminated soils. Liming effects or release of carbon into soil solution may increase arsenic mobility, whilst low capital but enhanced retention of plant nutrients can restrict revegetation on degraded soils amended only with biochars; the combination of composts, manures and other amendments with biochars could be their most effective deployment to soils requiring stabilisation by revegetation. Specific mechanisms of contaminant-biochar retention and release over time and the environmental impact of biochar amendments on soil organisms remain somewhat unclear but must be investigated to ensure that the management of environmental pollution coincides with ecological sustainability.
Resumo:
This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.
Resumo:
Crude enzymes produced via solid state fermentation (SSF) using wheat milling by-products have been employed for both fermentation media production using flour-rich waste (FRW) streams and lysis of Rhodosporidium toruloides yeast cells. Filter sterilization of crude hydrolysates was more beneficial than heat sterilization regarding yeast growth and microbial oil production. The initial carbon to free amino nitrogen ratio of crude hydrolysates was optimized (80.2 g/g) in fed-batch cultures of R. toruloides leading to a total dry weight of 61.2 g/L with microbial oil content of 61.8 % (w/w). Employing a feeding strategy where the glucose concentration was maintained in the range of 12.2 – 17.6 g/L led to the highest productivity (0.32 g/L∙h). The crude enzymes produced by SSF were utilised for yeast cell treatment leading to simultaneous release of around 80% of total lipids in the broth and production of a hydrolysate suitable as yeast extract replacement.
Resumo:
Estimating trajectories and parameters of dynamical systems from observations is a problem frequently encountered in various branches of science; geophysicists for example refer to this problem as data assimilation. Unlike as in estimation problems with exchangeable observations, in data assimilation the observations cannot easily be divided into separate sets for estimation and validation; this creates serious problems, since simply using the same observations for estimation and validation might result in overly optimistic performance assessments. To circumvent this problem, a result is presented which allows us to estimate this optimism, thus allowing for a more realistic performance assessment in data assimilation. The presented approach becomes particularly simple for data assimilation methods employing a linear error feedback (such as synchronization schemes, nudging, incremental 3DVAR and 4DVar, and various Kalman filter approaches). Numerical examples considering a high gain observer confirm the theory.
Resumo:
A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.
Resumo:
This paper discusses an important issue related to the implementation and interpretation of the analysis scheme in the ensemble Kalman filter . I t i s shown that the obser vations must be treated as random variables at the analysis steps. That is, one should add random perturbations with the correct statistics to the obser vations and generate an ensemble of obser vations that then is used in updating the ensemble of model states. T raditionally , this has not been done in previous applications of the ensemble Kalman filter and, as will be shown, this has resulted in an updated ensemble with a variance that is too low . This simple modification of the analysis scheme results in a completely consistent approach if the covariance of the ensemble of model states is interpreted as the prediction error covariance, and there are no further requirements on the ensemble Kalman filter method, except for the use of an ensemble of sufficient size. Thus, there is a unique correspondence between the error statistics from the ensemble Kalman filter and the standard Kalman filter approach
Resumo:
The weak-constraint inverse for nonlinear dynamical models is discussed and derived in terms of a probabilistic formulation. The well-known result that for Gaussian error statistics the minimum of the weak-constraint inverse is equal to the maximum-likelihood estimate is rederived. Then several methods based on ensemble statistics that can be used to find the smoother (as opposed to the filter) solution are introduced and compared to traditional methods. A strong point of the new methods is that they avoid the integration of adjoint equations, which is a complex task for real oceanographic or atmospheric applications. they also avoid iterative searches in a Hilbert space, and error estimates can be obtained without much additional computational effort. the feasibility of the new methods is illustrated in a two-layer quasigeostrophic model.
Resumo:
Nonlinear data assimilation is high on the agenda in all fields of the geosciences as with ever increasing model resolution and inclusion of more physical (biological etc.) processes, and more complex observation operators the data-assimilation problem becomes more and more nonlinear. The suitability of particle filters to solve the nonlinear data assimilation problem in high-dimensional geophysical problems will be discussed. Several existing and new schemes will be presented and it is shown that at least one of them, the Equivalent-Weights Particle Filter, does indeed beat the curse of dimensionality and provides a way forward to solve the problem of nonlinear data assimilation in high-dimensional systems.