9 resultados para Meshfree particle methods
em CentAUR: Central Archive University of Reading - UK
Resumo:
Particle size distribution (psd) is one of the most important features of the soil because it affects many of its other properties, and it determines how soil should be managed. To understand the properties of chalk soil, psd analyses should be based on the original material (including carbonates), and not just the acid-resistant fraction. Laser-based methods rather than traditional sedimentation methods are being used increasingly to determine particle size to reduce the cost of analysis. We give an overview of both approaches and the problems associated with them for analyzing the psd of chalk soil. In particular, we show that it is not appropriate to use the widely adopted 8 pm boundary between the clay and silt size fractions for samples determined by laser to estimate proportions of these size fractions that are equivalent to those based on sedimentation. We present data from field and national-scale surveys of soil derived from chalk in England. Results from both types of survey showed that laser methods tend to over-estimate the clay-size fraction compared to sedimentation for the 8 mu m clay/silt boundary, and we suggest reasons for this. For soil derived from chalk, either the sedimentation methods need to be modified or it would be more appropriate to use a 4 pm threshold as an interim solution for laser methods. Correlations between the proportions of sand- and clay-sized fractions, and other properties such as organic matter and volumetric water content, were the opposite of what one would expect for soil dominated by silicate minerals. For water content, this appeared to be due to the predominance of porous, chalk fragments in the sand-sized fraction rather than quartz grains, and the abundance of fine (<2 mu m) calcite crystals rather than phyllosilicates in the clay-sized fraction. This was confirmed by scanning electron microscope (SEM) analyses. "Of all the rocks with which 1 am acquainted, there is none whose formation seems to tax the ingenuity of theorists so severely, as the chalk, in whatever respect we may think fit to consider it". Thomas Allan, FRS Edinburgh 1823, Transactions of the Royal Society of Edinburgh. (C) 2009 Natural Environment Research Council (NERC) Published by Elsevier B.V. All rights reserved.
Resumo:
The application of particle filters in geophysical systems is reviewed. Some background on Bayesian filtering is provided, and the existing methods are discussed. The emphasis is on the methodology, and not so much on the applications themselves. It is shown that direct application of the basic particle filter (i.e., importance sampling using the prior as the importance density) does not work in high-dimensional systems, but several variants are shown to have potential. Approximations to the full problem that try to keep some aspects of the particle filter beyond the Gaussian approximation are also presented and discussed.
Resumo:
Recently, studies have shown that the classroom environment is very important for students' health and performance. Thus, the evaluation of indoor air quality (IAQ) in a classroom is necessary to ensure students' well-being. In this paper the emphasis is on airborne concentration of particulate matter (PM) in adult education rooms. The mass concentration of PM10 particulates was measured in two classrooms under different ventilation methods in the University of Reading, UK, during the winter period of 2008. In another study the measurement of the concentration of particles was accompanied with measurements of CO2 concentration in these classrooms but this study is the subject of another publication. The ambient PM10, temperature, relative humidity, wind speed and direction, and rainfall events were monitored as well. In general, this study showed that outdoor particle concentrations and outdoor meteorological parameters were identified as significant factors influencing indoor particle concentration levels. Ventilation methods showed significant effects on air change rate and on indoor/outdoor (I/O) concentration ratios. Higher levels of indoor particulates were seen during occupancy periods. I/O ratios were significantly higher when classrooms were occupied than when they were unoccupied, indicating the effect of both people presence and outdoor particle concentration levels. The concentrations of PM10 indoors and outdoors did not meet the requirements of WHO standards for PM10 annual average.
Resumo:
The 3D reconstruction of a Golgi-stained dendritic tree from a serial stack of images captured with a transmitted light bright-field microscope is investigated. Modifications to the bootstrap filter are discussed such that the tree structure may be estimated recursively as a series of connected segments. The tracking performance of the bootstrap particle filter is compared against Differential Evolution, an evolutionary global optimisation method, both in terms of robustness and accuracy. It is found that the particle filtering approach is significantly more robust and accurate for the data considered.
Resumo:
The Boltzmann equation in presence of boundary and initial conditions, which describes the general case of carrier transport in microelectronic devices is analysed in terms of Monte Carlo theory. The classical Ensemble Monte Carlo algorithm which has been devised by merely phenomenological considerations of the initial and boundary carrier contributions is now derived in a formal way. The approach allows to suggest a set of event-biasing algorithms for statistical enhancement as an alternative of the population control technique, which is virtually the only algorithm currently used in particle simulators. The scheme of the self-consistent coupling of Boltzmann and Poisson equation is considered for the case of weighted particles. It is shown that particles survive the successive iteration steps.
Resumo:
Accurate estimates for the fall speed of natural hydrometeors are vital if their evolution in clouds is to be understood quantitatively. In this study, laboratory measurements of the terminal velocity vt for a variety of ice particle models settling in viscous fluids, along with wind-tunnel and field measurements of ice particles settling in air, have been analyzed and compared to common methods of computing vt from the literature. It is observed that while these methods work well for a number of particle types, they fail for particles with open geometries, specifically those particles for which the area ratio Ar is small (Ar is defined as the area of the particle projected normal to the flow divided by the area of a circumscribing disc). In particular, the fall speeds of stellar and dendritic crystals, needles, open bullet rosettes, and low-density aggregates are all overestimated. These particle types are important in many cloud types: aggregates in particular often dominate snow precipitation at the ground and vertically pointing Doppler radar measurements. Based on the laboratory data, a simple modification to previous computational methods is proposed, based on the area ratio. This new method collapses the available drag data onto an approximately universal curve, and the resulting errors in the computed fall speeds relative to the tank data are less than 25% in all cases. Comparison with the (much more scattered) measurements of ice particles falling in air show strong support for this new method, with the area ratio bias apparently eliminated.
Resumo:
Almost all research fields in geosciences use numerical models and observations and combine these using data-assimilation techniques. With ever-increasing resolution and complexity, the numerical models tend to be highly nonlinear and also observations become more complicated and their relation to the models more nonlinear. Standard data-assimilation techniques like (ensemble) Kalman filters and variational methods like 4D-Var rely on linearizations and are likely to fail in one way or another. Nonlinear data-assimilation techniques are available, but are only efficient for small-dimensional problems, hampered by the so-called ‘curse of dimensionality’. Here we present a fully nonlinear particle filter that can be applied to higher dimensional problems by exploiting the freedom of the proposal density inherent in particle filtering. The method is illustrated for the three-dimensional Lorenz model using three particles and the much more complex 40-dimensional Lorenz model using 20 particles. By also applying the method to the 1000-dimensional Lorenz model, again using only 20 particles, we demonstrate the strong scale-invariance of the method, leading to the optimistic conjecture that the method is applicable to realistic geophysical problems. Copyright c 2010 Royal Meteorological Society
Resumo:
The A-Train constellation of satellites provides a new capability to measure vertical cloud profiles that leads to more detailed information on ice-cloud microphysical properties than has been possible up to now. A variational radar–lidar ice-cloud retrieval algorithm (VarCloud) takes advantage of the complementary nature of the CloudSat radar and Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar to provide a seamless retrieval of ice water content, effective radius, and extinction coefficient from the thinnest cirrus (seen only by the lidar) to the thickest ice cloud (penetrated only by the radar). In this paper, several versions of the VarCloud retrieval are compared with the CloudSat standard ice-only retrieval of ice water content, two empirical formulas that derive ice water content from radar reflectivity and temperature, and retrievals of vertically integrated properties from the Moderate Resolution Imaging Spectroradiometer (MODIS) radiometer. The retrieved variables typically agree to within a factor of 2, on average, and most of the differences can be explained by the different microphysical assumptions. For example, the ice water content comparison illustrates the sensitivity of the retrievals to assumed ice particle shape. If ice particles are modeled as oblate spheroids rather than spheres for radar scattering then the retrieved ice water content is reduced by on average 50% in clouds with a reflectivity factor larger than 0 dBZ. VarCloud retrieves optical depths that are on average a factor-of-2 lower than those from MODIS, which can be explained by the different assumptions on particle mass and area; if VarCloud mimics the MODIS assumptions then better agreement is found in effective radius and optical depth is overestimated. MODIS predicts the mean vertically integrated ice water content to be around a factor-of-3 lower than that from VarCloud for the same retrievals, however, because the MODIS algorithm assumes that its retrieved effective radius (which is mostly representative of cloud top) is constant throughout the depth of the cloud. These comparisons highlight the need to refine microphysical assumptions in all retrieval algorithms and also for future studies to compare not only the mean values but also the full probability density function.
Resumo:
The tiger nut tuber of the Cyperus esculentus L. plant is an unusual storage system with similar amounts of starch and lipid. The extraction of its oil employing both mechanical pressing and aqueous enzymatic extraction (AEE) methods was investigated and an examination of the resulting products was carried out. The effects of particle size and moisture content of the tuber on the yield of tiger nut oil with pressing were initially studied. Smaller particles were found to enhance oil yields while a range of moisture content was observed to favour higher oil yields. When samples were first subjected to high pressures up to 700 MPa before pressing at 38 MPa there was no increase in the oil yields. Ground samples incubated with a mixture of α- Amylase, Alcalase, and Viscozyme (a mixture of cell wall degrading enzyme) as a pre-treatment, increased oil yield by pressing and 90% of oil was recovered as a result. When aqueous enzymatic extraction was carried out on ground samples, the use of α- Amylase, Alcalase, and Celluclast independently improved extraction oil yields compared to oil extraction without enzymes by 34.5, 23.4 and 14.7% respectively. A mixture of the three enzymes further augmented the oil yield and different operational factors were individually studied for their effects on the process. These include time, total mixed enzyme concentration, linear agitation speed, and solid-liquid ratio. The largest oil yields were obtained with a solid-liquid ratio of 1:6, mixed enzyme concentration of 1% (w/w) and 6 h incubation time although the longer time allowed for the formation of an emulsion. Using stationary samples during incubation surprisingly gave the highest oil yields, and this was observed to be as a result of gravity separation occurring during agitation. Furthermore, the use of high pressure processing up to 300 MPa as a pre-treatment enhanced oil yields but additional pressure increments had a detrimental effect. The quality of oils recovered from both mechanical and aqueous enzymatic extraction based on the percentage free fatty acid (% FFA) and peroxide values (PV) all reflected the good stabilities of the oils with the highest % FFA of 1.8 and PV of 1.7. The fatty acid profiles of all oils also remained unchanged. The level of tocopherols in oils were enhanced with both enzyme aided pressing (EAP) and high pressure processing before AEE. Analysis on the residual meals revealed DP 3 and DP 4 oligosaccharides present in EAP samples but these would require further assessment on their identity and quality.