925 resultados para Free-space method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Kriging interpolation method is combined with an object-based evaluation measure to assess the ability of the UK Met Office's dispersion and weather prediction models to predict the evolution of a plume of tracer as it was transported across Europe. The object-based evaluation method, SAL, considers aspects of the Structure, Amplitude and Location of the pollutant field. The SAL method is able to quantify errors in the predicted size and shape of the pollutant plume, through the structure component, the over- or under-prediction of the pollutant concentrations, through the amplitude component, and the position of the pollutant plume, through the location component. The quantitative results of the SAL evaluation are similar for both models and close to a subjective visual inspection of the predictions. A negative structure component for both models, throughout the entire 60 hour plume dispersion simulation, indicates that the modelled plumes are too small and/or too peaked compared to the observed plume at all times. The amplitude component for both models is strongly positive at the start of the simulation, indicating that surface concentrations are over-predicted by both models for the first 24 hours, but modelled concentrations are within a factor of 2 of the observations at later times. Finally, for both models, the location component is small for the first 48 hours after the start of the tracer release, indicating that the modelled plumes are situated close to the observed plume early on in the simulation, but this plume location error grows at later times. The SAL methodology has also been used to identify differences in the transport of pollution in the dispersion and weather prediction models. The convection scheme in the weather prediction model is found to transport more pollution vertically out of the boundary layer into the free troposphere than the dispersion model convection scheme resulting in lower pollutant concentrations near the surface and hence a better forecast for this case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper employs a state space system description to provide a pole placement scheme via state feedback. It is shown that when a recursive least squares estimation scheme is used, the feedback employed can be expressed simply in terms of the estimated system parameters. To complement the state feedback approach, a method employing both state feedback and linear output feedback is discussed. Both methods arc then compared with the previous output polynomial type feedback schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A predominance of small, dense low-density lipoprotein (LDL) is a major component of an atherogenic lipoprotein phenotype, and a common, but modifiable, source of increased risk for coronary heart disease in the free-living population. While much of the atherogenicity of small, dense LDL is known to arise from its structural properties, the extent to which an increase in the number of small, dense LDL particles (hyper-apoprotein B) contributes to this risk of coronary heart disease is currently unknown. This study reports a method for the recruitment of free-living individuals with an atherogenic lipoprotein phenotype for a fish-oil intervention trial, and critically evaluates the relationship between LDL particle number and the predominance of small, dense LDL. In this group, volunteers were selected through local general practices on the basis of a moderately raised plasma triacylglycerol (triglyceride) level (>1.5 mmol/l) and a low concentration of high-density-lipoprotein cholesterol (<1.1 mmol/l). The screening of LDL subclasses revealed a predominance of small, dense LDL (LDL subclass pattern B) in 62% of the cohort. As expected, subjects with LDL subclass pattern B were characterized by higher plasma triacylglycerol and lower high-density lipoprotein cholesterol (<1.1 mmol/l) levels and, less predictably, by lower LDL cholesterol and apoprotein B levels (P<0.05; LDL subclass A compared with subclass B). While hyper-apoprotein B was detected in only five subjects, the relative percentage of small, dense LDL-III in subjects with subclass B showed an inverse relationship with LDL apoprotein B (r=-0.57; P<0.001), identifying a subset of individuals with plasma triacylglycerol above 2.5 mmol/l and a low concentration of LDL almost exclusively in a small and dense form. These findings indicate that a predominance of small, dense LDL and hyper-apoprotein B do not always co-exist in free-living groups. Moreover, if coronary risk increases with increasing LDL particle number, these results imply that the risk arising from a predominance of small, dense LDL may actually be reduced in certain cases when plasma triacylglycerol exceeds 2.5 mmol/l.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A distributed Lagrangian moving-mesh finite element method is applied to problems involving changes of phase. The algorithm uses a distributed conservation principle to determine nodal mesh velocities, which are then used to move the nodes. The nodal values are obtained from an ALE (Arbitrary Lagrangian-Eulerian) equation, which represents a generalization of the original algorithm presented in Applied Numerical Mathematics, 54:450--469 (2005). Having described the details of the generalized algorithm it is validated on two test cases from the original paper and is then applied to one-phase and, for the first time, two-phase Stefan problems in one and two space dimensions, paying particular attention to the implementation of the interface boundary conditions. Results are presented to demonstrate the accuracy and the effectiveness of the method, including comparisons against analytical solutions where available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce the perspex machine which unifies projective geometry and Turing computation and results in a supra-Turing machine. We show two ways in which the perspex machine unifies symbolic and non-symbolic AI. Firstly, we describe concrete geometrical models that map perspexes onto neural networks, some of which perform only symbolic operations. Secondly, we describe an abstract continuum of perspex logics that includes both symbolic logics and a new class of continuous logics. We argue that an axiom in symbolic logic can be the conclusion of a perspex theorem. That is, the atoms of symbolic logic can be the conclusions of sub-atomic theorems. We argue that perspex space can be mapped onto the spacetime of the universe we inhabit. This allows us to discuss how a robot might be conscious, feel, and have free will in a deterministic, or semi-deterministic, universe. We ground the reality of our universe in existence. On a theistic point, we argue that preordination and free will are compatible. On a theological point, we argue that it is not heretical for us to give robots free will. Finally, we give a pragmatic warning as to the double-edged risks of creating robots that do, or alternatively do not, have free will.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coarse spacing of automatic rain gauges complicates near-real- time spatial analyses of precipitation. We test the possibility of improving such analyses by considering, in addition to the in situ measurements, the spatial covariance structure inferred from past observations with a denser network. To this end, a statistical reconstruction technique, reduced space optimal interpolation (RSOI), is applied over Switzerland, a region of complex topography. RSOI consists of two main parts. First, principal component analysis (PCA) is applied to obtain a reduced space representation of gridded high- resolution precipitation fields available for a multiyear calibration period in the past. Second, sparse real-time rain gauge observations are used to estimate the principal component scores and to reconstruct the precipitation field. In this way, climatological information at higher resolution than the near-real-time measurements is incorporated into the spatial analysis. PCA is found to efficiently reduce the dimensionality of the calibration fields, and RSOI is successful despite the difficulties associated with the statistical distribution of daily precipitation (skewness, dry days). Examples and a systematic evaluation show substantial added value over a simple interpolation technique that uses near-real-time observations only. The benefit is particularly strong for larger- scale precipitation and prominent topographic effects. Small-scale precipitation features are reconstructed at a skill comparable to that of the simple technique. Stratifying the reconstruction method by the types of weather type classifications yields little added skill. Apart from application in near real time, RSOI may also be valuable for enhancing instrumental precipitation analyses for the historic past when direct observations were sparse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are varieties of physical and behavioral factors to determine energy demand load profile. The attainment of the optimum mix of measures and renewable energy system deployment requires a simple method suitable for using at the early design stage. A simple method of formulating load profile (SMLP) for UK domestic buildings has been presented in this paper. Domestic space heating load profile for different types of houses have been produced using thermal dynamic model which has been developed using thermal resistant network method. The daily breakdown energy demand load profile of appliance, domestic hot water and space heating can be predicted using this method. The method can produce daily load profile from individual house to urban community. It is suitable to be used at Renewable energy system strategic design stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

P>To address whether seasonal variability exists among Shiga toxin-encoding bacteriophage (Stx phage) numbers on a cattle farm, conventional plaque assay was performed on water samples collected over a 17 month period. Distinct seasonal variation in bacteriophage numbers was evident, peaking between June and August. Removal of cattle from the pasture precipitated a reduction in bacteriophage numbers, and during the winter months, no bacteriophage infecting Escherichia coli were detected, a surprising occurrence considering that 1031 tailed-bacteriophages are estimated to populate the globe. To address this discrepancy a culture-independent method based on quantitative PCR was developed. Primers targeting the Q gene and stx genes were designed that accurately and discriminately quantified artificial mixed lambdoid bacteriophage populations. Application of these primer sets to water samples possessing no detectable phages by plaque assay, demonstrated that the number of lambdoid bacteriophage ranged from 4.7 x 104 to 6.5 x 106 ml-1, with one in 103 free lambdoid bacteriophages carrying a Shiga toxin operon (stx). Specific molecular biological tools and discriminatory gene targets have enabled virus populations in the natural environment to be enumerated and similar strategies could replace existing propagation-dependent techniques, which grossly underestimate the abundance of viral entities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the time evolution of the wave function which is the solution of a stochastic Schrödinger equation describing the dynamics of a free quantum particle subject to spontaneous localizations in space. We prove global existence and uniqueness of solutions. We observe that there exist three time regimes: the collapse regime, the classical regime and the diffusive regime. Concerning the latter, we assert that the general solution converges almost surely to a diffusing Gaussian wave function having a finite spread both in position as well as in momentum. This paper corrects and completes earlier works on this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fingerprint method for detecting anthropogenic climate change is applied to new simulations with a coupled ocean-atmosphere general circulation model (CGCM) forced by increasing concentrations of greenhouse gases and aerosols covering the years 1880 to 2050. In addition to the anthropogenic climate change signal, the space-time structure of the natural climate variability for near-surface temperatures is estimated from instrumental data over the last 134 years and two 1000 year simulations with CGCMs. The estimates are compared with paleoclimate data over 570 years. The space-time information on both the signal and the noise is used to maximize the signal-to-noise ratio of a detection variable obtained by applying an optimal filter (fingerprint) to the observed data. The inclusion of aerosols slows the predicted future warming. The probability that the observed increase in near-surface temperatures in recent decades is of natural origin is estimated to be less than 5%. However, this number is dependent on the estimated natural variability level, which is still subject to some uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is shown how a renormalization technique, which is a variant of classical Krylov–Bogolyubov–Mitropol’skii averaging, can be used to obtain slow evolution equations for the vortical and inertia–gravity wave components of the dynamics in a rotating flow. The evolution equations for each component are obtained to second order in the Rossby number, and the nature of the coupling between the two is analyzed carefully. It is also shown how classical balance models such as quasigeostrophic dynamics and its second-order extension appear naturally as a special case of this renormalized system, thereby providing a rigorous basis for the slaving approach where only the fast variables are expanded. It is well known that these balance models correspond to a hypothetical slow manifold of the parent system; the method herein allows the determination of the dynamics in the neighborhood of such solutions. As a concrete illustration, a simple weak-wave model is used, although the method readily applies to more complex rotating fluid models such as the shallow-water, Boussinesq, primitive, and 3D Euler equations.