35 resultados para Editor of flow analysis methods

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) offers the analyst a number of key advantages for the analysis of the components of foods. CE offers better resolution than, say, high-performance liquid chromatography (HPLC), and is more adept at the simultaneous separation of a number of components of different chemistries within a single matrix. In addition, CE requires less rigorous sample cleanup procedures than HPLC, while offering the same degree of automation. However, despite these advantages, CE remains under-utilized by food analysts. Therefore, this review consolidates and discusses the currently reported applications of CE that are relevant to the analysis of foods. Some discussion is also devoted to the development of these reported methods and to the advantages/disadvantages compared with the more usual methods for each particular analysis. It is the aim of this review to give practicing food analysts an overview of the current scope of CE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly, the microbiological scientific community is relying on molecular biology to define the complexity of the gut flora and to distinguish one organism from the next. This is particularly pertinent in the field of probiotics, and probiotic therapy, where identifying probiotics from the commensal flora is often warranted. Current techniques, including genetic fingerprinting, gene sequencing, oligonucleotide probes and specific primer selection, discriminate closely related bacteria with varying degrees of success. Additional molecular methods being employed to determine the constituents of complex microbiota in this area of research are community analysis, denaturing gradient gel electrophoresis (DGGE)/temperature gradient gel electrophoresis (TGGE), fluorescent in situ hybridisation (FISH) and probe grids. Certain approaches enable specific aetiological agents to be monitored, whereas others allow the effects of dietary intervention on bacterial populations to be studied. Other approaches demonstrate diversity, but may not always enable quantification of the population. At the heart of current molecular methods is sequence information gathered from culturable organisms. However, the diversity and novelty identified when applying these methods to the gut microflora demonstrates how little is known about this ecosystem. Of greater concern is the inherent bias associated with some molecular methods. As we understand more of the complexity and dynamics of this diverse microbiota we will be in a position to develop more robust molecular-based technologies to examine it. In addition to identification of the microbiota and discrimination of probiotic strains from commensal organisms, the future of molecular biology in the field of probiotics and the gut flora will, no doubt, stretch to investigations of functionality and activity of the microflora, and/or specific fractions. The quest will be to demonstrate the roles of probiotic strains in vivo and not simply their presence or absence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Meta-analyses show that cognitive behaviour therapy for psychosis (CBT-P) improves distressing positive symptoms. However, it is a complex intervention involving a range of techniques. No previous study has assessed the delivery of the different elements of treatment and their effect on outcome. Our aim was to assess the differential effect of type of treatment delivered on the effectiveness of CBT-P, using novel statistical methodology. Method. The Psychological Prevention of Relapse in Psychosis (PRP) trial was a multi-centre randomized controlled trial (RCT) that compared CBT-P with treatment as usual (TAU). Therapy was manualized, and detailed evaluations of therapy delivery and client engagement were made. Follow-up assessments were made at 12 and 24 months. In a planned analysis, we applied principal stratification (involving structural equation modelling with finite mixtures) to estimate intention-to-treat (ITT) effects for subgroups of participants, defined by qualitative and quantitative differences in receipt of therapy, while maintaining the constraints of randomization. Results. Consistent delivery of full therapy, including specific cognitive and behavioural techniques, was associated with clinically and statistically significant increases in months in remission, and decreases in psychotic and affective symptoms. Delivery of partial therapy involving engagement and assessment was not effective. Conclusions. Our analyses suggest that CBT-P is of significant benefit on multiple outcomes to patients able to engage in the full range of therapy procedures. The novel statistical methods illustrated in this report have general application to the evaluation of heterogeneity in the effects of treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microbial metabolism of proteins and amino acids by human gut bacteria generates a variety of compounds including phenol, indole, and sulfur compounds and branched chain fatty acids, many of which have been shown to elicit a toxic effect on the lumen. Bacterial fermentation of amino acids and proteins occurs mainly in the distal colon, a site that is often fraught with symptoms from disorders including ulcerative colitis (UC) and colorectal cancer (CRC). In contrast to carbohydrate metabolism by the gut microbiota, proteolysis is less extensively researched. Many metabolites are low molecular weight, volatile compounds. This review will summarize the use of analytical methods to detect and identify compounds in order to elucidate the relationship between specific dietary proteinaceous substrates, their corresponding metabolites, and implications for gastrointestinal health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Turbulence statistics obtained by direct numerical simulations are analysed to investigate spatial heterogeneity within regular arrays of building-like cubical obstacles. Two different array layouts are studied, staggered and square, both at a packing density of p=0.25 . The flow statistics analysed are mean streamwise velocity ( u ), shear stress ( uw ), turbulent kinetic energy (k) and dispersive stress fraction ( uw ). The spatial flow patterns and spatial distribution of these statistics in the two arrays are found to be very different. Local regions of high spatial variability are identified. The overall spatial variances of the statistics are shown to be generally very significant in comparison with their spatial averages within the arrays. Above the arrays the spatial variances as well as dispersive stresses decay rapidly to zero. The heterogeneity is explored further by separately considering six different flow regimes identified within the arrays, described here as: channelling region, constricted region, intersection region, building wake region, canyon region and front-recirculation region. It is found that the flow in the first three regions is relatively homogeneous, but that spatial variances in the latter three regions are large, especially in the building wake and canyon regions. The implication is that, in general, the flow immediately behind (and, to a lesser extent, in front of) a building is much more heterogeneous than elsewhere, even in the relatively dense arrays considered here. Most of the dispersive stress is concentrated in these regions. Considering the experimental difficulties of obtaining enough point measurements to form a representative spatial average, the error incurred by degrading the sampling resolution is investigated. It is found that a good estimate for both area and line averages can be obtained using a relatively small number of strategically located sampling points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of recent experiments suggest that, at a given wetting speed, the dynamic contact angle formed by an advancing liquid-gas interface with a solid substrate depends on the flow field and geometry near the moving contact line. In the present work, this effect is investigated in the framework of an earlier developed theory that was based on the fact that dynamic wetting is, by its very name, a process of formation of a new liquid-solid interface (newly wetted solid surface) and hence should be considered not as a singular problem but as a particular case from a general class of flows with forming or/and disappearing interfaces. The results demonstrate that, in the flow configuration of curtain coating, where a liquid sheet (curtain) impinges onto a moving solid substrate, the actual dynamic contact angle indeed depends not only on the wetting speed and material constants of the contacting media, as in the so-called slip models, but also on the inlet velocity of the curtain, its height, and the angle between the falling curtain and the solid surface. In other words, for the same wetting speed the dynamic contact angle can be varied by manipulating the flow field and geometry near the moving contact line. The obtained results have important experimental implications: given that the dynamic contact angle is determined by the values of the surface tensions at the contact line and hence depends on the distributions of the surface parameters along the interfaces, which can be influenced by the flow field, one can use the overall flow conditions and the contact angle as a macroscopic multiparametric signal-response pair that probes the dynamics of the liquid-solid interface. This approach would allow one to investigate experimentally such properties of the interface as, for example, its equation of state and the rheological properties involved in the interfaces response to an external torque, and would help to measure its parameters, such as the coefficient of sliding friction, the surface-tension relaxation time, and so on.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The unsaturated zone exerts a major control on the delivery of nutrients to Chalk streams, yet flow and transport processes in this complex, dual-porosity medium have remained controversial. A major challenge arises in characterising these processes, both at the detailed mechanistic level and at an appropriate level for inclusion within catchment-scale models for nutrient management. The lowland catchment research (LOCAR) programme in the UK has provided a unique set of comprehensively instrumented groundwater-dominated catchments. Of these, the Pang and Lambourn, tributaries of the Thames near Reading, have been a particular focus for research into subsurface processes and surface water-groundwater interactions. Data from LOCAR and other sources, along with a new dual permeability numerical model of the Chalk, have been used to explore the relative roles of matrix and fracture flow within the unsaturated zone and resolve conflicting hypotheses of response. From the improved understanding gained through these explorations, a parsimonious conceptualisation of the general response of flow and transport within the Chalk unsaturated zone was formulated. This paper summarises the modelling and data findings of these explorations, and describes the integration of the new simplified unsaturated zone representation with a catchment-scale model of nutrients (INCA), resulting in a new model for catchment-scale flow and transport within Chalk systems: INCA-Chalk. This model is applied to the Lambourn, and results, including hindcast and forecast simulations, are presented. These clearly illustrate the decadal time-scales that need to be considered in the context of nutrient management and the EU Water Framework Directive. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilberts curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilberts curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilberts curve, Sammons mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilberts curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilberts curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ECMWF ensemble weather forecasts are generated by perturbing the initial conditions of the forecast using a subset of the singular vectors of the linearised propagator. Previous results show that when creating probabilistic forecasts from this ensemble better forecasts are obtained if the mean of the spread and the variability of the spread are calibrated separately. We show results from a simple linear model that suggest that this may be a generic property for all singular vector based ensemble forecasting systems based on only a subset of the full set of singular vectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuing importance of blue denim maintains indigo as an important vat dye industrially. In this review, we examine the various methods that have been used in the past and are currently used to reduce and dissolve indigo for dyeing. We discuss recent insights into the bacterial fermentation technology, the advantages and disadvantages of the direct chemical methods that have predominated for the last century and potentially cleaner technologies of catalytic hydrogenation and electrochemistry, which are becoming increasingly important. With considerations of environmental impact high on the dyeing industry's agenda, we also discuss the developments that have led to the production of pre-reduced indigo.