139 resultados para Processing steps
em CentAUR: Central Archive University of Reading - UK
Resumo:
Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Distillers’ Dried Grains with Solubles (DDGS) is the major by-product of bioethanol and distillery plants. Due to its high content of proteins, water-soluble vitamins and minerals, DDGS has been long marketed as animal feed for livestock. EU legislation on liquid biofuels could raise the demand on bioethanol production in Europe, with a resulting increase in DDGS availability. DDGS contains a spectrum of complex organic macromolecules, particularly polysaccharides, in addition to proteins and vitamins, and its use as a starting raw material within a biomass-based biorefining strategy could lead to the development of multi-stream processes for the production of commodities, platform molecules or speciality chemicals, with concomitant economic benefits and waste reduction for bioethanol plants. The present review aims to outline the compositional characteristics of DDGS and evaluate its potential utilisation as a starting material for the production of added-value products. Parameters of influence on the chemical and physical characteristics of DDGS are discussed. Moreover, various pre-treatment strategies are outlined in terms of efficient DDGS fractionation into several added value streams. Additional processing steps for the production of medium and high added value compounds from DDGS are evaluated and their potential applications in the food and chemical industry sector are identified.
Resumo:
BACKGROUND: There is an increasing interest in obtaining natural products with bioactive properties, using fermentation technology. However, the downstream processing consisting of multiple steps can be complicated, leading to increase in the final cost of the product. Therefore there is a need for integrated, cost-effective and scalable separation processes. RESULTS: The present study investigates the use of colloidal gas aphrons (CGA), which are surfactant-stabilized microbubbles, as a novel method for downstream processing. More particularly, their application for the recovery of astaxanthin from the cells of Phaffia rhodozyma is explored. Research carried out with standard solutions of astaxanthin and CGA generated from the cationic surfactant hexadecyl. trimethyl ammonium bromide (CTAB) showed that up to 90% recovery can be achieved under optimum conditions, i.e., pH 11 with NaOH 0.2 mol L-1. In the case of the cells' suspension from the fermentation broth, three different approaches were investigated: (a) the conventional integrated approach where CGA were applied directly; (b) CGA were applied to the clarified suspension of cells; and finally (c) the in situ approach, where CGA are generated within the clarified suspension of cells. Interestingly, in the case of the whole suspension (approach a) highest recoveries (78%) were achieved under the same conditions found to be optimal for the standard solutions. In addition, up to 97% recovery of total carotenoids could be achieved from the clarified suspension after pretreatment with NaOH. This pretreatment led to maximum cell disruption as well as optimum conditioning for subsequent CGA separation. CONCLUSIONS: These results demonstrate the potential of CGA for the recovery of bioactive components from complex feedstock. (c) 2008 Society of Chemical Industry.
Resumo:
Anthropogenic emissions of heat and exhaust gases play an important role in the atmospheric boundary layer, altering air quality, greenhouse gas concentrations and the transport of heat and moisture at various scales. This is particularly evident in urban areas where emission sources are integrated in the highly heterogeneous urban canopy layer and directly linked to human activities which exhibit significant temporal variability. It is common practice to use eddy covariance observations to estimate turbulent surface fluxes of latent heat, sensible heat and carbon dioxide, which can be attributed to a local scale source area. This study provides a method to assess the influence of micro-scale anthropogenic emissions on heat, moisture and carbon dioxide exchange in a highly urbanized environment for two sites in central London, UK. A new algorithm for the Identification of Micro-scale Anthropogenic Sources (IMAS) is presented, with two aims. Firstly, IMAS filters out the influence of micro-scale emissions and allows for the analysis of the turbulent fluxes representative of the local scale source area. Secondly, it is used to give a first order estimate of anthropogenic heat flux and carbon dioxide flux representative of the building scale. The algorithm is evaluated using directional and temporal analysis. The algorithm is then used at a second site which was not incorporated in its development. The spatial and temporal local scale patterns, as well as micro-scale fluxes, appear physically reasonable and can be incorporated in the analysis of long-term eddy covariance measurements at the sites in central London. In addition to the new IMAS-technique, further steps in quality control and quality assurance used for the flux processing are presented. The methods and results have implications for urban flux measurements in dense urbanised settings with significant sources of heat and greenhouse gases.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
An association between interpretation of ambiguity and anxiety may exist in children, but findings have been equivocal. The present research utilized the Interpretation Generation Questionnaire for Children (IGQ-C), a novel measure that breaks down the processing of ambiguity into three steps: the generation of possible interpretations, the selection of the most likely interpretation and the anticipated emotional response to the ambiguous situation. The IGQ-C was completed by 103 children aged 11–12 years, 28 of whom had a clinical anxiety disorder. There was some evidence for an association between anxiety and: (1) the generation of initial negative interpretations; (2) the generation of a greater number of negative interpretations overall; and (3) the selection of negative responses. These findings were not consistent across measures of anxiety. A more convincing association was found between child anxiety and anticipated emotional response to the ambiguous scenarios, with anxious children anticipating more negative emotion.
Resumo:
Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationality
Resumo:
The photochemical evolution of an anthropogenic plume from the New-York/Boston region during its transport at low altitudes over the North Atlantic to the European west coast has been studied using a Lagrangian framework. This plume, originally strongly polluted, was sampled by research aircraft just off the North American east coast on 3 successive days, and 3 days downwind off the west coast of Ireland where another aircraft re-sampled a weakly polluted plume. Changes in trace gas concentrations during transport were reproduced using a photochemical trajectory model including deposition and mixing effects. Chemical and wet deposition processing dominated the evolution of all pollutants in the plume. The mean net O3 production was evaluated to be -5 ppbv/day leading to low values of O3 by the time the plume reached Europe. Wet deposition of nitric acid was responsible for an 80% reduction in this O3 production. If the plume had not encountered precipitation, it would have reached the Europe with O3 levels up to 80-90 ppbv, and CO levels between 120 and 140 ppbv. Photochemical destruction also played a more important role than mixing in the evolution of plume CO due to high levels of both O3 and water vapour showing that CO cannot always be used as a tracer for polluted air masses, especially for plumes transported at low altitudes. The results also show that, in this case, an important increase in the O3/CO slope can be attributed to chemical destruction of CO and not to photochemical O3 production as is often assumed.
Resumo:
We construct a mapping from complex recursive linguistic data structures to spherical wave functions using Smolensky's filler/role bindings and tensor product representations. Syntactic language processing is then described by the transient evolution of these spherical patterns whose amplitudes are governed by nonlinear order parameter equations. Implications of the model in terms of brain wave dynamics are indicated.
Resumo:
In recent years there has been an increasing awareness of the radiological impact of non-nuclear industries that extract and/or process ores and minerals containing naturally occurring radioactive material (NORM). These industrial activities may result in significant radioactive contamination of (by-) products, wastes and plant installations. In this study, scale samples were collected from a decommissioned phosphoric acid processing plant. To determine the nature and concentration of NORM retained in pipe-work and associated process plant, four main areas of the site were investigated: (1) the 'Green Acid Plant', where crude acid was concentrated; (2) the green acid storage tanks; (3) the Purified White Acid (PWA) plant, where inorganic impurities were removed; and (4) the solid waste, disposed of on-site as landfill. The scale samples predominantly comprise the following: fluorides (e.g. ralstonite); calcium sulphate (e.g. gypsum); and an assemblage of mixed fluorides and phosphates (e.g. iron fluoride hydrate, calcium phosphate), respectively. The radioactive inventory is dominated by U-238 and its decay chain products, and significant fractionation along the series occurs. Compared to the feedstock ore, elevated concentrations (<= 8.8 Bq/g) of U-238 Were found to be retained in installations where the process stream was rich in fluorides and phosphates. In addition, enriched levels (<= 11 Bq/g) of Ra-226 were found in association with precipitates of calcium sulphate. Water extraction tests indicate that many of the scales and waste contain significantly soluble materials and readily release radioactivity into solution. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.