22 resultados para Billings Petroleum

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rhizoremediation is a bioremediation technique whereby enhanced microbial degradation of organic contaminants occurs within the plant root zone (rhizosphere). It is considered an effective and affordable ‘green technology’ for remediating soils contaminated with petroleum hydrocarbons (PHCs). This paper critically reviews the potential role of root exuded compounds in rhizoremediation, with emphasis on commonly exuded low molecular weight aliphatic organic acid anions (carboxylates). The extent to which remediation is achieved shows wide disparity among plant species. Therefore, plant selection is crucial for the advancement and widespread adoption of this technology. Root exudation is speculated to be one of the predominant factors leading to microbial changes in the rhizosphere and thus the potential driver behind enhanced petroleum biodegradation. Carboxylates can form a significant component of the root exudate mixture and are hypothesised to enhance petroleum biodegradation by: i) providing an easily degradable energy source; ii) increasing phosphorus supply; and/or iii) enhancing the contaminant bioavailability. These differing hypotheses, which are not mutually exclusive, require further investigation to progress our understanding of plant–microbe interactions with the aim to improve plant species selection and the efficacy of rhizoremediation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lime treatment of hydrocarbon-contaminated soils offers the potential to stabilize and solidify these materials, with a consequent reduction in the risks associated with the leachate emanating from them. This can aid the disposal of contaminated soils or enable their on-site treatment. In this study, the addition of hydrated lime and quicklime significantly reduced the leaching of total petroleum hydrocarbons (TPH) from soils polluted with a 50:50 petrol/diesel mixture. Treatment with quicklime was slightly more effective, but hydrated lime may be better in the field because of its ease of handling. It is proposed that this occurs as a consequence of pozzolanic reactions retaining the hydrocarbons within the soil matrix. There was some evidence that this may be a temporary effect, as leaching increased between seven and 21 days after treatment, but the TPH concentrations in the leachate of treated soils were still one order of magnitude below those of the control soil, offering significant protection to groundwater. The reduction in leaching following treatment was observed in both aliphatic and aromatic fractions, but the latter were more affected because of their higher solubilty. The results are discussed in the context of risk assessment, and recommendations for future research are made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

General circulation models (GCMs) use the laws of physics and an understanding of past geography to simulate climatic responses. They are objective in character. However, they tend to require powerful computers to handle vast numbers of calculations. Nevertheless, it is now possible to compare results from different GCMs for a range of times and over a wide range of parameterisations for the past, present and future (e.g. in terms of predictions of surface air temperature, surface moisture, precipitation, etc.). GCMs are currently producing simulated climate predictions for the Mesozoic, which compare favourably with the distributions of climatically sensitive facies (e.g. coals, evaporites and palaeosols). They can be used effectively in the prediction of oceanic upwelling sites and the distribution of petroleum source rocks and phosphorites. Models also produce evaluations of other parameters that do not leave a geological record (e.g. cloud cover, snow cover) and equivocal phenomena such as storminess. Parameterisation of sub-grid scale processes is the main weakness in GCMs (e.g. land surfaces, convection, cloud behaviour) and model output for continental interiors is still too cold in winter by comparison with palaeontological data. The sedimentary and palaeontological record provides an important way that GCMs may themselves be evaluated and this is important because the same GCMs are being used currently to predict possible changes in future climate. The Mesozoic Earth was, by comparison with the present, an alien world, as we illustrate here by reference to late Triassic, late Jurassic and late Cretaceous simulations. Dense forests grew close to both poles but experienced months-long daylight in warm summers and months-long darkness in cold snowy winters. Ocean depths were warm (8 degrees C or more to the ocean floor) and reefs, with corals, grew 10 degrees of latitude further north and south than at the present time. The whole Earth was warmer than now by 6 degrees C or more, giving more atmospheric humidity and a greatly enhanced hydrological cycle. Much of the rainfall was predominantly convective in character, often focused over the oceans and leaving major desert expanses on the continental areas. Polar ice sheets are unlikely to have been present because of the high summer temperatures achieved. The model indicates extensive sea ice in the nearly enclosed Arctic seaway through a large portion of the year during the late Cretaceous, and the possibility of sea ice in adjacent parts of the Midwest Seaway over North America. The Triassic world was a predominantly warm world, the model output for evaporation and precipitation conforming well with the known distributions of evaporites, calcretes and other climatically sensitive facies for that time. The message from the geological record is clear. Through the Phanerozoic, Earth's climate has changed significantly, both on a variety of time scales and over a range of climatic states, usually baldly referred to as "greenhouse" and "icehouse", although these terms disguise more subtle states between these extremes. Any notion that the climate can remain constant for the convenience of one species of anthropoid is a delusion (although the recent rate of climatic change is exceptional). (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article critically examines the challenges that come with implementing the Extractive Industries Transparency Initiative (EITI)a policy mechanism marketed by donors and Western governments as a key to facilitating economic improvement in resource-rich developing countriesin sub-Saharan Africa. The forces behind the EITI contest that impoverished institutions, the embezzlement of petroleum and/or mineral revenues, and a lack of transparency are the chief reasons why resource-rich sub-Saharan Africa is underperforming economically, and that implementation of the EITI, with its foundation of good governance, will help address these problems. The position here, however, is that the task is by no means straightforward: that the EITI is not necessarily a blueprint for facilitating good governance in the region's resource-rich countries. It is concluded that the EITI is a policy mechanism that could prove to be effective with significant institutional change in host African countries but, on its own, it is incapable of reducing corruption and mobilizing citizens to hold government officials accountable for hoarding profits from extractive industry operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new parameter-estimation algorithm, which minimises the cross-validated prediction error for linear-in-the-parameter models, is proposed, based on stacked regression and an evolutionary algorithm. It is initially shown that cross-validation is very important for prediction in linear-in-the-parameter models using a criterion called the mean dispersion error (MDE). Stacked regression, which can be regarded as a sophisticated type of cross-validation, is then introduced based on an evolutionary algorithm, to produce a new parameter-estimation algorithm, which preserves the parsimony of a concise model structure that is determined using the forward orthogonal least-squares (OLS) algorithm. The PRESS prediction errors are used for cross-validation, and the sunspot and Canadian lynx time series are used to demonstrate the new algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A predictability index was defined as the ratio of the variance of the optimal prediction to the variance of the original time series by Granger and Anderson (1976) and Bhansali (1989). A new simplified algorithm for estimating the predictability index is introduced and the new estimator is shown to be a simple and effective tool in applications of predictability ranking and as an aid in the preliminary analysis of time series. The relationship between the predictability index and the position of the poles and lag p of a time series which can be modelled as an AR(p) model are also investigated. The effectiveness of the algorithm is demonstrated using numerical examples including an application to stock prices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new structure of Radial Basis Function (RBF) neural network called the Dual-orthogonal RBF Network (DRBF) is introduced for nonlinear time series prediction. The hidden nodes of a conventional RBF network compare the Euclidean distance between the network input vector and the centres, and the node responses are radially symmetrical. But in time series prediction where the system input vectors are lagged system outputs, which are usually highly correlated, the Euclidean distance measure may not be appropriate. The DRBF network modifies the distance metric by introducing a classification function which is based on the estimation data set. Training the DRBF networks consists of two stages. Learning the classification related basis functions and the important input nodes, followed by selecting the regressors and learning the weights of the hidden nodes. In both cases, a forward Orthogonal Least Squares (OLS) selection procedure is applied, initially to select the important input nodes and then to select the important centres. Simulation results of single-step and multi-step ahead predictions over a test data set are included to demonstrate the effectiveness of the new approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fast backward elimination algorithm is introduced based on a QR decomposition and Givens transformations to prune radial-basis-function networks. Nodes are sequentially removed using an increment of error variance criterion. The procedure is terminated by using a prediction risk criterion so as to obtain a model structure with good generalisation properties. The algorithm can be used to postprocess radial basis centres selected using a k-means routine and, in this mode, it provides a hybrid supervised centre selection approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of biofuels has been one of the most visible and controversial manifestations of the use of biomass for energy. Biofuels policies in the EU, US and Brazil have been particularly important for the development of the industry in these three important markets. All three have used a variety of measures, including consumption or use mandates, tax incentives and import protection to promote the production and use of biofuels. Despite this, it is uncertain whether the EU will achieve its objective of a 10 per cent share for renewables in transport fuels by 2020. The US is also running into difficulties in meeting consumption mandates for biofuels. Questions are being raised about the continuation of tax credits and import protection. Brazil has liberalised its domestic ethanol market and adopted a more market-oriented approach to biofuels policy, but the management of domestic petroleum prices and the inter-relationship between the sugar market and ethanol production are important factors affecting domestic consumption and exports. In both the EU and the US an ongoing debate about the benefits of reliance on biofuels derived from food crops and concern about the efficacy of current biofuels policies may put their future in doubt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the biomimetic design two hydrophobic pentapetides Boc-Ile-Aib-Leu-Phe-Ala-OMe ( I) and Boc-Gly-Ile-Aib-Leu-Phe-OMe (II) (Aib: alpha-aminoisobutyric acid) containing one Aib each are found to undergo solvent assisted self-assembly in methanol/water to form vesicular structures, which can be disrupted by simple addition of acid. The nanovesicles are found to encapsulate dye molecules that can be released by the addition of acid as confirmed by fluorescence microscopy and UV studies. The influence of solvent polarity on the morphology of the materials generated from the peptides has been examined systematically, and shows that fibrillar structures are formed in less polar chloroform/petroleum ether mixture and vesicular structures are formed in more polar methanol/water. Single crystal X-ray diffraction studies reveal that while beta-sheet mediated self-assembly leads to the formation of fibrillar structures, the solvated beta-sheet structure leads to the formation of vesicular structures. The results demonstrate that even hydrophobic peptides can generate vesicular structures from polar solvent which may be employed in model studies of complex biological phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a model of the local field potential (LFP) based on the conservation of charge, the independence principle of ionic flows and the classical Hodgkin–Huxley (HH) type intracellular model of synaptic activity. Insights were gained through the simulation of the HH intracellular model on the nonlinear relationship between the balance of synaptic conductances and that of post-synaptic currents. The latter is dependent not only on the former, but also on the temporal lag between the excitatory and inhibitory conductances, as well as the strength of the afferent signal. The proposed LFP model provides a method for decomposing the LFP recordings near the soma of layer IV pyramidal neurons in the barrel cortex of anaesthetised rats into two highly correlated components with opposite polarity. The temporal dynamics and the proportional balance of the two components are comparable to the excitatory and inhibitory post-synaptic currents computed from the HH model. This suggests that the two components of the LFP reflect the underlying excitatory and inhibitory post-synaptic currents of the local neural population. We further used the model to decompose a sequence of evoked LFP responses under repetitive electrical stimulation (5 Hz) of the whisker pad. We found that as neural responses adapted, the excitatory and inhibitory components also adapted proportionately, while the temporal lag between the onsets of the two components increased during frequency adaptation. Our results demonstrated that the balance between neural excitation and inhibition can be investigated using extracellular recordings. Extension of the model to incorporate multiple compartments should allow more quantitative interpretations of surface Electroencephalography (EEG) recordings into components reflecting the excitatory, inhibitory and passive ionic current flows generated by local neural populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neurovascular coupling in response to stimulation of the rat barrel cortex was investigated using concurrent multichannel electrophysiology and laser Doppler flowmetry. The data were used to build a linear dynamic model relating neural activity to blood flow. Local field potential time series were subject to current source density analysis, and the time series of a layer IV sink of the barrel cortex was used as the input to the model. The model output was the time series of the changes in regional cerebral blood flow (CBF). We show that this model can provide excellent fit of the CBF responses for stimulus durations of up to 16 s. The structure of the model consisted of two coupled components representing vascular dilation and constriction. The complex temporal characteristics of the CBF time series were reproduced by the relatively simple balance of these two components. We show that the impulse response obtained under the 16-s duration stimulation condition generalised to provide a good prediction to the data from the shorter duration stimulation conditions. Furthermore, by optimising three out of the total of nine model parameters, the variability in the data can be well accounted for over a wide range of stimulus conditions. By establishing linearity, classic system analysis methods can be used to generate and explore a range of equivalent model structures (e.g., feed-forward or feedback) to guide the experimental investigation of the control of vascular dilation and constriction following stimulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a mathematical model linking changes in cerebral blood flow, blood volume and the blood oxygenation state in response to stimulation. The model has three compartments to take into account the fact that the cerebral blood flow and volume as measured concurrently using laser Doppler flowmetry and optical imaging spectroscopy have contributions from the arterial, capillary as well as the venous compartments of the vasculature. It is an extension to previous one-compartment hemodynamic models which assume that the measured blood volume changes are from the venous compartment only. An important assumption of the model is that the tissue oxygen concentration is a time varying state variable of the system and is driven by the changes in metabolic demand resulting from changes in neural activity. The model takes into account the pre-capillary oxygen diffusion by flexibly allowing the saturation of the arterial compartment to be less than unity. Simulations are used to explore the sensitivity of the model and to optimise the parameters for experimental data. We conclude that the three-compartment model was better than the one-compartment model at capturing the hemodynamics of the response to changes in neural activation following stimulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The temporal relationship between changes in cerebral blood flow (CBF) and cerebral blood volume (CBV) is important in the biophysical modeling and interpretation of the hemodynamic response to activation, particularly in the context of magnetic resonance imaging and the blood oxygen level-dependent signal. Grubb et al. (1974) measured the steady state relationship between changes in CBV and CBF after hypercapnic challenge. The relationship CBV proportional to CBFPhi has been used extensively in the literature. Two similar models, the Balloon (Buxton et al., 1998) and the Windkessel (Mandeville et al., 1999), have been proposed to describe the temporal dynamics of changes in CBV with respect to changes in CBF. In this study, a dynamic model extending the Windkessel model by incorporating delayed compliance is presented. The extended model is better able to capture the dynamics of CBV changes after changes in CBF, particularly in the return-to-baseline stages of the response.