64 resultados para Trajectory optimisation
Resumo:
The total phenols, apigenin 7-glucoside, turbidity and colour of extracts from dried chamomile flowers were studied with a view to develop chamomile extracts with potential anti-inflammatory properties for incorporation into beverages. The extraction of all constituents followed pseudo first-order kinetics. In general, the rate constant (k) increased as the temperature increased from 57 to 100 °C. The turbidity only increased significantly between 90 and 100 °C. Therefore, aqueous chamomile extracts had maximum total phenol concentration and minimum turbidity when extracted at 90 °C for 20 min. The effect of drying conditions on chamomile extracted using these conditions was determined. A significant reduction in phenol concentration, from 19.7 ± 0.5 mg/g GAE in fresh chamomile to 13 ± 1 mg/g GAE, was found only in the plant material oven-dried at 80 °C (p ⩽ 0.05). The biggest colour change was between fresh chamomile and that oven-dried at 80 °C, followed by samples air-dried. There was no significant difference in colour of material freeze-dried and oven-dried at 40 °C.
Resumo:
In estimating the inputs into the Modern Portfolio Theory (MPT) portfolio optimisation problem, it is usual to use equal weighted historic data. Equal weighting of the data, however, does not take account of the current state of the market. Consequently this approach is unlikely to perform well in any subsequent period as the data is still reflecting market conditions that are no longer valid. The need for some return-weighting scheme that gives greater weight to the most recent data would seem desirable. Therefore, this study uses returns data which are weighted to give greater weight to the most recent observations to see if such a weighting scheme can offer improved ex-ante performance over that based on un-weighted data.
Resumo:
A focused library of potential hydrogelators each containing two substituted aromatic residues separated by a urea or thiourea linkage have been synthesised and characterized. Six of these novel compounds are highly efficient hydrogelators, forming gels in aqueous solution at low concentrations (0.03–0.60 wt %). Gels were formed through a pH switching methodology, by acidification of a basic solution (pH 14 to ≈4) either by addition of HCl or via the slow hydrolysis of glucono-δ-lactone. Frequently, gelation was accompanied by a dramatic switch in the absorption spectra of the gelators, resulting in a significant change in colour, typically from a vibrant orange to pale yellow. Each of the gels was capable of sequestering significant quantities of the aromatic cationic dye, methylene blue, from aqueous solution (up to 1.02 g of dye per gram of dry gelator). Cryo-transmission electron microscopy of two of the gels revealed an extensive network of high aspect ratio fibers. The structure of the fibers altered dramatically upon addition of 20 wt % of the dye, resulting in aggregation and significant shortening of the fibrils. This study demonstrates the feasibility for these novel gels finding application as inexpensive and effective water purification platforms.
Resumo:
A two-phase system composed by a leach bed and a methanogenic reactor was modified for the first time to improve volumetric substrate degradation and methane yields from a complex substrate (maize; Zea mays). The system, which was operated for consecutive feed cycles of different durations for 120 days, was highly flexible and its performance improved by altering operational conditions. Daily substrate degradation was higher the shorter the feed cycle, reaching 8.5 g TSdestroyed d�1 (7-day feed cycle) but the overall substrate degradation was higher by up to 55% when longer feed cycles (14 and 28 days) were applied. The same occurred with volumetric methane yields, reaching 0.839 m3 (m3)�1 d�1. The system performed better than others on specific methane yields, reaching 0.434 m3 kg�1 TSadded, in the 14-day and 28-day systems. The UASB and AF designs performed similarly as second stage reactors on methane yields, SCOD and VFA removal efficiencies.
Resumo:
This paper provides a selective review of literature on fair trade and introduces contributions to this Policy Arena. It focuses on policy practice as a dynamic process, highlighting the changing configurations of actors, policy spaces, knowledge, practices and commodities that are shaping the policy trajectory of fair trade. It highlights how recent literature has tackled questions of mainstreaming as part of this trajectory, bringing to the fore dimensions of change associated with the market, state and civil society.
Resumo:
Context: Emotion regulation is critically disrupted in depression and use of paradigms tapping these processes may uncover essential changes in neurobiology during treatment. In addition, as neuroimaging outcome studies of depression commonly utilize solely baseline and endpoint data – which is more prone to week-to week noise in symptomatology – we sought to use all data points over the course of a six month trial. Objective: To examine changes in neurobiology resulting from successful treatment. Design: Double-blind trial examining changes in the neural circuits involved in emotion regulation resulting from one of two antidepressant treatments over a six month trial. Participants were scanned pretreatment, at 2 months and 6 months posttreatment. Setting: University functional magnetic resonance imaging facility. Participants: 21 patients with Major Depressive Disorder and without other Axis I or Axis II diagnoses and 14 healthy controls. Interventions: Venlafaxine XR (doses up to 300mg) or Fluoxetine (doses up to 80mg). Main Outcome Measure: Neural activity, as measured using functional magnetic resonance imaging during performance of an emotion regulation paradigm as well as regular assessments of symptom severity by the Hamilton Rating Scale for Depression. To utilize all data points, slope trajectories were calculated for rate of change in depression severity as well as rate of change of neural engagement. Results: Those depressed individuals showing the steepest decrease in depression severity over the six months were those individuals showing the most rapid increases in BA10 and right DLPFC activity when regulating negative affect over the same time frame. This relationship was more robust than when using solely the baseline and endpoint data. Conclusions: Changes in PFC engagement when regulating negative affect correlate with changes in depression severity over six months. These results are buttressed by calculating these statistics which are more reliable and robust to week-to-week variation than difference scores.
Resumo:
A Lagrangian model of photochemistry and mixing is described (CiTTyCAT, stemming from the Cambridge Tropospheric Trajectory model of Chemistry And Transport), which is suitable for transport and chemistry studies throughout the troposphere. Over the last five years, the model has been developed in parallel at several different institutions and here those developments have been incorporated into one "community" model and documented for the first time. The key photochemical developments include a new scheme for biogenic volatile organic compounds and updated emissions schemes. The key physical development is to evolve composition following an ensemble of trajectories within neighbouring air-masses, including a simple scheme for mixing between them via an evolving "background profile", both within the boundary layer and free troposphere. The model runs along trajectories pre-calculated using winds and temperature from meteorological analyses. In addition, boundary layer height and precipitation rates, output from the analysis model, are interpolated to trajectory points and used as inputs to the mixing and wet deposition schemes. The model is most suitable in regimes when the effects of small-scale turbulent mixing are slow relative to advection by the resolved winds so that coherent air-masses form with distinct composition and strong gradients between them. Such air-masses can persist for many days while stretching, folding and thinning. Lagrangian models offer a useful framework for picking apart the processes of air-mass evolution over inter-continental distances, without being hindered by the numerical diffusion inherent to global Eulerian models. The model, including different box and trajectory modes, is described and some output for each of the modes is presented for evaluation. The model is available for download from a Subversion-controlled repository by contacting the corresponding authors.
Resumo:
During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.
Resumo:
Methods for recombinant production of eukaryotic membrane proteins, yielding sufficient quantity and quality of protein for structural biology, remain a challenge. We describe here, expression and purification optimisation of the human SERCA2a cardiac isoform of Ca2+ translocating ATPase, using Saccharomyces cerevisiae as the heterologous expression system of choice. Two different expression vectors were utilised, allowing expression of C-terminal fusion proteins with a biotinylation domain or a GFP- His8 tag. Solubilised membrane fractions containing the protein of interest were purified onto Streptavidin-Sepharose, Ni-NTA or Talon resin, depending on the fusion tag present. Biotinylated protein was detected using specific antibody directed against SERCA2 and, advantageously, GFP-His8 fusion protein was easily traced during the purification steps using in-gel fluorescence. Importantly, talon resin affinity purification proved more specific than Ni-NTA resin for the GFP-His8 tagged protein, providing better separation of oligomers present, during size exclusion chromatography. The optimised method for expression and purification of human cardiac SERCA2a reported herein, yields purified protein (> 90%) that displays a calcium-dependent thapsigargin-sensitive activity and is suitable for further biophysical, structural and physiological studies. This work provides support for the use of Saccharomyces cerevisiae as a suitable expression system for recombinant production of multi-domain eukaryotic membrane proteins.
Resumo:
A novel two-stage construction algorithm for linear-in-the-parameters classifier is proposed, aiming at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage to construct a sparse linear-in-the-parameters classifier. For the first stage learning of generating the prefiltered signal, a two-level algorithm is introduced to maximise the model's generalisation capability, in which an elastic net model identification algorithm using singular value decomposition is employed at the lower level while the two regularisation parameters are selected by maximising the Bayesian evidence using a particle swarm optimization algorithm. Analysis is provided to demonstrate how “Occam's razor” is embodied in this approach. The second stage of sparse classifier construction is based on an orthogonal forward regression with the D-optimality algorithm. Extensive experimental results demonstrate that the proposed approach is effective and yields competitive results for noisy data sets.
Resumo:
It has long been known that the path (trajectory) taken by the eye to land on a target is rarely straight (Yarbus, 1967). Furthermore, the magnitude and direction of this natural tendency for curvature can be modulated by the presence of a competing distractor stimu lus presented along with the saccade target. The distractorrelated modulation of saccade trajectories provides a subtle measure of the underlying competitive processes involved in saccade target selection. Here we review some of our own studies into the effects distract ors have on saccade trajectories, which can be regarded as a way of probing the competit ive balance between target and distractor salience.
Resumo:
Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
Dynamic soundtracking presents various practical and aesthetic challenges to composers working with games. This paper presents an implementation of a system addressing some of these challenges with an affectively-driven music generation algorithm based on a second order Markov-model. The system can respond in real-time to emotional trajectories derived from 2-dimensions of affect on the circumplex model (arousal and valence), which are mapped to five musical parameters. A transition matrix is employed to vary the generated output in continuous response to the affective state intended by the gameplay.