63 resultados para Optimisation of methods
em CentAUR: Central Archive University of Reading - UK
Resumo:
Methods for recombinant production of eukaryotic membrane proteins, yielding sufficient quantity and quality of protein for structural biology, remain a challenge. We describe here, expression and purification optimisation of the human SERCA2a cardiac isoform of Ca2+ translocating ATPase, using Saccharomyces cerevisiae as the heterologous expression system of choice. Two different expression vectors were utilised, allowing expression of C-terminal fusion proteins with a biotinylation domain or a GFP- His8 tag. Solubilised membrane fractions containing the protein of interest were purified onto Streptavidin-Sepharose, Ni-NTA or Talon resin, depending on the fusion tag present. Biotinylated protein was detected using specific antibody directed against SERCA2 and, advantageously, GFP-His8 fusion protein was easily traced during the purification steps using in-gel fluorescence. Importantly, talon resin affinity purification proved more specific than Ni-NTA resin for the GFP-His8 tagged protein, providing better separation of oligomers present, during size exclusion chromatography. The optimised method for expression and purification of human cardiac SERCA2a reported herein, yields purified protein (> 90%) that displays a calcium-dependent thapsigargin-sensitive activity and is suitable for further biophysical, structural and physiological studies. This work provides support for the use of Saccharomyces cerevisiae as a suitable expression system for recombinant production of multi-domain eukaryotic membrane proteins.
Resumo:
Since the advent of the internet in every day life in the 1990s, the barriers to producing, distributing and consuming multimedia data such as videos, music, ebooks, etc. have steadily been lowered for most computer users so that almost everyone with internet access can join the online communities who both produce, consume and of course also share media artefacts. Along with this trend, the violation of personal data privacy and copyright has increased with illegal file sharing being rampant across many online communities particularly for certain music genres and amongst the younger age groups. This has had a devastating effect on the traditional media distribution market; in most cases leaving the distribution companies and the content owner with huge financial losses. To prove that a copyright violation has occurred one can deploy fingerprinting mechanisms to uniquely identify the property. However this is currently based on only uni-modal approaches. In this paper we describe some of the design challenges and architectural approaches to multi-modal fingerprinting currently being examined for evaluation studies within a PhD research programme on optimisation of multi-modal fingerprinting architectures. Accordingly we outline the available modalities that are being integrated through this research programme which aims to establish the optimal architecture for multi-modal media security protection over the internet as the online distribution environment for both legal and illegal distribution of media products.
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The influence, was investigated, of abiotic parameters on the isolation of protoplasts from in vitro seedling cotyledons of white lupin. The protoplasts were found to be competent in withstanding a wide range of osmotic potentials of the enzyme medium, however, -2.25 MPa (0.5 M mannitol), resulted in the highest yield of protoplasts. The pH of the isolation medium also had a profound effect on protoplast production. Vacuum infiltration of the enzyme solution into the cotyledon tissue resulted in a progressive drop in the yield of protoplasts. The speed and duration of orbital agitation of the cotyledon tissue played a significant role in the release of protoplasts and a two step (stationary-gyratory) regime was found to be better than the gyratory-only system.
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
Background: Meta-analyses based on individual patient data (IPD) are regarded as the gold standard for systematic reviews. However, the methods used for analysing and presenting results from IPD meta-analyses have received little discussion. Methods We review 44 IPD meta-analyses published during the years 1999–2001. We summarize whether they obtained all the data they sought, what types of approaches were used in the analysis, including assumptions of common or random effects, and how they examined the effects of covariates. Results: Twenty-four out of 44 analyses focused on time-to-event outcomes, and most analyses (28) estimated treatment effects within each trial and then combined the results assuming a common treatment effect across trials. Three analyses failed to stratify by trial, analysing the data is if they came from a single mega-trial. Only nine analyses used random effects methods. Covariate-treatment interactions were generally investigated by subgrouping patients. Seven of the meta-analyses included data from less than 80% of the randomized patients sought, but did not address the resulting potential biases. Conclusions: Although IPD meta-analyses have many advantages in assessing the effects of health care, there are several aspects that could be further developed to make fuller use of the potential of these time-consuming projects. In particular, IPD could be used to more fully investigate the influence of covariates on heterogeneity of treatment effects, both within and between trials. The impact of heterogeneity, or use of random effects, are seldom discussed. There is thus considerable scope for enhancing the methods of analysis and presentation of IPD meta-analysis.
Resumo:
This paper considers methods for testing for superiority or non-inferiority in active-control trials with binary data, when the relative treatment effect is expressed as an odds ratio. Three asymptotic tests for the log-odds ratio based on the unconditional binary likelihood are presented, namely the likelihood ratio, Wald and score tests. All three tests can be implemented straightforwardly in standard statistical software packages, as can the corresponding confidence intervals. Simulations indicate that the three alternatives are similar in terms of the Type I error, with values close to the nominal level. However, when the non-inferiority margin becomes large, the score test slightly exceeds the nominal level. In general, the highest power is obtained from the score test, although all three tests are similar and the observed differences in power are not of practical importance. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
One cubic centimetre potato cubes were blanched, sulfited, dried initially for between 40 and 80 min in air at 90 degreesC in a cabinet drier, puffed in a high temperature fluidised bed and then dried for up to 180 min in a cabinet drier. The final moisture content was 0.05 dwb. The resulting product was optimised using response surface methodology, in terms of volume and colour (L-*, a(*) and b(*) values) of the dry product, as well as rehydration ratio and texture of the rehydrated product. The operating conditions resulting in the optimised product were found to be blanching for 6 min in water at 100 degreesC, dipping in 400 ppm sodium metabisulfite solution for 10 min, initially drying for 40 min and puffing in air at 200 degreesC for 40 s, followed by final drying to a moisture content of 0.05 dwb. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
We introduce and describe the Multiple Gravity Assist problem, a global optimisation problem that is of great interest in the design of spacecraft and their trajectories. We discuss its formalization and we show, in one particular problem instance, the performance of selected state of the art heuristic global optimisation algorithms. A deterministic search space pruning algorithm is then developed and its polynomial time and space complexity derived. The algorithm is shown to achieve search space reductions of greater than six orders of magnitude, thus reducing significantly the complexity of the subsequent optimisation.
Resumo:
Based on integrated system optimisation and parameter estimation a method is described for on-line steady state optimisation which compensates for model-plant mismatch and solves a non-linear optimisation problem by iterating on a linear - quadratic representation. The method requires real process derivatives which are estimated using a dynamic identification technique. The utility of the method is demonstrated using a simulation of the Tennessee Eastman benchmark chemical process.
Resumo:
Sensitive methods that are currently used to monitor proteolysis by plasmin in milk are limited due to 7 their high cost and lack of standardisation for quality assurance in the various dairy laboratories. In 8 this study, four methods, trinitrobenzene sulphonic acid (TNBS), reverse phase high pressure liquid 9 chromatography (RP-HPLC), gel electrophoresis and fluorescamine, were selected to assess their 10 suitability for the detection of proteolysis in milk by plasmin. Commercial UHT milk was incubated 11 with plasmin at 37 °C for one week. Clarification was achieved by isoelectric precipitation (pH 4·6 12 soluble extracts)or 6% (final concentration) trichloroacetic acid (TCA). The pH 4·6 and 6% TCA 13 soluble extracts of milk showed high correlations (R2 > 0·93) by the TNBS, fluorescamine and 14 RP-HPLC methods, confirming increased proteolysis during storage. For gel electrophoresis,15 extensive proteolysis was confirmed by the disappearance of α- and β-casein bands on the seventh 16 day, which was more evident in the highest plasmin concentration. This was accompanied by the 17 appearance of α- and β-casein proteolysis products with higher intensities than on previous days, 18 implying that more products had been formed as a result of casein breakdown. The fluorescamine 19 method had a lower detection limit compared with the other methods, whereas gel electrophoresis 20 was the best qualitative method for monitoring β-casein proteolysis products. Although HPLC was the 21 most sensitive, the TNBS method is recommended for use in routine laboratory analysis on the basis 22 of its accuracy, reliability and simplicity.