33 resultados para Hugh O’Neill
Resumo:
A new method to measure Escherichia coil cell debris size after homogenization is presented. It is based on cumulative sedimentation analysis under centrifugal force, coupled with Sodium Dodecyl Sulfate-Polyacrylamide Gel Electrophoresis (SDS-PAGE) analysis of sedimented proteins. The effects that fermentation and homogenization conditions have on the resulting debris distributions were investigated using this method. Median debris size decreased significantly from approximately 0.5 mu m to 0.3 mu m as the number of homogenization passes increased from 2 to 10. Under identical homogenization conditions, uninduced host cells in stationary phase had a larger debris size than exponential cells after 5 homogenizer passes. This difference was not evident after 2 or in passes, possibly because of confounding intact cells and the existence of a minimum debris size for the conditions investigated. Recombinant cells containing protein inclusion bodies had the smallest debris size following homogenization. The method was also used to measure the size distribution of inclusion bodies. This result compared extremely well with an independent determination using centrifugal disc photosedimentation (CDS), thus validating the method. This is the first method that provides accurate size distributions of E. coli debris without the need for sample pretreatment, theoretical approximations (e.g. extinction coefficients), or the separation of debris and inclusion bodies prior to analysis. (C) 1997 John Wiley & Sons, Inc.
Resumo:
Renaturation of protein expressed as inclusion bodies within Escherichia coli is a key step in many bioprocesses. Operating conditions for the refolding step dramatically affect the amount of protein product recovered, and hence profoundly influence the process economics. The first systematic comparison of refolding conducted in batch, fed-batch and continuous stirred-tank reactors is provided Refolding is modeled as kinetic competition between first-order refolding (equilibrium reaction) and irreversible aggregation (second-order). Simulations presented allow direct comparison between different flowsheets and refolding schemes using a dimensionless economic objective. As expected from examination of the reaction kinetics, batch operation is the most inefficient merle. For the base process considered, the overall cost of fed-batch and continuous refolding is virtually identical (less than half that of the batch process). Reactor selection and optimization of refolding using overall economics are demonstrated to be vitally important.
Resumo:
Experimental data for E. coli debris size reduction during high-pressure homogenisation at 55 MPa are presented. A mathematical model based on grinding theory is developed to describe the data. The model is based on first-order breakage and compensation conditions. It does not require any assumption of a specified distribution for debris size and can be used given information on the initial size distribution of whole cells and the disruption efficiency during homogenisation. The number of homogeniser passes is incorporated into the model and used to describe the size reduction of non-induced stationary and induced E. coil cells during homogenisation. Regressing the results to the model equations gave an excellent fit to experimental data ( > 98.7% of variance explained for both fermentations), confirming the model's potential for predicting size reduction during high-pressure homogenisation. This study provides a means to optimise both homogenisation and disc-stack centrifugation conditions for recombinant product recovery. (C) 1997 Elsevier Science Ltd.
Resumo:
The major proteins of baboon milk were identified as beta -lactoglobulin (beta LG), alpha -lactalbumin (alpha LA), lysozyme, lactoferrin, casein, and albumin by immobiline isoelectric focusing, SDS-PAGE, immunoblotting of gels with rabbit antisera to human alpha LA, lysozyme, and albumin and bovine beta LG and casein, and N-terminal sequencing of proteins blotted from gels. The first 30 N-terminal residues of baboon polymorphism at residue 2. The complete cDNA sequence and derived amino acid composition of beta LG were elucidated using RT-PCR amplification of poly(A)(+) mRNA purified from lactating mammary gland. Baboon beta LG identified to date. beta LG and alpha LA polymorphisms with three (A, B, and C) and two (A and B) variants, respectively, were detected by immobiline IEF, pH 4-6, of individual baboon milk samples at varying stages of lactation.
Resumo:
In this study we present a novel automated strategy for predicting infarct evolution, based on MR diffusion and perfusion images acquired in the acute stage of stroke. The validity of this methodology was tested on novel patient data including data acquired from an independent stroke clinic. Regions-of-interest (ROIs) defining the initial diffusion lesion and tissue with abnormal hemodynamic function as defined by the mean transit time (MTT) abnormality were automatically extracted from DWI/PI maps. Quantitative measures of cerebral blood flow (CBF) and volume (CBV) along with ratio measures defined relative to the contralateral hemisphere (r(a)CBF and r(a)CBV) were calculated for the MTT ROIs. A parametric normal classifier algorithm incorporating these measures was used to predict infarct growth. The mean r(a)CBF and r(a)CBV values for eventually infarcted MTT tissue were 0.70 +/-0.19 and 1.20 +/-0.36. For recovered tissue the mean values were 0.99 +/-0.25 and 1.87 +/-0.71, respectively. There was a significant difference between these two regions for both measures (P
Resumo:
This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F-distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plot.
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
Using benthic habitat data from the Florida Keys (USA), we demonstrate how siting algorithms can help identify potential networks of marine reserves that comprehensively represent target habitat types. We applied a flexible optimization tool-simulated annealing-to represent a fixed proportion of different marine habitat types within a geographic area. We investigated the relative influence of spatial information, planning-unit size, detail of habitat classification, and magnitude of the overall conservation goal on the resulting network scenarios. With this method, we were able to identify many adequate reserve systems that met the conservation goals, e.g., representing at least 20% of each conservation target (i.e., habitat type) while fulfilling the overall aim of minimizing the system area and perimeter. One of the most useful types of information provided by this siting algorithm comes from an irreplaceability analysis, which is a count of the number of, times unique planning units were included in reserve system scenarios. This analysis indicated that many different combinations of sites produced networks that met the conservation goals. While individual 1-km(2) areas were fairly interchangeable, the irreplaceability analysis highlighted larger areas within the planning region that were chosen consistently to meet the goals incorporated into the algorithm. Additionally, we found that reserve systems designed with a high degree of spatial clustering tended to have considerably less perimeter and larger overall areas in reserve-a configuration that may be preferable particularly for sociopolitical reasons. This exercise illustrates the value of using the simulated annealing algorithm to help site marine reserves: the approach makes efficient use of;available resources, can be used interactively by conservation decision makers, and offers biologically suitable alternative networks from which an effective system of marine reserves can be crafted.