176 resultados para Simple methods
Resumo:
The beta subunit of the Escherichia coli replicative DNA polymerase III holoenzyme is the sliding clamp that interacts with the alpha (polymerase) subunit to maintain the high processivity of the enzyme. The beta protein is a ring-shaped dimer of 40.6 kDa subunits whose structure has previously been determined at a resolution of 2.5 Angstrom [Kong et al. (1992), Cell, 69, 425-437]. Here, the construction of a new plasmid that directs overproduction of beta to very high levels and a simple procedure for large-scale purification of the protein are described. Crystals grown under slightly modified conditions diffracted to beyond 1.9 Angstrom at 100 K at a synchrotron source. The structure of the beta dimer solved at 1.85 Angstrom resolution shows some differences from that reported previously. In particular, it was possible at this resolution to identify residues that differed in position between the two subunits in the unit cell; side chains of these and some other residues were found to occupy alternate conformations. This suggests that these residues are likely to be relatively mobile in solution. Some implications of this flexibility for the function of beta are discussed.
Resumo:
Simulations provide a powerful means to help gain the understanding of crustal fault system physics required to progress towards the goal of earthquake forecasting. Cellular Automata are efficient enough to probe system dynamics but their simplifications render interpretations questionable. In contrast, sophisticated elasto-dynamic models yield more convincing results but are too computationally demanding to explore phase space. To help bridge this gap, we develop a simple 2D elastodynamic model of parallel fault systems. The model is discretised onto a triangular lattice and faults are specified as split nodes along horizontal rows in the lattice. A simple numerical approach is presented for calculating the forces at medium and split nodes such that general nonlinear frictional constitutive relations can be modeled along faults. Single and multi-fault simulation examples are presented using a nonlinear frictional relation that is slip and slip-rate dependent in order to illustrate the model.
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
Purification of recombinant human growth hormone (rhGH) from Chinese hamster ovary (CHO) cell culture supernatant by Gradiflow large-scale electrophoresis is described. Production of rhGH in CHO cells is an alternative to production in Escherichia coli, with the advantage that rhGH is secreted into protein-free production media, facilitating a more simple purification and avoiding resolubilization of inclusion bodies and protein refolding. As an alternative to conventional chromatography, rhGH was purified in a one-step procedure using Gradiflow technology. Clarified culture supernatant containing rhGH was passed through a Gradiflow BF200 and separations were performed over 60 min using three different buffers of varying pH. Using a 50 mM Tris/Hepes buffer at pH 7.5 together with a 50 kDa separation membrane, rhGH was purified to approximately 98% purity with a yield of 90%. This study demonstrates the ability of Gradiflow preparative electrophoresis technology to purify rhGH from mammalian cell culture supernatant in a one-step process with high purity and yield. As the Gradiflow is directly scalable, this study also illustrates the potential for the inclusion of the Gradiflow into bioprocesses for the production of clinical grade rhGH and other therapeutic proteins. (C) 2003 Elsevier Science (USA). All rights reserved.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Quantum mechanics has been formulated in phase space, with the Wigner function as the representative of the quantum density operator, and classical mechanics has been formulated in Hilbert space, with the Groenewold operator as the representative of the classical Liouville density function. Semiclassical approximations to the quantum evolution of the Wigner function have been defined, enabling the quantum evolution to be approached from a classical starting point. Now analogous semiquantum approximations to the classical evolution of the Groenewold operator are defined, enabling the classical evolution to be approached from a quantum starting point. Simple nonlinear systems with one degree of freedom are considered, whose Hamiltonians are polynomials in the Hamiltonian of the simple harmonic oscillator. The behavior of expectation values of simple observables and of eigenvalues of the Groenewold operator are calculated numerically and compared for the various semiclassical and semiquantum approximations.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Extended gcd computation is interesting itself. It also plays a fundamental role in other calculations. We present a new algorithm for solving the extended gcd problem. This algorithm has a particularly simple description and is practical. It also provides refined bounds on the size of the multipliers obtained.
Resumo:
An improved method for counting virus and virus like particles by electron microscopy (EM) was developed. The procedure involves the determination of the absolute concentration of pure or semi-pure particles once deposited evenly on EM grids using either centrifugation or antibody capture techniques. The counting of particles was done with a Microfiche unit which enlarged approximately 50 x the image of particles on a developed negative film which had been taken at a relatively low magnification (2500 x) by EM. Initially, latex particles of a known concentration were counted using this approach, to prove the accuracy of the technique. The latex particles were deposited evenly on an EM grid using centrifugation (Modified Beckmen EM-90 Airfuge technique). Subsequently, recombinant Bluetongue virus (BTV) core-like particles (CLPs) captured by a Monoclonal antibody using a hovel sample loading method were counted by the Microfiche unit method and by a direct EM method. Comparison of the simplified counting method developed with a conventional method, showed good agreement. The method is simple, accurate, rapid, and reproducible when used with either pure particles or with particles from crude cell culture extracts.