91 resultados para Methods: laboratory: molecular


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conotoxins are valuable probes of receptors and ion channels because of their small size and highly selective activity. alpha-Conotoxin EpI, a 16-residue peptide from the mollusk-hunting Conus episcopatus, has the amino acid sequence GCCSDPRCNMNNPDY(SO3H)C-NH2 and appears to be an extremely potent and selective inhibitor of the alpha 3 beta 2 and alpha 3 beta 4 neuronal subtypes of the nicotinic acetylcholine receptor (nAChR). The desulfated form of EpI ([Tyr(15)]EpI) has a potency and selectivity for the nAChR receptor similar to those of EpI. Here we describe the crystal structure of [Tyr(15)]EpI solved at a resolution of 1.1 Angstrom using SnB. The asymmetric unit has a total of 284 non-hydrogen atoms, making this one of the largest structures solved de novo try direct methods. The [Tyr(15)]EpI structure brings to six the number of alpha-conotoxin structures that have been determined to date. Four of these, [Tyr(15)]EpI, PnIA, PnIB, and MII, have an alpha 4/7 cysteine framework and are selective for the neuronal subtype of the nAChR. The structure of [Tyr(15)]EpI has the same backbone fold as the other alpha 4/7-conotoxin structures, supporting the notion that this conotoxin cysteine framework and spacing give rise to a conserved fold. The surface charge distribution of [Tyr(15)]EpI is similar to that of PnIA and PnIB but is likely to be different from that of MII, suggesting that [Tyr(15)]EpI and MII may have different binding modes for the same receptor subtype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years databases have become an extremely important resource for biomedical research. Immunology research is increasingly dependent on access to extensive biological databases to extract existing information, plan experiments, and analyse experimental results. This review describes 15 immunological databases that have appeared over the last 30 years. In addition, important issues regarding database design and the potential for misuse of information contained within these databases are discussed. Access pointers are provided for the major immunological databases and also for a number of other immunological resources accessible over the World Wide Web (WWW). (C) 2000 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sound application of molecular epidemiological principles requires working knowledge of both molecular biological and epidemiological methods. Molecular tools have become an increasingly important part of studying the epidemiology of infectious agents. Molecular tools have allowed the aetiological agent within a population to be diagnosed with a greater degree of efficiency and accuracy than conventional diagnostic tools. They have increased the understanding of the pathogenicity, virulence, and host-parasite relationships of the aetiological agent, provided information on the genetic structure and taxonomy of the parasite and allowed the zoonotic potential of previously unidentified agents to be determined. This review describes the concept of epidemiology and proper study design, describes the array of currently available molecular biological tools and provides examples of studies that have integrated both disciplines to successfully unravel zoonotic relationships that would otherwise be impossible utilising conventional diagnostic tools. The current limitations of applying these tools, including cautions that need to be addressed during their application are also discussed.(c) 2005 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smoothing the potential energy surface for structure optimization is a general and commonly applied strategy. We propose a combination of soft-core potential energy functions and a variation of the diffusion equation method to smooth potential energy surfaces, which is applicable to complex systems such as protein structures; The performance of the method was demonstrated by comparison with simulated annealing using the refinement of the undecapeptide Cyclosporin A as a test case. Simulations were repeated many times using different initial conditions and structures since the methods are heuristic and results are only meaningful in a statistical sense.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dispersal, or the amount of dispersion between an individual's birthplace and that of its offspring, is of great importance in population biology, behavioural ecology and conservation, however, obtaining direct estimates from field data on natural populations can be problematic. The prickly forest skink, Gnypetoscincus queenslandiae, is a rainforest endemic skink from the wet tropics of Australia. Because of its log-dwelling habits and lack of definite nesting sites, a demographic estimate of dispersal distance is difficult to obtain. Neighbourhood size, defined as 4 piD sigma (2) (where D is the population density and sigma (2) the mean axial squared parent-offspring dispersal rate), dispersal and density were estimated directly and indirectly for this species using mark-recapture and microsatellite data, respectively, on lizards captured at a local geographical scale of 3 ha. Mark-recapture data gave a dispersal rate of 843 m(2)/generation (assuming a generation time of 6.5 years), a time-scaled density of 13 635 individuals * generation/km(2) and, hence, a neighbourhood size of 144 individuals. A genetic method based on the multilocus (10 loci) microsatellite genotypes of individuals and their geographical location indicated that there is a significant isolation by distance pattern, and gave a neighbourhood size of 69 individuals, with a 95% confidence interval between 48 and 184. This translates into a dispersal rate of 404 m(2)/generation when using the mark-recapture density estimation, or an estimate of time-scaled population density of 6520 individuals * generation/km(2) when using the mark-recapture dispersal rate estimate. The relationship between the two categories of neighbourhood size, dispersal and density estimates and reasons for any disparities are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the comparison of surface diffusivities of hydrocarbons in activated carbon. The surface diffusivities are obtained from the analysis of kinetic data collected using three different kinetics methods- the constant molar flow, the differential adsorption bed and the differential permeation methods. In general the values of surface diffusivity obtained by these methods agree with each other, and it is found that the surface diffusivity increases very fast with loading. Such a fast increase can not be accounted for by a thermodynamic Darken factor, and the surface heterogeneity only partially accounts for the fast rise of surface diffusivity versus loading. Surface diffusivities of methane, ethane, propane, n-butane, n-hexane, benzene and ethanol on activated carbon are reported in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A laboratory scale sequencing batch reactor (SBR) operating for enhanced biological phosphorus removal (EBPR) and fed with a mixture of volatile fatty acids (VFAs) showed stable and efficient EBPR capacity over a four-year-period. Phosphorus (P), poly-beta-hydroxyalkanoate (PHA) and glycogen cycling consistent with classical anaerobic/aerobic EBPR were demonstrated with the order of anaerobic VFA uptake being propionate, acetate then butyrate. The SBR was operated without pH control and 63.67+/-13.86 mg P l(-1) was released anaerobically. The P% of the sludge fluctuated between 6% and 10% over the operating period (average of 8.04+/-1.31%). Four main morphological types of floc-forming bacteria were observed in the sludge during one year of in-tensive microscopic observation. Two of them were mainly responsible for anaerobic/aerobic P and PHA transformations. Fluorescence in situ hybridization (FISH) and post-FISH chemical staining for intracellular polyphosphate and PHA were used to determine that 'Candidatus Accumulibacter phosphatis' was the most abundant polyphosphate accumulating organism (PAO), forming large clusters of coccobacilli (1.0-1.5 mum) and comprising 53% of the sludge bacteria. Also by these methods, large coccobacillus-shaped gammaproteobacteria (2.5-3.5 mum) from a recently described novel cluster were glycogen-accumulating organisms (GAOs) comprising 13% of the bacteria. Tetrad-forming organisms (TFOs) consistent with the 'G bacterium' morphotype were alphaproteobacteria , but not Amaricoccus spp., and comprised 25% of all bacteria. According to chemical staining, TFOs were occasionally able to store PHA anaerobically and utilize it aerobically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of rapid diagnosis of infectious diseases of children in the last decade has shifted from variations of the conventional laboratory techniques of antigen detection, microscopy and culture to that of molecular diagnosis of infectious agents. Pediatricians will need to be able to interpret the use, limitations and results of molecular diagnostic techniques as they are increasingly integrated into routine clinical microbiology laboratory protocols. PCR is the best known and most successfully implemented diagnostic molecular technology to date. It can detect specific infectious agents and determine their virulence and antimicrobial genotypes with greater speed, sensitivity and specificity than conventional microbiology methods. Inherent technical limitations of PCR are present, although they are reduced in laboratories that follow suitable validation and quality control procedures. Variations of PCR together with advances in nucleic acid amplification technology have broadened its diagnostic capabilities in clinical infectious disease to now rival and even surpass traditional methods in some situations. Automation of all components of PCR is now possible. The completion of the genome sequencing projects for significant microbial pathogens, in combination with PCR and DNA chip technology, will revolutionize the diagnosis and management of infectious diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the importance of protein complexes as therapeutic targets, it is necessary to understand the physical chemistry of these interactions under the crowded conditions that exist in cells. We have used sedimentation equilibrium to quantify the enhancement of the reversible homodimerization of alpha-chymotrypsin by high concentrations of the osmolytes glucose, sucrose, and raffinose. In an attempt to rationalize the ostuolyte-mediated stabilization of the a-chymotrypsin homodimer, we have used models based on binding interactions (transfer-free energy analysis) and steric interactions (excluded volume theory) to predict the stabilization. Although transfer-free energy analysis predicts reasonably well the relatively small stabilization observed for complex formation between cytochrome c and cytochrome c peroxidase, as well as that between bobtail quail lysozyme and a monoclonal Fab fragment, it underestimates the sugar-mediated stabilization of the alpha-chymotrypsin dimer. Although predictions based on excluded volume theory overestimate the stabilization, it would seem that a major determinant in the observed stabilization of the a-chymotrypsin homodimer is the thermodynamic nonideality arising from molecular crowding by the three small sugars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular evolution has been considered to be essentially a stochastic process, little influenced by the pace of phenotypic change. This assumption was challenged by a study that demonstrated an association between rates of morphological and molecular change estimated for total-evidence phylogenies, a finding that led some researchers to challenge molecular date estimates of major evolutionary radiations. Here we show that Omland's (1997) result is probably due to methodological bias, particularly phylogenetic nonindependence, rather than being indicative of an underlying evolutionary phenomenon. We apply three new methods specifically designed to overcome phylogenetic bias to 13 published phylogenetic datasets for vertebrate taxa, each of which includes both morphological characters and DNA sequence data. We find no evidence of an association between rates of molecular and morphological rates of change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are several competing methods commonly used to solve energy grained master equations describing gas-phase reactive systems. When it comes to selecting an appropriate method for any particular problem, there is little guidance in the literature. In this paper we directly compare several variants of spectral and numerical integration methods from the point of view of computer time required to calculate the solution and the range of temperature and pressure conditions under which the methods are successful. The test case used in the comparison is an important reaction in combustion chemistry and incorporates reversible and irreversible bimolecular reaction steps as well as isomerizations between multiple unimolecular species. While the numerical integration of the ODE with a stiff ODE integrator is not the fastest method overall, it is the fastest method applicable to all conditions.