859 resultados para constant modulus algorithm
Resumo:
Solid state nuclear magnetic resonance (NMR) spectroscopy is a powerful technique for studying structural and dynamical properties of disordered and partially ordered materials, such as glasses, polymers, liquid crystals, and biological materials. In particular, twodimensional( 2D) NMR methods such as ^^C-^^C correlation spectroscopy under the magicangle- spinning (MAS) conditions have been used to measure structural constraints on the secondary structure of proteins and polypeptides. Amyloid fibrils implicated in a broad class of diseases such as Alzheimer's are known to contain a particular repeating structural motif, called a /5-sheet. However, the details of such structures are poorly understood, primarily because the structural constraints extracted from the 2D NMR data in the form of the so-called Ramachandran (backbone torsion) angle distributions, g{^,'4)), are strongly model-dependent. Inverse theory methods are used to extract Ramachandran angle distributions from a set of 2D MAS and constant-time double-quantum-filtered dipolar recoupling (CTDQFD) data. This is a vastly underdetermined problem, and the stability of the inverse mapping is problematic. Tikhonov regularization is a well-known method of improving the stability of the inverse; in this work it is extended to use a new regularization functional based on the Laplacian rather than on the norm of the function itself. In this way, one makes use of the inherently two-dimensional nature of the underlying Ramachandran maps. In addition, a modification of the existing numerical procedure is performed, as appropriate for an underdetermined inverse problem. Stability of the algorithm with respect to the signal-to-noise (S/N) ratio is examined using a simulated data set. The results show excellent convergence to the true angle distribution function g{(j),ii) for the S/N ratio above 100.
Resumo:
We have calculated the thermodynamic properties of monatomic fcc crystals from the high temperature limit of the Helmholtz free energy. This equation of state included the static and vibrational energy components. The latter contribution was calculated to order A4 of perturbation theory, for a range of crystal volumes, in which a nearest neighbour central force model was used. We have calculated the lattice constant, the coefficient of volume expansion, the specific heat at constant volume and at constant pressure, the adiabatic and the isothermal bulk modulus, and the Gruneisen parameter, for two of the rare gas solids, Xe and Kr, and for the fcc metals Cu, Ag, Au, Al, and Pb. The LennardJones and the Morse potential were each used to represent the atomic interactions for the rare gas solids, and only the Morse potential was used for the fcc metals. The thermodynamic properties obtained from the A4 equation of state with the Lennard-Jones potential, seem to be in reasonable agreement with experiment for temperatures up to about threequarters of the melting temperature. However, for the higher temperatures, the results are less than satisfactory. For Xe and Kr, the thermodynamic properties calculated from the A2 equation of state with the Morse potential, are qualitatively similar to the A 2 results obtained with the Lennard-Jones potential, however, the properties obtained from the A4 equation of state are in good agreement with experiment, since the contribution from the A4 terms seem to be small. The lattice contribution to the thermal properties of the fcc metals was calculated from the A4 equation of state, and these results produced a slight improvement over the properties calculated from the A2 equation of state. In order to compare the calculated specific heats and bulk moduli results with experiment~ the electronic contribution to thermal properties was taken into account~ by using the free electron model. We found that the results varied significantly with the value chosen for the number of free electrons per atom.
Resumo:
The effects of a diurnal sine-wave temperature cycle (250 +- 5° C) on the wa terI-e etc r o1 yt est a t us 0 f gol df1' Sh , Carassius auratus, was assessed through determination of Na+, K+, Mg2+, Ca2+, Cl- and water content in plasma, Red blood cells and muscle tissue. Animals were also acclimated to o 0 0 static temperatures (20 C, 25 c, 30 C) corresponding to the high, low and mid-ooint temperatures of the cycle. All groups were sampled at 03:00, 09:00, 15:00 and 21:00 hr. Hemoglobin content and packed cell volume, as well as electrolyte and 'water levels were determined for each animal and red cell ion concentrations and ion : hemoglobin ratios estimated. Cycled animals were distinct from those at constant temperatures in several respects. Hematological parameters were elevated above those of animals at constant temperature and were, on a diurnal basis, more stable. Red blood cell electrolyte levels varied in an adaptively appropriate fashion to cycle temperatures. This was not the case in the constant temperature groups_ Under the cycling regime, plasma ion levels were more diurnally stable than those of constant temperature fish. Although muscle parameters in cycled fish exhibited more fluctuation than was observed in plasma, these also tended to be relatively more stable than was the caseErythrocytic data are discussed in terms of their effects on hemoglobin-oxygen affinity while plasma and muscle observations were considered from the standpoint of overall water-electrolyte balance. In general, cycled fish appeared to be capable of stabilizing overall body fluid composition, while simultaneously effecting adaptively-appropriate modifications in the erythrocytic ionic microenvironment of hemoglobin. The sometimes marked diurnal variability of water-electrolyte status in animals held at constant temperature as opposed to the conservation of cycled fish suggests that this species is, in some fashion, programmed for regulation in a thermally-fluctuating environment. If this interpretation is valid and a phenomenon of general occurrence, some earlier studies involving constant acclimation of eurythermal species normally occupying habitats which vary in temperature on a daily basis may require reconsideration. at constant temperature.
Resumo:
We have presented a Green's function method for the calculation of the atomic mean square displacement (MSD) for an anharmonic Hamil toni an . This method effectively sums a whole class of anharmonic contributions to MSD in the perturbation expansion in the high temperature limit. Using this formalism we have calculated the MSD for a nearest neighbour fcc Lennard Jones solid. The results show an improvement over the lowest order perturbation theory results, the difference with Monte Carlo calculations at temperatures close to melting is reduced from 11% to 3%. We also calculated the MSD for the Alkali metals Nat K/ Cs where a sixth neighbour interaction potential derived from the pseudopotential theory was employed in the calculations. The MSD by this method increases by 2.5% to 3.5% over the respective perturbation theory results. The MSD was calculated for Aluminum where different pseudopotential functions and a phenomenological Morse potential were used. The results show that the pseudopotentials provide better agreement with experimental data than the Morse potential. An excellent agreement with experiment over the whole temperature range is achieved with the Harrison modified point-ion pseudopotential with Hubbard-Sham screening function. We have calculated the thermodynamic properties of solid Kr by minimizing the total energy consisting of static and vibrational components, employing different schemes: The quasiharmonic theory (QH), ).2 and).4 perturbation theory, all terms up to 0 ().4) of the improved self consistent phonon theory (ISC), the ring diagrams up to o ().4) (RING), the iteration scheme (ITER) derived from the Greens's function method and a scheme consisting of ITER plus the remaining contributions of 0 ().4) which are not included in ITER which we call E(FULL). We have calculated the lattice constant, the volume expansion, the isothermal and adiabatic bulk modulus, the specific heat at constant volume and at constant pressure, and the Gruneisen parameter from two different potential functions: Lennard-Jones and Aziz. The Aziz potential gives generally a better agreement with experimental data than the LJ potential for the QH, ).2, ).4 and E(FULL) schemes. When only a partial sum of the).4 diagrams is used in the calculations (e.g. RING and ISC) the LJ results are in better agreement with experiment. The iteration scheme brings a definitive improvement over the).2 PT for both potentials.
Resumo:
The interaction of biological molecules with water is an important determinant of structural properties both in molecular assemblies, and in conformation of individual macromolecules. By observing the effects of manipulating the activity of water (which can be accomplished by limiting its concentration or by adding additional solutes, "osmotic stress"), one can learn something about intrinsic physical properties of biological molecules as well as measure an energetic contribution of closely associated water molecules to overall equilibria in biological reactions. Here two such studies are reported. The first of these examines several species of lysolipid which, while present in relatively low concentrations in biomembranes, have been shown to affect many cellular processes involving membrane-protein or membrane-membrane interactions. Monolayer elastic constants were determined by combining X-ray diffraction and the osmotic stress technique. Spontaneous radii of curvature of lysophosphatidylcholines were determined to be positive and in the range +30A to +70A, while lysophosphatidylethanolamines proved to be essentially flat. Neither lysolipid significantly affected the bending modulus of the monolayer in which it was incorporated. The second study examines the role of water in theprocess of polymerization of actin into filaments. Water activity was manipulated by adding osmolytes and the effect on the equilibrium dissociation constant (measured as the criticalmonomer concentration) was determined. As water activity was decreased, the critical concentration was reduced for Ca-actin but not for Mg-actin, suggesting that 10-12 fewer water molecules are associated with Ca-actin in the polymerized state. Thisunexpectedly small amount of water is discussed in the context of the common structural motif of a nucleotide binding cleft.
Resumo:
To date there is no documented procedure to extrapolate findings of an isometric nature to a whole body performance setting. The purpose of this study was to quantify the reliability of perceived exertion to control neuromuscular output during an isometric contraction. 21 varsity athletes completed a maximal voluntary contraction and a 2 min constant force contraction at both the start and end of the study. Between pre and post testing all participants completed a 2 min constant perceived exertion contraction once a day for 4 days. Intra-class correlation coefficient (R=O.949) and standard error of measurement (SEM=5.12 Nm) concluded that the isometric contraction was reliable. Limits of agreement demonstrated only moderate initial reliability, yet with smaller limits towards the end of 4 training sessions. In conclusion, athlete's na"ive to a constant effort isometric contraction will produce reliable and acceptably stable results after 1 familiarization sessions has been completed.
Resumo:
This thesis introduces the Salmon Algorithm, a search meta-heuristic which can be used for a variety of combinatorial optimization problems. This algorithm is loosely based on the path finding behaviour of salmon swimming upstream to spawn. There are a number of tunable parameters in the algorithm, so experiments were conducted to find the optimum parameter settings for different search spaces. The algorithm was tested on one instance of the Traveling Salesman Problem and found to have superior performance to an Ant Colony Algorithm and a Genetic Algorithm. It was then tested on three coding theory problems - optimal edit codes, optimal Hamming distance codes, and optimal covering codes. The algorithm produced improvements on the best known values for five of six of the test cases using edit codes. It matched the best known results on four out of seven of the Hamming codes as well as three out of three of the covering codes. The results suggest the Salmon Algorithm is competitive with established guided random search techniques, and may be superior in some search spaces.
Resumo:
Understanding the machinery of gene regulation to control gene expression has been one of the main focuses of bioinformaticians for years. We use a multi-objective genetic algorithm to evolve a specialized version of side effect machines for degenerate motif discovery. We compare some suggested objectives for the motifs they find, test different multi-objective scoring schemes and probabilistic models for the background sequence models and report our results on a synthetic dataset and some biological benchmarking suites. We conclude with a comparison of our algorithm with some widely used motif discovery algorithms in the literature and suggest future directions for research in this area.
Resumo:
DNA assembly is among the most fundamental and difficult problems in bioinformatics. Near optimal assembly solutions are available for bacterial and small genomes, however assembling large and complex genomes especially the human genome using Next-Generation-Sequencing (NGS) technologies is shown to be very difficult because of the highly repetitive and complex nature of the human genome, short read lengths, uneven data coverage and tools that are not specifically built for human genomes. Moreover, many algorithms are not even scalable to human genome datasets containing hundreds of millions of short reads. The DNA assembly problem is usually divided into several subproblems including DNA data error detection and correction, contig creation, scaffolding and contigs orientation; each can be seen as a distinct research area. This thesis specifically focuses on creating contigs from the short reads and combining them with outputs from other tools in order to obtain better results. Three different assemblers including SOAPdenovo [Li09], Velvet [ZB08] and Meraculous [CHS+11] are selected for comparative purposes in this thesis. Obtained results show that this thesis’ work produces comparable results to other assemblers and combining our contigs to outputs from other tools, produces the best results outperforming all other investigated assemblers.
Resumo:
Ordered gene problems are a very common classification of optimization problems. Because of their popularity countless algorithms have been developed in an attempt to find high quality solutions to the problems. It is also common to see many different types of problems reduced to ordered gene style problems as there are many popular heuristics and metaheuristics for them due to their popularity. Multiple ordered gene problems are studied, namely, the travelling salesman problem, bin packing problem, and graph colouring problem. In addition, two bioinformatics problems not traditionally seen as ordered gene problems are studied: DNA error correction and DNA fragment assembly. These problems are studied with multiple variations and combinations of heuristics and metaheuristics with two distinct types or representations. The majority of the algorithms are built around the Recentering- Restarting Genetic Algorithm. The algorithm variations were successful on all problems studied, and particularly for the two bioinformatics problems. For DNA Error Correction multiple cases were found with 100% of the codes being corrected. The algorithm variations were also able to beat all other state-of-the-art DNA Fragment Assemblers on 13 out of 16 benchmark problem instances.
Resumo:
Understanding the relationship between genetic diseases and the genes associated with them is an important problem regarding human health. The vast amount of data created from a large number of high-throughput experiments performed in the last few years has resulted in an unprecedented growth in computational methods to tackle the disease gene association problem. Nowadays, it is clear that a genetic disease is not a consequence of a defect in a single gene. Instead, the disease phenotype is a reflection of various genetic components interacting in a complex network. In fact, genetic diseases, like any other phenotype, occur as a result of various genes working in sync with each other in a single or several biological module(s). Using a genetic algorithm, our method tries to evolve communities containing the set of potential disease genes likely to be involved in a given genetic disease. Having a set of known disease genes, we first obtain a protein-protein interaction (PPI) network containing all the known disease genes. All the other genes inside the procured PPI network are then considered as candidate disease genes as they lie in the vicinity of the known disease genes in the network. Our method attempts to find communities of potential disease genes strongly working with one another and with the set of known disease genes. As a proof of concept, we tested our approach on 16 breast cancer genes and 15 Parkinson's Disease genes. We obtained comparable or better results than CIPHER, ENDEAVOUR and GPEC, three of the most reliable and frequently used disease-gene ranking frameworks.
Resumo:
In this thesis we are going to analyze the dictionary graphs and some other kinds of graphs using the PagerRank algorithm. We calculated the correlation between the degree and PageRank of all nodes for a graph obtained from Merriam-Webster dictionary, a French dictionary and WordNet hypernym and synonym dictionaries. Our conclusion was that PageRank can be a good tool to compare the quality of dictionaries. We studied some artificial social and random graphs. We found that when we omitted some random nodes from each of the graphs, we have not noticed any significant changes in the ranking of the nodes according to their PageRank. We also discovered that some social graphs selected for our study were less resistant to the changes of PageRank.