973 resultados para Computational biology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relationships between mineralization, collagen orientation and indentation modulus were investigated in bone structural units from the mid-shaft of human femora using a site-matched design. Mineral mass fraction, collagen fibril angle and indentation moduli were measured in registered anatomical sites using backscattered electron imaging, polarized light microscopy and nano-indentation, respectively. Theoretical indentation moduli were calculated with a homogenization model from the quantified mineral densities and mean collagen fibril orientations. The average indentation moduli predicted based on local mineralization and collagen fibers arrangement were not significantly different from the average measured experimentally with nanoindentation (p=0.9). Surprisingly, no substantial correlation of the measured indentation moduli with tissue mineralization and/or collagen fiber arrangement was found. Nano-porosity, micro-damage, collagen cross-links, non-collagenous proteins or other parameters affect the indentation measurements. Additional testing/simulation methods need to be considered to properly understand the variability of indentation moduli, beyond the mineralization and collagen arrangement in bone structural units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In attempts to elucidate the underlying mechanisms of spinal injuries and spinal deformities, several experimental and numerical studies have been conducted to understand the biomechanical behavior of the spine. However, numerical biomechanical studies suffer from uncertainties associated with hard- and soft-tissue anatomies. Currently, these parameters are identified manually on each mesh model prior to simulations. The determination of soft connective tissues on finite element meshes can be a tedious procedure, which limits the number of models used in the numerical studies to a few instances. In order to address these limitations, an image-based method for automatic morphing of soft connective tissues has been proposed. Results showed that the proposed method is capable to accurately determine the spatial locations of predetermined bony landmarks. The present method can be used to automatically generate patient-specific models, which may be helpful in designing studies involving a large number of instances and to understand the mechanical behavior of biomechanical structures across a given population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential and adaptive flexibility of population dynamic P-systems (PDP) to study population dynamics suggests that they may be suitable for modelling complex fluvial ecosystems, characterized by a composition of dynamic habitats with many variables that interact simultaneously. Using as a model a reservoir occupied by the zebra mussel Dreissena polymorpha, we designed a computational model based on P systems to study the population dynamics of larvae, in order to evaluate management actions to control or eradicate this invasive species. The population dynamics of this species was simulated under different scenarios ranging from the absence of water flow change to a weekly variation with different flow rates, to the actual hydrodynamic situation of an intermediate flow rate. Our results show that PDP models can be very useful tools to model complex, partially desynchronized, processes that work in parallel. This allows the study of complex hydroecological processes such as the one presented, where reproductive cycles, temperature and water dynamics are involved in the desynchronization of the population dynamics both, within areas and among them. The results obtained may be useful in the management of other reservoirs with similar hydrodynamic situations in which the presence of this invasive species has been documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Iterative Closest Point algorithm (ICP) is commonly used in engineering applications to solve the rigid registration problem of partially overlapped point sets which are pre-aligned with a coarse estimate of their relative positions. This iterative algorithm is applied in many areas such as the medicine for volumetric reconstruction of tomography data, in robotics to reconstruct surfaces or scenes using range sensor information, in industrial systems for quality control of manufactured objects or even in biology to study the structure and folding of proteins. One of the algorithm’s main problems is its high computational complexity (quadratic in the number of points with the non-optimized original variant) in a context where high density point sets, acquired by high resolution scanners, are processed. Many variants have been proposed in the literature whose goal is the performance improvement either by reducing the number of points or the required iterations or even enhancing the complexity of the most expensive phase: the closest neighbor search. In spite of decreasing its complexity, some of the variants tend to have a negative impact on the final registration precision or the convergence domain thus limiting the possible application scenarios. The goal of this work is the improvement of the algorithm’s computational cost so that a wider range of computationally demanding problems from among the ones described before can be addressed. For that purpose, an experimental and mathematical convergence analysis and validation of point-to-point distance metrics has been performed taking into account those distances with lower computational cost than the Euclidean one, which is used as the de facto standard for the algorithm’s implementations in the literature. In that analysis, the functioning of the algorithm in diverse topological spaces, characterized by different metrics, has been studied to check the convergence, efficacy and cost of the method in order to determine the one which offers the best results. Given that the distance calculation represents a significant part of the whole set of computations performed by the algorithm, it is expected that any reduction of that operation affects significantly and positively the overall performance of the method. As a result, a performance improvement has been achieved by the application of those reduced cost metrics whose quality in terms of convergence and error has been analyzed and validated experimentally as comparable with respect to the Euclidean distance using a heterogeneous set of objects, scenarios and initial situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To ensure signalling fidelity, kinases must act only on a defined subset of cellular targets. Appreciating the basis for this substrate specificity is essential for understanding the role of an individual protein kinase in a particular cellular process. The specificity in the cell is determined by a combination of peptide specificity of the kinase (the molecular recognition of the sequence surrounding the phosphorylation site), substrate recruitment and phosphatase activity. Peptide specificity plays a crucial role and depends on the complementarity between the kinase and the substrate and therefore on their three-dimensional structures. Methods for experimental identification of kinase substrates and characterization of specificity are expensive and laborious, therefore, computational approaches are being developed to reduce the amount of experimental work required in substrate identification. We discuss the structural basis of substrate specificity of protein kinases and review the experimental and computational methods used to obtain specificity information. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgments We thank Sally Rowland for helpful comments on the manuscript. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.