923 resultados para computational algebra


Relevância:

20.00% 20.00%

Publicador:

Resumo:

ESCRT-III proteins catalyze membrane fission during multi vesicular body biogenesis, budding of some enveloped viruses and cell division. We suggest and analyze a novel mechanism of membrane fission by the mammalian ESCRT-III subunits CHMP2 and CHMP3. We propose that the CHMP2-CHMP3 complexes self-assemble into hemi-spherical dome-like structures within the necks of the initial membrane buds generated by CHMP4 filaments. The dome formation is accompanied by the membrane attachment to the dome surface, which drives narrowing of the membrane neck and accumulation of the elastic stresses leading, ultimately, to the neck fission. Based on the bending elastic model of lipid bilayers, we determine the degree of the membrane attachment to the dome enabling the neck fission and compute the required values of the protein-membrane binding energy. We estimate the feasible values of this energy and predict a high efficiency for the CHMP2-CHMP3 complexes in mediating membrane fission. We support the computational model by electron tomography imaging of CHMP2-CHMP3 assemblies in vitro. We predict a high efficiency for the CHMP2-CHMP3 complexes in mediating membrane fission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bone morphogen proteins (BMPs) are distributed along a dorsal-ventral (DV) gradient in many developing embryos. The spatial distribution of this signaling ligand is critical for correct DV axis specification. In various species, BMP expression is spatially localized, and BMP gradient formation relies on BMP transport, which in turn requires interactions with the extracellular proteins Short gastrulation/Chordin (Chd) and Twisted gastrulation (Tsg). These binding interactions promote BMP movement and concomitantly inhibit BMP signaling. The protease Tolloid (Tld) cleaves Chd, which releases BMP from the complex and permits it to bind the BMP receptor and signal. In sea urchin embryos, BMP is produced in the ventral ectoderm, but signals in the dorsal ectoderm. The transport of BMP from the ventral ectoderm to the dorsal ectoderm in sea urchin embryos is not understood. Therefore, using information from a series of experiments, we adapt the mathematical model of Mizutani et al. (2005) and embed it as the reaction part of a one-dimensional reaction–diffusion model. We use it to study aspects of this transport process in sea urchin embryos. We demonstrate that the receptor-bound BMP concentration exhibits dorsally centered peaks of the same type as those observed experimentally when the ternary transport complex (Chd-Tsg-BMP) forms relatively quickly and BMP receptor binding is relatively slow. Similarly, dorsally centered peaks are created when the diffusivities of BMP, Chd, and Chd-Tsg are relatively low and that of Chd-Tsg-BMP is relatively high, and the model dynamics also suggest that Tld is a principal regulator of the system. At the end of this paper, we briefly compare the observed dynamics in the sea urchin model to a version that applies to the fly embryo, and we find that the same conditions can account for BMP transport in the two types of embryos only if Tld levels are reduced in sea urchin compared to fly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter, we draw out the relevant themes from a range of critical scholarship from the small body of digital media and software studies work that has focused on the politics of Twitter data and the sociotechnical means by which access is regulated. We highlight in particular the contested relationships between social media research (in both academic and non-academic contexts) and the data wholesale, retail, and analytics industries that feed on them. In the second major section of the chapter we discuss in detail the pragmatic edge of these politics in terms of what kinds of scientific research is and is not possible in the current political economy of Twitter data access. Finally, at the end of the chapter we return to the much broader implications of these issues for the politics of knowledge, demonstrating how the apparently microscopic level of how the Twitter API mediates access to Twitter data actually inscribes and influences the macro level of the global political economy of science itself, through re-inscribing institutional and traditional disciplinary privilege We conclude with some speculations about future developments in data rights and data philanthropy that may at least mitigate some of these negative impacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational epigenetics is a new area of research focused on exploring how DNA methylation patterns affect transcription factor binding that affect gene expression patterns. The aim of this study was to produce a new protocol for the detection of DNA methylation patterns using computational analysis which can be further confirmed by bisulfite PCR with serial pyrosequencing. The upstream regulatory element and pre-initiation complex relative to CpG islets within the methylenetetrahydrofolate reductase gene were determined via computational analysis and online databases. The 1,104 bp long CpG island located near to or at the alternative promoter site of methylenetetrahydrofolate reductase gene was identified. The CpG plot indicated that CpG islets A and B, within the island, contained 62 and 75 % GC content CpG ratios of 0.70 and 0.80–0.95, respectively. Further exploration of the CpG islets A and B indicates that the transcription start sites were GGC which were absent from the TATA boxes. In addition, although six PROSITE motifs were identified in CpG B, no motifs were detected in CpG A. A number of cis-regulatory elements were found in different regions within the CpGs A and B. Transcription factors were predicted to bind to CpGs A and B with varying affinities depending on the DNA methylation status. In addition, transcription factor binding may influence the expression patterns of the methylenetetrahydrofolate reductase gene by recruiting chromatin condensation inducing factors. These results have significant implications for the understanding of the architecture of transcription factor binding at CpG islets as well as DNA methylation patterns that affect chromatin structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Characterization of the epigenetic profile of humans since the initial breakthrough on the human genome project has strongly established the key role of histone modifications and DNA methylation. These dynamic elements interact to determine the normal level of expression or methylation status of the constituent genes in the genome. Recently, considerable evidence has been put forward to demonstrate that environmental stress implicitly alters epigenetic patterns causing imbalance that can lead to cancer initiation. This chain of consequences has motivated attempts to computationally model the influence of histone modification and DNA methylation in gene expression and investigate their intrinsic interdependency. In this paper, we explore the relation between DNA methylation and transcription and characterize in detail the histone modifications for specific DNA methylation levels using a stochastic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systems-level identification and analysis of cellular circuits in the brain will require the development of whole-brain imaging with single-cell resolution. To this end, we performed comprehensive chemical screening to develop a whole-brain clearing and imaging method, termed CUBIC (clear, unobstructed brain imaging cocktails and computational analysis). CUBIC is a simple and efficient method involving the immersion of brain samples in chemical mixtures containing aminoalcohols, which enables rapid whole-brain imaging with single-photon excitation microscopy. CUBIC is applicable to multicolor imaging of fluorescent proteins or immunostained samples in adult brains and is scalable from a primate brain to subcellular structures. We also developed a whole-brain cell-nuclear counterstaining protocol and a computational image analysis pipeline that, together with CUBIC reagents, enable the visualization and quantification of neural activities induced by environmental stimulation. CUBIC enables time-course expression profiling of whole adult brains with single-cell resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – In structural, earthquake and aeronautical engineering and mechanical vibration, the solution of dynamic equations for a structure subjected to dynamic loading leads to a high order system of differential equations. The numerical methods are usually used for integration when either there is dealing with discrete data or there is no analytical solution for the equations. Since the numerical methods with more accuracy and stability give more accurate results in structural responses, there is a need to improve the existing methods or develop new ones. The paper aims to discuss these issues. Design/methodology/approach – In this paper, a new time integration method is proposed mathematically and numerically, which is accordingly applied to single-degree-of-freedom (SDOF) and multi-degree-of-freedom (MDOF) systems. Finally, the results are compared to the existing methods such as Newmark’s method and closed form solution. Findings – It is concluded that, in the proposed method, the data variance of each set of structural responses such as displacement, velocity, or acceleration in different time steps is less than those in Newmark’s method, and the proposed method is more accurate and stable than Newmark’s method and is capable of analyzing the structure at fewer numbers of iteration or computation cycles, hence less time-consuming. Originality/value – A new mathematical and numerical time integration method is proposed for the computation of structural responses with higher accuracy and stability, lower data variance, and fewer numbers of iterations for computational cycles.