991 resultados para Computational sciences
Resumo:
Recently, the numerical modelling and simulation for fractional partial differential equations (FPDE), which have been found with widely applications in modern engineering and sciences, are attracting increased attentions. The current dominant numerical method for modelling of FPDE is the explicit Finite Difference Method (FDM), which is based on a pre-defined grid leading to inherited issues or shortcomings. This paper aims to develop an implicit meshless approach based on the radial basis functions (RBF) for numerical simulation of time fractional diffusion equations. The discrete system of equations is obtained by using the RBF meshless shape functions and the strong-forms. The stability and convergence of this meshless approach are then discussed and theoretically proven. Several numerical examples with different problem domains are used to validate and investigate accuracy and efficiency of the newly developed meshless formulation. The results obtained by the meshless formations are also compared with those obtained by FDM in terms of their accuracy and efficiency. It is concluded that the present meshless formulation is very effective for the modelling and simulation for FPDE.
Resumo:
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the delivery of original investigative journalism, and to attract and retain readers online.
Resumo:
This chapter focuses on the interactions and roles between delays and intrinsic noise effects within cellular pathways and regulatory networks. We address these aspects by focusing on genetic regulatory networks that share a common network motif, namely the negative feedback loop, leading to oscillatory gene expression and protein levels. In this context, we discuss computational simulation algorithms for addressing the interplay of delays and noise within the signaling pathways based on biological data. We address implementational issues associated with efficiency and robustness. In a molecular biology setting we present two case studies of temporal models for the Hes1 gene (Monk, 2003; Hirata et al., 2002), known to act as a molecular clock, and the Her1/Her7 regulatory system controlling the periodic somite segmentation in vertebrate embryos (Giudicelli and Lewis, 2004; Horikawa et al., 2006).
Resumo:
We report on analysis of discussions in an online community of people with chronic illness using socio-cognitively motivated, automatically produced semantic spaces. The analysis aims to further the emerging theory of "transition" (how people can learn to incorporate the consequences of illness into their lives). An automatically derived representation of sense of self for individuals is created in the semantic space by the analysis of the email utterances of the community members. The movement over time of the sense of self is visualised, via projection, with respect to axes of "ordinariness" and "extra-ordinariness". Qualitative evaluation shows that the visualisation is paralleled by the transitions of people during the course of their illness. The research aims to progress tools for analysis of textual data to promote greater use of tacit knowledge as found in online virtual communities. We hope it also encourages further interest in representation of sense-of-self.
Resumo:
CTAC2012 was the 16th biennial Computational Techniques and Applications Conference, and took place at Queensland University of Technology from 23 - 26 September, 2012. The ANZIAM Special Interest Group in Computational Techniques and Applications is responsible for the CTAC meetings, the first of which was held in 1981.
Resumo:
Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.
Resumo:
As the level of autonomy in Unmanned Aircraft Systems (UAS) increases, there is an imperative need for developing methods to assess robust autonomy. This paper focuses on the computations that lead to a set of measures of robust autonomy. These measures are the probabilities that selected performance indices related to the mission requirements and airframe capabilities remain within regions of acceptable performance.
Resumo:
ESCRT-III proteins catalyze membrane fission during multi vesicular body biogenesis, budding of some enveloped viruses and cell division. We suggest and analyze a novel mechanism of membrane fission by the mammalian ESCRT-III subunits CHMP2 and CHMP3. We propose that the CHMP2-CHMP3 complexes self-assemble into hemi-spherical dome-like structures within the necks of the initial membrane buds generated by CHMP4 filaments. The dome formation is accompanied by the membrane attachment to the dome surface, which drives narrowing of the membrane neck and accumulation of the elastic stresses leading, ultimately, to the neck fission. Based on the bending elastic model of lipid bilayers, we determine the degree of the membrane attachment to the dome enabling the neck fission and compute the required values of the protein-membrane binding energy. We estimate the feasible values of this energy and predict a high efficiency for the CHMP2-CHMP3 complexes in mediating membrane fission. We support the computational model by electron tomography imaging of CHMP2-CHMP3 assemblies in vitro. We predict a high efficiency for the CHMP2-CHMP3 complexes in mediating membrane fission.
Resumo:
Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.
Resumo:
Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.
Resumo:
Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.
Resumo:
Bone morphogen proteins (BMPs) are distributed along a dorsal-ventral (DV) gradient in many developing embryos. The spatial distribution of this signaling ligand is critical for correct DV axis specification. In various species, BMP expression is spatially localized, and BMP gradient formation relies on BMP transport, which in turn requires interactions with the extracellular proteins Short gastrulation/Chordin (Chd) and Twisted gastrulation (Tsg). These binding interactions promote BMP movement and concomitantly inhibit BMP signaling. The protease Tolloid (Tld) cleaves Chd, which releases BMP from the complex and permits it to bind the BMP receptor and signal. In sea urchin embryos, BMP is produced in the ventral ectoderm, but signals in the dorsal ectoderm. The transport of BMP from the ventral ectoderm to the dorsal ectoderm in sea urchin embryos is not understood. Therefore, using information from a series of experiments, we adapt the mathematical model of Mizutani et al. (2005) and embed it as the reaction part of a one-dimensional reaction–diffusion model. We use it to study aspects of this transport process in sea urchin embryos. We demonstrate that the receptor-bound BMP concentration exhibits dorsally centered peaks of the same type as those observed experimentally when the ternary transport complex (Chd-Tsg-BMP) forms relatively quickly and BMP receptor binding is relatively slow. Similarly, dorsally centered peaks are created when the diffusivities of BMP, Chd, and Chd-Tsg are relatively low and that of Chd-Tsg-BMP is relatively high, and the model dynamics also suggest that Tld is a principal regulator of the system. At the end of this paper, we briefly compare the observed dynamics in the sea urchin model to a version that applies to the fly embryo, and we find that the same conditions can account for BMP transport in the two types of embryos only if Tld levels are reduced in sea urchin compared to fly.
Resumo:
Computational epigenetics is a new area of research focused on exploring how DNA methylation patterns affect transcription factor binding that affect gene expression patterns. The aim of this study was to produce a new protocol for the detection of DNA methylation patterns using computational analysis which can be further confirmed by bisulfite PCR with serial pyrosequencing. The upstream regulatory element and pre-initiation complex relative to CpG islets within the methylenetetrahydrofolate reductase gene were determined via computational analysis and online databases. The 1,104 bp long CpG island located near to or at the alternative promoter site of methylenetetrahydrofolate reductase gene was identified. The CpG plot indicated that CpG islets A and B, within the island, contained 62 and 75 % GC content CpG ratios of 0.70 and 0.80–0.95, respectively. Further exploration of the CpG islets A and B indicates that the transcription start sites were GGC which were absent from the TATA boxes. In addition, although six PROSITE motifs were identified in CpG B, no motifs were detected in CpG A. A number of cis-regulatory elements were found in different regions within the CpGs A and B. Transcription factors were predicted to bind to CpGs A and B with varying affinities depending on the DNA methylation status. In addition, transcription factor binding may influence the expression patterns of the methylenetetrahydrofolate reductase gene by recruiting chromatin condensation inducing factors. These results have significant implications for the understanding of the architecture of transcription factor binding at CpG islets as well as DNA methylation patterns that affect chromatin structure.
Resumo:
This thesis introduces a new way of using prior information in a spatial model and develops scalable algorithms for fitting this model to large imaging datasets. These methods are employed for image-guided radiation therapy and satellite based classification of land use and water quality. This study has utilized a pre-computation step to achieve a hundredfold improvement in the elapsed runtime for model fitting. This makes it much more feasible to apply these models to real-world problems, and enables full Bayesian inference for images with a million or more pixels.