940 resultados para Computational complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2001 45% (2.7 billion) of the world’s population of approximately 6.1 billion lived in ‘moderate poverty’ on less than US $ 2 per person per day (World Population Summary, 2012). In the last 60 years there have been many theories attempting to explain development, why some countries have the fastest growth in history, while others stagnate and so far no way has been found to explain the differences. Traditional views imply that development is the aggregation of successes from multiple individual business enterprises, but this ignores the interactions between and among institutions, organisations and individuals in the economy, which can often have unpredictable effects. Complexity Development Theory proposes that by viewing development as an emergent property of society, we can help create better development programs at the organisational, institutional and national levels. This paper asks how the principals of CAS can be used to develop CDT principals used to develop and operate development programs at the bottom of the pyramid in developing economies. To investigate this research question we conduct a literature review to define and describe CDT and create propositions for testing. We illustrate these propositions using a case study of an Asset Based Community Development (ABCD) Program for existing and nascent entrepreneurs in the Democratic Republic of the Congo (DRC). We found evidence that all the principals of CDT were related to the characteristics of CAS. If this is the case, development programs will be able to select which CAS needed to test these propositions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bone morphogen proteins (BMPs) are distributed along a dorsal-ventral (DV) gradient in many developing embryos. The spatial distribution of this signaling ligand is critical for correct DV axis specification. In various species, BMP expression is spatially localized, and BMP gradient formation relies on BMP transport, which in turn requires interactions with the extracellular proteins Short gastrulation/Chordin (Chd) and Twisted gastrulation (Tsg). These binding interactions promote BMP movement and concomitantly inhibit BMP signaling. The protease Tolloid (Tld) cleaves Chd, which releases BMP from the complex and permits it to bind the BMP receptor and signal. In sea urchin embryos, BMP is produced in the ventral ectoderm, but signals in the dorsal ectoderm. The transport of BMP from the ventral ectoderm to the dorsal ectoderm in sea urchin embryos is not understood. Therefore, using information from a series of experiments, we adapt the mathematical model of Mizutani et al. (2005) and embed it as the reaction part of a one-dimensional reaction–diffusion model. We use it to study aspects of this transport process in sea urchin embryos. We demonstrate that the receptor-bound BMP concentration exhibits dorsally centered peaks of the same type as those observed experimentally when the ternary transport complex (Chd-Tsg-BMP) forms relatively quickly and BMP receptor binding is relatively slow. Similarly, dorsally centered peaks are created when the diffusivities of BMP, Chd, and Chd-Tsg are relatively low and that of Chd-Tsg-BMP is relatively high, and the model dynamics also suggest that Tld is a principal regulator of the system. At the end of this paper, we briefly compare the observed dynamics in the sea urchin model to a version that applies to the fly embryo, and we find that the same conditions can account for BMP transport in the two types of embryos only if Tld levels are reduced in sea urchin compared to fly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose and study low complexity algorithms for on-line estimation of hidden Markov model (HMM) parameters. The estimates approach the true model parameters as the measurement noise approaches zero, but otherwise give improved estimates, albeit with bias. On a nite data set in the high noise case, the bias may not be signi cantly more severe than for a higher complexity asymptotically optimal scheme. Our algorithms require O(N3) calculations per time instant, where N is the number of states. Previous algorithms based on earlier hidden Markov model signal processing methods, including the expectation-maximumisation (EM) algorithm require O(N4) calculations per time instant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This pilot study aims to examine the effect of work-integrated learning (WIL) on work self-efficacy (WSE) for undergraduate students from the Queensland University of Technology. A WSE instrument was used to examine the seven subscales of WSE. These were; learning, problem solving, pressure, role expectations, team work, sensitivity and work politics. The results of this pilot study revealed that, overall the WSE scores were highest when the students’ did not participate in the WIL unit (comparison group) in comparison to the WIL group. The current paper suggests that WSE scores were changed as a result of WIL participation. These findings open a new path for future studies allowing them to explore the relationship between WIL and the specific subscales of WSE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter, we draw out the relevant themes from a range of critical scholarship from the small body of digital media and software studies work that has focused on the politics of Twitter data and the sociotechnical means by which access is regulated. We highlight in particular the contested relationships between social media research (in both academic and non-academic contexts) and the data wholesale, retail, and analytics industries that feed on them. In the second major section of the chapter we discuss in detail the pragmatic edge of these politics in terms of what kinds of scientific research is and is not possible in the current political economy of Twitter data access. Finally, at the end of the chapter we return to the much broader implications of these issues for the politics of knowledge, demonstrating how the apparently microscopic level of how the Twitter API mediates access to Twitter data actually inscribes and influences the macro level of the global political economy of science itself, through re-inscribing institutional and traditional disciplinary privilege We conclude with some speculations about future developments in data rights and data philanthropy that may at least mitigate some of these negative impacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational epigenetics is a new area of research focused on exploring how DNA methylation patterns affect transcription factor binding that affect gene expression patterns. The aim of this study was to produce a new protocol for the detection of DNA methylation patterns using computational analysis which can be further confirmed by bisulfite PCR with serial pyrosequencing. The upstream regulatory element and pre-initiation complex relative to CpG islets within the methylenetetrahydrofolate reductase gene were determined via computational analysis and online databases. The 1,104 bp long CpG island located near to or at the alternative promoter site of methylenetetrahydrofolate reductase gene was identified. The CpG plot indicated that CpG islets A and B, within the island, contained 62 and 75 % GC content CpG ratios of 0.70 and 0.80–0.95, respectively. Further exploration of the CpG islets A and B indicates that the transcription start sites were GGC which were absent from the TATA boxes. In addition, although six PROSITE motifs were identified in CpG B, no motifs were detected in CpG A. A number of cis-regulatory elements were found in different regions within the CpGs A and B. Transcription factors were predicted to bind to CpGs A and B with varying affinities depending on the DNA methylation status. In addition, transcription factor binding may influence the expression patterns of the methylenetetrahydrofolate reductase gene by recruiting chromatin condensation inducing factors. These results have significant implications for the understanding of the architecture of transcription factor binding at CpG islets as well as DNA methylation patterns that affect chromatin structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hippocampus is an anatomically distinct region of the medial temporal lobe that plays a critical role in the formation of declarative memories. Here we show that a computer simulation of simple compartmental cells organized with basic hippocampal connectivity is capable of producing stimulus intensity sensitive wide-band fluctuations of spectral power similar to that seen in real EEG. While previous computational models have been designed to assess the viability of the putative mechanisms of memory storage and retrieval, they have generally been too abstract to allow comparison with empirical data. Furthermore, while the anatomical connectivity and organization of the hippocampus is well defined, many questions regarding the mechanisms that mediate large-scale synaptic integration remain unanswered. For this reason we focus less on the specifics of changing synaptic weights and more on the population dynamics. Spectral power in four distinct frequency bands were derived from simulated field potentials of the computational model and found to depend on the intensity of a random input. The majority of power occurred in the lowest frequency band (3-6 Hz) and was greatest to the lowest intensity stimulus condition (1% maximal stimulus). In contrast, higher frequency bands ranging from 7-45 Hz show an increase in power directly related with an increase in stimulus intensity. This trend continues up to a stimulus level of 15% to 20% of the maximal input, above which power falls dramatically. These results suggest that the relative power of intrinsic network oscillations are dependent upon the level of activation and that above threshold levels all frequencies are damped, perhaps due to over activation of inhibitory interneurons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.