121 resultados para scientific computation
Computation of ECG signal features using MCMC modelling in software and FPGA reconfigurable hardware
Resumo:
Computational optimisation of clinically important electrocardiogram signal features, within a single heart beat, using a Markov-chain Monte Carlo (MCMC) method is undertaken. A detailed, efficient data-driven software implementation of an MCMC algorithm has been shown. Initially software parallelisation is explored and has been shown that despite the large amount of model parameter inter-dependency that parallelisation is possible. Also, an initial reconfigurable hardware approach is explored for future applicability to real-time computation on a portable ECG device, under continuous extended use.
Resumo:
The ability of cloud computing to provide almost unlimited storage, backup and recovery, and quick deployment contributes to its widespread attention and implementation. Cloud computing has also become an attractive choice for mobile users as well. Due to limited features of mobile devices such as power scarcity and inability to cater computationintensive tasks, selected computation needs to be outsourced to the resourceful cloud servers. However, there are many challenges which need to be addressed in computation offloading for mobile cloud computing such as communication cost, connectivity maintenance and incurred latency. This paper presents taxonomy of the computation offloading approaches which aim to address the challenges. The taxonomy provides guidelines to identify research scopes in computation offloading for mobile cloud computing. We also outline directions and anticipated trends for future research.
Resumo:
The practice of travel journalism is still largely neglected as a field of inquiry for communication and journalism scholars, despite the fact that news media are increasingly focussing on softer news. Lifestyle sections of newspapers, for example, have been growing in size over the past few decades, and given corresponding cutbacks in international news reporting, particularly travel journalism is now playing a growing role in the representation of ‘the Other’. While this need for research into the field has been identified before, very little actual investigation of travel journalism has been forthcoming. This paper reviews the current state of research by reviewing what studies have been conducted into the production, content and reception of travel journalism. It argues that while there does now exist a very small number of studies, these have often been conducted in isolation and with significant limitations, and much remains to be done to sufficiently explore this sub-field of journalism. By analysing what we do know about travel journalism, the paper suggests a number of possibilities in each area on how we can advance this knowledge. Above all, it contends that dated prejudices against the field have to be put to the side, and the practice of travel journalism needs to be taken seriously in order to do its growing importance justice.
Resumo:
We study the natural problem of secure n-party computation (in the passive, computationally unbounded attack model) of the n-product function f G (x 1,...,x n ) = x 1 ·x 2 ⋯ x n in an arbitrary finite group (G,·), where the input of party P i is x i ∈ G for i = 1,...,n. For flexibility, we are interested in protocols for f G which require only black-box access to the group G (i.e. the only computations performed by players in the protocol are a group operation, a group inverse, or sampling a uniformly random group element). Our results are as follows. First, on the negative side, we show that if (G,·) is non-abelian and n ≥ 4, then no ⌈n/2⌉-private protocol for computing f G exists. Second, on the positive side, we initiate an approach for construction of black-box protocols for f G based on k-of-k threshold secret sharing schemes, which are efficiently implementable over any black-box group G. We reduce the problem of constructing such protocols to a combinatorial colouring problem in planar graphs. We then give two constructions for such graph colourings. Our first colouring construction gives a protocol with optimal collusion resistance t < n/2, but has exponential communication complexity O(n*2t+1^2/t) group elements (this construction easily extends to general adversary structures). Our second probabilistic colouring construction gives a protocol with (close to optimal) collusion resistance t < n/μ for a graph-related constant μ ≤ 2.948, and has efficient communication complexity O(n*t^2) group elements. Furthermore, we believe that our results can be improved by further study of the associated combinatorial problems.
Resumo:
Since 1995 the eruption of the andesitic Soufrière Hills Volcano (SHV), Montserrat, has been studied in substantial detail. As an important contribution to this effort, the Seismic Experiment with Airgunsource-Caribbean Andesitic Lava Island Precision Seismo-geodetic Observatory (SEA-CALIPSO) experiment was devised to image the arc crust underlying Montserrat, and, if possible, the magma system at SHV using tomography and reflection seismology. Field operations were carried out in October–December 2007, with deployment of 238 seismometers on land supplementing seven volcano observatory stations, and with an array of 10 ocean-bottom seismometers deployed offshore. The RRS James Cook on NERC cruise JC19 towed a tuned airgun array plus a digital 48-channel streamer on encircling and radial tracks for 77 h about Montserrat during December 2007, firing 4414 airgun shots and yielding about 47 Gb of data. The main objecctives of the experiment were achieved. Preliminary analyses of these data published in 2010 generated images of heterogeneous high-velocity bodies representing the cores of volcanoes and subjacent intrusions, and shallow areas of low velocity on the flanks of the island that reflect volcaniclastic deposits and hydrothermal alteration. The resolution of this preliminary work did not extend beyond 5 km depth. An improved three-dimensional (3D) seismic velocity model was then obtained by inversion of 181 665 first-arrival travel times from a more-complete sampling of the dataset, yielding clear images to 7.5 km depth of a low-velocity volume that was interpreted as the magma chamber which feeds the current eruption, with an estimated volume 13 km3. Coupled thermal and seismic modelling revealed properties of the partly crystallized magma. Seismic reflection analyses aimed at imaging structures under southern Montserrat had limited success, and suggest subhorizontal layering interpreted as sills at a depth of between 6 and 19 km. Seismic reflection profiles collected offshore reveal deep fans of volcaniclastic debris and fault offsets, leading to new tectonic interpretations. This chapter presents the project goals and planning concepts, describes in detail the campaigns at sea and on land, summarizes the major results, and identifies the key lessons learned.
Resumo:
The spatiotemporal dynamics of an alien species invasion across a real landscape are typically complex. While surveillance is an essential part of a management response, planning surveillance in space and time present a difficult challenge due to this complexity. We show here a method for determining the highest probability sites for occupancy across a landscape at an arbitrary point in the future, based on occupancy data from a single slice in time. We apply to the method to the invasion of Giant Hogweed, a serious weed in the Czech republic and throughout Europe.
Resumo:
Despite rising levels of safe-sex knowledge in Australia, sexually transmitted infection notifications continue to increase. A culture-centred approach suggests it is useful in attempting to reach a target population first to understand their perspective on the issues. Twenty focus groups were conducted with 89 young people between the ages of 14 and 16 years. Key findings suggest that scientific information does not articulate closely with everyday practice, that young people get the message that sex is bad and they should not be preparing for it and that it is not appropriate to talk about sex. Understanding how young people think about these issues is particularly important because the focus groups also found that young people disengage from sources of information that do not match their own experiences.
Resumo:
Quantifying the impact of biochemical compounds on collective cell spreading is an essential element of drug design, with various applications including developing treatments for chronic wounds and cancer. Scratch assays are a technically simple and inexpensive method used to study collective cell spreading; however, most previous interpretations of scratch assays are qualitative and do not provide estimates of the cell diffusivity, D, or the cell proliferation rate,l. Estimating D and l is important for investigating the efficacy of a potential treatment and provides insight into the mechanism through which the potential treatment acts. While a few methods for estimating D and l have been proposed, these previous methods lead to point estimates of D and l, and provide no insight into the uncertainty in these estimates. Here, we compare various types of information that can be extracted from images of a scratch assay, and quantify D and l using discrete computational simulations and approximate Bayesian computation. We show that it is possible to robustly recover estimates of D and l from synthetic data, as well as a new set of experimental data. For the first time, our approach also provides a method to estimate the uncertainty in our estimates of D and l. We anticipate that our approach can be generalized to deal with more realistic experimental scenarios in which we are interested in estimating D and l, as well as additional relevant parameters such as the strength of cell-to-cell adhesion or the strength of cell-to-substrate adhesion.
Resumo:
Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
This work describes recent extensions to the GPFlow scientific workflow system in development at MQUTeR (www.mquter.qut.edu.au), which facilitate interactive experimentation, automatic lifting of computations from single-case to collection-oriented computation and automatic correlation and synthesis of collections. A GPFlow workflow presents as an acyclic data flow graph, yet provides powerful iteration and collection formation capabilities.
Resumo:
This paper examines discourses of male prostitution through an analysis of scientific texts. A contrast is drawn between nineteenth-century understandings of male prostitution and twentieth-century accounts of male prostitution. In contrast to female prostitution, male prostitution was not regarded as a significant social problem throughout the nineteenth century, despite its close association with gender deviation and social disorder. Changing conceptions of sexuality, linked with the emergence of the ‘adolescent’, drew scientific attention to male prostitution during the 1940s and 1950s. Research suggested that male prostitution was a problem associated with the development of sexual identity. Through the application of scientific techniques, which tagged and differentiated male prostitute populations, a language developed about male prostitution that allowed for normative assessments and judgements to be made concerning particular classes of male prostitute. The paper highlights how a broad distinction emerged between public prostitutes, regarded as heterosexual/masculine, and private prostitutes, regarded as homosexual/effeminate. This distinction altered the way in which male prostitution was understood and governed, allowing for male prostitution to be constituted as a public health concern.
Resumo:
Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.