929 resultados para scientific computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Associations between the consumption of particular foods and health outcomes may be indicated by observational studies. However, intervention trials that evaluate the health benefits of foods provide the strongest evidence to support dietary recommendations for health. Thus, it is important that these trials are carried out safely, and to high scientific standards. Accepted standards for the reporting of the health benefits of pharmaceutical and other medical interventions have been provided by the Consolidated Standards of Reporting Trials (CONSORT) statement. However, there are no generally accepted standards for trials to evaluate the health benefits of foods. Trials with foods differ from medical trials in issues related to safety, ethics, research governance and practical implementation. Furthermore, these important issues can deter the conduct of both medical and nutrition trials in infants, children and adolescents. This paper provides standards for the planning, design, conduct, statistical analysis and interpretation of human intervention trials to evaluate the health benefits of foods that are based on the CONSORT guidelines, and outlines the key issues that need to be addressed in trials in participants in the paediatric age range.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a design methodology for algorithm/architecture co-design of a voltage-scalable, process variation aware motion estimator based on significance driven computation. The fundamental premise of our approach lies in the fact that all computations are not equally significant in shaping the output response of video systems. We use a statistical technique to intelligently identify these significant/not-so-significant computations at the algorithmic level and subsequently change the underlying architecture such that the significant computations are computed in an error free manner under voltage over-scaling. Furthermore, our design includes an adaptive quality compensation (AQC) block which "tunes" the algorithm and architecture depending on the magnitude of voltage over-scaling and severity of process variations. Simulation results show average power savings of similar to 33% for the proposed architecture when compared to conventional implementation in the 90 nm CMOS technology. The maximum output quality loss in terms of Peak Signal to Noise Ratio (PSNR) was similar to 1 dB without incurring any throughput penalty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a design paradigm for energy efficient and variation-aware operation of next-generation multicore heterogeneous platforms. The main idea behind the proposed approach lies on the observation that not all operations are equally important in shaping the output quality of various applications and of the overall system. Based on such an observation, we suggest that all levels of the software design stack, including the programming model, compiler, operating system (OS) and run-time system should identify the critical tasks and ensure correct operation of such tasks by assigning them to dynamically adjusted reliable cores/units. Specifically, based on error rates and operating conditions identified by a sense-and-adapt (SeA) unit, the OS selects and sets the right mode of operation of the overall system. The run-time system identifies the critical/less-critical tasks based on special directives and schedules them to the appropriate units that are dynamically adjusted for highly-accurate/approximate operation by tuning their voltage/frequency. Units that execute less significant operations can operate at voltages less than what is required for correct operation and consume less power, if required, since such tasks do not need to be always exact as opposed to the critical ones. Such scheme can lead to energy efficient and reliable operation, while reducing the design cost and overheads of conventional circuit/micro-architecture level techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a Bayesian learning setting, the posterior distribution of a predictive model arises from a trade-off between its prior distribution and the conditional likelihood of observed data. Such distribution functions usually rely on additional hyperparameters which need to be tuned in order to achieve optimum predictive performance; this operation can be efficiently performed in an Empirical Bayes fashion by maximizing the posterior marginal likelihood of the observed data. Since the score function of this optimization problem is in general characterized by the presence of local optima, it is necessary to resort to global optimization strategies, which require a large number of function evaluations. Given that the evaluation is usually computationally intensive and badly scaled with respect to the dataset size, the maximum number of observations that can be treated simultaneously is quite limited. In this paper, we consider the case of hyperparameter tuning in Gaussian process regression. A straightforward implementation of the posterior log-likelihood for this model requires O(N^3) operations for every iteration of the optimization procedure, where N is the number of examples in the input dataset. We derive a novel set of identities that allow, after an initial overhead of O(N^3), the evaluation of the score function, as well as the Jacobian and Hessian matrices, in O(N) operations. We prove how the proposed identities, that follow from the eigendecomposition of the kernel matrix, yield a reduction of several orders of magnitude in the computation time for the hyperparameter optimization problem. Notably, the proposed solution provides computational advantages even with respect to state of the art approximations that rely on sparse kernel matrices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular logic-based computation is a broad umbrella covering molecular sensors at its simplest level and logic gate arrays involving steadily increasing levels of parallel and serial integration. The fluorescent PET(photoinduced electron transfer) switching principle remains a loyal servant of this entire field. Applications arise from the convenient operation of molecular information processors in very small spaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enhanced Indispensability Arguments (EIA) claim that Scientific Realists are committed to the existence of mathematical entities due to their reliance on Inference to the Best Explana- tion (IBE). Our central question concerns this purported parity of reasoning: do people who defend the EIA make an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific Realists about are arrived at by any inferen- tial route which eschews causes (§3), and nor is there any direct pressure for Scientific Real- ists to change their inferential methods (§4). We suggest that in order to maintain inferential parity with Scientific Realism, proponents of EIA need to give details about how and in what way the presence of mathematical entities directly contribute to explanations (§5).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is an exciting era for molecular computation because molecular logic gates are being pushed in new directions. The use of sulfur rather than the commonplace nitrogen as the key receptor atom in metal ion sensors is one of these directions; plant cells coming within the jurisdiction of fluorescent molecular thermometers is another, combining photochromism with voltammetry for molecular electronics is yet another. Two-input logic gates benefit from old ideas such as rectifying bilayer electrodes, cyclodextrin-enhanced room-temperature phosphorescence, steric hindrance, the polymerase chain reaction, charge transfer absorption of donor–acceptor complexes and lectin–glycocluster interactions. Furthermore, the concept of photo-uncaging enables rational ways of concatenating logic gates. Computational concepts are also applied to potential cancer theranostics and to the selective monitoring of neurotransmitters in situ. Higher numbers of inputs are also accommodated with the concept of functional integration of gates, where complex input–output patterns are sought out and analysed. Molecular emulation of computational components such as demultiplexers and parity generators/checkers are achieved in related ways. Complexity of another order is tackled with molecular edge detection routines.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by the need for designing efficient and robust fully-distributed computation in highly dynamic networks such as Peer-to-Peer (P2P) networks, we study distributed protocols for constructing and maintaining dynamic network topologies with good expansion properties. Our goal is to maintain a sparse (bounded degree) expander topology despite heavy {\em churn} (i.e., nodes joining and leaving the network continuously over time). We assume that the churn is controlled by an adversary that has complete knowledge and control of what nodes join and leave and at what time and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is a randomized distributed protocol that guarantees with high probability the maintenance of a {\em constant} degree graph with {\em high expansion} even under {\em continuous high adversarial} churn. Our protocol can tolerate a churn rate of up to $O(n/\poly\log(n))$ per round (where $n$ is the stable network size). Our protocol is efficient, lightweight, and scalable, and it incurs only $O(\poly\log(n))$ overhead for topology maintenance: only polylogarithmic (in $n$) bits needs to be processed and sent by each node per round and any node's computation cost per round is also polylogarithmic. The given protocol is a fundamental ingredient that is needed for the design of efficient fully-distributed algorithms for solving fundamental distributed computing problems such as agreement, leader election, search, and storage in highly dynamic P2P networks and enables fast and scalable algorithms for these problems that can tolerate a large amount of churn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular logic-based computation continues to throw up new applications in sensing and switching, the newest of which is the edge detection of objects. The scope of this phenomenon is mapped out by the use of structure-activity relationships, where several structures of the molecules and of the objects are examined. The different angles and curvatures of the objects are followed with good-fidelity in the visualized edges, even when the objects are in reverse video.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1997 a scandal associated with Bre-X, a junior mining firm, and its prospecting activities in Indonesia, exposed to public scrutiny the ways in which mineral exploration firms acquire, assess and report on scientific claims about the natural environment. At stake here was not just how investors understood the provisional nature of scientific knowledge, but also evidence of fraud. Contemporaneous mining scandals not only included the salting of cores, but also unreliable proprietary sample preparation and assay methods, mis-representations of visual field estimates as drilling results and ‘overly optimistic’ geological reports. This paper reports on initiatives taken in the wake of these scandals and prompted by the Mining Standards Task Force (TSE/OSC 1999). For regulators, mandated to increase investor confidence in Canada’s leading role within the global mining industry, efforts focused first and foremost upon identifying and removing sources of error and wilfulness within the production and circulation of scientific knowledge claims. A common goal cross-cutting these initiatives was ‘a faithful representation of nature’ (Daston and Galison 2010), however, as the paper argues, this was manifest in an assemblage of practices governed by distinct and rival regulative visions of science and the making of markets in claims about ‘nature’. These ‘practices of fidelity’, it is argued, can be consequential in shaping the spatial and temporal dynamics of the marketization of nature.