849 resultados para Distributed Calculations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores system performance for reconfigurable distributed systems and provides an analytical model for determining throughput of theoretical systems based on the OpenSPARC FPGA Board and the SIRC Communication Framework. This model was developed by studying a small set of variables that together determine a system¿s throughput. The importance of this model is in assisting system designers to make decisions as to whether or not to commit to designing a reconfigurable distributed system based on the estimated performance and hardware costs. Because custom hardware design and distributed system design are both time consuming and costly, it is important for designers to make decisions regarding system feasibility early in the development cycle. Based on experimental data the model presented in this paper shows a close fit with less than 10% experimental error on average. The model is limited to a certain range of problems, but it can still be used given those limitations and also provides a foundation for further development of modeling reconfigurable distributed systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents two frameworks- a software framework and a hardware core manager framework- which, together, can be used to develop a processing platform using a distributed system of field-programmable gate array (FPGA) boards. The software framework providesusers with the ability to easily develop applications that exploit the processing power of FPGAs while the hardware core manager framework gives users the ability to configure and interact with multiple FPGA boards and/or hardware cores. This thesis describes the design and development of these frameworks and analyzes the performance of a system that was constructed using the frameworks. The performance analysis included measuring the effect of incorporating additional hardware components into the system and comparing the system to a software-only implementation. This work draws conclusions based on the provided results of the performance analysis and offers suggestions for future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The initiation and maintenance of physiological and pathophysiological oscillatory activity depends on the synaptic interactions within neuronal networks. We studied the mechanisms underlying evoked transient network oscillation in acute slices of the adolescent rat somatosensory cortex and modeled its underpinning mechanisms. Oscillations were evoked by brief spatially distributed noisy extracellular stimulation, delivered via bipolar electrodes. Evoked transient network oscillation was detected with multi-neuron patch-clamp recordings under different pharmacological conditions. The observed oscillations are in the frequency range of 2-5 Hz and consist of 4-12 mV large, 40-150 ms wide compound synaptic events with rare overlying action potentials. This evoked transient network oscillation is only weakly expressed in the somatosensory cortex and requires increased [K+]o of 6.25 mM and decreased [Ca2+]o of 1.5 mM and [Mg2+]o of 0.5 mM. A peak in the cross-correlation among membrane potential in layers II/III, IV and V neurons reflects the underlying network-driven basis of the evoked transient network oscillation. The initiation of the evoked transient network oscillation is accompanied by an increased [K+]o and can be prevented by the K+ channel blocker quinidine. In addition, a shift of the chloride reversal potential takes place during stimulation, resulting in a depolarizing type A GABA (GABAA) receptor response. Blockade of alpha-amino-3-hydroxy-5-methyl-4-isoxazole-proprionate (AMPA), N-methyl-D-aspartate (NMDA), or GABA(A) receptors as well as gap junctions prevents evoked transient network oscillation while a reduction of AMPA or GABA(A) receptor desensitization increases its duration and amplitude. The apparent reversal potential of -27 mV of the evoked transient network oscillation, its pharmacological profile, as well as the modeling results suggest a mixed contribution of glutamatergic, excitatory GABAergic, and gap junctional conductances in initiation and maintenance of this oscillatory activity. With these properties, evoked transient network oscillation resembles epileptic afterdischarges more than any other form of physiological or pathophysiological neocortical oscillatory activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power calculations in a small sample comparative study, with a continuous outcome measure, are typically undertaken using the asymptotic distribution of the test statistic. When the sample size is small, this asymptotic result can be a poor approximation. An alternative approach, using a rank based test statistic, is an exact power calculation. When the number of groups is greater than two, the number of calculations required to perform an exact power calculation is prohibitive. To reduce the computational burden, a Monte Carlo resampling procedure is used to approximate the exact power function of a k-sample rank test statistic under the family of Lehmann alternative hypotheses. The motivating example for this approach is the design of animal studies, where the number of animals per group is typically small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to make scientific findings reproducible is increasingly important in areas where substantive results are the product of complex statistical computations. Reproducibility can allow others to verify the published findings and conduct alternate analyses of the same data. A question that arises naturally is how can one conduct and distribute reproducible research? This question is relevant from the point of view of both the authors who want to make their research reproducible and readers who want to reproduce relevant findings reported in the scientific literature. We present a framework in which reproducible research can be conducted and distributed via cached computations and describe specific tools for both authors and readers. As a prototype implementation we introduce three software packages written in the R language. The cacheSweave and stashR packages together provide tools for caching computational results in a key-value style database which can be published to a public repository for readers to download. The SRPM package provides tools for generating and interacting with "shared reproducibility packages" (SRPs) which can facilitate the distribution of the data and code. As a case study we demonstrate the use of the toolkit on a national study of air pollution exposure and mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Differential cyp19 aromatase expression during development leads to sexual dimorphisms in the mammalian brain. Whether this is also true for fish is unknown. The aim of the current study has been to follow the expression of the brain-specific aromatase cyp19a2 in the brains of sexually differentiating zebrafish. To assess the role of cyp19a2 in the zebrafish brain during gonadal differentiation, we used quantitative reverse transcriptase-polymerase chain reaction and immunohistochemistry to detect differences in the transcript or protein levels and/or expression pattern in juvenile fish, histology to monitor the gonadal status, and double immunofluorescence with neuronal or radial glial markers to characterize aromatase-positive cells. Our data show that cyp19a2 expression levels during zebrafish sexual differentiation cannot be assigned to a particular sex; the expression pattern in the brain is similar in both sexes and aromatase-positive cells appear to be mostly of radial glial nature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work was to study and quantify the differences in dose distributions computed with some of the newest dose calculation algorithms available in commercial planning systems. The study was done for clinical cases originally calculated with pencil beam convolution (PBC) where large density inhomogeneities were present. Three other dose algorithms were used: a pencil beam like algorithm, the anisotropic analytic algorithm (AAA), a convolution superposition algorithm, collapsed cone convolution (CCC), and a Monte Carlo program, voxel Monte Carlo (VMC++). The dose calculation algorithms were compared under static field irradiations at 6 MV and 15 MV using multileaf collimators and hard wedges where necessary. Five clinical cases were studied: three lung and two breast cases. We found that, in terms of accuracy, the CCC algorithm performed better overall than AAA compared to VMC++, but AAA remains an attractive option for routine use in the clinic due to its short computation times. Dose differences between the different algorithms and VMC++ for the median value of the planning target volume (PTV) were typically 0.4% (range: 0.0 to 1.4%) in the lung and -1.3% (range: -2.1 to -0.6%) in the breast for the few cases we analysed. As expected, PTV coverage and dose homogeneity turned out to be more critical in the lung than in the breast cases with respect to the accuracy of the dose calculation. This was observed in the dose volume histograms obtained from the Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conversion of computed tomography (CT) numbers into material composition and mass density data influences the accuracy of patient dose calculations in Monte Carlo treatment planning (MCTP). The aim of our work was to develop a CT conversion scheme by performing a stoichiometric CT calibration. Fourteen dosimetrically equivalent tissue subsets (bins), of which ten bone bins, were created. After validating the proposed CT conversion scheme on phantoms, it was compared to a conventional five bin scheme with only one bone bin. This resulted in dose distributions D(14) and D(5) for nine clinical patient cases in a European multi-centre study. The observed local relative differences in dose to medium were mostly smaller than 5%. The dose-volume histograms of both targets and organs at risk were comparable, although within bony structures D(14) was found to be slightly but systematically higher than D(5). Converting dose to medium to dose to water (D(14) to D(14wat) and D(5) to D(5wat)) resulted in larger local differences as D(5wat) became up to 10% higher than D(14wat). In conclusion, multiple bone bins need to be introduced when Monte Carlo (MC) calculations of patient dose distributions are converted to dose to water.