960 resultados para Averaging Theorem


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion are the result of the combination of large scale advection and small scale turbulence which are both complex to estimate. A field study was conducted in a small sub-tropical estuary in which high frequency (50 Hz) turbulent data were recorded continuously for about 48 hours. A triple decomposition technique was introduced to isolate the contributions of tides, resonance and turbulence in the flow field. A striking feature of the data set was the slow fluctuations which exhibited large amplitudes up to 50% the tidal amplitude under neap tide conditions. The triple decomposition technique allowed a characterisation of broader temporal scales of high frequency fluctuation data sampled during a number of full tidal cycles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The signal-to-noise ratio achievable in x-ray computed tomography (CT) images of polymer gels can be increased by averaging over multiple scans of each sample. However, repeated scanning delivers a small additional dose to the gel which may compromise the accuracy of the dose measurement. In this study, a NIPAM-based polymer gel was irradiated and then CT scanned 25 times, with the resulting data used to derive an averaged image and a "zero-scan" image of the gel. Comparison between these two results and the first scan of the gel showed that the averaged and zero-scan images provided better contrast, higher contrast-to- noise and higher signal-to-noise than the initial scan. The pixel values (Hounsfield units, HU) in the averaged image were not noticeably elevated, compared to the zero-scan result and the gradients used in the linear extrapolation of the zero-scan images were small and symmetrically distributed around zero. These results indicate that the averaged image was not artificially lightened by the small, additional dose delivered during CT scanning. This work demonstrates the broader usefulness of the zero-scan method as a means to verify the dosimetric accuracy of gel images derived from averaged x-ray CT data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduced in this paper is a Bayesian model for isolating the resonant frequency from combustion chamber resonance. The model shown in this paper focused on characterising the initial rise in the resonant frequency to investigate the rise of in-cylinder bulk temperature associated with combustion. By resolving the model parameters, it is possible to determine: the start of pre-mixed combustion, the start of diffusion combustion, the initial resonant frequency, the resonant frequency as a function of crank angle, the in-cylinder bulk temperature as a function of crank angle and the trapped mass as a function of crank angle. The Bayesian method allows for individual cycles to be examined without cycle-averaging|allowing inter-cycle variability studies. Results are shown for a turbo-charged, common-rail compression ignition engine run at 2000 rpm and full load.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because brain structure and function are affected in neurological and psychiatric disorders, it is important to disentangle the sources of variation in these phenotypes. Over the past 15 years, twin studies have found evidence for both genetic and environmental influences on neuroimaging phenotypes, but considerable variation across studies makes it difficult to draw clear conclusions about the relative magnitude of these influences. Here we performed the first meta-analysis of structural MRI data from 48 studies on >1,250 twin pairs, and diffusion tensor imaging data from 10 studies on 444 twin pairs. The proportion of total variance accounted for by genes (A), shared environment (C), and unshared environment (E), was calculated by averaging A, C, and E estimates across studies from independent twin cohorts and weighting by sample size. The results indicated that additive genetic estimates were significantly different from zero for all metaanalyzed phenotypes, with the exception of fractional anisotropy (FA) of the callosal splenium, and cortical thickness (CT) of the uncus, left parahippocampal gyrus, and insula. For many phenotypes there was also a significant influence of C. We now have good estimates of heritability for many regional and lobar CT measures, in addition to the global volumes. Confidence intervals are wide and number of individuals small for many of the other phenotypes. In conclusion, while our meta-analysis shows that imaging measures are strongly influenced by genes, and that novel phenotypes such as CT measures, FA measures, and brain activation measures look especially promising, replication across independent samples and demographic groups is necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past several years, evidence has accumulated showing that the cerebellum plays a significant role in cognitive function. Here we show, in a large genetically informative twin sample (n= 430; aged 16-30. years), that the cerebellum is strongly, and reliably (n=30 rescans), activated during an n-back working memory task, particularly lobules I-IV, VIIa Crus I and II, IX and the vermis. Monozygotic twin correlations for cerebellar activation were generally much larger than dizygotic twin correlations, consistent with genetic influences. Structural equation models showed that up to 65% of the variance in cerebellar activation during working memory is genetic (averaging 34% across significant voxels), most prominently in the lobules VI, and VIIa Crus I, with the remaining variance explained by unique/unshared environmental factors. Heritability estimates for brain activation in the cerebellum agree with those found for working memory activation in the cerebral cortex, even though cerebellar cyto-architecture differs substantially. Phenotypic correlations between BOLD percent signal change in cerebrum and cerebellum were low, and bivariate modeling indicated that genetic influences on the cerebellum are at least partly specific to the cerebellum. Activation on the voxel-level correlated very weakly with cerebellar gray matter volume, suggesting specific genetic influences on the BOLD signal. Heritable signals identified here should facilitate discovery of genetic polymorphisms influencing cerebellar function through genome-wide association studies, to elucidate the genetic liability to brain disorders affecting the cerebellum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a major effort in medical imaging to develop algorithms to extract information from DTI and HARDI, which provide detailed information on brain integrity and connectivity. As the images have recently advanced to provide extraordinarily high angular resolution and spatial detail, including an entire manifold of information at each point in the 3D images, there has been no readily available means to view the results. This impedes developments in HARDI research, which need some method to check the plausibility and validity of image processing operations on HARDI data or to appreciate data features or invariants that might serve as a basis for new directions in image segmentation, registration, and statistics. We present a set of tools to provide interactive display of HARDI data, including both a local rendering application and an off-screen renderer that works with a web-based viewer. Visualizations are presented after registration and averaging of HARDI data from 90 human subjects, revealing important details for which there would be no direct way to appreciate using conventional display of scalar images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population-based brain mapping provides great insight into the trajectory of aging and dementia, as well as brain changes that normally occur over the human life span.We describe three novel brain mapping techniques, cortical thickness mapping, tensor-based morphometry (TBM), and hippocampal surface modeling, which offer enormous power for measuring disease progression in drug trials, and shed light on the neuroscience of brain degeneration in Alzheimer's disease (AD) and mild cognitive impairment (MCI).We report the first time-lapse maps of cortical atrophy spreading dynamically in the living brain, based on averaging data from populations of subjects with Alzheimer's disease and normal subjects imaged longitudinally with MRI. These dynamic sequences show a rapidly advancing wave of cortical atrophy sweeping from limbic and temporal cortices into higher-order association and ultimately primary sensorimotor areas, in a pattern that correlates with cognitive decline. A complementary technique, TBM, reveals the 3D profile of atrophic rates, at each point in the brain. A third technique, hippocampal surface modeling, plots the profile of shape alterations across the hippocampal surface. The three techniques provide moderate to highly automated analyses of images, have been validated on hundreds of scans, and are sensitive to clinically relevant changes in individual patients and groups undergoing different drug treatments. We compare time-lapse maps of AD, MCI, and other dementias, correlate these changes with cognition, and relate them to similar time-lapse maps of childhood development, schizophrenia, and HIV-associated brain degeneration. Strengths and weaknesses of these different imaging measures for basic neuroscience and drug trials are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Concordance is characterised as a negotiation-like health communication approach based on an equal and collaborative partnership between patients and health professionals. The Leeds Attitudes to Concordance II (LATCon II) scale was developed to measure the attitudes towards concordance. The purpose of this study was to translate the LATCon II into Chinese and psychometrically test the Chinese version of LATCon II (C-LATCon II). Methods The study involved three phases: i) translation and cross-cultural adaptation; ii) pilot study, and; iii) a cross-sectional survey (n = 366). Systematic random sampling was used to recruit hypertensive patients from nine communities covering around 78,000 residents in China. Tests of psychometric properties included content validity, construct validity, criteria-related validity (correlation between the C-LATCon II and the Therapeutic Adherence Scale for Hypertensive Patients (TASHP)), internal reliability, and test-retest reliability (n = 30). Results The study found that the C-LATCon II had a satisfactory content validity (item-level Content Validity Index (CVI) = 0.83-1, scale-level CVI/universal agreement = 0.89, and scale-level CVI/averaging calculation = 0.98), construct validity (four components extracted explained 56.66% of the total variance), internal reliability (Cronbach’s alpha of overall scale and four components was 0.78 and 0.66-0.84, respectively), and test-retest reliability (Pearson’s correlation coefficient = 0.82, p < 0.001; interclass correlation coefficient = 0.82, p < 0.001; linear weighted kappa3 statistic for each item = 0.40-0.65, p < 0.05). Criteria-related validity showed a weak association (Pearson’s correlation coefficient = 0.11, p < 0.05) between patients’ attitudes towards concordance during health communication and their health behaviours for hypertension management. Conclusions The C-LATCon II is a validated and reliable instrument which can be used to evaluate the attitudes to concordance in Chinese populations. Four components (health professionals’ attitudes, partnership between two parties, therapeutic decision making, and patients’ involvement) describe the attitudes towards concordance during health communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-world cryptographic protocols such as the widely used Transport Layer Security (TLS) protocol support many different combinations of cryptographic algorithms (called ciphersuites) and simultaneously support different versions. Recent advances in provable security have shown that most modern TLS ciphersuites are secure authenticated and confidential channel establishment (ACCE) protocols, but these analyses generally focus on single ciphersuites in isolation. In this paper we extend the ACCE model to cover protocols with many different sub-protocols, capturing both multiple ciphersuites and multiple versions, and define a security notion for secure negotiation of the optimal sub-protocol. We give a generic theorem that shows how secure negotiation follows, with some additional conditions, from the authentication property of secure ACCE protocols. Using this framework, we analyse the security of ciphersuite and three variants of version negotiation in TLS, including a recently proposed mechanism for detecting fallback attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To further investigate susceptibility loci identified by genome-wide association studies, we genotyped 5,500 SNPs across 14 associated regions in 8,000 samples from a control group and 3 diseases: type 2 diabetes (T2D), coronary artery disease (CAD) and Graves' disease. We defined, using Bayes theorem, credible sets of SNPs that were 95% likely, based on posterior probability, to contain the causal disease-associated SNPs. In 3 of the 14 regions, TCF7L2 (T2D), CTLA4 (Graves' disease) and CDKN2A-CDKN2B (T2D), much of the posterior probability rested on a single SNP, and, in 4 other regions (CDKN2A-CDKN2B (CAD) and CDKAL1, FTO and HHEX (T2D)), the 95% sets were small, thereby excluding most SNPs as potentially causal. Very few SNPs in our credible sets had annotated functions, illustrating the limitations in understanding the mechanisms underlying susceptibility to common diseases. Our results also show the value of more detailed mapping to target sequences for functional studies. © 2012 Nature America, Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the field of face recognition, sparse representation (SR) has received considerable attention during the past few years, with a focus on holistic descriptors in closed-set identification applications. The underlying assumption in such SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such an assumption is easily violated in the face verification scenario, where the task is to determine if two faces (where one or both have not been seen before) belong to the same person. In this study, the authors propose an alternative approach to SR-based face verification, where SR encoding is performed on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which then form an overall face descriptor. Owing to the deliberate loss of spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment and various image deformations. Within the proposed framework, they evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN) and an implicit probabilistic technique based on Gaussian mixture models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, on both the traditional closed-set identification task and the more applicable face verification task. The experiments also show that l1-minimisation-based encoding has a considerably higher computational cost when compared with SANN-based and probabilistic encoding, but leads to higher recognition rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let G = (V, E) be a finite, simple and undirected graph. For S subset of V, let delta(S, G) = {(u, v) is an element of E : u is an element of S and v is an element of V - S} be the edge boundary of S. Given an integer i, 1 <= i <= vertical bar V vertical bar, let the edge isoperimetric value of G at i be defined as b(e)(i, G) = min(S subset of V:vertical bar S vertical bar=i)vertical bar delta(S, G)vertical bar. The edge isoperimetric peak of G is defined as b(e)(G) = max(1 <= j <=vertical bar V vertical bar)b(e)(j, G). Let b(v)(G) denote the vertex isoperimetric peak defined in a corresponding way. The problem of determining a lower bound for the vertex isoperimetric peak in complete t-ary trees was recently considered in [Y. Otachi, K. Yamazaki, A lower bound for the vertex boundary-width of complete k-ary trees, Discrete Mathematics, in press (doi: 10.1016/j.disc.2007.05.014)]. In this paper we provide bounds which improve those in the above cited paper. Our results can be generalized to arbitrary (rooted) trees. The depth d of a tree is the number of nodes on the longest path starting from the root and ending at a leaf. In this paper we show that for a complete binary tree of depth d (denoted as T-d(2)), c(1)d <= b(e) (T-d(2)) <= d and c(2)d <= b(v)(T-d(2)) <= d where c(1), c(2) are constants. For a complete t-ary tree of depth d (denoted as T-d(t)) and d >= c log t where c is a constant, we show that c(1)root td <= b(e)(T-d(t)) <= td and c(2)d/root t <= b(v) (T-d(t)) <= d where c(1), c(2) are constants. At the heart of our proof we have the following theorem which works for an arbitrary rooted tree and not just for a complete t-ary tree. Let T = (V, E, r) be a finite, connected and rooted tree - the root being the vertex r. Define a weight function w : V -> N where the weight w(u) of a vertex u is the number of its successors (including itself) and let the weight index eta(T) be defined as the number of distinct weights in the tree, i.e eta(T) vertical bar{w(u) : u is an element of V}vertical bar. For a positive integer k, let l(k) = vertical bar{i is an element of N : 1 <= i <= vertical bar V vertical bar, b(e)(i, G) <= k}vertical bar. We show that l(k) <= 2(2 eta+k k)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A high speed digital signal averager with programmable features for the sampling period, for the number of channels and for the number of sweeps is described. The system implements a stable averaging algorithm (Deadroff and Trimble 1968) to provide a stable, calibrated display. The performance of the instrument has been evaluated for the reduction of random noise and for comb-filter action. Special uses of the instrument as a box-car integrator and as a transient recorder are also indicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of separability in recent models of classical relativistic interacting particles is examined. This physical requirement is shown to be more subtle than naive separability of all the constraints defining the system: it is adequate to be able to canonically transform the time-fixing constraints from an unseparated to a separated form when clusters emerge. Viewing separability in this way, and within a specific framework, we are led to a new no-interaction theorem which states the incompatibility of nontrivial interaction with relativistic invariance, separability, and invariant world lines for more than two particles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work on the violent relaxation of collisionless stellar systems has been based on the notion of a wide class of entropy functions. A theorem concerning entropy increase has been proved. We draw attention to some underlying assumptions that have been ignored in the applications of this theorem to stellar dynamical problems. Once these are taken into account, the use of this theorem is at best heuristic. We present a simple counter-example.