1000 resultados para blocking algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Richardson-Lucy algorithm is one of the most important algorithms in the image deconvolution area. However, one of its drawbacks is slow convergence. A very significant acceleration is obtained by the technique proposed by Biggs and Andrews (BA), which is implemented in the deconvlucy function of the Image Processing MATLAB toolbox. The BA method was developed heuristically with no proof of convergence. In this paper, we introduce the Heavy-Ball (H-B) method for Poisson data optimization and extend it to a scaled H-B method, which includes the BA method as a special case. The method has proof of the convergence rate of O(k-2), where k is the number of iterations. We demonstrate the superior convergence performance of the scaled H-B method on both synthetic and real 3D images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new self-learning algorithm for accelerated dynamics, reconnaissance metadynamics, is proposed that is able to work with a very large number of collective coordinates. Acceleration of the dynamics is achieved by constructing a bias potential in terms of a patchwork of one-dimensional, locally valid collective coordinates. These collective coordinates are obtained from trajectory analyses so that they adapt to any new features encountered during the simulation. We show how this methodology can be used to enhance sampling in real chemical systems citing examples both from the physics of clusters and from the biological sciences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the sensitivity and specificity of the screening mode of the Humphrey-Welch Allyn frequency-doubling technology (FDT), Octopus tendency-oriented perimetry (TOP), and the Humphrey Swedish Interactive Threshold Algorithm (SITA)-fast (HSF) in patients with glaucoma. DESIGN: A comparative consecutive case series. METHODS: This was a prospective study which took place in the glaucoma unit of an academic department of ophthalmology. One eye of 70 consecutive glaucoma patients and 28 age-matched normal subjects was studied. Eyes were examined with the program C-20 of FDT, G1-TOP, and 24-2 HSF in one visit and in random order. The gold standard for glaucoma was presence of a typical glaucomatous optic disk appearance on stereoscopic examination, which was judged by a glaucoma expert. The sensitivity and specificity, positive and negative predictive value, and receiver operating characteristic (ROC) curves of two algorithms for the FDT screening test, two algorithms for TOP, and three algorithms for HSF, as defined before the start of this study, were evaluated. The time required for each test was also analyzed. RESULTS: Values for area under the ROC curve ranged from 82.5%-93.9%. The largest area (93.9%) under the ROC curve was obtained with the FDT criteria, defining abnormality as presence of at least one abnormal location. Mean test time was 1.08 ± 0.28 minutes, 2.31 ± 0.28 minutes, and 4.14 ± 0.57 minutes for the FDT, TOP, and HSF, respectively. The difference in testing time was statistically significant (P <.0001). CONCLUSIONS: The C-20 FDT, G1-TOP, and 24-2 HSF appear to be useful tools to diagnose glaucoma. The test C-20 FDT and G1-TOP take approximately 1/4 and 1/2 of the time taken by 24 to 2 HSF. © 2002 by Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integrating evidence from multiple domains is useful in prioritizing disease candidate genes for subsequent testing. We ranked all known human genes (n = 3819) under linkage peaks in the Irish Study of High-Density Schizophrenia Families using three different evidence domains: 1) a meta-analysis of microarray gene expression results using the Stanley Brain collection, 2) a schizophrenia protein-protein interaction network, and 3) a systematic literature search. Each gene was assigned a domain-specific p-value and ranked after evaluating the evidence within each domain. For comparison to this
ranking process, a large-scale candidate gene hypothesis was also tested by including genes with Gene Ontology terms related to neurodevelopment. Subsequently, genotypes of 3725 SNPs in 167 genes from a custom Illumina iSelect array were used to evaluate the top ranked vs. hypothesis selected genes. Seventy-three genes were both highly ranked and involved in neurodevelopment (category 1) while 42 and 52 genes were exclusive to neurodevelopment (category 2) or highly ranked (category 3), respectively. The most significant associations were observed in genes PRKG1, PRKCE, and CNTN4 but no individual SNPs were significant after correction for multiple testing. Comparison of the approaches showed an excess of significant tests using the hypothesis-driven neurodevelopment category. Random selection of similar sized genes from two independent genome-wide association studies (GWAS) of schizophrenia showed the excess was unlikely by chance. In a further meta-analysis of three GWAS datasets, four candidate SNPs reached nominal significance. Although gene ranking using integrated sources of prior information did not enrich for significant results in the current experiment, gene selection using an a priori hypothesis (neurodevelopment) was superior to random selection. As such, further development of gene ranking strategies using more carefully selected sources of information is warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the DONUTS autoguiding algorithm, designed to fix stellar positions at the sub-pixel level for high-cadence time-series photometry, and also capable of autoguiding on defocused stars. DONUTS was designed to calculate guide corrections from a series of science images and recentre telescope pointing between each exposure. The algorithm has the unique ability of calculating guide corrections from undersampled to heavily defocused point spread functions. We present the case for why such an algorithm is important for high precision photometry and give our results from off and on-sky testing. We discuss the limitations of DONUTS and the facilities where it soon will be deployed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper emerged from work supported by EPSRC grant GR/S84354/01 and proposes a method of determining principal curves, using spline functions, in principal component analysis (PCA) for the representation of non-linear behaviour in process monitoring. Although principal curves are well established, they are difficult to implement in practice if a large number of variables are analysed. The significant contribution of this paper is that the proposed method has minimal complexity, assuming simple spline geometry, thus enabling efficient computation. The paper provides a foundation for further work where multiple curves may be required to represent underlying non-linear information in complex data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a fast moving window algorithm for QR and Cholesky decompositions by simultaneously applying data updating and downdating. The developed procedure is based on inner products and entails a similar downdating to that of the Chambers’ approach. For adding and deleting one row of data from the original matrix, a detailed analysis shows that the proposed algorithm outperforms existing ones in terms or computational efficiency, if the number of columns exceeds 7. For a large number of columns, the proposed algorithm is numerically superior compared to the traditional sequential technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE - To evaluate an algorithm guiding responses of continuous subcutaneous insulin infusion (CSII)-treated type 1 diabetic patients using real-time continuous glucose monitoring (RT-CGM). RESEARCH DESIGN AND METHODS - Sixty CSII-treated type 1 diabetic participants (aged 13-70 years, including adult and adolescent subgroups, with A1C =9.5%) were randomized in age-, sex-, and A1C-matched pairs. Phase 1 was an open 16-week multicenter randomized controlled trial. Group A was treated with CSII/RT-CGM with the algorithm, and group B was treated with CSII/RT-CGM without the algorithm. The primary outcome was the difference in time in target (4-10 mmol/l) glucose range on 6-day masked CGM. Secondary outcomes were differences in A1C, low (=3.9 mmol/l) glucose CGM time, and glycemic variability. Phase 2 was the week 16-32 follow-up. Group A was returned to usual care, and group B was provided with the algorithm. Glycemia parameters were as above. Comparisons were made between baseline and 16 weeks and 32 weeks. RESULTS - In phase 1, after withdrawals 29 of 30 subjects were left in group A and 28 of 30 subjects were left in group B. The change in target glucose time did not differ between groups. A1C fell (mean 7.9% [95% CI 7.7-8.2to 7.6% [7.2-8.0]; P <0.03) in group A but not in group B (7.8% [7.5-8.1] to 7.7 [7.3-8.0]; NS) with no difference between groups. More subjects in group A achieved A1C =7% than those in group B (2 of 29 to 14 of 29 vs. 4 of 28 to 7 of 28; P = 0.015). In phase 2, one participant was lost from each group. In group A, A1C returned to baseline with RT-CGM discontinuation but did not change in group B, who continued RT-CGM with addition of the algorithm. CONCLUSIONS - Early but not late algorithm provision to type 1 diabetic patients using CSII/RT-CGM did not increase the target glucose time but increased achievement of A1C =7%. Upon RT-CGM cessation, A1C returned to baseline. © 2010 by the American Diabetes Association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of self-healing in reconfigurable networks e.g., peer-to-peer and wireless mesh networks. For such networks under repeated attack by an omniscient adversary, we propose a fully distributed algorithm, Xheal, that maintains good expansion and spectral properties of the network, while keeping the network connected. Moreover, Xheal does this while allowing only low stretch and degree increase per node. The algorithm heals global properties like expansion and stretch while only doing local changes and using only local information. We also provide bounds on the second smallest eigenvalue of the Laplacian which captures key properties such as mixing time, conductance, congestion in routing etc. Xheal has low amortized latency and bandwidth requirements. Our work improves over the self-healing algorithms Forgiving tree [PODC 2008] andForgiving graph [PODC 2009] in that we are able to give guarantees on degree and stretch, while at the same time preserving the expansion and spectral properties of the network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article offers a typology of so-called blocking legislation and analyses its development, functions and legality under international law. It also presents and discusses the new Russian blocking Order, issued in September 2012, focusing on its possible effects on the European Commission's investigation of Gazprom's business practices (in light of EU competition law) as well as, more broadly, on foreign operations of Russian strategic enterprises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the behaviour of the glued trees algorithm described by Childs et al. in [1] under decoherence. We consider a discrete time reformulation of the continuous time quantum walk protocol and apply a phase damping channel to the coin state, investigating the effect of such a mechanism on the probability of the walker appearing on the target vertex of the graph. We pay particular attention to any potential advantage coming from the use of weak decoherence for the spreading of the walk across the glued trees graph. © 2013 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to a higher order reasoning account, inferential reasoning processes underpin the widely observed cue competition effect of blocking in causal learning. The inference required for blocking has been described as modus tollens (if p then q, not q therefore not p). Young children are known to have difficulties with this type of inference, but research with adults suggests that this inference is easier if participants think counterfactually. In this study, 100 children (51 five-year-olds and 49 six- to seven-year-olds) were assigned to two types of pretraining groups. The counterfactual group observed demonstrations of cues paired with outcomes and answered questions about what the outcome would have been if the causal status of cues had been different, whereas the factual group answered factual questions about the same demonstrations. Children then completed a causal learning task. Counterfactual pretraining enhanced levels of blocking as well as modus tollens reasoning but only for the younger children. These findings provide new evidence for an important role for inferential reasoning in causal learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A sample of 99 children completed a causal learning task that was an analogue of the food allergy paradigm used with adults. The cue competition effects of blocking and unovershadowing were assessed under forward and backward presentation conditions. Children also answered questions probing their ability to make the inference posited to be necessary for blocking by a reasoning account of cue competition. For the first time, children's working memory and general verbal ability were also measured alongside their causal learning. The magnitude of blocking and unovershadowing effects increased with age. However, analyses showed that the best predictor of both blocking and unovershadowing effects was children's performance on the reasoning questions. The magnitude of the blocking effect was also predicted by children's working memory abilities. These findings provide new evidence that cue competition effects such as blocking are underpinned by effortful reasoning processes.