22 resultados para low degree graph


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of self-healing in reconfigurable networks e.g., peer-to-peer and wireless mesh networks. For such networks under repeated attack by an omniscient adversary, we propose a fully distributed algorithm, Xheal, that maintains good expansion and spectral properties of the network, while keeping the network connected. Moreover, Xheal does this while allowing only low stretch and degree increase per node. The algorithm heals global properties like expansion and stretch while only doing local changes and using only local information. We also provide bounds on the second smallest eigenvalue of the Laplacian which captures key properties such as mixing time, conductance, congestion in routing etc. Xheal has low amortized latency and bandwidth requirements. Our work improves over the self-healing algorithms Forgiving tree [PODC 2008] andForgiving graph [PODC 2009] in that we are able to give guarantees on degree and stretch, while at the same time preserving the expansion and spectral properties of the network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we have developed a low-complexity algorithm for epileptic seizure detection with a high degree of accuracy. The algorithm has been designed to be feasibly implementable as battery-powered low-power implantable epileptic seizure detection system or epilepsy prosthesis. This is achieved by utilizing design optimization techniques at different levels of abstraction. Particularly, user-specific critical parameters are identified at the algorithmic level and are explicitly used along with multiplier-less implementations at the architecture level. The system has been tested on neural data obtained from in-vivo animal recordings and has been implemented in 90nm bulk-Si technology. The results show up to 90 % savings in power as compared to prevalent wavelet based seizure detection technique while achieving 97% average detection rate. Copyright 2010 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Barrett’s esophagus (BE) is a common premalignant lesion for which surveillance is recommended. This strategy is limited by considerable variations in clinical practice. We conducted an international, multidisciplinary, systematic search and evidence-based review of BE and provided consensus recommendations for clinical use in patients with nondysplastic, indefinite, and low-grade dysplasia (LGD). METHODS: We defined the scope, proposed statements, and searched electronic databases, yielding 20,558 publications that were screened, selected online, and formed the evidence base. We used a Delphi consensus process, with an 80% agreement threshold, using GRADE (Grading of Recommendations Assessment, Development and Evaluation) to categorize the quality of evidence and strength of recommendations. RESULTS: In total, 80% of respondents agreed with 55 of 127 statements in the final voting rounds. Population endoscopic screening is not recommended and screening should target only very high-risk cases of males aged over 60 years with chronic uncontrolled reflux. A new international definition of BE was agreed upon. For any degree of dysplasia, at least two specialist gastrointestinal (GI) pathologists are required. Risk factors for cancer include male gender, length of BE, and central obesity. Endoscopic resection should be used for visible, nodular areas. Surveillance is not recommended for <5 years of life expectancy. Management strategies for indefinite dysplasia (IND) and LGD were identified, including a de-escalation strategy for lower-risk patients and escalation to intervention with follow-up for higher-risk patients. CONCLUSIONS: In this uniquely large consensus process in gastroenterology, we made key clinical recommendations for the escalation/de-escalation of BE in clinical practice. We made strong recommendations for the prioritization of future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we introduce an original distance definition for graphs, called the Markov-inverse-F measure (MiF). This measure enables the integration of classical graph theory indices with new knowledge pertaining to structural feature extraction from semantic networks. MiF improves the conventional Jaccard and/or Simpson indices, and reconciles both the geodesic information (random walk) and co-occurrence adjustment (degree balance and distribution). We measure the effectiveness of graph-based coefficients through the application of linguistic graph information for a neural activity recorded during conceptual processing in the human brain. Specifically, the MiF distance is computed between each of the nouns used in a previous neural experiment and each of the in-between words in a subgraph derived from the Edinburgh Word Association Thesaurus of English. From the MiF-based information matrix, a machine learning model can accurately obtain a scalar parameter that specifies the degree to which each voxel in (the MRI image of) the brain is activated by each word or each principal component of the intermediate semantic features. Furthermore, correlating the voxel information with the MiF-based principal components, a new computational neurolinguistics model with a network connectivity paradigm is created. This allows two dimensions of context space to be incorporated with both semantic and neural distributional representations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realising memory intensive applications such as image and video processing on FPGA requires creation of complex, multi-level memory hierarchies to achieve real-time performance; however commerical High Level Synthesis tools are unable to automatically derive such structures and hence are unable to meet the demanding bandwidth and capacity constraints of these applications. Current approaches to solving this problem can only derive either single-level memory structures or very deep, highly inefficient hierarchies, leading in either case to one or more of high implementation cost and low performance. This paper presents an enhancement to an existing MC-HLS synthesis approach which solves this problem; it exploits and eliminates data duplication at multiple levels levels of the generated hierarchy, leading to a reduction in the number of levels and ultimately higher performance, lower cost implementations. When applied to synthesis of C-based Motion Estimation, Matrix Multiplication and Sobel Edge Detection applications, this enables reductions in Block RAM and Look Up Table (LUT) cost of up to 25%, whilst simultaneously increasing throughput.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-carrier index keying (MCIK) is a recently developed transmission technique that exploits the sub-carrier indices as an additional degree of freedom for data transmission. This paper investigates the performance of a low complexity detection scheme with diversity reception for MCIK with orthogonal frequency division multiplexing (OFDM). For the performance evaluation, an exact and an approximate closed form expression for the pairwise error probability (PEP) of a greedy detector (GD) with maximal ratio combining (MRC) is derived. The presented results show that the performance of the GD is significantly improved when MRC diversity is employed. The proposed hybrid scheme is found to outperform maximum likelihood (ML) detection with a substantial reduction on the associated computational complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing compact routing schemes, e.g., Thorup and Zwick [SPAA 2001] and Chechik [PODC 2013], often have no means to tolerate failures, once the system has been setup and started. This paper presents, to our knowledge, the first self-healing compact routing scheme. Besides, our schemes are developed for low memory nodes, i.e., nodes need only O(log2 n) memory, and are thus, compact schemes.
We introduce two algorithms of independent interest: The first is CompactFT, a novel compact version (using only O(log n) local memory) of the self-healing algorithm Forgiving Tree of Hayes et al. [PODC 2008]. The second algorithm (CompactFTZ) combines CompactFT with Thorup-Zwick’s treebased compact routing scheme [SPAA 2001] to produce a fully compact self-healing routing scheme. In the self-healing model, the adversary deletes nodes one at a time with the affected nodes self-healing locally by adding few edges. CompactFT recovers from each attack in only O(1) time and ∆ messages, with only +3 degree increase and O(log∆) graph diameter increase, over any sequence of deletions (∆ is the initial maximum degree).
Additionally, CompactFTZ guarantees delivery of a packet sent from sender s as long as the receiver has not been deleted, with only an additional O(y log ∆) latency, where y is the number of nodes that have been deleted on the path between s and t. If t has been deleted, s gets informed and the packet removed from the network.