991 resultados para unconditional guarantees


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of self-healing in reconfigurable networks e.g., peer-to-peer and wireless mesh networks. For such networks under repeated attack by an omniscient adversary, we propose a fully distributed algorithm, Xheal, that maintains good expansion and spectral properties of the network, while keeping the network connected. Moreover, Xheal does this while allowing only low stretch and degree increase per node. The algorithm heals global properties like expansion and stretch while only doing local changes and using only local information. We also provide bounds on the second smallest eigenvalue of the Laplacian which captures key properties such as mixing time, conductance, congestion in routing etc. Xheal has low amortized latency and bandwidth requirements. Our work improves over the self-healing algorithms Forgiving tree [PODC 2008] andForgiving graph [PODC 2009] in that we are able to give guarantees on degree and stretch, while at the same time preserving the expansion and spectral properties of the network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a unified approach to an energy-efficient variation-tolerant design of Discrete Wavelet Transform (DWT) in the context of image processing applications. It is to be noted that it is not necessary to produce exactly correct numerical outputs in most image processing applications. We exploit this important feature and propose a design methodology for DWT which shows energy quality tradeoffs at each level of design hierarchy starting from the algorithm level down to the architecture and circuit levels by taking advantage of the limited perceptual ability of the Human Visual System. A unique feature of this design methodology is that it guarantees robustness under process variability and facilitates aggressive voltage over-scaling. Simulation results show significant energy savings (74% - 83%) with minor degradations in output image quality and avert catastrophic failures under process variations compared to a conventional design. © 2010 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a fully-distributed self-healing algorithm DEX, that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders - whose expansion properties hold deterministically - that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion which rapidly degrade in a dynamic setting, in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(log n) rounds and O(log n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table (DHT) on top of DEX, with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the problem of learning Bayesian network structures from data based on score functions that are decomposable. It describes properties that strongly reduce the time and memory costs of many known methods without losing global optimality guarantees. These properties are derived for different score criteria such as Minimum Description Length (or Bayesian Information Criterion), Akaike Information Criterion and Bayesian Dirichlet Criterion. Then a branch-and-bound algorithm is presented that integrates structural constraints with data in a way to guarantee global optimality. As an example, structural constraints are used to map the problem of structure learning in Dynamic Bayesian networks into a corresponding augmented Bayesian network. Finally, we show empirically the benefits of using the properties with state-of-the-art methods and with the new algorithm, which is able to handle larger data sets than before.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper (co-written with Dr Maria Lohan, Dr Carmel Kelly & Professor Laura Lundy) will describe the ethical review process to undertake health research in the UK, and explain an approach that can help researchers deal with ethical and methodological dilemmas in their research. Ethical review is necessary to ensure researchers and participants are protected, yet the requirement to ‘pass’ numerous committees may be challenging particularly for health researchers who work with vulnerable groups and sensitive topics. The inclusion of these groups/topics is crucial if health researchers are to understand health disparities and implement appropriate interventions with health benefits for vulnerable populations. It is proposed that to overcome ethical and methodological challenges and pitfalls, researchers must implement strategies that advocate for, and increase the participation of, vulnerable populations in health research. A ‘children’s rights based approach’ using participatory methodology will be described that draws on the jurisprudence of international law, (United Nations Convention on the Rights of the Child, 1989) and provides a framework that may empower ethics committees to carry out their function confidently. The role of the researcher, framed within the context of doctoral level study, will be reviewed in terms of the investment required and benefits of utilising this approach. It will be argued that adopting this approach with vulnerable groups, not only guarantees their meaningful participation in the research process and permits their voices to be heard, but also offers ethics committees an internationally agreed upon legal framework, ratified by their governing States, from which to fulfil their obligations and resolve their ethical dilemmas. Increasing the representation and participation of vulnerable groups in health research can inform the development of health policy and practice based on ‘insider knowledge’ that better engages with and more adequately reflects their specific needs. This is likely to yield numerous health, social and economic benefits for all of society through the delivery of more equitable, effective and sustainable services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: High digestible carbohydrate intakes can induce hyperglycemia and hyperinsulinemia and collectively have been implicated in colorectal tumor development. Our aim was to explore the association between aspects of dietary carbohydrate intake and risk of colorectal adenomas and hyperplastic polyps in a large case–control study.

Methods: Colorectal polyp cases (n = 1,315 adenomas only, n = 566 hyperplastic polyps only and n = 394 both) and controls (n = 3,184) undergoing colonoscopy were recruited between 2003 and 2010 in Nashville, Tennessee, USA. Dietary intakes were estimated by a 108-item food frequency questionnaire. Unconditional logistic regression analysis was applied to determine odds ratios (OR) and corresponding 95 % confidence intervals (CI) for colorectal polyps according to dietary carbohydrate intakes, after adjustment for potential confounders.

Results: No significant associations were detected for risk of colorectal adenomas when comparing the highest versus lowest quartiles of intake for total sugars (OR 1.03; 95 % CI 0.84–1.26), starch (OR 1.01; 95 % CI 0.81–1.26), total or available carbohydrate intakes. Similar null associations were observed between dietary carbohydrate intakes and risk of hyperplastic polyps, or concurrent adenomas and hyperplastic polyps.

Conclusion: In this US population, digestible carbohydrate intakes were not associated with risk of colorectal polyps, suggesting that dietary carbohydrate does not have an etiological role in the early stages of colorectal carcinogenesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe, for the first time, considerations in the sterile manufacture of polymeric microneedle arrays. Microneedles (MN) made from dissolving polymeric matrices and loaded with the model drugs ovalbumin (OVA) and ibuprofen sodium and hydrogel-forming MN composed of "super-swelling" polymers and their corresponding lyophilised wafer drug reservoirs loaded with OVA and ibuprofen sodium were prepared aseptically or sterilised using commonly employed sterilisation techniques. Moist and dry heat sterilisation, understandably, damaged all devices, leaving aseptic production and gamma sterilisation as the only viable options. No measureable bioburden was detected in any of the prepared devices, and endotoxin levels were always below the US Food & Drug Administration limits (20 endotoxin units/device). Hydrogel-forming MN were unaffected by gamma irradiation (25 kGy) in terms of their physical properties or capabilities in delivering OVA and ibuprofen sodium across excised neonatal porcine skin in vitro. However, OVA content in dissolving MN (down from approximately 101.1 % recovery to approximately 58.3 % recovery) and lyophilised wafer-type drug reservoirs (down from approximately 99.7 % recovery to approximately 60.1 % recovery) was significantly reduced by gamma irradiation, while the skin permeation profile of ibuprofen sodium from gamma-irradiated dissolving MN was markedly different from their non-irradiated counterparts. It is clear that MN poses a very low risk to human health when used appropriately, as evidenced here by low endotoxin levels and absence of microbial contamination. However, if guarantees of absolute sterility of MN products are ultimately required by regulatory authorities, it will be necessary to investigate the effect of lower gamma doses on dissolving MN loaded with active pharmaceutical ingredients and lyophilised wafers loaded with biomolecules in order to avoid the expense and inconvenience of aseptic processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivated by the need for designing efficient and robust fully-distributed computation in highly dynamic networks such as Peer-to-Peer (P2P) networks, we study distributed protocols for constructing and maintaining dynamic network topologies with good expansion properties. Our goal is to maintain a sparse (bounded degree) expander topology despite heavy {\em churn} (i.e., nodes joining and leaving the network continuously over time). We assume that the churn is controlled by an adversary that has complete knowledge and control of what nodes join and leave and at what time and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is a randomized distributed protocol that guarantees with high probability the maintenance of a {\em constant} degree graph with {\em high expansion} even under {\em continuous high adversarial} churn. Our protocol can tolerate a churn rate of up to $O(n/\poly\log(n))$ per round (where $n$ is the stable network size). Our protocol is efficient, lightweight, and scalable, and it incurs only $O(\poly\log(n))$ overhead for topology maintenance: only polylogarithmic (in $n$) bits needs to be processed and sent by each node per round and any node's computation cost per round is also polylogarithmic. The given protocol is a fundamental ingredient that is needed for the design of efficient fully-distributed algorithms for solving fundamental distributed computing problems such as agreement, leader election, search, and storage in highly dynamic P2P networks and enables fast and scalable algorithms for these problems that can tolerate a large amount of churn.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the fundamental Byzantine leader election problem in dynamic networks where the topology can change from round to round and nodes can also experience heavy {\em churn} (i.e., nodes can join and leave the network continuously over time). We assume the full information model where the Byzantine nodes have complete knowledge about the entire state of the network at every round (including random choices made by all the nodes), have unbounded computational power and can deviate arbitrarily from the protocol. The churn is controlled by an adversary that has complete knowledge and control over which nodes join and leave and at what times and also may rewire the topology in every round and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is an $O(\log^3 n)$ round algorithm that achieves Byzantine leader election under the presence of up to $O({n}^{1/2 - \epsilon})$ Byzantine nodes (for a small constant $\epsilon > 0$) and a churn of up to \\$O(\sqrt{n}/\poly\log(n))$ nodes per round (where $n$ is the stable network size).The algorithm elects a leader with probability at least $1-n^{-\Omega(1)}$ and guarantees that it is an honest node with probability at least $1-n^{-\Omega(1)}$; assuming the algorithm succeeds, the leader's identity will be known to a $1-o(1)$ fraction of the honest nodes. Our algorithm is fully-distributed, lightweight, and is simple to implement. It is also scalable, as it runs in polylogarithmic (in $n$) time and requires nodes to send and receive messages of only polylogarithmic size per round.To the best of our knowledge, our algorithm is the first scalable solution for Byzantine leader election in a dynamic network with a high rate of churn; our protocol can also be used to solve Byzantine agreement in a straightforward way.We also show how to implement an (almost-everywhere) public coin with constant bias in a dynamic network with Byzantine nodes and provide a mechanism for enabling honest nodes to store information reliably in the network, which might be of independent interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a fully-distributed self-healing algorithm dex that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders—whose expansion properties holddeterministically—that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion whichrapidly degrade in a dynamic setting; in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(logn)O(log⁡n) rounds and O(logn)O(log⁡n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table on top of dex  with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.

Gopal Pandurangan has been supported in part by Nanyang Technological University Grant M58110000, Singapore Ministry of Education (MOE) Academic Research Fund (AcRF) Tier 2 Grant MOE2010-T2-2-082, MOE AcRF Tier 1 Grant MOE2012-T1-001-094, and the United States-Israel Binational Science Foundation (BSF) Grant 2008348. Peter Robinson has been supported by Grant MOE2011-T2-2-042 “Fault-tolerant Communication Complexity in Wireless Networks” from the Singapore MoE AcRF-2. Work done in part while the author was at the Nanyang Technological University and at the National University of Singapore. Amitabh Trehan has been supported by the Israeli Centers of Research Excellence (I-CORE) program (Center No. 4/11). Work done in part while the author was at Hebrew University of Jerusalem and at the Technion and supported by a Technion fellowship.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The West has failed to properly integrate Russia into its worldview since 1991, and there is an obvious vacuum of ideas for how to deal with it. The default reaction is to fall back on the Cold War paradigm - sanctions, containment, and hopes of Russian regime change.

This is folly. There’s no knowing how long it will take for Russia to change tack, if it ever does; nothing guarantees that a new regime in Russia would be any more pro-Western. There’s also apparently no idea how to handle Russia in the meantime, especially while it remains a crucial part of crises like those in Iran and Syria.

Ukraine has shown that the placeholder post-Cold War order Europe and Russia inherited urgently needs replacing. With a ceasefire in place at last, the search for an alternative is on. The Geneva talks in April this year could be its basis; but nothing truly transformative will be achieved until the US, EU, Russia and Ukraine all recognise the need for compromise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research presents a fast algorithm for projected support vector machines (PSVM) by selecting a basis vector set (BVS) for the kernel-induced feature space, the training points are projected onto the subspace spanned by the selected BVS. A standard linear support vector machine (SVM) is then produced in the subspace with the projected training points. As the dimension of the subspace is determined by the size of the selected basis vector set, the size of the produced SVM expansion can be specified. A two-stage algorithm is derived which selects and refines the basis vector set achieving a locally optimal model. The model expansion coefficients and bias are updated recursively for increase and decrease in the basis set and support vector set. The condition for a point to be classed as outside the current basis vector and selected as a new basis vector is derived and embedded in the recursive procedure. This guarantees the linear independence of the produced basis set. The proposed algorithm is tested and compared with an existing sparse primal SVM (SpSVM) and a standard SVM (LibSVM) on seven public benchmark classification problems. Our new algorithm is designed for use in the application area of human activity recognition using smart devices and embedded sensors where their sometimes limited memory and processing resources must be exploited to the full and the more robust and accurate the classification the more satisfied the user. Experimental results demonstrate the effectiveness and efficiency of the proposed algorithm. This work builds upon a previously published algorithm specifically created for activity recognition within mobile applications for the EU Haptimap project [1]. The algorithms detailed in this paper are more memory and resource efficient making them suitable for use with bigger data sets and more easily trained SVMs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIM: To evaluate the association between various lifestyle factors and achalasia risk.

METHODS: A population-based case-control study was conducted in Northern Ireland, including n= 151 achalasia cases and n = 117 age- and sex-matched controls. Lifestyle factors were assessed via a face-to-face structured interview. The association between achalasia and lifestyle factors was assessed by unconditional logistic regression, to produce odds ratios (OR) and 95% confidence interval (CI).

RESULTS: Individuals who had low-class occupations were at the highest risk of achalasia (OR = 1.88, 95%CI: 1.02-3.45), inferring that high-class occupation holders have a reduced risk of achalasia. A history of foreign travel, a lifestyle factor linked to upper socio-economic class, was also associated with a reduced risk of achalasia (OR = 0.59, 95%CI: 0.35-0.99). Smoking and alcohol consumption carried significantly reduced risks of achalasia, even after adjustment for socio-economic status. The presence of pets in the house was associated with a two-fold increased risk of achalasia (OR = 2.00, 95%CI: 1.17-3.42). No childhood household factors were associated with achalasia risk.

CONCLUSION: Achalasia is a disease of inequality, and individuals from low socio-economic backgrounds are at highest risk. This does not appear to be due to corresponding alcohol and smoking behaviours. An observed positive association between pet ownership and achalasia risk suggests an interaction between endotoxin and viral infection exposure in achalasia aetiology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bounding the tree-width of a Bayesian network can reduce the chance of overfitting, and allows exact inference to be performed efficiently. Several existing algorithms tackle the problem of learning bounded tree-width Bayesian networks by learning from k-trees as super-structures, but they do not scale to large domains and/or large tree-width. We propose a guided search algorithm to find k-trees with maximum Informative scores, which is a measure of quality for the k-tree in yielding good Bayesian networks. The algorithm achieves close to optimal performance compared to exact solutions in small domains, and can discover better networks than existing approximate methods can in large domains. It also provides an optimal elimination order of variables that guarantees small complexity for later runs of exact inference. Comparisons with well-known approaches in terms of learning and inference accuracy illustrate its capabilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing compact routing schemes, e.g., Thorup and Zwick [SPAA 2001] and Chechik [PODC 2013], often have no means to tolerate failures, once the system has been setup and started. This paper presents, to our knowledge, the first self-healing compact routing scheme. Besides, our schemes are developed for low memory nodes, i.e., nodes need only O(log2 n) memory, and are thus, compact schemes.
We introduce two algorithms of independent interest: The first is CompactFT, a novel compact version (using only O(log n) local memory) of the self-healing algorithm Forgiving Tree of Hayes et al. [PODC 2008]. The second algorithm (CompactFTZ) combines CompactFT with Thorup-Zwick’s treebased compact routing scheme [SPAA 2001] to produce a fully compact self-healing routing scheme. In the self-healing model, the adversary deletes nodes one at a time with the affected nodes self-healing locally by adding few edges. CompactFT recovers from each attack in only O(1) time and ∆ messages, with only +3 degree increase and O(log∆) graph diameter increase, over any sequence of deletions (∆ is the initial maximum degree).
Additionally, CompactFTZ guarantees delivery of a packet sent from sender s as long as the receiver has not been deleted, with only an additional O(y log ∆) latency, where y is the number of nodes that have been deleted on the path between s and t. If t has been deleted, s gets informed and the packet removed from the network.