561 resultados para Data portal
Resumo:
We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that the following process continues for up to n rounds where n is the total number of nodes initially in the network: the adversary deletesan arbitrary node from the network, then the network responds by quickly adding a small number of new edges.
We present a distributed data structure that ensures two key properties. First, the diameter of the network is never more than O(log Delta) times its original diameter, where Delta is the maximum degree of the network initially. We note that for many peer-to-peer systems, Delta is polylogarithmic, so the diameter increase would be a O(loglog n) multiplicative factor. Second, the degree of any node never increases by more than 3 over its original degree. Our data structure is fully distributed, has O(1) latency per round and requires each node to send and receive O(1) messages per round. The data structure requires an initial setup phase that has latency equal to the diameter of the original network, and requires, with high probability, each node v to send O(log n) messages along every edge incident to v. Our approach is orthogonal and complementary to traditional topology-based approaches to defending against attack.
Resumo:
We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick "repairs," which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions,without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been - in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most - log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degreewould have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network. © Springer-Verlag 2012.
Resumo:
Model selection between competing models is a key consideration in the discovery of prognostic multigene signatures. The use of appropriate statistical performance measures as well as verification of biological significance of the signatures is imperative to maximise the chance of external validation of the generated signatures. Current approaches in time-to-event studies often use only a single measure of performance in model selection, such as logrank test p-values, or dichotomise the follow-up times at some phase of the study to facilitate signature discovery. In this study we improve the prognostic signature discovery process through the application of the multivariate partial Cox model combined with the concordance index, hazard ratio of predictions, independence from available clinical covariates and biological enrichment as measures of signature performance. The proposed framework was applied to discover prognostic multigene signatures from early breast cancer data. The partial Cox model combined with the multiple performance measures were used in both guiding the selection of the optimal panel of prognostic genes and prediction of risk within cross validation without dichotomising the follow-up times at any stage. The signatures were successfully externally cross validated in independent breast cancer datasets, yielding a hazard ratio of 2.55 [1.44, 4.51] for the top ranking signature.
Resumo:
Soil carbon stores are a major component of the annual returns required by EU governments to the Intergovernmental Panel on Climate Change. Peat has a high proportion of soil carbon due to the relatively high carbon density of peat and organic-rich soils. For this reason it has become increasingly important to measure and model soil carbon stores and changes in peat stocks to facilitate the management of carbon changes over time. The approach investigated in this research evaluates the use of airborne geophysical (radiometric) data to estimate peat thickness using the attenuation of bedrock geology radioactivity by superficial peat cover. Remotely sensed radiometric data are validated with ground peat depth measurements combined with non-invasive geophysical surveys. Two field-based case studies exemplify and validate the results. Variography and kriging are used to predict peat thickness from point measurements of peat depth and airborne radiometric data and provide an estimate of uncertainty in the predictions. Cokriging, by assessing the degree of spatial correlation between recent remote sensed geophysical monitoring and previous peat depth models, is used to examine changes in peat stocks over time. The significance of the coregionalisation is that the spatial cross correlation between the remote and ground based data can be used to update the model of peat depth. The result is that by integrating remotely sensed data with ground geophysics, the need is reduced for extensive ground-based monitoring and invasive peat depth measurements. The overall goal is to provide robust estimates of peat thickness to improve estimates of carbon stocks. The implications from the research have a broader significance that promotes a reduction in the need for damaging onsite peat thickness measurement and an increase in the use of remote sensed data for carbon stock estimations.
Resumo:
Objective: To examine the differences in the interval between diagnosis and initiation of treatment among women with breast cancer in Northern Ireland.
Design: A cross-sectional observational study.
Setting: All breast cancer care patients in the Northern Ireland Cancer Registry in 2006.
Participants: All women diagnosed and treated for breast cancer in Northern Ireland in 2006.
Main outcome measure: The number of days between diagnosis and initiation of treatment for breast cancer.
Results: The mean (median) interval between diagnosis and initiation of treatment among public patients was 19 (15) compared with 14 (12) among those whose care involved private providers. The differences between individual public providers were as marked as those between the public and private sector - the mean (median) ranging between 14 (12) and 25 (22) days. Multivariate models revealed that the differences were evident when a range of patient characteristics were controlled for including cancer stage.
Conclusions: A relatively small number of women received care privately in Northern Ireland but experienced shorter intervals between diagnosis and initiation of treatment than those who received care wholly in the public system. The variation among public providers was as great as that between the public and private providers. The impact of such differences on survival and in light of waiting time targets introduced in Northern Ireland warrants investigation.
Resumo:
Prader-Willi syndrome (PWS) and Fragile X syndrome (FraX) are associated with distinctive cognitive and behavioural profiles. We examined whether repetitive behaviours in the two syndromes were associated with deficits in specific executive functions. PWS, FraX, and typically developing (TD) children were assessed for executive functioning using the Test of Everyday Attention for Children and an adapted Simon spatial interference task. Relative to the TD children, children with PWS and FraX showed greater costs of attention switching on the Simon task, but after controlling for intellectual ability, these switching deficits were only significant in the PWS group. Children with PWS and FraX also showed significantly increased preference for routine and differing profiles of other specific types of repetitive behaviours. A measure of switch cost from the Simon task was positively correlated to scores on preference for routine questionnaire items and was strongly associated with scores on other items relating to a preference for predictability. It is proposed that a deficit in attention switching is a component of the endophenotypes of both PWS and FraX and is associated with specific behaviours. This proposal is discussed in the context of neurocognitive pathways between genes and behaviour.
Resumo:
The aim of this paper is to equip readers with an understanding of the principles of qualitative data analysis and offer a practical example of how analysis might be undertaken in an interview-based study.