979 resultados para messages


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commonly, research work in routing for delay tolerant networks (DTN) assumes that node encounters are predestined, in the sense that they are the result of unknown, exogenous processes that control the mobility of these nodes. In this paper, we argue that for many applications such an assumption is too restrictive: while the spatio-temporal coordinates of the start and end points of a node's journey are determined by exogenous processes, the specific path that a node may take in space-time, and hence the set of nodes it may encounter could be controlled in such a way so as to improve the performance of DTN routing. To that end, we consider a setting in which each mobile node is governed by a schedule consisting of a ist of locations that the node must visit at particular times. Typically, such schedules exhibit some level of slack, which could be leveraged for DTN message delivery purposes. We define the Mobility Coordination Problem (MCP) for DTNs as follows: Given a set of nodes, each with its own schedule, and a set of messages to be exchanged between these nodes, devise a set of node encounters that minimize message delivery delays while satisfying all node schedules. The MCP for DTNs is general enough that it allows us to model and evaluate some of the existing DTN schemes, including data mules and message ferries. In this paper, we show that MCP for DTNs is NP-hard and propose two detour-based approaches to solve the problem. The first (DMD) is a centralized heuristic that leverages knowledge of the message workload to suggest specific detours to optimize message delivery. The second (DNE) is a distributed heuristic that is oblivious to the message workload, and which selects detours so as to maximize node encounters. We evaluate the performance of these detour-based approaches using extensive simulations based on synthetic workloads as well as real schedules obtained from taxi logs in a major metropolitan area. Our evaluation shows that our centralized, workload-aware DMD approach yields the best performance, in terms of message delay and delivery success ratio, and that our distributed, workload-oblivious DNE approach yields favorable performance when compared to approaches that require the use of data mules and message ferries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in processor speeds, mobile communications and battery life have enabled computers to evolve from completely wired to completely mobile. In the most extreme case, all nodes are mobile and communication takes place at available opportunities – using both traditional communication infrastructure as well as the mobility of intermediate nodes. These are mobile opportunistic networks. Data communication in such networks is a difficult problem, because of the dynamic underlying topology, the scarcity of network resources and the lack of global information. Establishing end-to-end routes in such networks is usually not feasible. Instead a store-and-carry forwarding paradigm is better suited for such networks. This dissertation describes and analyzes algorithms for forwarding of messages in such networks. In order to design effective forwarding algorithms for mobile opportunistic networks, we start by first building an understanding of the set of all paths between nodes, which represent the available opportunities for any forwarding algorithm. Relying on real measurements, we enumerate paths between nodes and uncover what we refer to as the path explosion effect. The term path explosion refers to the fact that the number of paths between a randomly selected pair of nodes increases exponentially with time. We draw from the theory of epidemics to model and explain the path explosion effect. This is the first contribution of the thesis, and is a key observation that underlies subsequent results. Our second contribution is the study of forwarding algorithms. For this, we rely on trace driven simulations of different algorithms that span a range of design dimensions. We compare the performance (success rate and average delay) of these algorithms. We make the surprising observation that most algorithms we consider have roughly similar performance. We explain this result in light of the path explosion phenomenon. While the performance of most algorithms we studied was roughly the same, these algorithms differed in terms of cost. This prompted us to focus on designing algorithms with the explicit intent of reducing costs. For this, we cast the problem of forwarding as an optimal stopping problem. Our third main contribution is the design of strategies based on optimal stopping principles which we refer to as Delegation schemes. Our analysis shows that using a delegation scheme reduces cost over naive forwarding by a factor of O(√N), where N is the number of nodes in the network. We further validate this result on real traces, where the cost reduction observed is even greater. Our results so far include a key assumption, which is unbounded buffers on nodes. Next, we relax this assumption, so that the problem shifts to one of prioritization of messages for transmission and dropping. Our fourth contribution is the study of message prioritization schemes, combined with forwarding. Our main result is that one achieves higher performance by assigning higher priorities to young messages in the network. We again interpret this result in light of the path explosion effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we introduce a theory of policy routing dynamics based on fundamental axioms of routing update mechanisms. We develop a dynamic policy routing model (DPR) that extends the static formalism of the stable paths problem (introduced by Griffin et al.) with discrete synchronous time. DPR captures the propagation of path changes in any dynamic network irrespective of its time-varying topology. We introduce several novel structures such as causation chains, dispute fences and policy digraphs that model different aspects of routing dynamics and provide insight into how these dynamics manifest in a network. We exercise the practicality of the theoretical foundation provided by DPR with two fundamental problems: routing dynamics minimization and policy conflict detection. The dynamics minimization problem utilizes policy digraphs, that capture the dependencies in routing policies irrespective of underlying topology dynamics, to solve a graph optimization problem. This optimization problem explicitly minimizes the number of routing update messages in a dynamic network by optimally changing the path preferences of a minimal subset of nodes. The conflict detection problem, on the other hand, utilizes a theoretical result of DPR where the root cause of a causation cycle (i.e., cycle of routing update messages) can be precisely inferred as either a transient route flap or a dispute wheel (i.e., policy conflict). Using this result we develop SafetyPulse, a token-based distributed algorithm to detect policy conflicts in a dynamic network. SafetyPulse is privacy preserving, computationally efficient, and provably correct.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We revisit the problem of connection management for reliable transport. At one extreme, a pure soft-state (SS) approach (as in Delta-t [9]) safely removes the state of a connection at the sender and receiver once the state timers expire without the need for explicit removal messages. And new connections are established without an explicit handshaking phase. On the other hand, a hybrid hard-state/soft-state (HS+SS) approach (as in TCP) uses both explicit handshaking as well as timer-based management of the connection’s state. In this paper, we consider the worst-case scenario of reliable single-message communication, and develop a common analytical model that can be instantiated to capture either the SS approach or the HS+SS approach. We compare the two approaches in terms of goodput, message and state overhead. We also use simulations to compare against other approaches, and evaluate them in terms of correctness (with respect to data loss and duplication) and robustness to bad network conditions (high message loss rate and variable channel delays). Our results show that the SS approach is more robust, and has lower message overhead. On the other hand, SS requires more memory to keep connection states, which reduces goodput. Given memories are getting bigger and cheaper, SS presents the best choice over bandwidth-constrained, error-prone networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an economic mechanism to reduce the incidence of malware that delivers spam. Earlier research proposed attention markets as a solution for unwanted messages, and showed they could provide more net benefit than alternatives such as filtering and taxes. Because it uses a currency system, Attention Bonds faces a challenge. Zombies, botnets, and various forms of malware might steal valuable currency instead of stealing unused CPU cycles. We resolve this problem by taking advantage of the fact that the spam-bot problem has been reduced to financial fraud. As such, the large body of existing work in that realm can be brought to bear. By drawing an analogy between sending and spending, we show how a market mechanism can detect and prevent spam malware. We prove that by using a currency (i) each instance of spam increases the probability of detecting infections, and (ii) the value of eradicating infections can justify insuring users against fraud. This approach attacks spam at the source, a virtue missing from filters that attack spam at the destination. Additionally, the exchange of currency provides signals of interest that can improve the targeting of ads. ISPs benefit from data management services and consumers benefit from the higher average value of messages they receive. We explore these and other secondary effects of attention markets, and find them to offer, on the whole, attractive economic benefits for all – including consumers, advertisers, and the ISPs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional approaches to receiver-driven layered multicast have advocated the benefits of cumulative layering, which can enable coarse-grained congestion control that complies with TCP-friendliness equations over large time scales. In this paper, we quantify the costs and benefits of using non-cumulative layering and present a new, scalable multicast congestion control scheme which provides a fine-grained approximation to the behavior of TCP additive increase/multiplicative decrease (AIMD). In contrast to the conventional wisdom, we demonstrate that fine-grained rate adjustment can be achieved with only modest increases in the number of layers and aggregate bandwidth consumption, while using only a small constant number of control messages to perform either additive increase or multiplicative decrease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A notable feature of the surveillance case law of the European Court of Human Rights has been the tendency of the Court to focus on the “in accordance with the law” aspect of the Article 8 ECHR inquiry. This focus has been the subject of some criticism, but the impact of this approach on the manner in which domestic surveillance legislation has been formulated in the Party States has received little scholarly attention. This thesis addresses that gap in the literature through its consideration of the Interception of Postal Packets and Telecommunications Messages (Regulation) Act, 1993 and the Criminal Justice (Surveillance) Act, 2009. While both Acts provide several of the safeguards endorsed by the European Court of Human Rights, this thesis finds that they suffer from a number of crucial weaknesses that undermine the protection of privacy. This thesis demonstrates how the focus of the European Court of Human Rights on the “in accordance with the law” test has resulted in some positive legislative change. Notwithstanding this fact, it is maintained that the legality approach has gained prominence at the expense of a full consideration of the “necessary in a democratic society” inquiry. This has resulted in superficial legislative responses at the domestic level, including from the Irish government. Notably, through the examination of a number of more recent cases, this project discerns a significant alteration in the interpretive approach adopted by the European Court of Human Rights regarding the application of the necessity test. The implications of this development are considered and the outlook for Irish surveillance legislation is assessed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation consists of three essays on behavioral economics, with a general aim of enriching our understanding of economic decisions using behavioral insights and experimental methodology. Each essay takes on one particular topic with this general aim.

The first chapter studies savings behavior of the poor. In this project, partnering with a savings product provider in Kenya, we tested the extent to which behavioral interventions and financial incentives can increase the saving rate of individuals with low and irregular income. Our experiment lasted for six months and included a total of twelve conditions. The control condition received weekly reminders and balance reporting via text messages. The treatment conditions received in addition one of the following interventions: (1) reminder text messages framed as if they came from the participant’s kid (2) a golden colored coin with numbers for each week of the trial, on which participants were asked to keep track of their weekly deposits (3) a match of weekly savings: The match was either 10% or 20% up to a certain amount per week. The match was either deposited at the end of each week or the highest possible match was deposited at the start of each week and was adjusted at the end. Among these interventions, by far the most effective was the coin: Those in the coin condition saved on average the highest amount and more than twice as those in the control condition. We hypothesize that being a tangible track-keeping object; the coin made subjects remember to save more often. Our results support the line of literature suggesting that saving decisions involve psychological aspects and that policy makers and product designers should take these influences into account.

The second chapter is related to views towards inequality. In this project, we investigate how the perceived fairness of income distributions depends on the beliefs about the process that generated the inequality. Specifically, we examine how two crucial features of this process affect fairness views: (1) Procedural justice - equal treatment of all, (2) Agency - one's ability to determine his/her income. We do this in a lab experiment by varying the equality of opportunity (procedural justice), and one's ability to make choices, which consequently influence subjects’ ability to influence their income (agency). We then elicit ex-post redistribution decisions of the earnings as a function of these two elements. Our results suggest both agency and procedural justice matter for fairness. Our main findings can be summarized as follows: (1) Highlighting the importance of agency, we find that inequality resulting from risk is considered to be fair only when risk is chosen freely; (2) Highlighting the importance of procedural justice, we find that introducing inequality of opportunity significantly increases redistribution, however the share of subjects redistributing none remain close to the share of subjects redistributing fully revealing an underlying heterogeneity in the population about how fairness views should account for inequality of opportunity.

The third chapter is on morality. In this project, we study whether religious rituals act as an internal reminder for basic moral principles and thus affect moral judgments. To this end, we conducted two survey experiments in Turkey and Israel to specifically test the effect of Ramadan and Yom Kippur. The results from the Turkish sample how that Ramadan has a significant effect on moral judgments to some extent for those who report to believe in God. Those who believe in God judged the moral acceptability of ten out of sixty one actions significantly differently in Ramadan, whereas those who reported not to believe in God significantly changed their judgments only for one action in Ramadan. Our results extends the hypothesis established by lab experiments that religious reminders have a significant effect on morality, by testing it in the field in the natural environment of religious rituals.

This thesis is part of a broader collaborative research agenda with both colleagues and advisors. The programming, analyses, and writing, as well as any errors in this work, are my own.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Animals communicating via scent often deposit composite signals that incorporate odorants from multiple sources; however, the function of mixing chemical signals remains understudied. We tested both a 'multiple-messages' and a 'fixative' hypothesis of composite olfactory signalling, which, respectively, posit that mixing scents functions to increase information content or prolong signal longevity. Our subjects-adult, male ring-tailed lemurs (Lemur catta)-have a complex scent-marking repertoire, involving volatile antebrachial (A) secretions, deposited pure or after being mixed with a squalene-rich paste exuded from brachial (B) glands. Using behavioural bioassays, we examined recipient responses to odorants collected from conspecific strangers. We concurrently presented pure A, pure B and mixed A + B secretions, in fresh or decayed conditions. Lemurs preferentially responded to mixed over pure secretions, their interest increasing and shifting over time, from sniffing and countermarking fresh mixtures, to licking and countermarking decayed mixtures. Substituting synthetic squalene (S)-a well-known fixative-for B secretions did not replicate prior results: B secretions, which contain additional chemicals that probably encode salient information, were preferred over pure S. Whereas support for the 'multiple-messages' hypothesis underscores the unique contribution from each of an animal's various secretions, support for the 'fixative' hypothesis highlights the synergistic benefits of composite signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

*Designated as an exemplary master's project for 2015-16*

This paper examines how contemporary literature contributes to the discussion of punitory justice. It uses close analysis of three contemporary novels, Margaret Atwood’s The Heart Goes Last, Hillary Jordan’s When She Woke, and Joyce Carol Oates’s Carthage, to deconstruct different conceptions of punitory justice. This analysis is framed and supported by relevant social science research on the concept of punitivity within criminal justice. Each section examines punitory justice at three levels: macro, where media messages and the predominant social conversation reside; meso, which involves penal policy and judicial process; and micro, which encompasses personal attitudes towards criminal justice. The first two chapters evaluate works by Atwood and Jordan, examining how their dystopian schemas of justice shed light on top-down and bottom-up processes of punitory justice in the real world. The third chapter uses a more realistic novel, Oates’s Carthage, to examine the ontological nature of punitory justice. It explores a variety of factors that give rise to and legitimize punitory justice, both at the personal level and within a broader cultural consensus. This chapter also discusses how both victim and perpetrator can come to stand in as metaphors to both represent and distract from broader social issues. As a whole, analysis of these three novels illuminate how current and common conceptualizations of justice have little to do with the actual act of transgression itself. Instead, justice emerges as a set of specific, conditioned responses to perceived threats, mediated by complex social, cultural, and emotive forces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Computer Aided Parallelisation Tools (CAPTools) [Ierotheou, C, Johnson SP, Cross M, Leggett PF, Computer aided parallelisation tools (CAPTools)-conceptual overview and performance on the parallelisation of structured mesh codes, Parallel Computing, 1996;22:163±195] is a set of interactive tools aimed to provide automatic parallelisation of serial FORTRAN Computational Mechanics (CM) programs. CAPTools analyses the user's serial code and then through stages of array partitioning, mask and communication calculation, generates parallel SPMD (Single Program Multiple Data) messages passing FORTRAN. The parallel code generated by CAPTools contains calls to a collection of routines that form the CAPTools communications Library (CAPLib). The library provides a portable layer and user friendly abstraction over the underlying parallel environment. CAPLib contains optimised message passing routines for data exchange between parallel processes and other utility routines for parallel execution control, initialisation and debugging. By compiling and linking with different implementations of the library, the user is able to run on many different parallel environments. Even with today's parallel systems the concept of a single version of a parallel application code is more of an aspiration than a reality. However for CM codes the data partitioning SPMD paradigm requires a relatively small set of message-passing communication calls. This set can be implemented as an intermediate `thin layer' library of message-passing calls that enables the parallel code (especially that generated automatically by a parallelisation tool such as CAPTools) to be as generic as possible. CAPLib is just such a `thin layer' message passing library that supports parallel CM codes, by mapping generic calls onto machine specific libraries (such as CRAY SHMEM) and portable general purpose libraries (such as PVM an MPI). This paper describe CAPLib together with its three perceived advantages over other routes: - as a high level abstraction, it is both easy to understand (especially when generated automatically by tools) and to implement by hand, for the CM community (who are not generally parallel computing specialists); - the one parallel version of the application code is truly generic and portable; - the parallel application can readily utilise whatever message passing libraries on a given machine yield optimum performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing election algorithms suffer limited scalability. This limit stems from the communication design which in turn stems from their fundamentally two-state behaviour. This paper presents a new election algorithm specifically designed to be highly scalable in broadcast networks whilst allowing any processing node to become coordinator with initially equal probability. To achieve this, careful attention has been paid to the communication design, and an additional state has been introduced. The design of the tri-state election algorithm has been motivated by the requirements analysis of a major research project to deliver robust scalable distributed applications, including load sharing, in hostile computing environments in which it is common for processing nodes to be rebooted frequently without notice. The new election algorithm is based in-part on a simple 'emergent' design. The science of emergence is of great relevance to developers of distributed applications because it describes how higher-level self-regulatory behaviour can arise from many participants following a small set of simple rules. The tri-state election algorithm is shown to have very low communication complexity in which the number of messages generated remains loosely-bounded regardless of scale for large systems; is highly scalable because nodes in the idle state do not transmit any messages; and because of its self-organising characteristics, is very stable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many pieces of legislation have been implemented with the anticipation - or justification - that they will have a deterrent effect. Deterrence was clearly argued in the debate preceding the Swedish prostitution law prohibiting the purchase of sexual services, but less so regarding the Dangerous Dogs Act, which was a very rapid response to a particular moral panic. As it turned out, the Swedish law has had a deterrent effect on street prostitution in that 'respectable' buyers were deterred. It will be argued that it is this very 'respectability' that makes deterrence work in this case. Regarding the Dangerous Dogs Act, the owners of Pit Bulls and other banned breeds are not considered 'respectable' and the banning might have had the reversed effect - increasing the attraction of these dogs, rather than deterring the ownership. Apart from deterrence and its consequences, the rendering invisible of key actors - buyers and owners respectively - and the use of symbolic legislation to promote moral messages will also be considered. [From the Author]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radio advertising is suffering from a remarkable crisis of creativity as it has yet not found its role in a radio model based on voice locution and information genres. This article suggests the need for implementing a peripheral or heuristic strategy to attract and hold listeners’ attention. Within this framework, the narration and scene representation are proposed as suitable persuasion techniques. The objective is to design a useful conceptual tool for an efficient creative conception of narration at the service of certain commercial strategy. First, the concept of narrative persuasion is grounded according to the possibilities of the sound code. Second, the keys of scene representation and commercial strategy (brand, product, advantage, benefit and target) within the sound message are presented. And third, these keys are articulated in a model. This model is pre-tested by means of analyzing eight different case-radio ads.