972 resultados para cache consistency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Formal correctness of complex multi-party network protocols can be difficult to verify. While models of specific fixed compositions of agents can be checked against design constraints, protocols which lend themselves to arbitrarily many compositions of agents-such as the chaining of proxies or the peering of routers-are more difficult to verify because they represent potentially infinite state spaces and may exhibit emergent behaviors which may not materialize under particular fixed compositions. We address this challenge by developing an algebraic approach that enables us to reduce arbitrary compositions of network agents into a behaviorally-equivalent (with respect to some correctness property) compact, canonical representation, which is amenable to mechanical verification. Our approach consists of an algebra and a set of property-preserving rewrite rules for the Canonical Homomorphic Abstraction of Infinite Network protocol compositions (CHAIN). Using CHAIN, an expression over our algebra (i.e., a set of configurations of network protocol agents) can be reduced to another behaviorally-equivalent expression (i.e., a smaller set of configurations). Repeated applications of such rewrite rules produces a canonical expression which can be checked mechanically. We demonstrate our approach by characterizing deadlock-prone configurations of HTTP agents, as well as establishing useful properties of an overlay protocol for scheduling MPEG frames, and of a protocol for Web intra-cache consistency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Formal tools like finite-state model checkers have proven useful in verifying the correctness of systems of bounded size and for hardening single system components against arbitrary inputs. However, conventional applications of these techniques are not well suited to characterizing emergent behaviors of large compositions of processes. In this paper, we present a methodology by which arbitrarily large compositions of components can, if sufficient conditions are proven concerning properties of small compositions, be modeled and completely verified by performing formal verifications upon only a finite set of compositions. The sufficient conditions take the form of reductions, which are claims that particular sequences of components will be causally indistinguishable from other shorter sequences of components. We show how this methodology can be applied to a variety of network protocol applications, including two features of the HTTP protocol, a simple active networking applet, and a proposed web cache consistency algorithm. We also doing discuss its applicability to framing protocol design goals and to representing systems which employ non-model-checking verification methodologies. Finally, we briefly discuss how we hope to broaden this methodology to more general topological compositions of network applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data caching can remarkably improve the efficiency of information access in a wireless ad hoc network by reducing the access latency and bandwidth usage. The cache placement problem minimizes total data access cost in ad hoc networks with multiple data items. The ad hoc networks are multi hop networks without a central base station and are resource constrained in terms of channel bandwidth and battery power. By data caching the communication cost can be reduced in terms of bandwidth as well as battery energy. As the network node has limited memory the problem of cache placement is a vital issue. This paper attempts to study the existing cooperative caching techniques and their suitability in mobile ad hoc networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative caching in mobile ad hoc networks aims at improving the efficiency of information access by reducing access latency and bandwidth usage. Cache replacement policy plays a vital role in improving the performance of a cache in a mobile node since it has limited memory. In this paper we propose a new key based cache replacement policy called E-LRU for cooperative caching in ad hoc networks. The proposed scheme for replacement considers the time interval between the recent references, size and consistency as key factors for replacement. Simulation study shows that the proposed replacement policy can significantly improve the cache performance in terms of cache hit ratio and query delay

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative caching is used in mobile ad hoc networks to reduce the latency perceived by the mobile clients while retrieving data and to reduce the traffic load in the network. Caching also increases the availability of data due to server disconnections. The implementation of a cooperative caching technique essentially involves four major design considerations (i) cache placement and resolution, which decides where to place and how to locate the cached data (ii) Cache admission control which decides the data to be cached (iii) Cache replacement which makes the replacement decision when the cache is full and (iv) consistency maintenance, i.e. maintaining consistency between the data in server and cache. In this paper we propose an effective cache resolution technique, which reduces the number of messages flooded in to the network to find the requested data. The experimental results gives a promising result based on the metrics of studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high degree of variability and inconsistency in cash flow study usage by property professionals demands improvement in knowledge and processes. Until recently limited research was being undertaken on the use of cash flow studies in property valuations but the growing acceptance of this approach for major investment valuations has resulted in renewed interest in this topic. Studies on valuation variations identify data accuracy, model consistency and bias as major concerns. In cash flow studies there are practical problems with the input data and the consistency of the models. This study will refer to the recent literature and identify the major factors in model inconsistency and data selection. A detailed case study will be used to examine the effects of changes in structure and inputs. The key variable inputs will be identified and proposals developed to improve the selection process for these key variables. The variables will be selected with the aid of sensitivity studies and alternative ways of quantifying the key variables explained. The paper recommends, with reservations, the use of probability profiles of the variables and the incorporation of this data in simulation exercises. The use of Monte Carlo simulation is demonstrated and the factors influencing the structure of the probability distributions of the key variables are outline. This study relates to ongoing research into functional performance of commercial property within an Australian Cooperative Research Centre.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative behaviour analysis requires the classification of behaviour to produce the basic data. In practice, much of this work will be performed by multiple observers, and maximising inter-observer consistency is of particular importance. Another discipline where consistency in classification is vital is biological taxonomy. A classification tool of great utility, the binary key, is designed to simplify the classification decision process and ensure consistent identification of proper categories. We show how this same decision-making tool - the binary key - can be used to promote consistency in the classification of behaviour. The construction of a binary key also ensures that the categories in which behaviour is classified are complete and non-overlapping. We discuss the general principles of design of binary keys, and illustrate their construction and use with a practical example from education research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is the culmination of a 2 year Australian Learning and Teaching Council's Project Priority Programs Research Grant which investigates key issues and challenges in developing flexible guidelines lines for best practice in Australian Doctoral and Masters by Research Examination, encompassing the two modes of investigation, written and multi-modal (practice-led/based) theses, their distinctiveness and their potential interplay. The aims of the project were to address issues of assessment legitimacy raised by the entry of practice-orientated dance studies into Australian higher degrees; examine literal embodiment and presence, as opposed to cultural studies about states of embodiment; foreground the validity of questions around subjectivity and corporeal intelligence/s and the reliability of artistic/aesthetic communications, and finally to celebrate ‘performance mastery’(Melrose 2003) as a rigorous and legitimate mode of higher research. The project began with questions which centred around: the functions of higher degree dance research; concepts of 'master-ness’ and ‘doctorateness’; the kinds of languages, structures and processes which may guide candidates, supervisors, examiners and research personnel; the purpose of evaluation/examination; addressing positive and negative attributes of examination. Finally the study examined ways in which academic/professional, writing/dancing, tradition/creation and diversity/consistency relationships might be fostered to embrace change. Over two years, the authors undertook a qualitative national study encompassing a triangulation of semi-structured face to face interviews and industry forums to gather views from the profession, together with an analysis of existing guidelines, and recent literature in the field. The most significant primary data emerged from 74 qualitative interviews with supervisors, examiners, research deans and administrators, and candidates in dance and more broadly across the creative arts. Qualitative data gathered from the two primary sources, was coded and analysed using the NVivo software program. Further perspectives were drawn from international consultant and dance researcher Susan Melrose, as well as publications in the field, and initial feedback from a draft document circulated at the World Dance Alliance Global Summit in July 2008 in Brisbane. Refinement of data occurred in a continual sifting process until the final publication was produced. This process resulted in a set of guidelines in the form of a complex dynamic system for both product and process oriented outcomes of multi-modal theses, along with short position papers on issues which arose from the research such as contested definitions, embodiment and ephemerality, ‘liveness’ in performance research higher degrees, dissolving theory/practice binaries, the relationship between academe and industry, documenting practices and a re-consideration of the viva voce.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Open Forum examines research on case management that draws on consumer perspectives. It clarifies the extent of consumer involvement and whether evaluations were informed by recovery perspectives. Searches of three databases revealed l3 studies that sought to investigate consumer perspectives. Only one study asked consumers about experiences of recovery. Most evaluations did not adequately assess consumers' views, and active consumer participation in research was rare. Supporting an individual's recovery requires commitment to a recovery paradigm that incorporates traditional symptom reduction and improved functioning, with broader recovery principles, and a shift in focus from illness to wellbeing. It also requires greater involvement of consumers in the implementation of case management and ownership of their own recovery process, not just in research that evaluates the practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major challenges facing a present day game development company is the removal of bugs from such complex virtual environments. This work presents an approach for measuring the correctness of synthetic scenes generated by a rendering system of a 3D application, such as a computer game. Our approach builds a database of labelled point clouds representing the spatiotemporal colour distribution for the objects present in a sequence of bug-free frames. This is done by converting the position that the pixels take over time into the 3D equivalent points with associated colours. Once the space of labelled points is built, each new image produced from the same game by any rendering system can be analysed by measuring its visual inconsistency in terms of distance from the database. Objects within the scene can be relocated (manually or by the application engine); yet the algorithm is able to perform the image analysis in terms of the 3D structure and colour distribution of samples on the surface of the object. We applied our framework to the publicly available game RacingGame developed for Microsoft(R) Xna(R). Preliminary results show how this approach can be used to detect a variety of visual artifacts generated by the rendering system in a professional quality game engine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the complex interactions that occur as teachers meet online to justify and negotiate their assessment judgments of student work across relatively large and geographically dispersed populations. Drawing from sociocultural theories of learning and technology, the technology is positioned as playing a role in either supporting or hindering teachers reaching a common understanding of assessment standards. Meeting transcripts and interviews with the teachers have been qualitatively analysed in terms of the interactions that occurred and teachers’ perceptions of these interactions. While online meetings offer a partial solution to address the current demands of assessment in education, they also present new challenges as teachers meet, in an unfamiliar environment, to discuss student work.