950 resultados para Turner, Bradley


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urquhart, C., Turner, J., Durbin, J. & Ryan, J. (2006). Evaluating the contribution of the clinical librarian to a multidisciplinary team. Library and Information Research, 30(94), 30-43. Sponsorship: NHS Trusts in North Wales

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urquhart, C., Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Fenton, R. & Armstrong, C. (2004). Evaluating the development of virtual learning environments in higher and further education. In J. Cook (Ed.), Blue skies and pragmatism: learning technologies for the next decade. Research proceedings of the 11th Association for Learning Technology conference (ALT-C 2004), 14-16 September 2004, University of Exeter, Devon, England (pp. 157-169). Oxford: Association for Learning Technology Sponsorship: JISC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urquhart, C., Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Fenton, R. & Armstrong, C. (2004). JUSTEIS: JISC Usage Surveys: Trends in Electronic Information Services Final report 2003/2004 Cycle Five. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: JISC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urquhart, C. (editor for JUSTEIS team), Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Armstrong, A., Lonsdale, R. & Fenton, R. (2003). JUSTEIS (JISC Usage Surveys: Trends in Electronic Information Services) Strand A: survey of end users of all electronic information services (HE and FE), with Action research report. Final report 2002/2003 Cycle Four. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth with Information Automation Ltd (CIQM). Sponsorship: JISC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urquhart, C., Turner, J., Durbin, J. & Ryan, J. (2007). Changes in information behavior in clinical teams after introduction of a clinical librarian service. Journal of the Medical Library Association, 95(1), 14-22. Available via PubMed central Sponsorship: North Wales NHS Trusts

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cox, S.J., Bradley, G. and Weaire, D. (2001) Metallic foam processing from the liquid state: the competition between solidification and drainage. Eur. Phys. J. AP 14:87-97. Sponsorship: This research was supported by the Prodex programme of ESA, and is a contribution to ESA contract C14308/AO-075-99. SJC was supported by Enterprise Ireland and a Marie Curie fellowship. GB was supported by the HPC Programme of TCD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Niniejsza publikacja podejmuje problem, który jest dla antropologii symbolicznej fundamentalny. Jest nim ukazanie tego, jak w ramach tego kierunku ujmowano kwestię działań o charakterze symbolicznym oraz tego, jak działają same symbole w kulturze. Równie istotnym problemem wiążącym się ze sposobem pojmowania przez ten nurt powyższych kwestii jest jego uwarunkowanie kontekstualne. Chodzi w tym miejscu nie tylko o źródła naukowych inspiracji, ale również o mniej oczywiste czynniki kształtujące teoretyczne stanowiska Geertza czy Turnera. Antropologia symboliczna stanowiła część paradygmatu interpretacyjnego. To właśnie wpływ wiążącego się ze zwrotem interpretacyjnym klimatu intelektualnego, jak też instytucjonalne oddziaływanie wspomnianych powyżej ośrodków akademickich jest najbardziej interesujące. Historyczno-naukowa kontekstualizacja określonych prądów naukowych wywiera wpływ na kształt realizowanych w ich ramach teorii w takim samym stopniu, co relacje pomiędzy stanowiskami poszczególnych badaczy i ich autorskie programy antropologiczne.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal correctness of complex multi-party network protocols can be difficult to verify. While models of specific fixed compositions of agents can be checked against design constraints, protocols which lend themselves to arbitrarily many compositions of agents-such as the chaining of proxies or the peering of routers-are more difficult to verify because they represent potentially infinite state spaces and may exhibit emergent behaviors which may not materialize under particular fixed compositions. We address this challenge by developing an algebraic approach that enables us to reduce arbitrary compositions of network agents into a behaviorally-equivalent (with respect to some correctness property) compact, canonical representation, which is amenable to mechanical verification. Our approach consists of an algebra and a set of property-preserving rewrite rules for the Canonical Homomorphic Abstraction of Infinite Network protocol compositions (CHAIN). Using CHAIN, an expression over our algebra (i.e., a set of configurations of network protocol agents) can be reduced to another behaviorally-equivalent expression (i.e., a smaller set of configurations). Repeated applications of such rewrite rules produces a canonical expression which can be checked mechanically. We demonstrate our approach by characterizing deadlock-prone configurations of HTTP agents, as well as establishing useful properties of an overlay protocol for scheduling MPEG frames, and of a protocol for Web intra-cache consistency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a type system, StaXML, which employs the stacked type syntax to represent essential aspects of the potential roles of XML fragments to the structure of complete XML documents. The simplest application of this system is to enforce well-formedness upon the construction of XML documents without requiring the use of templates or balanced "gap plugging" operators; this allows it to be applied to programs written according to common imperative web scripting idioms, particularly the echoing of unbalanced XML fragments to an output buffer. The system can be extended to verify particular XML applications such as XHTML and identifying individual XML tags constructed from their lexical components. We also present StaXML for PHP, a prototype precompiler for the PHP4 scripting language which infers StaXML types for expressions without assistance from the programmer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Science of Network Service Composition has clearly emerged as one of the grand themes driving many of our research questions in the networking field today [NeXtworking 2003]. This driving force stems from the rise of sophisticated applications and new networking paradigms. By "service composition" we mean that the performance and correctness properties local to the various constituent components of a service can be readily composed into global (end-to-end) properties without re-analyzing any of the constituent components in isolation, or as part of the whole composite service. The set of laws that would govern such composition is what will constitute that new science of composition. The combined heterogeneity and dynamic open nature of network systems makes composition quite challenging, and thus programming network services has been largely inaccessible to the average user. We identify (and outline) a research agenda in which we aim to develop a specification language that is expressive enough to describe different components of a network service, and that will include type hierarchies inspired by type systems in general programming languages that enable the safe composition of software components. We envision this new science of composition to be built upon several theories (e.g., control theory, game theory, network calculus, percolation theory, economics, queuing theory). In essence, different theories may provide different languages by which certain properties of system components can be expressed and composed into larger systems. We then seek to lift these lower-level specifications to a higher level by abstracting away details that are irrelevant for safe composition at the higher level, thus making theories scalable and useful to the average user. In this paper we focus on services built upon an overlay management architecture, and we use control theory and QoS theory as example theories from which we lift up compositional specifications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As new multi-party edge services are deployed on the Internet, application-layer protocols with complex communication models and event dependencies are increasingly being specified and adopted. To ensure that such protocols (and compositions thereof with existing protocols) do not result in undesirable behaviors (e.g., livelocks) there needs to be a methodology for the automated checking of the "safety" of these protocols. In this paper, we present ingredients of such a methodology. Specifically, we show how SPIN, a tool from the formal systems verification community, can be used to quickly identify problematic behaviors of application-layer protocols with non-trivial communication models—such as HTTP with the addition of the "100 Continue" mechanism. As a case study, we examine several versions of the specification for the Continue mechanism; our experiments mechanically uncovered multi-version interoperability problems, including some which motivated revisions of HTTP/1.1 and some which persist even with the current version of the protocol. One such problem resembles a classic degradation-of-service attack, but can arise between well-meaning peers. We also discuss how the methods we employ can be used to make explicit the requirements for hardening a protocol's implementation against potentially malicious peers, and for verifying an implementation's interoperability with the full range of allowable peer behaviors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal tools like finite-state model checkers have proven useful in verifying the correctness of systems of bounded size and for hardening single system components against arbitrary inputs. However, conventional applications of these techniques are not well suited to characterizing emergent behaviors of large compositions of processes. In this paper, we present a methodology by which arbitrarily large compositions of components can, if sufficient conditions are proven concerning properties of small compositions, be modeled and completely verified by performing formal verifications upon only a finite set of compositions. The sufficient conditions take the form of reductions, which are claims that particular sequences of components will be causally indistinguishable from other shorter sequences of components. We show how this methodology can be applied to a variety of network protocol applications, including two features of the HTTP protocol, a simple active networking applet, and a proposed web cache consistency algorithm. We also doing discuss its applicability to framing protocol design goals and to representing systems which employ non-model-checking verification methodologies. Finally, we briefly discuss how we hope to broaden this methodology to more general topological compositions of network applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We survey several of the research efforts pursued by the iBench and snBench projects in the CS Department at Boston University over the last half dozen years. These activities use ideas and methodologies inspired by recent developments in other parts of computer science -- particularly in formal methods and in the foundations of programming languages -- but now specifically applied to the certification of safety-critical networking systems. This is research jointly led by Azer Bestavros and Assaf Kfoury with the participation of Adam Bradley, Andrei Lapets, and Michael Ocean.