896 resultados para Practical Librarianship
Resumo:
Missiological calls for self-theologizing among faith communities present the field of practical theology with a challenge to develop methodological approaches that address the complexities of cross-cultural, practical theological research. Although a variety of approaches can be considered critical correlative practical theology, existing methods are often built on assumptions that limit their use in subaltern contexts. This study seeks to address these concerns by analyzing existing theological methodologies with sustained attention to a community of Deaf Zimbabwean women struggling to develop their own agency in relation to child rearing practices. This dilemma serves as an entry point to an examination of the limitations of existing methodologies and a constructive, interdisciplinary theological exploration. The use of theological modeling methodology employs my experience of learning to cook sadza, a staple dish of Zimbabwe, as a guide for analyzing and reorienting practical theological methodology. The study explores a variety of theological approaches from practical theology, mission oriented theologians, theology among Deaf communities, and African women’s theology in relationship to the challenges presented by subaltern communities such as Deaf Zimbabwean women. Analysis reveals that although there is much to commend in these existing methodologies, questions about who does the critical correlation, whose interests are guiding the study, and consideration for the cross-cultural and power dynamics between researchers and faith communities remain problematic for developing self-theologizing agency. Rather than frame a comprehensive methodology, this study proposes three attitudes and guideposts to reorient practical theological researchers who wish to engender self-theologizing agency in subaltern communities. The creativity of enacted theology, the humility of using checks and balances in research methods, and the grace of finding strategies to build bridges of commonality and community offer ways to reorient practical theological methodologies toward the development of self-theologizing agency among subaltern people. This study concludes with discussion of how these guideposts can not only benefit particular work with a community of Deaf Zimbabwean women, but also provide research and theological reflection in other subaltern contexts.
Resumo:
Existing approaches for multirate multicast congestion control are either friendly to TCP only over large time scales or introduce unfortunate side effects, such as significant control traffic, wasted bandwidth, or the need for modifications to existing routers. We advocate a layered multicast approach in which steady-state receiver reception rates emulate the classical TCP sawtooth derived from additive-increase, multiplicative decrease (AIMD) principles. Our approach introduces the concept of dynamic stair layers to simulate various rates of additive increase for receivers with heterogeneous round-trip times (RTTs), facilitated by a minimal amount of IGMP control traffic. We employ a mix of cumulative and non-cumulative layering to minimize the amount of excess bandwidth consumed by receivers operating asynchronously behind a shared bottleneck. We integrate these techniques together into a congestion control scheme called STAIR which is amenable to those multicast applications which can make effective use of arbitrary and time-varying subscription levels.
Resumo:
With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.
Resumo:
This work considers the effect of hardware constraints that typically arise in practical power-aware wireless sensor network systems. A rigorous methodology is presented that quantifies the effect of output power limit and quantization constraints on bit error rate performance. The approach uses a novel, intuitively appealing means of addressing the output power constraint, wherein the attendant saturation block is mapped from the output of the plant to its input and compensation is then achieved using a robust anti-windup scheme. A priori levels of system performance are attained using a quantitative feedback theory approach on the initial, linear stage of the design paradigm. This hybrid design is assessed experimentally using a fully compliant 802.15.4 testbed where mobility is introduced through the use of autonomous robots. A benchmark comparison between the new approach and a number of existing strategies is also presented.
Resumo:
Though the motivation for developing Ambient Assisted Living (AAL) systems is incontestable, significant challenges exist in realizing the ambience that is essential to the success of such systems. By definition, an AAL system must be omnipresent, tracking occupant activities in the home and identifying those situations where assistance is needed or would be welcomed. Embedded sensors offer an attractive mechanism for realizing ambience as their form factor and harnessing of wireless technologies aid in their seamless integration into pre-existing environments. However, the heterogeneity of the end-user population, their disparate needs and the differing environments in which they inhabit, all pose particular problems regarding sensor integration and management
Resumo:
The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.
Resumo:
This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.
Resumo:
OBJECTIVE: The research studied the status of hospital librarians and library services to better inform the Medical Library Association's advocacy activities. METHODS: The Vital Pathways Survey Subcommittee of the Task Force on Vital Pathways for Hospital Librarians distributed a web-based survey to hospital librarians and academic health sciences library directors. The survey results were compared to data collected in a 1989 survey of hospital libraries by the American Hospital Association in order to identify any trends in hospital libraries, roles of librarians, and library services. A web-based hospital library report form based on the survey questions was also developed to more quickly identify changes in the status of hospital libraries on an ongoing basis. RESULTS: The greatest change in library services between 1989 and 2005/06 was in the area of access to information, with 40% more of the respondents providing access to commercial online services, 100% more providing access to Internet resources, and 28% more providing training in database searching and use of information resources. Twenty-nine percent (n = 587) of the 2005/06 respondents reported a decrease in staff over the last 5 years. CONCLUSIONS: Survey data support reported trends of consolidation of hospitals and hospital libraries and additions of new services. These services have likely required librarians to acquire new skills. It is hoped that future surveys will be undertaken to continue to study these trends.
Resumo:
Although the prognosis of ambulatory heart failure (HF) has improved dramatically there have been few advances in the management of acute HF (AHF). Despite regional differences in patient characteristics, background therapy, and event rates, AHF clinical trial enrollment has transitioned from North America and Western Europe to Eastern Europe, South America, and Asia-Pacific where regulatory burden and cost of conducting research may be less prohibitive. It is unclear if the results of clinical trials conducted outside of North America are generalizable to US patient populations. This article uses AHF as a paradigm and identifies barriers and practical solutions to successfully conducting site-based research in North America.
Resumo:
Social network analysts have tried to capture the idea of social role explicitly by proposing a framework that precisely gives conditions under which group actors are playing equivalent roles. They term these methods positional analysis techniques. The most general definition is regular equivalence which captures the idea that equivalent actors are related in a similar way to equivalent alters. Regular equivalence gives rise to a whole class of partitions on a network. Given a network we have two different computational problems. The first is how to find a particular regular equivalence. An algorithm exists to find the largest regular partition but there are not efficient algorithms to test whether there is a regular k-partition. That is a partition in k groups that is regular. In addition, when dealing with real data, it is unlikely that any regular partitions exist. To overcome this problem relaxations of regular equivalence have been proposed along with optimisation techniques to find nearly regular partitions. In this paper we review the algorithms that have developed to find particular regular equivalences and look at some of the recent theoretical results which give an insight into the complexity of finding regular partitions.
Resumo:
Counter-current chromatography (CCC) is a technique that shows a lot of potential for large scale purification. Its usefulness in a "research and development" pharmaceutical environment has been investigated, and the conclusions are shown in this article. The use of CCC requires the development of an appropriate solvent system (a parameter of critical importance), a process which can be tedious. This article presents a novel strategy, combining a statistical approach and fast HPLC to generate a three-dimensional partition coefficient map and rapidly predict an optimal solvent system. This screen is performed in half a day and involves 9 experiments per solvent mixture. Test separations were performed using that screen to ensure the validity of the method.