989 resultados para ANSWER


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Achieving the goals set by Roll Back Malaria and the Government of Kenya for use of insecticide treated bednets (ITNs) will require that the private retail market for nets and insecticide treatments grow substantially. This paper applies some basic concepts of market structure and pricing to a set of recently-collected retail price data from Kenya in order to answer the question, “How well are Kenyan retail markets for ITNs working?” Methods Data on the availability and prices of ITNs at a wide range of retail outlets throughout Kenya were collected in January 2002, and vendors and manufacturers were interviewed regarding market structure. Findings Untreated nets are manufactured in Kenya by a number of companies and are widely available in large and medium-sized towns. Availability in smaller villages is limited. There is relatively little geographic price variation, and nets can be found at competitive prices in towns and cities. Marketing margins on prices appear to be within normal ranges. No finished nets are imported. Few pre-treated nets or net+treatment combinations are available, with the exception of the subsidized Supanet/Power Tab combination marketed by a donor-funded social marketing project. Conclusions Retail markets for untreated nets in Kenya are well established and appear to be competitive. Markets for treated nets and insecticide treatment kits are not well established. The role of subsidized ITN marketing projects should be monitored to ensure that these projects support, rather than hinder, the development of retail markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Chronic illness and premature mortality from malaria, water-borne diseases, and respiratory illnesses have long been known to diminish the welfare of individuals and households in developing countries. Previous research has also shown that chronic diseases among farming populations suppress labor productivity and agricultural output. As the illness and death toll from HIV/AIDS continues to climb in most of sub-Saharan Africa, concern has arisen that the loss of household labor it causes will reduce crop yields, impoverish farming households, intensify malnutrition, and suppress growth in the agricultural sector. If chronic morbidity and premature mortality among individuals in farming households have substantial impacts on household production, and if a large number of households are affected, it is possible that an increase in morbidity and mortality from HIV/AIDS or other diseases could affect national aggregate output and exports. If, on the other hand, the impact at the household farm level is modest, or if relatively few households are affected, there is likely to be little effect on aggregate production across an entire country. Which of these outcomes is more likely in West Africa is unknown. Little rigorous, quantitative research has been published on the impacts of AIDS on smallholder farm production, particularly in West Africa. The handful of studies that have been conducted have looked mainly at small populations in areas of very high HIV prevalence in southern and eastern Africa. Conclusions about how HIV/AIDS, and other causes of chronic morbidity and mortality, are affecting agriculture across the continent cannot be drawn from these studies. In view of the importance of agriculture, and particularly smallholder agriculture, in the economies of most African countries and the scarcity of resources for health interventions, it is valuable to identify, describe, and quantify the impact of chronic morbidity and mortality on smallholder production of important crops in West Africa. One such crop is cocoa. In Ghana, cocoa is a crop of national importance that is produced almost exclusively by smallholder households. In 2003, Ghana was the world’s second-largest producer of cocoa. Cocoa accounted for a quarter of Ghana’s export revenues that year and generated 15 percent of employment. The success and growth of the cocoa industry is thus vital to the country’s overall social and economic development. Study Objectives and Methods In February and March 2005, the Center for International Health and Development of Boston University (CIHD) and the Department of Agricultural Economics and Agribusiness (DAEA) of the University of Ghana, with financial support from the Africa Bureau of the U.S. Agency for International Development and from Mars, Inc., which is a major purchaser of West African cocoa, conducted a survey of a random sample of cocoa farming households in the Western Region of Ghana. The survey documented the extent of chronic morbidity and mortality in cocoa growing households in the Western Region of Ghana, the country’s largest cocoa growing region, and analyzed the impact of morbidity and mortality on cocoa production. It aimed to answer three specific research questions. (1) What is the baseline status of the study population in terms of household size and composition, acute and chronic morbidity, recent mortality, and cocoa production? (2) What is the relationship between household size and cocoa production, and how can this relationship be used to understand the impact of adult mortality and chronic morbidity on the production of cocoa at the household level? The study population was the approximately 42,000 cocoa farming households in the southern part of Ghana’s Western Region. A random sample of households was selected from a roster of eligible households developed from existing administrative information. Under the supervision of the University of Ghana field team, enumerators were graduate students of the Department of Agricultural Economics and Agribusiness or employees of the Cocoa Services Division. A total of 632 eligible farmers participated in the survey. Of these, 610 provided complete responses to all questions needed to complete the multivariate statistical analysis reported here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sonic boom propagation in a quiet) stratified) lossy atmosphere is the subject of this dissertation. Two questions are considered in detail: (1) Does waveform freezing occur? (2) Are sonic booms shocks in steady state? Both assumptions have been invoked in the past to predict sonic boom waveforms at the ground. A very general form of the Burgers equation is derived and used as the model for the problem. The derivation begins with the basic conservation equations. The effects of nonlinearity) attenuation and dispersion due to multiple relaxations) viscosity) and heat conduction) geometrical spreading) and stratification of the medium are included. When the absorption and dispersion terms are neglected) an analytical solution is available. The analytical solution is used to answer the first question. Geometrical spreading and stratification of the medium are found to slow down the nonlinear distortion of finite-amplitude waves. In certain cases the distortion reaches an absolute limit) a phenomenon called waveform freezing. Judging by the maturity of the distortion mechanism, sonic booms generated by aircraft at 18 km altitude are not frozen when they reach the ground. On the other hand, judging by the approach of the waveform to its asymptotic shape, N waves generated by aircraft at 18 km altitude are frozen when they reach the ground. To answer the second question we solve the full Burgers equation and for this purpose develop a new computer code, THOR. The code is based on an algorithm by Lee and Hamilton (J. Acoust. Soc. Am. 97, 906-917, 1995) and has the novel feature that all its calculations are done in the time domain, including absorption and dispersion. Results from the code compare very well with analytical solutions. In a NASA exercise to compare sonic boom computer programs, THOR gave results that agree well with those of other participants and ran faster. We show that sonic booms are not steady state waves because they travel through a varying medium, suffer spreading, and fail to approximate step shocks closely enough. Although developed to predict sonic boom propagation, THOR can solve other problems for which the extended Burgers equation is a good propagation model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If every lambda-abstraction in a lambda-term M binds at most one variable occurrence, then M is said to be "linear". Many questions about linear lambda-terms are relatively easy to answer, e.g. they all are beta-strongly normalizing and all are simply-typable. We extend the syntax of the standard lambda-calculus L to a non-standard lambda-calculus L^ satisfying a linearity condition generalizing the notion in the standard case. Specifically, in L^ a subterm Q of a term M can be applied to several subterms R1,...,Rk in parallel, which we write as (Q. R1 \wedge ... \wedge Rk). The appropriate notion of beta-reduction beta^ for the calculus L^ is such that, if Q is the lambda-abstraction (\lambda x.P) with m\geq 0 bound occurrences of x, the reduction can be carried out provided k = max(m,1). Every M in L^ is thus beta^-SN. We relate standard beta-reduction and non-standard beta^-reduction in several different ways, and draw several consequences, e.g. a new simple proof for the fact that a standard term M is beta-SN iff M can be assigned a so-called "intersection" type ("top" type disallowed).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Personal communication devices are increasingly equipped with sensors for passive monitoring of encounters and surroundings. We envision the emergence of services that enable a community of mobile users carrying such resource-limited devices to query such information at remote locations in the field in which they collectively roam. One approach to implement such a service is directed placement and retrieval (DPR), whereby readings/queries about a specific location are routed to a node responsible for that location. In a mobile, potentially sparse setting, where end-to-end paths are unavailable, DPR is not an attractive solution as it would require the use of delay-tolerant (flooding-based store-carry-forward) routing of both readings and queries, which is inappropriate for applications with data freshness constraints, and which is incompatible with stringent device power/memory constraints. Alternatively, we propose the use of amorphous placement and retrieval (APR), in which routing and field monitoring are integrated through the use of a cache management scheme coupled with an informed exchange of cached samples to diffuse sensory data throughout the network, in such a way that a query answer is likely to be found close to the query origin. We argue that knowledge of the distribution of query targets could be used effectively by an informed cache management policy to maximize the utility of collective storage of all devices. Using a simple analytical model, we show that the use of informed cache management is particularly important when the mobility model results in a non-uniform distribution of users over the field. We present results from extensive simulations which show that in sparsely-connected networks, APR is more cost-effective than DPR, that it provides extra resilience to node failure and packet losses, and that its use of informed cache management yields superior performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a mobile sensor network monitoring a spatio-temporal field. Given limited cache sizes at the sensor nodes, the goal is to develop a distributed cache management algorithm to efficiently answer queries with a known probability distribution over the spatial dimension. First, we propose a novel distributed information theoretic approach in which the nodes locally update their caches based on full knowledge of the space-time distribution of the monitored phenomenon. At each time instant, local decisions are made at the mobile nodes concerning which samples to keep and whether or not a new sample should be acquired at the current location. These decisions account for minimizing an entropic utility function that captures the average amount of uncertainty in queries given the probability distribution of query locations. Second, we propose a different correlation-based technique, which only requires knowledge of the second-order statistics, thus relaxing the stringent constraint of having a priori knowledge of the query distribution, while significantly reducing the computational overhead. It is shown that the proposed approaches considerably improve the average field estimation error by maintaining efficient cache content. It is further shown that the correlation-based technique is robust to model mismatch in case of imperfect knowledge of the underlying generative correlation structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most vexing questions facing researchers interested in the World Wide Web is why users often experience long delays in document retrieval. The Internet's size, complexity, and continued growth make this a difficult question to answer. We describe the Wide Area Web Measurement project (WAWM) which uses an infrastructure distributed across the Internet to study Web performance. The infrastructure enables simultaneous measurements of Web client performance, network performance and Web server performance. The infrastructure uses a Web traffic generator to create representative workloads on servers, and both active and passive tools to measure performance characteristics. Initial results based on a prototype installation of the infrastructure are presented in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Perceptual grouping is well-known to be a fundamental process during visual perception, notably grouping across scenic regions that do not receive contrastive visual inputs. Illusory contours are a classical example of such groupings. Recent psychophysical and neurophysiological evidence have shown that the grouping process can facilitate rapid synchronization of the cells that are bound together by a grouping, even when the grouping must be completed across regions that receive no contrastive inputs. Synchronous grouping can hereby bind together different object parts that may have become desynchronized due to a variety of factors, and can enhance the efficiency of cortical transmission. Neural models of perceptual grouping have clarified how such fast synchronization may occur by using bipole grouping cells, whose predicted properties have been supported by psychophysical, anatomical, and neurophysiological experiments. These models have not, however, incorporated some of the realistic constraints on which groupings in the brain are conditioned, notably the measured spatial extent of long-range interactions in layer 2/3 of a grouping network, and realistic synaptic and axonal signaling delays within and across cells in different cortical layers. This work addresses the question: Can long-range interactions that obey the bipole constraint achieve fast synchronization under realistic anatomical and neurophysiological constraints that initially desynchronize grouping signals? Can the cells that synchronize retain their analog sensitivity to changing input amplitudes? Can the grouping process complete and synchronize illusory contours across gaps in bottom-up inputs? Our simulations show that the answer to these questions is Yes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do our brains transform the "blooming buzzing confusion" of daily experience into a coherent sense of self that can learn and selectively attend to important information? How do local signals at multiple processing stages, none of which has a global view of brain dynamics or behavioral outcomes, trigger learning at multiple synaptic sites when appropriate, and prevent learning when inappropriate, to achieve useful behavioral goals in a continually changing world? How does the brain allow synaptic plasticity at a remarkably rapid rate, as anyone who has gone to an exciting movie is readily aware, yet also protect useful memories from catastrophic forgetting? A neural model provides a unified answer by explaining and quantitatively simulating data about single cell biophysics and neurophysiology, laminar neuroanatomy, aggregate cell recordings (current-source densities, local field potentials), large-scale oscillations (beta, gamma), and spike-timing dependent plasticity, and functionally linking them all to cognitive information processing requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Through an investigation of the Anglo-Saxon prayer books and selected psalters, this thesis corrects standard histories of medieval devotion that circumvent the Anglo-Saxon contribution to medieval piety. In the first half of the thesis, I establish a theoretical framework for Anglo-Saxon piety in which to explore the prayers. Current theoretical frameworks dealing with the medieval devotional material are flawed as scholars use terms such as ‘affective piety’, ‘private’ and even ‘devotion’ vaguely. After an introduction which defines some of the core terminology, Chapter 2 introduces the principal witnesses to the Anglo-Saxon prayer tradition. These include the prodigal eighth- and early ninth- century Mercian Group, comprising the Book of Nunnaminster (London, British Library, Harley 2965, s. viii ex/ix1), the Harleian Prayer Book (London, British Library, Harley 7653, s. viii ex/ix1), the Royal Prayer Book (London, British Library, Royal 2 A. xx, s. viii2/ix1/4), and the Book of Cerne (Cambridge, University Library, Ll. 1. 10). These prayer books are the earliest of their kind in Europe. This chapter challenges some established views concerning the prayer books, including purported Irish influence on their composition and the probability of female ownership. Chapter 3 explores the performance of prayer. The chapter demonstrates that Anglo-Saxon prayers, for example, the Royal Abecedarian Prayer, were transmitted fluidly. The complex relationship between this abecedarian prayer and its reflex in the Book of Nunnaminster reveals the complexity of prayer composition and transmission in the early medieval world but more importantly, it helps scholars theorise how the prayers may have been used, whether recited verbatim or used for extemporalisation. Changes made by later readers to earlier texts are also vital to this study, since they help answer questions of usage and show the evolution and subsequent influence of Anglo-Saxon religiosity. The second half of the thesis makes a special study of prayers to the Cross, the wounded Christ, and the Virgin, three important themes in later medieval spirituality. These focus on the Royal Abecedarian Prayer, which explores Christ’s life (Chapter 5), especially his Passion; the ‘Domine Ihesu Christe, adoro te cruce’ which celebrates the Cross (Chapter 4); and the Oratio Alchfriðo ad sanctam Mariam, which invokes the Virgin Mary (Chapter 6). These prayers occur in multiple, temporally-diverse witnesses and have complex transmission histories, involving both oral and written dissemination. The concluding chapter (7) highlights some of the avenues for future research opened by the thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Hospital clinicians are increasingly expected to practice evidence-based medicine (EBM) in order to minimize medical errors and ensure quality patient care, but experience obstacles to information-seeking. The introduction of a Clinical Informationist (CI) is explored as a possible solution. Aims:  This paper investigates the self-perceived information needs, behaviour and skill levels of clinicians in two Irish public hospitals. It also explores clinicians perceptions and attitudes to the introduction of a CI into their clinical teams. Methods: A questionnaire survey approach was utilised for this study, with 22 clinicians in two hospitals. Data analysis was conducted using descriptive statistics. Results: Analysis showed that clinicians experience diverse information needs for patient care, and that barriers such as time constraints and insufficient access to resources hinder their information-seeking. Findings also showed that clinicians struggle to fit information-seeking into their working day, regularly seeking to answer patient-related queries outside of working hours. Attitudes towards the concept of a CI were predominantly positive. Conclusion: This paper highlights the factors that characterise and limit hospital clinicians information-seeking, and suggests the CI as a potentially useful addition to the clinical team, to help them to resolve their information needs for patient care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study is set in the context of disadvantaged urban primary schools in Ireland. It inquires into the collaborative practices of primary teachers exploring how class teachers and support teachers develop ways of working together in an effort to improve the literacy and numeracy levels of their student. Traditionally teachers have worked in isolation and therefore ‘collaboration’ as a practice has been slow to permeate the historically embedded assumption of how a teacher should work. This study aims to answer the following questions. 1). What are the dynamics of teacher collaboration in disadvantaged urban primary schools? 2). In what ways are teacher collaboration and teacher learning related? 3). In what ways does teacher collaboration influence students’ opportunities for learning? In answering these research questions, this study aims to contribute to the body of knowledge pertaining to teacher learning through collaboration. Though current policy and literature advocate and make a case for the development of collaborative teaching practices, key studies have identified gaps in the research literature in relation to the impact of teacher collaboration in schools. This study seeks to address some of those gaps by establishing how schools develop a collaborative environment and how teaching practices are enacted in such a setting. It seeks to determine what skills, relationships, structures and conditions are most important in developing collaborative environments that foster the development of professional learning communities (PLCs). This study uses a mixed method research design involving a postal survey, four snap-shot case studies and one in depth case study in an effort to establish if collaborative practice is a feasible practice resulting in worthwhile benefits for both teachers and students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis covers the Irish House of Lords in the last two decades of its life. A number of important themes run through the work - the regency crisis, patronage, the management of the Lords, the relationship between the Lords and Commons. These themes, explored from different angles, are vital to an understanding of the political role of the upper house in the 1780s and 1790s. This study is confined to the Lords as a political institution and thus its judicial role as final court of appeal, which was restored to it in 1782, will not be explored here. The thesis consists of two parts. Part one examines the structure and powers of the House of Lords while part two looks at the parties and policies of the house. Chapter one discusses the British constitution as imposed upon Ireland. Chapter two suggests the reasons why constitutional changes were introduced in 1782, and looks at the contribution made by the Irish House of Lords in securing these changes. Chapter three explores the various channels of influence which the peers enjoyed. Chapter four explores the sometimes tense relationship between Lords and Commons. Chapter five examines management of the House of Lords by Dublin Castle. Part two, begins at chapter six. This chapter explores the leadership of both parties within the Lords. Chapter seven looks at how patronage was used to reward those who were loyal to the government. Chapter eight explores the influence of the Whig opposition. Chapter nine looks at the controversial attempts made by Pitt and his ministry during the 1790s to win the support of catholics and turn them from the lure of French ideas, and of the response of the peers to these attempts. Chapter ten is concerned with the relationship between the peers of the House of Lords and the lords lieutenant during the 1790s. Chapter eleven looks at the Union and the House of Lords and attempts to answer the question historians have long asked: why did the Irish parliament and the House of Lords in particular, look favourably on the proposed union of the two kingdoms and the end of their own institution? The House of Lords in the closing decades of the eighteenth century was an institution within which the wealth and power of the kingdom could be found. Its members were politically active, both inside and outside the house. It contained a majority who saw the Crown as the source of stability, but it was a living and evolving political organism and therefore it contained men who believed that the Crown should have its influence limited. This evolution is also demonstrated in its desire for political change in 1782 and 1788. Its last, and perhaps most radical decision, was to vote for its own demise in 1900.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation one often employs heuristics which are characteristic to a specific design. In this work we answer three questions in our quest to provide a systematic approach to joint power and delay Optimisation. The first question of our research is: How to build a design flow which incorporates academic and industry standard design flows for power optimisation? To address this question, we use a reference design flow provided by Synopsys and integrate in this flow academic tools and methodologies. The proposed design flow is used as a platform for analysing some novel algorithms and methodologies for optimisation in the context of digital circuits. The second question we answer is: Is possible to apply a systematic approach for power optimisation in the context of combinational digital circuits? The starting point is a selection of a suitable data structure which can easily incorporate information about delay, power, area and which then allows optimisation algorithms to be applied. In particular we address the implications of a systematic power optimisation methodologies and the potential degradation of other (often conflicting) parameters such as area or the delay of implementation. Finally, the third question which this thesis attempts to answer is: Is there a systematic approach for multi-objective optimisation of delay and power? A delay-driven power and power-driven delay optimisation is proposed in order to have balanced delay and power values. This implies that each power optimisation step is not only constrained by the decrease in power but also the increase in delay. Similarly, each delay optimisation step is not only governed with the decrease in delay but also the increase in power. The goal is to obtain multi-objective optimisation of digital circuits where the two conflicting objectives are power and delay. The logic synthesis and optimisation methodology is based on AND-Inverter Graphs (AIGs) which represent the functionality of the circuit. The switching activities and arrival times of circuit nodes are annotated onto an AND-Inverter Graph under the zero and a non-zero-delay model. We introduce then several reordering rules which are applied on the AIG nodes to minimise switching power or longest path delay of the circuit at the pre-technology mapping level. The academic Electronic Design Automation (EDA) tool ABC is used for the manipulation of AND-Inverter Graphs. We have implemented various combinatorial optimisation algorithms often used in Electronic Design Automation such as Simulated Annealing and Uniform Cost Search Algorithm. Simulated Annealing (SMA) is a probabilistic meta heuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. We used SMA to probabilistically decide between moving from one optimised solution to another such that the dynamic power is optimised under given delay constraints and the delay is optimised under given power constraints. A good approximation to the global optimum solution of energy constraint is obtained. Uniform Cost Search (UCS) is a tree search algorithm used for traversing or searching a weighted tree, tree structure, or graph. We have used Uniform Cost Search Algorithm to search within the AIG network, a specific AIG node order for the reordering rules application. After the reordering rules application, the AIG network is mapped to an AIG netlist using specific library cells. Our approach combines network re-structuring, AIG nodes reordering, dynamic power and longest path delay estimation and optimisation and finally technology mapping to an AIG netlist. A set of MCNC Benchmark circuits and large combinational circuits up to 100,000 gates have been used to validate our methodology. Comparisons for power and delay optimisation are made with the best synthesis scripts used in ABC. Reduction of 23% in power and 15% in delay with minimal overhead is achieved, compared to the best known ABC results. Also, our approach is also implemented on a number of processors with combinational and sequential components and significant savings are achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this dissertation is to revive the 19th-century thinker Max Stirner’s thought through a critical reexamination of his mistaken legacy as a ‘political’ thinker. The reading of Stirner that I present is one of an ontological thinker, spurred on as much—if not more—by the contents of Hegel’s Phenomenology of Spirit as it is the radical roots that Hegel unintentionally planted. In the first chapter, the role of language in Stirner’s thought is examined, and the problems to which his conception of language seem to give rise are addressed. The second chapter looks at Stirner’s purportedly ‘anarchistic’ politics and finds the ‘anarchist’ reading of Stirner misguided. Rather than being a ‘political’ anarchist, it is argued that we ought to understand Stirner as advocating a sort of ‘ontological’ anarchism in which the very existence of authority is questioned. In the third chapter, I look at the political ramifications of Stirner’s ontology as well as the critique of liberalism contained within it, and argue that the politics implicit in his philosophy shares more in common with the tradition of political realism than it does anarchism. The fourth chapter is dedicated to an examination of Stirner’s anti-humanism, which is concluded to be much different than the ‘anti-humanisms’ associated with other, more famous thinkers, such as Foucault and Heidegger. In the fifth and final chapter, I provide an answer to the question(s) of how, if, and to what extent Friedrich Nietzsche was influenced by Stirner. It is concluded that the complete lack of evidence that Nietzsche ever read Stirner is proof enough to dismiss accusations of plagiarism on Nietzsche’s part, thus emphasizing the originality and singularity of both thinkers.