943 resultados para resource-use efficiency
Resumo:
ABSTRACT This thesis will determine if there is a discrepancy between how literature defines conservation, preservation, and restoration, and how natural resource professionals define these terms. Interviews were conducted with six professionals from six different agencies that deal with natural resources. These agencies consisted of both government and non-government groups. In addition to interviewing these professionals regarding how they define the terms, they were asked where their work fits into the context of these terms. The interviewees’ responses were then compared with the literature to determine inconsistencies with the use of these terms in the literature and real world settings. The literature and the interviewees have agreed on the term conservation. There are some different points of view about preservation, some see it as ‘no management’ and some others see it as keeping things the same or ‘static.’ Restoration was the term where both the literature and professionals thought of moving an ecosystem from one point of succession or community, to another point on a continuum. The only thing in which they disagree on is the final goal of a restoration project. The literature would suggest restoring the ecosystem to a past historic condition, where the interviewees said to restore it to the best of their abilities and to a functioning ecosystem.
Resumo:
Suppliers of water and energy are frequently natural monopolies, with their pricing regulated by governmental agencies. Pricing schemes are evaluated by the efficiency of the resource allocation they lead to, the capacity of the utilities to capture their costs and the distributional effects of the policies, in particular, impacts on the poor. One pricing approach has been average cost pricing, which guarantees cost recovery and allows utilities to provide their product at relatively low prices. However, average cost pricing leads to economically inefficient consumption levels, when sources of water and energy are limited and increasing the supply is costly. An alternative approach is increasing block rates (hereafter, IBR or tiered pricing), where individuals pay a low rate for an initial consumption block and a higher rate as they increase use beyond that block. An example of IBR is shown in Figure 1 (on next page), which shows a rate structure for residential water use. With the rates in Figure 1, a household would be charged $0.46 and $0.71 per hundred gallons for consumption below and above 21,000 gallons per month, respectively.
Resumo:
Chimpanzees have been the traditional referential models for investigating human evolution and stone tool use by hominins. We enlarge this comparative scenario by describing normative use of hammer stones and anvils in two wild groups of bearded capuchin monkeys (Cebus libidinosus) over one year. We found that most of the individuals habitually use stones and anvils to crack nuts and other encased food items. Further, we found that in adults (1) males use stone tools more frequently than females, (2) males crack high resistance nuts more frequently than females, (3) efficiency at opening a food by percussive tool use varies according to the resistance of the encased food, (4) heavier individuals are more efficient at cracking high resistant nuts than smaller individuals, and (5) to crack open encased foods, both sexes select hammer stones on the basis of material and weight. These findings confirm and extend previous experimental evidence concerning tool selectivity in wild capuchin monkeys (Visalberghi et al., 2009b; Fragaszy et al., 2010b). Male capuchins use tools more frequently than females and body mass is the best predictor of efficiency, but the sexes do not differ in terms of efficiency. We argue that the contrasting pattern of sex differences in capuchins compared with chimpanzees, in which females use tools more frequently and more skillfully than males, may have arisen from the degree of sexual dimorphism in body size of the two species, which is larger in capuchins than in chimpanzees. Our findings show the importance of taking sex and body mass into account as separate variables to assess their role in tool use. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
AIMS: To compare the gender distribution of HIV-infected adults receiving highly active antiretroviral treatment (HAART) in resource-constrained settings with estimates of the gender distribution of HIV infection; to describe the clinical characteristics of women and men receiving HAART. METHODS: The Antiretroviral Therapy in Lower-Income Countries, ART-LINC Collaboration is a network of clinics providing HAART in Africa, Latin America, and Asia. We compared UNAIDS data on the gender distribution of HIV infection with the proportions of women and men receiving HAART in the ART-LINC Collaboration. RESULTS: Twenty-nine centers in 13 countries participated. Among 33,164 individuals, 19,989 (60.3%) were women. Proportions of women receiving HAART in ART-LINC centers were similar to, or higher than, UNAIDS estimates of the proportions of HIV-infected women in all but two centers. There were fewer women receiving HAART than expected from UNAIDS data in one center in Uganda and one center in India. Taking into account heterogeneity across cohorts, women were younger than men, less likely to have advanced HIV infection, and more likely to be anemic at HAART initiation. CONCLUSIONS: Women in resource-constrained settings are not necessarily disadvantaged in their access to HAART. More attention needs to be paid to ensuring that HIV-infected men are seeking care and starting HAART.
Resumo:
The Swiss Swiss Consultant Trust Fund (CTF) support covered the period from July to December 2007 and comprised four main tasks: (1) Analysis of historic land degradation trends in the four watersheds of Zerafshan, Surkhob, Toirsu, and Vanj; (2) Translation of standard CDE GIS training materials into Russian and Tajik to enable local government staff and other specialists to use geospatial data and tools; (3) Demonstration of geospatial tools that show land degradation trends associated with land use and vegetative cover data in the project areas, (4) Preliminary training of government staff in using appropriate data, including existing information, global datasets, inexpensive satellite imagery and other datasets and webbased visualization tools like spatial data viewers, etc. The project allowed building of local awareness of, and skills in, up-to-date, inexpensive, easy-to-use GIS technologies, data sources, and applications relevant to natural resource management and especially to sustainable land management. In addition to supporting the implementation of the World Bank technical assistance activity to build capacity in the use of geospatial tools for natural resource management, the Swiss CTF support also aimed at complementing the Bank supervision work on the ongoing Community Agriculture and Watershed Management Project (CAWMP).
Resumo:
The Centre for Development and Environment (CDE) has been contracted by the World Bank Group to conduct a program on capacity development in use of geospatial tools for natural resource management in Tajikistan. The program aimed to help improving natural resource management by fostering the use of geospatial tools among governmental and non-governmental institutions in Tajikistan. For this purpose a database including a Geographic Information System (GIS) has been prepared, which combines spatial data on various sectors for case study analysis related to the Community Agriculture and Watershed Management Project (CAWMP). The inception report is based on the findings resulting from the Swiss Consultant Trust Fund (CTF) financed project, specifically on the experiences from the awareness creation and training workshop conducted in Dushanbe in November 2007 and the analysis of historical land degradation trends carried out for the four CAWMP watersheds. Furthermore, also recommendations from the inception mission of CDE to Tajikistan (5-20 August 2007) and the inception report for the Swiss CTF support were considered. The inception report for the BNWPP project (The Bank-Netherlands Water Partnership Program) discusses the following project relevant issues: (1) Preliminary list of additional data layers, types of data analysis, and audiences to be covered by BNWPP grant (2) Assessing skills and equipment already available within Tajikistan, and implications for training program and specific equipment procurement plans (3) Updated detailed schedule and plans for all activities to be financed by BNWPP grant, and (4) Proposed list of contents for the final report and web-based presentations.
Resumo:
AIMS While zebrafish embryos are amenable to in vivo imaging, allowing the study of morphogenetic processes during development, intravital imaging of adults is hampered by their small size and loss of transparency. The use of adult zebrafish as a vertebrate model of cardiac disease and regeneration is increasing at high speed. It is therefore of great importance to establish appropriate and robust methods to measure cardiac function parameters. METHODS AND RESULTS Here we describe the use of 2D-echocardiography to study the fractional volume shortening and segmental wall motion of the ventricle. Our data show that 2D-echocardiography can be used to evaluate cardiac injury and also to study recovery of cardiac function. Interestingly, our results show that while global systolic function recovered following cardiac cryoinjury, ventricular wall motion was only partially restored. CONCLUSION Cryoinjury leads to long-lasting impairment of cardiac contraction, partially mimicking the consequences of myocardial infarction in humans. Functional assessment of heart regeneration by echocardiography allows a deeper understanding of the mechanisms of cardiac regeneration and has the advantage of being easily transferable to other cardiovascular zebrafish disease models.
Resumo:
This paper uses the metaphor of a pressure cooker to highlight how water problems in Spain are highly geographical and sectorial in nature, with some specific hotspots which raise the temperature of the whole water complex system, turning many potentially solvable water problems into ?wicked problems?. The paper discusses the tendency for water governance to be hydrocentric, when often the drivers and in turn the ?solutions? to Spanish water problems lie outside the water sphere. The paper analyzes of the current water governance system by looking at water governance as both a process, and its key attributes like participation, trans- parency, equity and rule of law, as well as an analysis of water governance as an outcome by looking at efficiency and sustainability of water use in Spain. It concludes on the need to have a deeper knowledge on the interactions of water governance as a process and as an outcome and potential synergies and arguing that water governance is an inherently political process which calls for strengthening the capacity of the system by looking at the interactions of these different governance attributes.
Resumo:
With the advent of cloud computing model, distributed caches have become the cornerstone for building scalable applications. Popular systems like Facebook [1] or Twitter use Memcached [5], a highly scalable distributed object cache, to speed up applications by avoiding database accesses. Distributed object caches assign objects to cache instances based on a hashing function, and objects are not moved from a cache instance to another unless more instances are added to the cache and objects are redistributed. This may lead to situations where some cache instances are overloaded when some of the objects they store are frequently accessed, while other cache instances are less frequently used. In this paper we propose a multi-resource load balancing algorithm for distributed cache systems. The algorithm aims at balancing both CPU and Memory resources among cache instances by redistributing stored data. Considering the possible conflict of balancing multiple resources at the same time, we give CPU and Memory resources weighted priorities based on the runtime load distributions. A scarcer resource is given a higher weight than a less scarce resource when load balancing. The system imbalance degree is evaluated based on monitoring information, and the utility load of a node, a unit for resource consumption. Besides, since continuous rebalance of the system may affect the QoS of applications utilizing the cache system, our data selection policy ensures that each data migration minimizes the system imbalance degree and hence, the total reconfiguration cost can be minimized. An extensive simulation is conducted to compare our policy with other policies. Our policy shows a significant improvement in time efficiency and decrease in reconfiguration cost.
Resumo:
Biomass has always been associated with the development of the population in the Canary Islands as the first source of elemental energy that was in the archipelago and the main cause of deforestation of forests, which over the years has been replaced by forest fossil fuels. The Canary Islands store a large amount of energy in the form of biomass. This may be important on a small scale for the design of small power plants with similar fuels from agricultural activities, and these plants could supply rural areas that could have self-sufficiency energy. The problem with the Canary Islands for a boost in this achievement is to ensure the supply to the consumer centers or power plants for greater efficiency that must operate continuously, allowing them to have a resource with regularity, quality and at an acceptable cost. In the Canary Islands converge also a unique topography with a very rugged terrain that makes it greater difficult to use and significantly more expensive. In this work all these aspects are studied, giving conclusions, action paths and theoretical potentials.
Resumo:
Debido al gran incremento de datos digitales que ha tenido lugar en los últimos años, ha surgido un nuevo paradigma de computación paralela para el procesamiento eficiente de grandes volúmenes de datos. Muchos de los sistemas basados en este paradigma, también llamados sistemas de computación intensiva de datos, siguen el modelo de programación de Google MapReduce. La principal ventaja de los sistemas MapReduce es que se basan en la idea de enviar la computación donde residen los datos, tratando de proporcionar escalabilidad y eficiencia. En escenarios libres de fallo, estos sistemas generalmente logran buenos resultados. Sin embargo, la mayoría de escenarios donde se utilizan, se caracterizan por la existencia de fallos. Por tanto, estas plataformas suelen incorporar características de tolerancia a fallos y fiabilidad. Por otro lado, es reconocido que las mejoras en confiabilidad vienen asociadas a costes adicionales en recursos. Esto es razonable y los proveedores que ofrecen este tipo de infraestructuras son conscientes de ello. No obstante, no todos los enfoques proporcionan la misma solución de compromiso entre las capacidades de tolerancia a fallo (o de manera general, las capacidades de fiabilidad) y su coste. Esta tesis ha tratado la problemática de la coexistencia entre fiabilidad y eficiencia de los recursos en los sistemas basados en el paradigma MapReduce, a través de metodologías que introducen el mínimo coste, garantizando un nivel adecuado de fiabilidad. Para lograr esto, se ha propuesto: (i) la formalización de una abstracción de detección de fallos; (ii) una solución alternativa a los puntos únicos de fallo de estas plataformas, y, finalmente, (iii) un nuevo sistema de asignación de recursos basado en retroalimentación a nivel de contenedores. Estas contribuciones genéricas han sido evaluadas tomando como referencia la arquitectura Hadoop YARN, que, hoy en día, es la plataforma de referencia en la comunidad de los sistemas de computación intensiva de datos. En la tesis se demuestra cómo todas las contribuciones de la misma superan a Hadoop YARN tanto en fiabilidad como en eficiencia de los recursos utilizados. ABSTRACT Due to the increase of huge data volumes, a new parallel computing paradigm to process big data in an efficient way has arisen. Many of these systems, called dataintensive computing systems, follow the Google MapReduce programming model. The main advantage of these systems is based on the idea of sending the computation where the data resides, trying to provide scalability and efficiency. In failure-free scenarios, these frameworks usually achieve good results. However, these ones are not realistic scenarios. Consequently, these frameworks exhibit some fault tolerance and dependability techniques as built-in features. On the other hand, dependability improvements are known to imply additional resource costs. This is reasonable and providers offering these infrastructures are aware of this. Nevertheless, not all the approaches provide the same tradeoff between fault tolerant capabilities (or more generally, reliability capabilities) and cost. In this thesis, we have addressed the coexistence between reliability and resource efficiency in MapReduce-based systems, looking for methodologies that introduce the minimal cost and guarantee an appropriate level of reliability. In order to achieve this, we have proposed: (i) a formalization of a failure detector abstraction; (ii) an alternative solution to single points of failure of these frameworks, and finally (iii) a novel feedback-based resource allocation system at the container level. Finally, our generic contributions have been instantiated for the Hadoop YARN architecture, which is the state-of-the-art framework in the data-intensive computing systems community nowadays. The thesis demonstrates how all our approaches outperform Hadoop YARN in terms of reliability and resource efficiency.
Resumo:
No abstract.