914 resultados para Resource use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents exact, hybrid algorithms for mixed resource Allocation and Scheduling problems; in general terms, those consist into assigning over time finite capacity resources to a set of precedence connected activities. The proposed methods have broad applicability, but are mainly motivated by applications in the field of Embedded System Design. In particular, high-performance embedded computing recently witnessed the shift from single CPU platforms with application-specific accelerators to programmable Multi Processor Systems-on-Chip (MPSoCs). Those allow higher flexibility, real time performance and low energy consumption, but the programmer must be able to effectively exploit the platform parallelism. This raises interest in the development of algorithmic techniques to be embedded in CAD tools; in particular, given a specific application and platform, the objective if to perform optimal allocation of hardware resources and to compute an execution schedule. On this regard, since embedded systems tend to run the same set of applications for their entire lifetime, off-line, exact optimization approaches are particularly appealing. Quite surprisingly, the use of exact algorithms has not been well investigated so far; this is in part motivated by the complexity of integrated allocation and scheduling, setting tough challenges for ``pure'' combinatorial methods. The use of hybrid CP/OR approaches presents the opportunity to exploit mutual advantages of different methods, while compensating for their weaknesses. In this work, we consider in first instance an Allocation and Scheduling problem over the Cell BE processor by Sony, IBM and Toshiba; we propose three different solution methods, leveraging decomposition, cut generation and heuristic guided search. Next, we face Allocation and Scheduling of so-called Conditional Task Graphs, explicitly accounting for branches with outcome not known at design time; we extend the CP scheduling framework to effectively deal with the introduced stochastic elements. Finally, we address Allocation and Scheduling with uncertain, bounded execution times, via conflict based tree search; we introduce a simple and flexible time model to take into account duration variability and provide an efficient conflict detection method. The proposed approaches achieve good results on practical size problem, thus demonstrating the use of exact approaches for system design is feasible. Furthermore, the developed techniques bring significant contributions to combinatorial optimization methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crop water requirements are important elements for food production, especially in arid and semiarid regions. These regions are experience increasing population growth and less water for agriculture, which amplifies the need for more efficient irrigation. Improved water use efficiency is needed to produce more food while conserving water as a limited natural resource. Evaporation (E) from bare soil and Transpiration (T) from plants is considered a critical part of the global water cycle and, in recent decades, climate change could lead to increased E and T. Because energy is required to break hydrogen bonds and vaporize water, water and energy balances are closely connected. The soil water balance is also linked with water vapour losses to evapotranspiration (ET) that are dependent mainly on energy balance at the Earth’s surface. This work addresses the role of evapotranspiration for water use efficiency by developing a mathematical model that improves the accuracy of crop evapotranspiration calculation; accounting for the effects of weather conditions, e.g., wind speed and humidity, on crop coefficients, which relates crop evapotranspiration to reference evapotranspiration. The ability to partition ET into Evaporation and Transpiration components will help irrigation managers to find ways to improve water use efficiency by decreasing the ratio of evaporation to transpiration. The developed crop coefficient model will improve both irrigation scheduling and water resources planning in response to future climate change, which can improve world food production and water use efficiency in agriculture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context Long-term antiretroviral therapy (ART) use in resource-limited countries leads to increasing numbers of patients with HIV taking second-line therapy. Limited access to further therapeutic options makes essential the evaluation of second-line regimen efficacy in these settings. Objectives To investigate failure rates in patients receiving second-line therapy and factors associated with failure and death. Design, Setting, and Participants Multicohort study of 632 patients >14 years old receiving second-line therapy for more than 6 months in 27 ART programs in Africa and Asia between January 2001 and October 2008. Main Outcome Measures Clinical, immunological, virological, and immunovirological failure (first diagnosed episode of immunological or virological failure) rates, and mortality after 6 months of second-line therapy use. Sensitivity analyses were performed using alternative CD4 cell count thresholds for immunological and immunovirological definitions of failure and for cohort attrition instead of death. Results The 632 patients provided 740.7 person-years of follow-up; 119 (18.8%) met World Health Organization failure criteria after a median 11.9 months following the start of second-line therapy (interquartile range [IQR], 8.7-17.0 months), and 34 (5.4%) died after a median 15.1 months (IQR, 11.9-25.7 months). Failure rates were lower in those who changed 2 nucleoside reverse transcriptase inhibitors (NRTIs) instead of 1 (179.2 vs 251.6 per 1000 person-years; incidence rate ratio [IRR], 0.64; 95% confidence interval [CI], 0.42-0.96), and higher in those with lowest adherence index (383.5 vs 176.0 per 1000 person-years; IRR, 3.14; 95% CI, 1.67-5.90 for <80% vs ≥95% [percentage adherent, as represented by percentage of appointments attended with no delay]). Failure rates increased with lower CD4 cell counts when second-line therapy was started, from 156.3 vs 96.2 per 1000 person-years; IRR, 1.59 (95% CI, 0.78-3.25) for 100 to 199/μL to 336.8 per 1000 person-years; IRR, 3.32 (95% CI, 1.81-6.08) for less than 50/μL vs 200/μL or higher; and decreased with time using second-line therapy, from 250.0 vs 123.2 per 1000 person-years; IRR, 1.90 (95% CI, 1.19-3.02) for 6 to 11 months to 212.0 per 1000 person-years; 1.71 (95% CI, 1.01-2.88) for 12 to 17 months vs 18 or more months. Mortality for those taking second-line therapy was lower in women (32.4 vs 68.3 per 1000 person-years; hazard ratio [HR], 0.45; 95% CI, 0.23-0.91); and higher in patients with treatment failure of any type (91.9 vs 28.1 per 1000 person-years; HR, 2.83; 95% CI, 1.38-5.80). Sensitivity analyses showed similar results. Conclusions Among patients in Africa and Asia receiving second-line therapy for HIV, treatment failure was associated with low CD4 cell counts at second-line therapy start, use of suboptimal second-line regimens, and poor adherence. Mortality was associated with diagnosed treatment failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-stream structures including cross-vanes, J-hooks, rock vanes, and W-weirs are widely used in river restoration to limit bank erosion, prevent changes in channel gradient, and improve aquatic habitat. During this investigation, a rapid assessment protocol was combined with post-project monitoring data to assess factors influencing the performance of more than 558 in-stream structures and rootwads in North Carolina. Cross-sectional survey data examined for 221 cross sections from 26 sites showed that channel adjustments were highly variable from site to site, but approximately 60 % of the sites underwent at least a 20 % net change in channel capacity. Evaluation of in-stream structures ranging from 1 to 8 years in age showed that about half of the structures were impaired at 10 of the 26 sites. Major structural damage was often associated with floods of low to moderate frequency and magnitude. Failure mechanisms varied between sites and structure types, but included: (1) erosion of the channel bed and banks (outflanking); (2) movement of rock materials during floods; and (3) burial of the structures in the channel bed. Sites with reconstructed channels that exhibited large changes in channel capacity possessed the highest rates of structural impairment, suggesting that channel adjustments between structures led to their degradation of function. The data question whether currently used in-stream structures are capable of stabilizing reconfigured channels for even short periods when applied to dynamic rivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pasture use in the Kyrgyz Republic has changed significantly as a result of fundamental political, economic, and societal changes following the collapse of the Soviet Union and the subsequent changes in people’s livelihoods. Government institutions criticize current land use patterns as unsustainable and the cause of degradation. But at the local level, pasture quality is rarely seen as a major problem. This article uses a qualitative approach to examine the tension between these views and addresses current land use practices and related narratives about pasture degradation in rural Kyrgyzstan. By focusing on meanings ascribed to pastures, it shows how people closely relate current practices to the experiences and value systems of the Soviet period and to changing identities emerging in the post-Soviet transformation process. It argues that proper understanding of resource degradation issues requires adequate consideration of the context of meaning constructed by local resource users when they make sense of their environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an inequality in resource utilization among acute psychiatric in-patients. About 20-30% of them absorb 60-80% of the total resources allocated to this form of treatment. This study intends to summarize findings related to heavy in-patient service use and to illustrate them by means of utilization data for acute psychiatric wards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For countless communities around the world, acquiring access to safe drinking water is a daily challenge which many organizations endeavor to meet. The villages in the interior of Suriname have been the focus of many improved drinking water projects as most communities are without year-round access. Unfortunately, as many as 75% of the systems in Suriname fail within several years of implementation. These communities, scattered along the rivers and throughout the jungle, lack many of the resources required to sustain a centralized water treatment system. However, the centralized system in the village of Bendekonde on the Upper Suriname River has been operational for over 10 years and is often touted by other communities. The Bendekonde system is praised even though the technology does not differ significantly from other failed systems. Many of the water systems that fail in the interior fail due to a lack of resources available to the community to maintain the system. Typically, the more complex a system becomes, so does the demand for additional resources. Alternatives to centralized systems include technologies such as point-of-use water filters, which can greatly reduce the necessity for outside resources. In particular, ceramic point-of-use water filters offer a technology that can be reasonably managed in a low resource setting such as that in the interior of Suriname. This report investigates the appropriateness and effectiveness of ceramic filters constructed with local Suriname clay and compares the treatment effectiveness to that of the Bendekonde system. Results of this study showed that functional filters could be produced from Surinamese clay and that they were more effective, in a controlled laboratory setting, than the field performance of the Bendekonde system for removing total coliform. However, the Bendekonde system was more successful at removing E. coli. In a life-cycle assessment, ceramic water filters manufactured in Suriname and used in homes for a lifespan of 2 years were shown to have lower cumulative energy demand, as well as lower global warming potential than a centralized system similar to that used in Bendekonde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peru is a developing country with abundant fresh water resources, yet the lack of infrastructure leaves much of the population without access to safe water for domestic uses. The author of this report was a Peace Corps Volunteer in the sector of water & sanitation in the district of Independencia, Ica, Peru. Independencia is located in the arid coastal region of the country, receiving on average 15 mm of rain annually. The water source for this district comes from the Pisco River, originating in the Andean highlands and outflowing into the Pacific Ocean near the town of Pisco, Peru. The objectives of this report are to assess the water supply and sanitation practices, model the existing water distribution system, and make recommendations for future expansion of the distribution system in the district of Independencia, Peru. The assessment of water supply will be based on the results from community surveys done in the district of Independencia, water quality testing done by a detachment of the U.S. Navy, as well as on the results of a hydraulic model built in EPANET 2.0 to represent the distribution system. Sanitation practice assessments will be based on the surveys as well as observations from the author while living in Peru. Recommendations for system expansions will be made based on results from the EPANET model and the municipality’s technical report for the existing distribution system. Household water use and sanitation surveys were conducted with 84 families in the district revealing that upwards of 85% store their domestic water in regularly washed containers with lids. Over 80% of those surveyed are drinking water that is treated, mostly boiled. Of those surveyed, over 95% reported washing their hands and over 60% mentioned at least one critical time for hand washing when asked for specific instances. From the surveys, it was also discovered that over 80% of houses are properly disposing of excrement, in either latrines or septic tanks. There were 43 families interviewed with children five years of age or under, and just over 18% reported the child had a case of diarrhea within the last month at the time of the interview. Finally, from the surveys it was calculated that the average water use per person per day is about 22 liters. Water quality testing carried out by a detachment of the U.S. Navy revealed that the water intended for consumption in the houses surveyed was not suitable for consumption, with a median E. coli most probable number of 47/100 ml for the 61 houses sampled. The median total coliforms was 3,000 colony forming units per 100 ml. EPANET was used to simulate the water delivery system and evaluate its performance. EPANET is designed for continuous water delivery systems, assuming all pipes are always flowing full. To account for the intermittent nature of the system, multiple EPANET network models were created to simulate how water is routed to the different parts of the system throughout the day. The models were created from interviews with the water technicians and a map of the system created using handheld GPS units. The purpose is to analyze the performance of the water system that services approximately 13,276 people in the district of Independencia, Peru, as well as provide recommendations for future growth and improvement of the service level. Performance evaluation of the existing system is based on meeting 25 liters per person per day while maintaining positive pressure at all nodes in the network. The future performance is based on meeting a minimum pressure of 20 psi in the main line, as proposed by Chase (2000). The EPANET model results yield an average nodal pressure for all communities of 71 psi, with a range from 1.3 – 160 psi. Thus, if the current water delivery schedule obtained from the local municipality is followed, all communities should have sufficient pressure to deliver 25 l/p/d, with the exception of Los Rosales, which can only supply 3.25 l/p/d. However, if the line to Los Rosales were increased from one to four inches, the system could supply this community with 25 l/p/d. The district of Independencia could greatly benefit from increasing the service level to 24-hour water delivery and a minimum of 50 l/p/d, so that communities without reliable access due to insufficient pressure would become equal beneficiaries of this invaluable resource. To evaluate the feasibility of this, EPANET was used to model the system with a range of population growth rates, system lifetimes, and demands. In order to meet a minimum pressure of 20 psi in the main line, the 6-inch diameter main line must be increased and approximately two miles of trench must be excavated up to 30 feet deep. The sections of the main line that must be excavated are mile 0-1 and 1.5-2.5, and the first 3.4 miles of the main line must be increased from 6 to 16 inches, contracting to 10 inches for the remaining 5.8 miles. Doing this would allow 24-hour water delivery and provide 50 l/p/d for a range of population growth rates and system lifetimes. It is expected that improving the water delivery service would reduce the morbidity and mortality from diarrheal diseases by decreasing the recontamination of the water due to transport and household storage, as well as by maintaining continuous pressure in the system to prevent infiltration of contaminated groundwater. However, this expansion must be carefully planned so as not to affect aquatic ecosystems or other districts utilizing water from the Pisco River. It is recommended that stream gaging of the Pisco River and precipitation monitoring of the surrounding watershed is initiated in order to begin a hydrological study that would be integrated into the district’s water resource planning. It is also recommended that the district begin routine water quality testing, with the results available to the public.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water resource depletion and sanitation are growing problems around the world. A solution to both of these problems is the use of composting latrines, as it requires no water and has been recommended by the World Health Organization as an improved sanitation technology. However, little analysis has been done on the decomposition process occurring inside the latrine, including what temperatures are reached and what variables most affect the composting process. Having better knowledge of how outside variables affect composting latrines can aid development workers on the choice of implementing such technology, and to better educate the users on the appropriate methods of maintenance. This report presents a full, detailed construction manual and temperature data analysis of a double vault composting latrine. During the author’s two year Peace Corps service in rural Paraguay he was involved with building twenty one composting latrines, and took detailed temperature readings and visual observations of his personal latrine for ten months. The author also took limited temperature readings of fourteen community member’s latrines over a three month period. These data points were analyzed to find correlations between compost temperatures and several variables. The two main variables found to affect the compost temperatures were the seasonal trends of the outside temperatures, and the mixing and addition of moisture to the compost. Outside seasonal temperature changes were compared to those of the compost and a linear regression was performed resulting in a R2-value of 0.89. Mixing the compost and adding water, or a water/urine mixture, resulted in temperature increases of the compost 100% of the time, with seasonal temperatures determining the rate and duration of the temperature increases. The temperature readings were also used to find events when certain temperatures were held for sufficient amounts of time to reach total pathogen destruction in the compost. Four different events were recorded when a temperature of 122°F (50°C) was held for at least 24 hours, ensuring total pathogen destruction in that area of the compost. One event of 114.8°F (46°C) held for one week was also recorded, again ensuring total pathogen destruction. Through the analysis of the temperature data, however, it was found that the compost only reached total pathogen destruction levels during ten percent of the data points. Because of this the storage time recommendation outlined by the World Health Organization should be complied with. The WHO recommends storing compost for 1.5-2 years in climates with ambient temperatures of 2-20°C (35-68°F), and for at least 1 year with ambient temperatures of 20-35°C (68-95°F). If these storage durations are obtainable the use of the double vault composting latrine is an economical and achievable solution to sanitation while conserving water resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When project managers determine schedules for resource-constrained projects, they commonly use commercial project management software packages. Which resource-allocation methods are implemented in these packages is proprietary information. The resource-allocation problem is in general computationally difficult to solve to optimality. Hence, the question arises if and how various project management software packages differ in quality with respect to their resource-allocation capabilities. None of the few existing papers on this subject uses a sizeable data set and recent versions of common software packages. We experimentally analyze the resource-allocation capabilities of Acos Plus.1, AdeptTracker Professional, CS Project Professional, Microsoft Office Project 2007, Primavera P6, Sciforma PS8, and Turbo Project Professional. Our analysis is based on 1560 instances of the precedence- and resource-constrained project scheduling problem RCPSP. The experiment shows that using the resource-allocation feature of these packages may lead to a project duration increase of almost 115% above the best known feasible schedule. The increase gets larger with increasing resource scarcity and with increasing number of activities. We investigate the impact of different complexity scenarios and priority rules on the project duration obtained by the software packages. We provide a decision table to support managers in selecting a software package and a priority rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES In resource-constrained settings, tuberculosis (TB) is a common opportunistic infection and cause of death in HIV-infected persons. TB may be present at the start of antiretroviral therapy (ART), but it is often under-diagnosed. We describe approaches to TB diagnosis and screening of TB in ART programs in low- and middle-income countries. METHODS AND FINDINGS We surveyed ART programs treating HIV-infected adults in sub-Saharan Africa, Asia and Latin America in 2012 using online questionnaires to collect program-level and patient-level data. Forty-seven sites from 26 countries participated. Patient-level data were collected on 987 adult TB patients from 40 sites (median age 34.7 years; 54% female). Sputum smear microscopy and chest radiograph were available in 47 (100%) sites, TB culture in 44 (94%), and Xpert MTB/RIF in 23 (49%). Xpert MTB/RIF was rarely available in Central Africa and South America. In sites with access to these diagnostics, microscopy was used in 745 (76%) patients diagnosed with TB, culture in 220 (24%), and chest X-ray in 688 (70%) patients. When free of charge culture was done in 27% of patients, compared to 21% when there was a fee (p = 0.033). Corresponding percentages for Xpert MTB/RIF were 26% and 15% of patients (p = 0.001). Screening practices for active disease before starting ART included symptom screening (46 sites, 98%), chest X-ray (38, 81%), sputum microscopy (37, 79%), culture (16, 34%), and Xpert MTB/RIF (5, 11%). CONCLUSIONS Mycobacterial culture was infrequently used despite its availability at most sites, while Xpert MTB/RIF was not generally available. Use of available diagnostics was higher when offered free of charge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fog is a potential source of water that could be exploited using the innovative technology of fog collection. Naturally, the potential of fog has proven its significance in cloud forests that are thriving from fog interception. Historically, the remains of artificial structures in different countries prove that fog has been collected as an alternative and/or supplementary water source. In the beginning of the 19th century, fog collection was investigated as a potential natural resource. After the mid-1980s, following success in Chile, fog-water collection commenced in a number of developing countries. Most of these countries are located in arid and semi-arid regions with topographic and climatic conditions that favour fog-water collection. This paper reviews the technology of fog collection with initial background information on natural fog collection and its historical development. It reviews the climatic and topographic features that dictate fog formation (mainly advection and orographic) and the innovative technology to collect it, focusing on the amount collected, the quality of fog water, and the impact of the technology on the livelihoods of beneficiary communities. By and large, the technology described is simple, cost-effective, and energy-free. However, fog-water collection has disadvantages in that it is seasonal, localised, and the technology needs continual maintenance. Based on the experience in several countries, the sustainability of the technology could be guaranteed if technical, economic, social, and management factors are addressed during its planning and implementation.