42 resultados para Economic resource use
Resumo:
Africa’s agriculture faces varying climate change impacts which mainly worsen production conditions and adversely affect its economies. Adaptations thus need to build the resilience of farming systems. Using “resilient adaptation” as a concept, this study analyses how adaptations at farm and policy/institutional-levels contribute to the resilience of Sub-Saharan African agriculture. The developed tool, “the Resilience Check”, provides socio-economic data which complements existing adaptation tools. The underlying development gaps such as insecure property rights, poverty, low self-organisation, inadequate climate data and infrastructure limit resilient adaptations. If farmers could implement recommended practices, existing measures and improved crops can address most impacts expected in the medium-term. However, resource use efficiency remains critical for all farm management types. Development-oriented adaptation measures are needed to provide the robust foundations for building resilience. Reaching the very poor remains a challenge and the externally driven nature of many interventions raises concern about their sustainability. The study recommends practical measures such as decentralising various services and integrating the action plans of the multilateral environmental agreements into one national action plan.
Resumo:
The Department of Geography at the University of Bern is currently active in over 40 countries, with a focus on Switzerland, Europe, East Africa including the Horn, the Andes, and Central Asia and South East Asia. These regional or national programmes are complemented by a number of global networks with a focus on climate change and sustainable resource use including mountain development. Overall, the Department is engaged in 74 projects, all based on close collaboration with local and regional partners in the countries where the programmes work.
Resumo:
PURPOSE: We studied the effects of reorganization and changes in the care process, including use of protocols for sedation and weaning from mechanical ventilation, on the use of sedative and analgesic drugs and on length of respiratory support and stay in the intensive care unit (ICU). MATERIALS AND METHODS: Three cohorts of 100 mechanically ventilated ICU patients, admitted in 1999 (baseline), 2000 (implementation I, after a change in ICU organization and in diagnostic and therapeutic approaches), and 2001 (implementation II, after introduction of protocols for weaning from mechanical ventilation and sedation), were studied retrospectively. RESULTS: Simplified Acute Physiology Score II (SAPS II), diagnostic groups, and number of organ failures were similar in all groups. Data are reported as median (interquartile range).Time on mechanical ventilation decreased from 18 (7-41) (baseline) to 12 (7-27) hours (implementation II) (P = .046), an effect which was entirely attributable to noninvasive ventilation, and length of ICU stay decreased in survivors from 37 (21-71) to 25 (19-63) hours (P = .049). The amount of morphine (P = .001) and midazolam (P = .050) decreased, whereas the amount of propofol (P = .052) and fentanyl increased (P = .001). Total Therapeutic Intervention Scoring System-28 (TISS-28) per patient decreased from 137 (99-272) to 113 (87-256) points (P = .009). Intensive care unit mortality was 19% (baseline), 8% (implementation I), and 7% (implementation II) (P = .020). CONCLUSIONS: Changes in organizational and care processes were associated with an altered pattern of sedative and analgesic drug prescription, a decrease in length of (noninvasive) respiratory support and length of stay in survivors, and decreases in resource use as measured by TISS-28 and mortality.
Resumo:
Models of population dynamics generally neglect the presence of males. While this assumption holds under many circumstances, behavioural ecology increasingly tells us that the presence (or absence) of males may have an impact on female fitness, and hence population sizes. Here we ask the question of whether males matter to population dynamics, operationally defined as a dependency of population growth on the relative density of males. We provide simple models, and evaluate the current empirical evidence for them, that illustrate various mechanisms of such male influence: mate searching behavior, male resource use (including effects of sexual dimorphism), sexual harassment and sexual segregation. In each case, theory predicts that males can have an effect on population densities, and in some extreme cases a positive feedback between an increasingly male-biased sex ratio and the effects on female harassment may theoretically even bring about population extinction. The results of this study, and the literature reviewed, show that the males can have a substantial effect on population dynamics, particularly so when human influences result in biased sex ratios.
Resumo:
The present paper discusses a conceptual, methodological and practical framework within which the limitations of the conventional notion of natural resource management (NRM) can be overcome. NRM is understood as the application of scientific ecological knowledge to resource management. By including a consideration of the normative imperatives that arise from scientific ecological knowledge and submitting them to public scrutiny, ‘sustainable management of natural resources’ can be recontextualised as ‘sustainable governance of natural resources’. This in turn makes it possible to place the politically neutralising discourse of ‘management’ in a space for wider societal debate, in which the different actors involved can deliberate and negotiate the norms, rules and power relations related to natural resource use and sustainable development. The transformation of sustainable management into sustainable governance of natural resources can be conceptualised as a social learning process involving scientists, experts, politicians and local actors, and their corresponding scientific and non-scientific knowledges. The social learning process is the result of what Habermas has described as ‘communicative action’, in contrast to ‘strategic action’. Sustainable governance of natural resources thus requires a new space for communicative action aiming at shared, intersubjectively validated definitions of actual situations and the goals and means required for transforming current norms, rules and power relations in order to achieve sustainable development. Case studies from rural India, Bolivia and Mali explore the potentials and limitations for broadening communicative action through an intensification of social learning processes at the interface of local and external knowledge. Key factors that enable or hinder the transformation of sustainable management into sustainable governance of natural resources through social learning processes and communicative action are discussed.
Resumo:
PURPOSE OF REVIEW: Intensive care medicine consumes a high share of healthcare costs, and there is growing pressure to use the scarce resources efficiently. Accordingly, organizational issues and quality management have become an important focus of interest in recent years. Here, we will review current concepts of how outcome data can be used to identify areas requiring action. RECENT FINDINGS: Using recently established models of outcome assessment, wide variability between individual ICUs is found, both with respect to outcome and resource use. Such variability implies that there are large differences in patient care processes not only within the ICU but also in pre-ICU and post-ICU care. Indeed, measures to improve the patient process in the ICU (including care of the critically ill, patient safety, and management of the ICU) have been presented in a number of recently published papers. SUMMARY: Outcome assessment models provide an important framework for benchmarking. They may help the individual ICU to spot appropriate fields of action, plan and initiate quality improvement projects, and monitor the consequences of such activity.
Resumo:
Water is an important resource for plant life. Since climate scenarios for Switzerland predict an average reduction of 20% in summer precipitation until 2070, understanding ecosystem responses to water shortage, e.g. in terms of plant productivity, is of major concern. Thus, we tested the effects of simulated summer drought on three managed grasslands along an altitudinal gradient in Switzerland from 2005 to 2007, representing typical management intensities at the respective altitude. We assessed the effects of experimental drought on above- and below-ground productivity, stand structure (LAI and vegetation height) and resource use (carbon and water). Responses of community above-ground productivity to reduced precipitation input differed among the three sites but scaled positively with total annual precipitation at the sites (R2=0.85). Annual community above-ground biomass productivity was significantly reduced by summer drought at the alpine site receiving the least amount of annual precipitation, while no significant decrease (rather an increase) was observed at the pre-alpine site receiving highest precipitation amounts in all three years. At the lowland site (intermediate precipitation sums), biomass productivity significantly decreased in response to drought only in the third year, after showing increased abundance of a drought tolerant weed species in the second year. No significant change in below-ground biomass productivity was observed at any of the sites in response to simulated summer drought. However, vegetation carbon isotope ratios increased under drought conditions, indicating an increase in water use efficiency. We conclude that there is no general drought response of Swiss grasslands, but that sites with lower annual precipitation seem to be more vulnerable to summer drought than sites with higher annual precipitation, and thus site-specific adaptation of management strategies will be needed, especially in regions with low annual precipitation.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.
Resumo:
Traditionally, desertification research has focused on degradation assessments, whereas prevention and mitigation strategies have not sufficiently been emphasised, although the concept of sustainable land management (SLM) is increasingly being acknowledged. SLM strategies are interventions at the local to regional scale aiming at increasing productivity, protecting the natural resource base, and improving livelihoods. The global WOCAT initiative and its partners have developed harmonized frameworks to compile, evaluate and analyse the impact of SLM practices around the globe. Recent studies within the EU research project DESIRE developed a methodological framework that combines a collective learning and decision-making approach with use of best practices from the WOCAT database. In-depth assessment of 30 technologies and 8 approaches from 17 desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats such as water scarcity, soil and vegetation degradation, low production, climate change, resource use conflicts and migration. Among the impacts attributed to the documented technologies, those mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Water harvesting offers under-exploited opportunities for the drylands and the predominantly rainfed farming systems of the developing world. Recently compiled guidelines introduce the concepts behind water harvesting and propose a harmonised classification system, followed by an assessment of suitability, adoption and up-scaling of practices. Case studies go from large-scale floodwater spreading that make alluvial plains cultivable, to systems that boost cereal production in small farms, as well as practices that collect and store water from household compounds. Once contextualized and set in appropriate institutional frameworks, they can form part of an overall adaptation strategy for land users. More field research is needed to reinforce expert assessments of SLM impacts and provide the necessary evidence-based rationale for investing in SLM. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the new EU CASCADE project.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.
Resumo:
Backgrounds and Aims Leaf functional traits have been used as a basis to categoize plants across a range of resource-use specialization, from those that conserve available resources to those that exploit them. However, the extent to which the leaf functional traits used to define the resource-use strategies are related to root traits and are good indicators of the ability of the roots to take up nitrogen (N) are poorly known. This is an important question because interspecific differences in N uptake have been proposed as one mechanism by which species coexistence may be determined. This study therefore investigated the relationships between functional traits and N uptake ability for grass species across a range of conservative to exploitative resource-use strategies.Methods Root uptake of NH4+ and NO3-, and leaf and root functional traits were measured for eight grass species sampled at three grassland sites across Europe, in France, Austria and the UK. Species were grown in hydroponics to determine functional traits and kinetic uptake parameters (Imax and Km) under standardized conditions.Key Results Species with high specific leaf area (SLA) and shoot N content, and low leaf and root dry matter content (LDMC and RDMC, respectively), which are traits associated with the exploitative syndrome, had higher uptake and affinity for both N forms. No trade-off was observed in uptake between the two forms of N, and all species expressed a higher preference for NH4+.Conclusions The results support the use of leaf traits, and especially SLA and LDMC, as indicators of the N uptake ability across a broad range of grass species. The difficulties associated with assessing root properties are also highlighted, as root traits were only weakly correlated with leaf traits, and only RDMC and, to a lesser extent, root N content were related to leaf traits.
Resumo:
Our study considers the natural resources of the Miombo forests in Cabo Delgado from a broad ecosystems perspective. Thus, our view goes beyond the disciplinary approaches of forestry, agronomy, biology or zoology, and also of the social sciences, namely anthropology, history, sociology, political science or economics. The present study aims to establish a dialogue and create synergies between Miti Ltd. – the logging company and owner of the forest concessions – as well as government and state structures at the various levels and the communities – through the Committees on Natural Resources – in order to promote the sustainable use of resources and ecosystems. The research methodology we used can broadly be described as moderated transdisciplinary interaction for action-research based on the approach known as Learning for Sustainability (LforS, http://www.cde.unibe.ch/Pages/Project/2/14/Learning-for-Sustainability-Extension-Approach.aspx). The research methods used include: LforS seminars; field work; forests observations focusing, among others, on ecosystems, trees, wildlife, and burned areas; visits to farms; and interviews. We conducted both collective interviews and individual interviews, including with key informants. The main results indicate that members of the Committee on Natural Resources have a dual attitude: their statements defend the paradigm of sustainable use of natural resources as well as their own immediate monetary gain. They are willing to apply the values, concepts and theories of sustainable development that underpin the establishment of Committees on Natural Resources if they are paid for their work or if they can derive direct benefits from it, i.e. if they can earn a salary or allowance. If this does not happen, however, they are willing to allow actors to engage in illegal hunting or logging activities. This dual attitude also exists in relation to forestry operators. If the concession workers pay the committee members in cash or provide goods, they can run their business even if they violate the law. Natural forest regeneration in Nkonga and Namiune already shows the impact of such use. Although there are many saplings that could basically ensure continuous regeneration under sustainable management, repeated burning is damaging the young trees, deforming them and killing a great number of them. Campaigns against uncontrolled fires are ineffective because the administrative and political authorities have a dual attitude as well and are also part of the group that uses resources to their own profit and benefit. There are institutional structures within the administration, populations, and communities to perform regulating functions, create and implement rules, punish offenders, and oversee resource use. However, they feel that since they are not paid for performing these functions, they do not have to do so. This attitude shows a lack of awareness, but also indicates a situation where everyone seeks to derive maximum benefits from existing resource use patterns. Anything goes.
Resumo:
The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake. Such an assessment could form the basis for improving ES thinking, further embedding it into environmental decisions and management. The Young Ecosystem Services Specialists (YESS) completed a Strengths–Weaknesses–Opportunities–Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach being interdisciplinary, and a useful communication tool. Weaknesses include an incomplete scientific basis, frameworks being inconsistently applied, and accounting for nature's intrinsic value. Opportunities include alignment with existing policies and established methodologies, and increasing environmental awareness. Threats include resistance to change, and difficulty with interdisciplinary collaboration. Consideration of SWOT themes suggested five strategic areas for developing and implementing ES. The ES concept could improve decision-making related to natural resource use, and interpretation of the complexities of human-nature interactions. It is contradictory – valued as a simple means of communicating the importance of conservation, whilst also considered an oversimplification characterised by ambiguous language. Nonetheless, given sufficient funding and political will, the ES framework could facilitate interdisciplinary research, ensuring decision-making that supports sustainable development.
Resumo:
Polymorbid patients, diverse diagnostic and therapeutic options, more complex hospital structures, financial incentives, benchmarking, as well as perceptional and societal changes put pressure on medical doctors, specifically if medical errors surface. This is particularly true for the emergency department setting, where patients face delayed or erroneous initial diagnostic or therapeutic measures and costly hospital stays due to sub-optimal triage. A "biomarker" is any laboratory tool with the potential better to detect and characterise diseases, to simplify complex clinical algorithms and to improve clinical problem solving in routine care. They must be embedded in clinical algorithms to complement and not replace basic medical skills. Unselected ordering of laboratory tests and shortcomings in test performance and interpretation contribute to diagnostic errors. Test results may be ambiguous with false positive or false negative results and generate unnecessary harm and costs. Laboratory tests should only be ordered, if results have clinical consequences. In studies, we must move beyond the observational reporting and meta-analysing of diagnostic accuracies for biomarkers. Instead, specific cut-off ranges should be proposed and intervention studies conducted to prove outcome relevant impacts on patient care. The focus of this review is to exemplify the appropriate use of selected laboratory tests in the emergency setting for which randomised-controlled intervention studies have proven clinical benefit. Herein, we focus on initial patient triage and allocation of treatment opportunities in patients with cardiorespiratory diseases in the emergency department. The following five biomarkers will be discussed: proadrenomedullin for prognostic triage assessment and site-of-care decisions, cardiac troponin for acute myocardial infarction, natriuretic peptides for acute heart failure, D-dimers for venous thromboembolism, C-reactive protein as a marker of inflammation, and procalcitonin for antibiotic stewardship in infections of the respiratory tract and sepsis. For these markers we provide an overview on physiopathology, historical evolution of evidence, strengths and limitations for a rational implementation into clinical algorithms. We critically discuss results from key intervention trials that led to their use in clinical routine and potential future indications. The rational for the use of all these biomarkers, first, tackle diagnostic ambiguity and consecutive defensive medicine, second, delayed and sub-optimal therapeutic decisions, and third, prognostic uncertainty with misguided triage and site-of-care decisions all contributing to the waste of our limited health care resources. A multifaceted approach for a more targeted management of medical patients from emergency admission to discharge including biomarkers, will translate into better resource use, shorter length of hospital stay, reduced overall costs, improved patients satisfaction and outcomes in terms of mortality and re-hospitalisation. Hopefully, the concepts outlined in this review will help the reader to improve their diagnostic skills and become more parsimonious laboratory test requesters.
Resumo:
OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.