785 resultados para Visitor impact management
Resumo:
SDC has been involved in rural development in Cabo Delgado for more than 30 years. Shortly after the independence of Mozambique, projects in water supply and integrated rural development were initiated. The silvoagropastoral project FO9 based in Mueda was a very early experience in forestry in Cabo Delgado. Andreas Kläy was responsible for the forestry sector in FO9 for 3 years in the early 1980s and had an opportunity to initiate an exchange of ideas and experience in rural development theory and approaches with Yussuf Adam, who was doing research in human anthropology and history in the province. 25 years later, the current situation of forest management in Cabo Delgado was reassessed, with a specific focus on concessions in the North. The opportunity for a partnership between the MITI SA, the University of Eduardo Mondlane, and CDE was created on the basis of this preliminary study1. The aim of this partnership is to generate knowledge and develop capacity for sustainable forest management. The preliminary study showed that “…we have to face weaknesses and would like to start a learning process with the main institutions, organisations, and stakeholder groups active in forest management and research in the North of Cabo Delgado. This learning process will involve studies supported by competent research institutions and workshops …” The specific objectives of ESAPP project Q804 are the following: 1. Contribute to understanding of the forestry sector; 2. Capacity development for professionals and academics; 3. Support for the private sector and the local forest service; 4. Support data generation at Cabo Delgado's Provincial Service; 5. Capacity development for Swiss academic institutions (CDE and ETHZ). A conceptual planning platform was elaborated as a basis for cooperation and research in the partnership (cf. Annex 1). The partners agreed to work on two lines of research: biophysical and socio-economic. In order to ensure a transdisciplinary approach, disciplinary research is anchored in common understanding in workshops based on the LforS methods. These workshops integrate the main stakeholders in the local context of the COMADEL concession in Nangade District managed by MITI SA, and take place in the village of Namiune. The research team observed that current management schemes consist mainly of strategies of nature mining by most stakeholders involved. Institutional settings - formal and informal - have little impact due to weak capacity at the local level and corruption. Local difficulties in a remote rural area facilitate external access to resources and are perpetuated by the loss of benefits. The benefits of logging remain at the top level (economic and political elites). The interests of the owners of the concession in stopping the loss of resources caused by this regime offers a unique opportunity to intervene in the logic of resource degradation and agony in rural development and forest management.
Resumo:
Most commercial project management software packages include planning methods to devise schedules for resource-constrained projects. As it is proprietary information of the software vendors which planning methods are implemented, the question arises how the software packages differ in quality with respect to their resource-allocation capabilities. We experimentally evaluate the resource-allocation capabilities of eight recent software packages by using 1,560 instances with 30, 60, and 120 activities of the well-known PSPLIB library. In some of the analyzed packages, the user may influence the resource allocation by means of multi-level priority rules, whereas in other packages, only few options can be chosen. We study the impact of various complexity parameters and priority rules on the project duration obtained by the software packages. The results indicate that the resource-allocation capabilities of these packages differ significantly. In general, the relative gap between the packages gets larger with increasing resource scarcity and with increasing number of activities. Moreover, the selection of the priority rule has a considerable impact on the project duration. Surprisingly, when selecting a priority rule in the packages where it is possible, both the mean and the variance of the project duration are in general worse than for the packages which do not offer the selection of a priority rule.
Resumo:
Regime shifts, defined as a radical and persistent reconfiguration of an ecosystem following a disturbance, have been acknowledged by scientists as a very important aspect of the dynamic of ecosystems. However, their consideration in land management planning remains marginal and limited to specific processes and systems. Current research focuses on mathematical modeling and statistical analysis of spatio-temporal data for specific environmental variables. These methods do not fulfill the needs of land managers, who are confronted with a multitude of processes and pressure types and require clear and simple strategies to prevent regime shift or to increase the resilience of their environment. The EU-FP7 CASCADE project is looking at regime shifts of dryland ecosystems in southern Europe and specifically focuses on rangeland and forest systems which are prone to various land degradation threats. One of the aims of the project is to evaluate the impact of different management practices on the dynamic of the environment in a participatory manner, including a multi-stakeholder evaluation of the state of the environment and of the management potential. To achieve this objective we have organized several stakeholder meetings and we have compiled a review of management practices using the WOCAT methodology, which enables merging scientific and land users knowledge. We highlight here the main challenges we have encountered in applying the notion of regime shift to real world socio-ecological systems and in translating related concepts such as tipping points, stable states, hysteresis and resilience to land managers, using concrete examples from CASCADE study sites. Secondly, we explore the advantages of including land users’ knowledge in the scientific understanding of regime shifts. Moreover, we discuss useful alternative concepts and lessons learnt that will allow us to build a participatory method for the assessment of resilient management practices in specific socio-ecological systems and to foster adaptive dryland management.
Resumo:
Ensuring sustainable use of natural resources is crucial for maintaining the basis for our livelihoods. With threats from climate change, disputes over water, biodiversity loss, competing claims on land, and migration increasing worldwide, the demands for sustainable land management (SLM) practices will only increase in the future. For years already, various national and international organizations (GOs, NGOs, donors, research institutes, etc.) have been working on alternative forms of land management. And numerous land users worldwide – especially small farmers – have been testing, adapting, and refining new and better ways of managing land. All too often, however, the resulting SLM knowledge has not been sufficiently evaluated, documented and shared. Among other things, this has often prevented valuable SLM knowledge from being channelled into evidence-based decision-making processes. Indeed, proper knowledge management is crucial for SLM to reach its full potential. Since more than 20 years, the international WOCAT network documents and promotes SLM through its global platform. As a whole, the WOCAT methodology comprises tools for documenting, evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing, analysis and use for decision support in the field, at the planning level, and in scaling up identified good practices. In early 2014, WOCAT’s growth and ongoing improvement culminated in its being officially recognized by the UNCCD as the primary recommended database for SLM best practices. Over the years, the WOCAT network confirmed that SLM helps to prevent desertification, to increase biodiversity, enhance food security and to make people less vulnerable to the effects of climate variability and change. In addi- tion, it plays an important role in mitigating climate change through improving soil organic matter and increasing vegetation cover. In-depth assessments of SLM practices from desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats. The impacts mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Among others, favourable local-scale cost-benefit relationships of SLM practices play a crucial role in their adoption. An economic analysis from the WOCAT database showed that land users perceive a large majority of the technologies as having benefits that outweigh costs in the long term. The high investment costs associated with some practices may constitute a barrier to adoption, however, where appropriate, short-term support for land users can help to promote these practices. The increased global concerns on climate change, disaster risks and food security redirect attention to, and trigger more funds for SLM. To provide the necessary evidence-based rationale for investing in SLM and to reinforce expert and land users assessments of SLM impacts, more field research using inter- and transdisciplinary approaches is needed. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the EU FP7 projects CASCADE and RECARE.
Resumo:
It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.
Resumo:
Bullous pemphigoid is the most common autoimmune subepidermal blistering disease of the skin and mucous membranes. This disease typically affects the elderly and presents with itch and localized or generalized bullous lesions. In up to 20% of affected patients, bullae may be completely absent, and only excoriations, prurigo-like lesions, eczematous lesions, urticated lesions and/or infiltrated plaques are observed. The disease is significantly associated with neurological disorders. The morbidity of bullous pemphigoid and its impact on quality of life are significant. So far, a limited number of national treatment guidelines have been proposed, but no common European consensus has emerged. Our consensus for the treatment of bullous pemphigoid has been developed under the guidance of the European Dermatology Forum in collaboration with the European Academy of Dermatology and Venereology. It summarizes evidence-based and expert-based recommendations.
Resumo:
OBJECTIVE The role of hypertension and its impact on outcome in patients with acute coronary syndrome (ACS) is still debated. This study aimed to compare the outcomes of hypertensive and nonhypertensive ACS patients. METHODS Using data of ACS patients enrolled in the Acute Myocardial Infarction in Switzerland Plus Registry from 1997 to 2013, characteristics at presentation and outcomes in hospital and after 1 year were analyzed. Hypertension was defined as previously diagnosed and treated by a physician. The primary endpoint was mortality. Data were analyzed using multiple logistic regressions. RESULTS Among 41 771 ACS patients, 16 855 (40.4%) were without and 24 916 (59.6%) with preexisting hypertension. Patients with preexisting hypertension had a more favorable in-hospital outcome [odds ratio (OR) in-hospital mortality 0.82, 95% confidence interval (CI) 0.73-0.93; P = 0.022]. The independent predictors of in-hospital mortality for patients with preexisting hypertension were age, Killip class greater than 2, Charlson Comorbidity Index greater than 1, no pretreatment with statins and lower admission systemic blood pressure. Preexisting hypertension was not an independent predictor of 1-year mortality in the subgroup of patients (n = 7801) followed: OR 1.07, 95% CI 0.78-1.47; P = 0.68. Independent predictors of mortality 1 year after discharge for the 4796 patients with preexisting hypertension were age, male sex and comorbidities. Angiotensin-converting enzyme inhibitors or angiotensin II receptor antagonists and statins prescribed at discharge improved the outcomes. CONCLUSION Outcome of ACS patients with preexisting hypertension was associated with an improved in-hospital prognosis after adjustment for their higher baseline risk. However, this effect was not long-lasting and does not necessarily mean a causal relationship exists. Short-term and long-term management of patients with hypertension admitted with ACS could be further improved.
Resumo:
OBJECTIVE To determine the success of medical management of presumptive thoracolumbar disk herniation in dogs and the variables associated with treatment outcome. STUDY DESIGN Retrospective case series. ANIMALS Dogs (n=223) with presumptive thoracolumbar disk herniation. METHODS Medical records from 2 clinics were used to identify affected dogs, and owners were mailed a questionnaire about success of therapy, recurrence of clinical signs, and quality of life (QOL) as interpreted by the owner. Signalment, duration and degree of neurologic dysfunction, and medication administration were determined from medical records. RESULTS Eighty-three percent of dogs (185/223) were ambulatory at initial evaluation. Successful treatment was reported for 54.7% of dogs, with 30.9% having recurrence of clinical signs and 14.4% classified as therapeutic failures. From bivariable logistic regression, glucocorticoid administration was negatively associated with success (P=.008; odds ratio [OR]=.48) and QOL scores (P=.004; OR=.48). The duration of cage rest was not significantly associated with success or QOL. Nonambulatory dogs were more likely to have lower QOL scores (P=.01; OR=2.34). CONCLUSIONS Medical management can lead to an acceptable outcome in many dogs with presumptive thoracolumbar disk herniation. Cage rest duration does not seem to affect outcome and glucocorticoids may negatively impact success and QOL. The conclusions in this report should be interpreted cautiously because of the retrospective data collection and the use of client self-administered questionnaire follow-up. CLINICAL RELEVANCE These results provide an insight into the success of medical management for presumptive thoracolumbar disk herniation in dogs and may allow for refinement of treatment protocols.
Resumo:
Soils provide us with over 90% of all human food, livestock feed, fibre and fuel on Earth. Soils, however, have more than just productive functions. The key challenge in coming years will be to address the diverse and potentially conflicting demands now being made by human societies and other forms of life, while ensuring that future generations have the same potential to use soils and land of comparable quality. In a multi-level stakeholder approach, down-to-earth action will have to be supplemented with measures at various levels, from households to communities, and from national policies to international conventions. Knowledge systems, both indigenous and scientific, and related research and learning processes must play a central role. Ongoing action can be enhanced through a critical assessment of the impact of past achievements, and through better cooperation between people and institutions.
Resumo:
The question concerning the circumstances under which it is advantageous for a company to outsource certain information systems functions has been a controversial issue for the last decade. While opponents emphasize the risks of outsourcing based on the loss of strategic potentials and increased transaction costs, proponents emphasize the strategic benefits of outsourcing and high potentials of cost-savings. This paper brings together both views by examining the conditions under which both the strategic potentials as well as savings in production and transaction costs of developing and maintaining software applications can better be achieved in-house as opposed to by an external vendor. We develop a theoretical framework from three complementary theories and test it empirically based on a mail survey of 139 German companies. The results show that insourcing is more cost efficient and advantageous in creating strategic benefits through IS if the provision of application services requires a high amount of firm specific human assets. These relationships, however, are partially moderated by differences in the trustworthiness and intrinsic motivation of internal versus external IS professionals. Moreover, capital shares with an external vendor can lower the risk of high transaction costs as well the risk of loosing the strategic opportunities of an IS.
Resumo:
PURPOSE OF REVIEW Fever and neutropenia is the most common complication in the treatment of childhood cancer. This review will summarize recent publications that focus on improving the management of this condition as well as those that seek to optimize translational research efforts. RECENT FINDINGS A number of clinical decision rules are available to assist in the identification of low-risk fever and neutropenia however few have undergone external validation and formal impact analysis. Emerging evidence suggests acute fever and neutropenia management strategies should include time to antibiotic recommendations, and quality improvement initiatives have focused on eliminating barriers to early antibiotic administration. Despite reported increases in antimicrobial resistance, few studies have focused on the prediction, prevention, and optimal treatment of these infections and the effect on risk stratification remains unknown. A consensus guideline for paediatric fever and neutropenia research is now available and may help reduce some of the heterogeneity between studies that have previously limited the translation of evidence into clinical practice. SUMMARY Risk stratification is recommended for children with cancer and fever and neutropenia. Further research is required to quantify the overall impact of this approach and to refine exactly which children will benefit from early antibiotic administration as well as modifications to empiric regimens to cover antibiotic-resistant organisms.
Resumo:
Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources. While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed. Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed. By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures. This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards. Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common (adverse) outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures. Thirdly, effect-directed analysis (EDA) will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.
Resumo:
SOLUTIONS (2013 to 2018) is a European Union Seventh Framework Programme Project (EU-FP7). The project aims to deliver a conceptual framework to support the evidence-based development of environmental policies with regard to water quality. SOLUTIONS will develop the tools for the identification, prioritisation and assessment of those water contaminants that may pose a risk to ecosystems and human health. To this end, a new generation of chemical and effect-based monitoring tools is developed and integrated with a full set of exposure, effect and risk assessment models. SOLUTIONS attempts to address legacy, present and future contamination by integrating monitoring and modelling based approaches with scenarios on future developments in society, economy and technology and thus in contamination. The project follows a solutions-oriented approach by addressing major problems of water and chemicals management and by assessing abatement options. SOLUTIONS takes advantage of the access to the infrastructure necessary to investigate the large basins of the Danube and Rhine as well as relevant Mediterranean basins as case studies, and puts major efforts on stakeholder dialogue and support. Particularly, the EU Water Framework Directive (WFD) Common Implementation Strategy (CIS) working groups, International River Commissions, and water works associations are directly supported with consistent guidance for the early detection, identification, prioritisation, and abatement of chemicals in the water cycle. SOLUTIONS will give a specific emphasis on concepts and tools for the impact and risk assessment of complex mixtures of emerging pollutants, their metabolites and transformation products. Analytical and effect-based screening tools will be applied together with ecological assessment tools for the identification of toxicants and their impacts. The SOLUTIONS approach is expected to provide transparent and evidence-based candidates or River Basin Specific Pollutants in the case study basins and to assist future review of priority pollutants under the WFD as well as potential abatement options.
Resumo:
BACKGROUND Uncertainty about the presence of infection results in unnecessary and prolonged empiric antibiotic treatment of newborns at risk for early-onset sepsis (EOS). This study evaluates the impact of this uncertainty on the diversity in management. METHODS A web-based survey with questions addressing management of infection risk-adjusted scenarios was performed in Europe, North America, and Australia. Published national guidelines (n=5) were reviewed and compared to the results of the survey. RESULTS 439 Clinicians (68% were neonatologists) from 16 countries completed the survey. In the low-risk scenario, 29% would start antibiotic therapy and 26% would not, both groups without laboratory investigations; 45% would start if laboratory markers were abnormal. In the high-risk scenario, 99% would start antibiotic therapy. In the low-risk scenario, 89% would discontinue antibiotic therapy before 72 hours. In the high-risk scenario, 35% would discontinue therapy before 72 hours, 56% would continue therapy for five to seven days, and 9% for more than 7 days. Laboratory investigations were used in 31% of scenarios for the decision to start, and in 72% for the decision to discontinue antibiotic treatment. National guidelines differ considerably regarding the decision to start in low-risk and regarding the decision to continue therapy in higher risk situations. CONCLUSIONS There is a broad diversity of clinical practice in management of EOS and a lack of agreement between current guidelines. The results of the survey reflect the diversity of national guidelines. Prospective studies regarding management of neonates at risk of EOS with safety endpoints are needed.
Resumo:
BACKGROUND Reported frequency of post-stroke dysphagia in the literature is highly variable. In view of progress in stroke management, we aimed to assess the current burden of dysphagia in acute ischemic stroke. METHODS We studied 570 consecutive patients treated in a tertiary stroke center. Dysphagia was evaluated by using the Gugging Swallowing Screen (GUSS). We investigated the relationship of dysphagia with pneumonia, length of hospital stay and discharge destination and compared rates of favourable clinical outcome and mortality at 3 months between dysphagic patients and those without dysphagia. RESULTS Dysphagia was diagnosed in 118 of 570 (20.7%) patients and persisted in 60 (50.9%) at hospital discharge. Thirty-six (30.5%) patients needed nasogastric tube because of severe dysphagia. Stroke severity rather than infarct location was associated with dysphagia. Dysphagic patients suffered more frequently from pneumonia (23.1% vs. 1.1%, p<0.001), stayed longer at monitored stroke unit beds (4.4±2.8 vs. 2.7±2.4 days; p<0.001) and were less often discharged to home (19.5% vs. 63.7%, p = 0.001) as compared to those without dysphagia. At 3 months, dysphagic patients less often had a favourable outcome (35.7% vs. 69.7%; p<0.001), less often lived at home (38.8% vs. 76.5%; p<0.001), and more often had died (13.6% vs. 1.6%; p<0.001). Multivariate analyses identified dysphagia to be an independent predictor of discharge destination and institutionalization at 3 months, while severe dysphagia requiring tube placement was strongly associated with mortality. CONCLUSION Dysphagia still affects a substantial portion of stroke patients and may have a large impact on clinical outcome, mortality and institutionalization.