66 resultados para Participatory methodology
Trees, trust and the state: A comparison of participatory forest management in Pakistan and Tanzania
Resumo:
Relationships between mineralization, collagen orientation and indentation modulus were investigated in bone structural units from the mid-shaft of human femora using a site-matched design. Mineral mass fraction, collagen fibril angle and indentation moduli were measured in registered anatomical sites using backscattered electron imaging, polarized light microscopy and nano-indentation, respectively. Theoretical indentation moduli were calculated with a homogenization model from the quantified mineral densities and mean collagen fibril orientations. The average indentation moduli predicted based on local mineralization and collagen fibers arrangement were not significantly different from the average measured experimentally with nanoindentation (p=0.9). Surprisingly, no substantial correlation of the measured indentation moduli with tissue mineralization and/or collagen fiber arrangement was found. Nano-porosity, micro-damage, collagen cross-links, non-collagenous proteins or other parameters affect the indentation measurements. Additional testing/simulation methods need to be considered to properly understand the variability of indentation moduli, beyond the mineralization and collagen arrangement in bone structural units.
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
Competing water demands for household consumption as well as the production of food, energy, and other uses pose challenges for water supply and sustainable development in many parts of the world. Designing creative strategies and learning processes for sustainable water governance is thus of prime importance. While this need is uncontested, suitable approaches still have to be found. In this article we present and evaluate a conceptual approach to scenario building aimed at transdisciplinary learning for sustainable water governance. The approach combines normative, explorative, and participatory scenario elements. This combination allows for adequate consideration of stakeholders’ and scientists’ systems, target, and transformation knowledge. Application of the approach in the MontanAqua project in the Swiss Alps confirmed its high potential for co-producing new knowledge and establishing a meaningful and deliberative dialogue between all actors involved. The iterative and combined approach ensured that stakeholders’ knowledge was adequately captured, fed into scientific analysis, and brought back to stakeholders in several cycles, thereby facilitating learning and co-production of new knowledge relevant for both stakeholders and scientists. However, the approach also revealed a number of constraints, including the enormous flexibility required of stakeholders and scientists in order for them to truly engage in the co-production of new knowledge. Overall, the study showed that shifts from strategic to communicative action are possible in an environment of mutual trust. This ultimately depends on creating conditions of interaction that place scientists’ and stakeholders’ knowledge on an equal footing.
Resumo:
This year marks the 20th anniversary of functional near-infrared spectroscopy and imaging (fNIRS/fNIRI). As the vast majority of commercial instruments developed until now are based on continuous wave technology, the aim of this publication is to review the current state of instrumentation and methodology of continuous wave fNIRI. For this purpose we provide an overview of the commercially available instruments and address instrumental aspects such as light sources, detectors and sensor arrangements. Methodological aspects, algorithms to calculate the concentrations of oxy- and deoxyhemoglobin and approaches for data analysis are also reviewed. From the single-location measurements of the early years, instrumentation has progressed to imaging initially in two dimensions (topography) and then three (tomography). The methods of analysis have also changed tremendously, from the simple modified Beer-Lambert law to sophisticated image reconstruction and data analysis methods used today. Due to these advances, fNIRI has become a modality that is widely used in neuroscience research and several manufacturers provide commercial instrumentation. It seems likely that fNIRI will become a clinical tool in the foreseeable future, which will enable diagnosis in single subjects.
Resumo:
A 12 year old German sheperd dog was examined because of polyuria-polydipsia and polyphagia for the last month. Hemogram and biochemistry profile being compatible with hypercorticism, functional test were undertaken and allow to emit a suspicion of primary adrenocortical tumor-dependent hyperadrenocorticism. Diagnosis was confirmed with CT-scan examination and at surgery, an adrenocortical carcinoma was found. Diagnostic evaluation of Cushing syndrom in the dog is discussed.
Resumo:
This study was undertaken to evaluate the specificity and efficiency of different methods to detect Escherichia coli K-12 strains. Another aim was to determine the frequency of E. coli K-12 strains among wild-type E. coli isolates from different sources. The detection of K-12 strains was performed both genotypically by K-12 specific polymerase chain reaction (PCR) and on the basis of phenotypical tests. In addition, the genome structures of E. coli strains were characterized by pulsed-field gel electrophoresis (PFGE). The most specific results could be obtained by the genotypical tests PCR and PFGE as well as by the K-12 specific phage assay. In total, 131 stool and 95 water isolates as well as 14 K-12 derivatives were examined by the different methods. No E. coli K-12 strains were detected among the wild-type isolates.
Resumo:
Clock synchronization in the order of nanoseconds is one of the critical factors for time-based localization. Currently used time synchronization methods are developed for the more relaxed needs of network operation. Their usability for positioning should be carefully evaluated. In this paper, we are particularly interested in GPS-based time synchronization. To judge its usability for localization we need a method that can evaluate the achieved time synchronization with nanosecond accuracy. Our method to evaluate the synchronization accuracy is inspired by signal processing algorithms and relies on fine grain time information. The method is able to calculate the clock offset and skew between devices with nanosecond accuracy in real time. It was implemented using software defined radio technology. We demonstrate that GPS-based synchronization suffers from remaining clock offset in the range of a few hundred of nanoseconds but the clock skew is negligible. Finally, we determine a corresponding lower bound on the expected positioning error.