874 resultados para development methods
Resumo:
Aquesta tesi es basa en el programa de reintroducció de la llúdriga eurasiàtica (Lutra lutra) a les conques dels rius Muga i Fluvià (Catalunya) durant la segona meitat dels 1990s. Els objectius de la tesi foren demostrar la viabilitat de la reintroducció, demostrar l'èxit de la mateixa, estudiar aspectes ecològics i etològics de l'espècie, aprofitant l'oportunitat única de gaudir d'una població "de disseny" i determinar les probabilitats de supervivència de la població a llarg termini. La reintroducció de la llúdriga a les conques dels rius Muga i Fluvià va reeixir, doncs l'àrea geogràfica ocupada efectivament es va incrementar fins a un 64% d'estacions positives a l'hivern 2001-02. La troballa de tres exemplars adults nascuts a l'àrea de reintroducció és una altra prova que valida l'èxit del programa. La densitat d'exemplars calculada a través dels censos visuals ha resultat baixa (0.04-0.11 llúdrigues/km), però s'aproxima al que hom pot esperar en els primers estadis d'una població reintroduïda, encara poc nombrosa però distribuïda en una gran àrea. La mortalitat post-alliberament va ser del 22% un any després de l'alliberament, similar o inferior a la d'altres programes de reintroducció de llúdrigues reeixits. La mortalitat va ser deguda principalment a atropellaments (56%). El patró d'activitat de les llúdrigues reintroduïdes va esdevenir principalment nocturn i crepuscular, amb una escassa activitat diürna. Les seves àrees vitals van ser del mateix ordre (34,2 km) que les calculades en d'altres estudis realitzats a Europa. La longitud mitjana de riu recorreguda per una llúdriga durant 24 hores va ser de 4,2 km per les femelles i 7,6 km pels mascles. Durant el període de radioseguiment dues femelles van criar i els seus moviments van poder ser estudiats amb deteniment. La resposta de la nova població de llúdrigues a les fluctuacions estacionals en la disponibilitat d'aigua, habitual a les regions mediterrànies, va consistir en la concentració en una àrea menor durant el període de sequera estival, a causa de l'increment de trams secs, inhabitables per la llúdriga per la manca d'aliment, fet que va provocar expansions i contraccions periòdiques en l'àrea de distribució. La persistència a llarg termini de la població reintroduïda va ser estudiada mitjançant una Anàlisi de Viabilitat Poblacional (PVA). El resultat va ser un baix risc d'extinció de la població en els propers 100 anys i la majoria dels escenaris simulats (65%) van assolir el criteri d'un mínim de 90% de probabilitat de supervivència. Del model poblacional construït es dedueix que un punt clau per assegurar la viabilitat de la població reintroduïda és la reducció de la mortalitat accidental. A l'àrea d'estudi, els atropellaments causen més del 50% de la mortalitat i aquesta pot ser reduïda mitjançant la construcció de passos de fauna, el tancament lateral d'alguns trams de carretera perillosos i el control de la velocitat en algunes vies. El projecte de reintroducció ha posat a punt un protocol per a la captura, maneig i alliberament de llúdrigues salvatges, que pot contenir informació útil per a programes similars. També ha suposat una oportunitat única d'estudiar una població dissenyada artificialment i poder comparar diversos mètodes per estimar la distribució i la densitat de poblacions de llúdrigues. Per últim, la reintroducció portada a terme a les conques dels rius Muga i Fluvià ha aconseguit crear una nova població de llúdrigues, que persisteix en el temps, que es reprodueix regularment i que es dispersa progressivament, fins i tot a noves conques fluvials.
Resumo:
La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.
Resumo:
Birds are vulnerable to collisions with human-made fixed structures. Despite ongoing development and increases in infrastructure, we have few estimates of the magnitude of collision mortality. We reviewed the existing literature on avian mortality associated with transmission lines and derived an initial estimate for Canada. Estimating mortality from collisions with power lines is challenging due to the lack of studies, especially from sites within Canada, and due to uncertainty about the magnitude of detection biases. Detection of bird collisions with transmission lines varies due to habitat type, species size, and scavenging rates. In addition, birds can be crippled by the impact and subsequently die, although crippling rates are poorly known and rarely incorporated into estimates. We used existing data to derive a range of estimates of avian mortality associated with collisions with transmission lines in Canada by incorporating detection, scavenging, and crippling biases. There are 231,966 km of transmission lines across Canada, mostly in the boreal forest. Mortality estimates ranged from 1 million to 229.5 million birds per year, depending on the bias corrections applied. We consider our most realistic estimate, taking into account variation in risk across Canada, to range from 2.5 million to 25.6 million birds killed per year. Data from multiple studies across Canada and the northern U.S. indicate that the most vulnerable bird groups are (1) waterfowl, (2) grebes, (3) shorebirds, and (4) cranes, which is consistent with other studies. Populations of several groups that are vulnerable to collisions are increasing across Canada (e.g., waterfowl, raptors), which suggests that collision mortality, at current levels, is not limiting population growth. However, there may be impacts on other declining species, such as shorebirds and some species at risk, including Alberta’s Trumpeter Swans (Cygnus buccinator) and western Canada’s endangered Whooping Cranes (Grus americana). Collisions may be more common during migration, which underscores the need to understand impacts across the annual cycle. We emphasize that these estimates are preliminary, especially considering the absence of Canadian studies.
Resumo:
The acute hippocampal brain slice preparation is an important in vitro screening tool for potential anticonvulsants. Application of 4-aminopyridine (4-AP) or removal of external Mg2+ ions induces epileptiform bursting in slices which is analogous to electrical brain activity seen in status epilepticus states. We have developed these epileptiform models for use with multi-electrode arrays (MEAs), allowing recording across the hippocampal slice surface from 59 points. We present validation of this novel approach and analyses using two anticonvulsants, felbamate and phenobarbital, the effects of which have already been assessed in these models using conventional extracellular recordings. In addition to assessing drug effects on commonly described parameters (duration, amplitude and frequency), we describe novel methods using the MEA to assess burst propagation speeds and the underlying frequencies that contribute to the epileptiform activity seen. Contour plots are also used as a method of illustrating burst activity. Finally, we describe hitherto unreported properties of epileptiform bursting induced by 100M4-AP or removal of external Mg2+ ions. Specifically, we observed decreases over time in burst amplitude and increase over time in burst frequency in the absence of additional pharmacological interventions. These MEA methods enhance the depth, quality and range of data that can be derived from the hippocampal slice preparation compared to conventional extracellular recordings. It may also uncover additional modes of action that contribute to anti-epileptiform drug effects
Resumo:
Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Partnerships are complex, diverse and subtle relationships, the nature of which changes with time, but they are vital for the functioning of the development chain. This paper reviews the meaning of partnership between development institutions as well as some of the main approaches taken to analyse the relationships. The latter typically revolve around analyses based on power, discourse, interdependence and functionality. The paper makes the case for taking a multianalytical approach to understanding partnership but points out three problem areas: identifying acceptable/unacceptable trade-offs between characteristics of partnership, the analysis of multicomponent partnerships (where one partner has a number of other partners) and the analysis of long-term partnership. The latter is especially problematic for long-term partnerships between donors and field agencies that share an underlying commitment based on religious beliefs. These problems with current methods of analysing partnership are highlighted by focusing upon the Catholic Church-based development chain, linking donors in the North (Europe) and their field partners in the South (Abuja Ecclesiastical Province, Nigeria). It explores a narrated history of a relationship with a single donor spanning 35 years from the perspective of one partner (the field agency).
Integrating methods for developing sustainability indicators that can facilitate learning and action
Resumo:
Bossel's (2001) systems-based approach for deriving comprehensive indicator sets provides one of the most holistic frameworks for developing sustainability indicators. It ensures that indicators cover all important aspects of system viability, performance, and sustainability, and recognizes that a system cannot be assessed in isolation from the systems upon which it depends and which in turn depend upon it. In this reply, we show how Bossel's approach is part of a wider convergence toward integrating participatory and reductionist approaches to measure progress toward sustainable development. However, we also show that further integration of these approaches may be able to improve the accuracy and reliability of indicators to better stimulate community learning and action. Only through active community involvement can indicators facilitate progress toward sustainable development goals. To engage communities effectively in the application of indicators, these communities must be actively involved in developing, and even in proposing, indicators. The accuracy, reliability, and sensitivity of the indicators derived from local communities can be ensured through an iterative process of empirical and community evaluation. Communities are unlikely to invest in measuring sustainability indicators unless monitoring provides immediate and clear benefits. However, in the context of goals, targets, and/or baselines, sustainability indicators can more effectively contribute to a process of development that matches local priorities and engages the interests of local people.
Resumo:
The ultimate criterion of success for interactive expert systems is that they will be used, and used to effect, by individuals other than the system developers. A key ingredient of success in most systems is involving users in the specification and development of systems as they are being built. However, until recently, system designers have paid little attention to ascertaining user needs and to developing systems with corresponding functionality and appropriate interfaces to match those requirements. Although the situation is beginning to change, many developers do not know how to go about involving users, or else tackle the problem in an inadequate way. This paper discusses the need for user involvement and considers why many developers are still not involving users in an optimal way. It looks at the different ways in which users can be involved in the development process and describes how to select appropriate techniques and methods for studying users. Finally, it discusses some of the problems inherent in involving users in expert system development, and recommends an approach which incorporates both ethnographic analysis and formal user testing.
Resumo:
This review article addresses recent advances in the analysis of foods and food components by capillary electrophoresis (CE). CE has found application to a number of important areas of food analysis, including quantitative chemical analysis of food additives, biochemical analysis of protein composition, and others. The speed, resolution and simplicity of CE, combined with low operating costs, make the technique an attractive option for the development of improved methods of food analysis for the new millennium.
Resumo:
The 'direct costs' attributable to 30 different endemic diseases of farm animals in Great Britain are estimated using a standardised method to construct a simple model for each disease that includes consideration of disease prevention and treatment costs. The models so far developed provide a basis for further analyses including cost-benefit analyses for the economic assessment of disease control options. The approach used reflects the inherent livestock disease information constraints, which limit the application of other economic analytical methods. It is a practical and transparent approach that is relatively easily communicated to veterinary scientists and policy makers. The next step is to develop the approach by incorporating wider economic considerations into the analyses in a way that will demonstrate to policy makers and others the importance of an economic perspective to livestock disease issues.
Resumo:
Answering many of the critical questions in conservation, development and environmental management requires integrating the social and natural sciences. However, understanding the array of available quantitative methods and their associated terminology presents a major barrier to successful collaboration. We provide an overview of quantitative socio-economic methods that distils their complexity into a simple taxonomy. We outline how each has been used in conjunction with ecological models to address questions relating to the management of socio-ecological systems. We review the application of social and ecological quantitative concepts to agro-ecology and classify the approaches used to integrate the two disciplines. Our review included all published integrated models from 2003 to 2008 in 27 journals that publish agricultural modelling research. Although our focus is on agro-ecology, many of the results are broadly applicable to other fields involving an interaction between human activities and ecology. We found 36 papers that integrated social and ecological concepts in a quantitative model. Four different approaches to integration were used, depending on the scale at which human welfare was quantified. Most models viewed humans as pure profit maximizers, both when calculating welfare and predicting behaviour. Synthesis and applications. We reached two main conclusions based on our taxonomy and review. The first is that quantitative methods that extend predictions of behaviour and measurements of welfare beyond a simple market value basis are underutilized by integrated models. The second is that the accuracy of prediction for integrated models remains largely unquantified. Addressing both problems requires researchers to reach a common understanding of modelling goals and data requirements during the early stages of a project.
Resumo:
Questions: How is succession on ex-arable land affected by sowing high and low diversity mixtures of grassland species as compared to natural succession? How long do effects persist? Location: Experimental plots installed in the Czech Republic, The Netherlands, Spain, Sweden and the United Kingdom. Methods: The experiment was established on ex-arable land, with five blocks, each containing three 10 m x 10 m experiment tal plots: natural colonization, a low- (four species) and high-diversity (15 species) seed mixture. Species composition and biomass was followed for eight years. Results: The sown plants considerably affected the whole successional pathway and the effects persisted during the whole eight year period. Whilst the proportion of sown species (characterized by their cover) increased during the study period, the number of sown species started to decrease from the third season onwards. Sowing caused suppression of natural colonizing species, and the sown plots had more biomass. These effects were on average larger in the high diversity mixtures. However, the low diversity replicate sown with the mixture that produced the largest biomass or largest suppression of natural colonizers fell within the range recorded at the five replicates of the high diversity plots. The natural colonization plots usually had the highest total species richness and lowest productivity at the end of the observation period. Conclusions: The effect of sowing demonstrated dispersal limitation as a factor controlling the rate of early secondary succession. Diversity was important primarily for its 'insurance effect': the high diversity mixtures were always able to compensate for the failure of some species.
Resumo:
An aggregated farm-level index, the Agri-environmental Footprint Index (AFI), based on multiple criteria methods and representing a harmonised approach to evaluation of EU agri-environmental schemes is described. The index uses a common framework for the design and evaluation of policy that can be customised to locally relevant agri-environmental issues and circumstances. Evaluation can be strictly policy-focused, or broader and more holistic in that context-relevant assessment criteria that are not necessarily considered in the evaluated policy can nevertheless be incorporated. The Index structure is flexible, and can respond to diverse local needs. The process of Index construction is interactive, engaging farmers and other relevant stakeholders in a transparent decision-making process that can ensure acceptance of the outcome, help to forge an improved understanding of local agri-environmental priorities and potentially increase awareness of the critical role of farmers in environmental management. The structure of the AFI facilitates post-evaluation analysis of relative performance in different dimensions of the agri-environment, permitting identification of current strengths and weaknesses, and enabling future improvement in policy design. Quantification of the environmental impact of agriculture beyond the stated aims of policy using an 'unweighted' form of the AFI has potential as the basis of an ongoing system of environmental audit within a specified agricultural context. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
A study was conducted to estimate variation among laboratories and between manual and automated techniques of measuring pressure on the resulting gas production profiles (GPP). Eight feeds (molassed sugarbeet feed, grass silage, maize silage, soyabean hulls, maize gluten feed, whole crop wheat silage, wheat, glucose) were milled to pass a I mm screen and sent to three laboratories (ADAS Nutritional Sciences Research Unit, UK; Institute of Grassland and Environmental Research (IGER), UK; Wageningen University, The Netherlands). Each laboratory measured GPP over 144 h using standardised procedures with manual pressure transducers (MPT) and automated pressure systems (APS). The APS at ADAS used a pressure transducer and bottles in a shaking water bath, while the APS at Wageningen and IGER used a pressure sensor and bottles held in a stationary rack. Apparent dry matter degradability (ADDM) was estimated at the end of the incubation. GPP were fitted to a modified Michaelis-Menten model assuming a single phase of gas production, and GPP were described in terms of the asymptotic volume of gas produced (A), the time to half A (B), the time of maximum gas production rate (t(RM) (gas)) and maximum gas production rate (R-M (gas)). There were effects (P<0.001) of substrate on all parameters. However, MPT produced more (P<0.001) gas, but with longer (P<0.001) B and t(RM gas) (P<0.05) and lower (P<0.001) R-M gas compared to APS. There was no difference between apparatus in ADDM estimates. Interactions occurred between substrate and apparatus, substrate and laboratory, and laboratory and apparatus. However, when mean values for MPT were regressed from the individual laboratories, relationships were good (i.e., adjusted R-2 = 0.827 or higher). Good relationships were also observed with APS, although they were weaker than for MPT (i.e., adjusted R-2 = 0.723 or higher). The relationships between mean MPT and mean APS data were also good (i.e., adjusted R 2 = 0. 844 or higher). Data suggest that, although laboratory and method of measuring pressure are sources of variation in GPP estimation, it should be possible using appropriate mathematical models to standardise data among laboratories so that data from one laboratory could be extrapolated to others. This would allow development of a database of GPP data from many diverse feeds. (c) 2005 Published by Elsevier B.V.
Resumo:
One of the major factors contributing to the failure of new wheat varieties is seasonal variability in end-use quality. Consequently, it is important to produce varieties which are robust and stable over a range of environmental conditions. Recently developed sample preparation methods have allowed the application of FT-IR spectroscopic imaging methods to the analysis of wheat endosperm cell wall composition, allowing the spatial distribution of structural components to be determined without the limitations of conventional chemical analysis. The advantages of the methods, described in this paper, are that they determine the composition of endosperm cell walls in situ and with minimal modification during preparation. Two bread-making wheat cultivars, Spark and Rialto, were selected to determine the impact of environmental conditions on the cell-wall composition of the starchy endosperm of the developing and mature grain, focusing on the period of grain filling (starting at about 14 days after anthesis). Studies carried out over two successive seasons show that the structure of the arabinoxylans in the endosperm cell walls changes from a highly branched form to a less branched form. Furthermore, during development the rate of restructuring was faster when the plants were grown at higher temperature with restricted water availability from 14 days after anthesis with differences in the rate of restructuring occurring between the two cultivars.