63 resultados para Wildlife Monitoring and Conservation
Resumo:
Introduction: Although it seems plausible that sports performance relies on high-acuity foveal vision, it could be empirically shown that myoptic blur (up to +2 diopters) does not harm performance in sport tasks that require foveal information pick-up like golf putting (Bulson, Ciuffreda, & Hung, 2008). How myoptic blur affects peripheral performance is yet unknown. Attention might be less needed for processing visual cues foveally and lead to better performance because peripheral cues are better processed as a function of reduced foveal vision, which will be tested in the current experiment. Methods: 18 sport science students with self-reported myopia volunteered as participants, all of them regularly wearing contact lenses. Exclusion criteria comprised visual correction other than myopic, correction of astigmatism and use of contact lenses out of Swiss delivery area. For each of the participants, three pairs of additional contact lenses (besides their regular lenses; used in the “plano” condition) were manufactured with an individual overcorrection to a retinal defocus of +1 to +3 diopters (referred to as “+1.00 D”, “+2.00 D”, and “+3.00 D” condition, respectively). Gaze data were acquired while participants had to perform a multiple object tracking (MOT) task that required to track 4 out of 10 moving stimuli. In addition, in 66.7 % of all trials, one of the 4 targets suddenly stopped during the motion phase for a period of 0.5 s. Stimuli moved in front of a picture of a sports hall to allow for foveal processing. Due to the directional hypotheses, the level of significance for one-tailed tests on differences was set at α = .05 and posteriori effect sizes were computed as partial eta squares (ηρ2). Results: Due to problems with the gaze-data collection, 3 participants had to be excluded from further analyses. The expectation of a centroid strategy was confirmed because gaze was closer to the centroid than the target (all p < .01). In comparison to the plano baseline, participants more often recalled all 4 targets under defocus conditions, F(1,14) = 26.13, p < .01, ηρ2 = .65. The three defocus conditions differed significantly, F(2,28) = 2.56, p = .05, ηρ2 = .16, with a higher accuracy as a function of a defocus increase and significant contrasts between conditions +1.00 D and +2.00 D (p = .03) and +1.00 D and +3.00 D (p = .03). For stop trials, significant differences could neither be found between plano baseline and defocus conditions, F(1,14) = .19, p = .67, ηρ2 = .01, nor between the three defocus conditions, F(2,28) = 1.09, p = .18, ηρ2 = .07. Participants reacted faster in “4 correct+button” trials under defocus than under plano-baseline conditions, F(1,14) = 10.77, p < .01, ηρ2 = .44. The defocus conditions differed significantly, F(2,28) = 6.16, p < .01, ηρ2 = .31, with shorter response times as a function of a defocus increase and significant contrasts between +1.00 D and +2.00 D (p = .01) and +1.00 D and +3.00 D (p < .01). Discussion: The results show that gaze behaviour in MOT is not affected to a relevant degree by a visual overcorrection up to +3 diopters. Hence, it can be taken for granted that peripheral event detection was investigated in the present study. This overcorrection, however, does not harm the capability to peripherally track objects. Moreover, if an event has to be detected peripherally, neither response accuracy nor response time is negatively affected. Findings could claim considerable relevance for all sport situations in which peripheral vision is required which now needs applied studies on this topic. References: Bulson, R. C., Ciuffreda, K. J., & Hung, G. K. (2008). The effect of retinal defocus on golf putting. Ophthalmic and Physiological Optics, 28, 334-344.
Resumo:
Human–wildlife conflict is emerging as an important topic in conservation. Carnivores and birds of prey are responsible for most conflicts with livestock and game but since the mid 1990s a new conflict is emerging in south-west Europe: the presumed killing of livestock by griffon vultures Gyps fulvus. Lack of scientific data and magnification of the problem by the media are increasing alarm amongst the public, and political pressures to implement management decisions have not been based on scientific evidence. We compiled information on 1,793 complaints about attacks by griffon vultures on livestock, lodged with Spanish authorities from 1996 to 2010. Spain is home to the majority (95%) of griffon vultures and other scavengers in the European Union. Most of the cases occurred in areas of high livestock density, affected principally sheep (49%) and cows (31%), and were associated with spring birthing times (April–June). On average 69% of the complaints made annually were rejected because of a lack of evidence about whether the animal was alive before being eaten. The total economic cost of compensation was EUR 278,590 from 2004 to 2010. We discuss possible ways to mitigate this emerging human–wildlife conflict. These need to include the participation of livestock farmers, authorities, scientists and conservation groups.
Resumo:
The value of wildlife has long been ignored or under-rated. However, growing concerns about biodiversity loss and emerging diseases of wildlife origin have enhanced debates about the importance of wildlife. Wildlife-related diseases are viewed through these debates as a potential threat to wildlife conservation and domestic animal and human health. This article provides an overview of the values we place on wildlife (positive: socio-cultural, nutritional, economic, ecological; and negative: damages, health issues) and of the significance of diseases for biodiversity conservation. It shows that the values of wildlife, the emergence of wildlife diseases and biodiversity conservation are closely linked. The article also illustrates why investigations into wildlife diseases are now recognized as an integral part of global health issues. The modern One Health concept requires multi-disciplinary research groups including veterinarians, human physicians, ecologists and other scientists collaborating towards a common goal: prevention of disease emergence and preservation of ecosystems, both of which are essential to protect human life and well-being.
Resumo:
Many efforts have been made in Ethiopia to mitigate land degradation, particularly soil erosion, through both local and newly introduced soil and water conservation (SWC) practices. However, the strict focus on soil erosion and conservation does not necessarily lead to satisfactory results. If SWC is effective in reducing erosion but is at the same time too costly and unacceptable to land users, sooner or later it will disappear and its positive effects will also be lost. This book therefore suggests to follow the broader approach of Sustainable Land Management (SLM), which aims at ecological soundness, economic viability and social acceptability, and thus places SWC in a more holistic framework that is closer to farmers’ reality.
Quantifying the impacts of Conservation Agriculture (CA) on water use, soil quality and productivity
Resumo:
Along a downstream stretch of River Mure , Romania, adult males of two feral fish species, European chub (Leuciscus cephalus) and sneep (Chondrostoma nasus) were sampled at four sites with different levels of contamination. Fish were analysed for the biochemical markers hsp70 (in liver and gills) and hepatic EROD activity, as well as several biometrical parameters (age, length, wet weight, condition factor). None of the biochemical markers correlated with any biometrical parameter, thus biomarker reactions were related to site-specific criteria. While the hepatic hsp70 level did not differ among the sites, significant elevation of the hsp70 level in the gills revealed proteotoxic damage in chub at the most upstream site, where we recorded the highest heavy metal contamination of the investigated stretch, and in both chub and sneep at the site right downstream of the city of Arad. In both species, significantly elevated hepatic EROD activity downstream of Arad indicated that fish from these sites are also exposed to organic chemicals. The results were indicative of impaired fish health at least at three of the four investigated sites. The approach to relate biomarker responses to analytical data on pollution was shown to fit well the recent EU demands on further enhanced efforts in the monitoring of Romanian water quality.
Resumo:
BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.
Resumo:
Soil conservation technologies that fit well to local scale and are acceptable to land users are increasingly needed. To achieve this at small-holder farm level, there is a need for an understanding of specific erosion processes and indicators, the land users’ knowledge and their willingness, ability and possibilities to respond to the respective problems to decide on control options. This study was carried out to assess local erosion and performance of earlier introduced conservation terraces from both technological and land users’ points of view. The study was conducted during July to August 2008 at Angereb watershed on 58 farm plots from three selected case-study catchments. Participatory erosion assessment and evaluation were implemented along with direct field measurement procedures. Our focus was to involve the land users in the action research to explore with them the effectiveness of existing conservation measures against the erosion hazard. Terrace characteristics measured and evaluated against the terrace implementation guideline of Hurni (1986). The long-term consequences of seasonal erosion indicators had often not been known and noticed by farmers. The cause and effect relationships of the erosion indicators and conservation measures have shown the limitations and gaps to be addressed towards sustainable erosion control strategies. Less effective erosion control has been observed and participants have believed the gaps are to be the result of lack of landusers’ genuine participation. The results of both local erosion observation and assessment of conservation efficacy using different aspects show the need to promote approaches for erosion evaluation and planning of interventions by the farmers themselves. This paper describes the importance of human factor involving in the empirical erosion assessment methods towards sustainable soil conservation.
Resumo:
Many Member States of the European Union (EU) currently monitor antimicrobial resistance in zoonotic agents, including Salmonella and Campylobacter. According to Directive 2003/99/EC, Member States shall ensure that the monitoring provides comparable data on the occurrence of antimicrobial resistance. The European Commission asked the European Food Safety Authority to prepare detailed specifications for harmonised schemes for monitoring antimicrobial resistance. The objective of these specifications is to lay down provisions for a monitoring and reporting scheme for Salmonella in fowl (Gallus gallus), turkeys and pigs, and for Campylobacter jejuni and Campylobacter coli in broiler chickens. The current specifications are considered to be a first step towards a gradual implementation of comprehensive antimicrobial resistance monitoring at the EU level. These specifications propose to test a common set of antimicrobial agents against available cut-off values and a specified concentration range to determine the susceptibility of Salmonella and Campylobacter. Using isolates collected through programmes in which the sampling frame covers all epidemiological units of the national production, the target number of Salmonella isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., laying hens, broilers, turkeys and slaughter pigs). The target number of Campylobacter isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., broilers). The results of the antimicrobial resistance monitoring are assessed and reported in the yearly national report on trends and sources of zoonoses, zoonotic agents and antimicrobial resistance.
Resumo:
In a fast changing world with growing concerns about biodiversity loss and an increasing number of animal and human diseases emerging from wildlife, the need for effective wildlife health investigations including both surveillance and research is now widely recognized. However, procedures applicable to and knowledge acquired from studies related to domestic animal and human health can be on partly extrapolated to wildlife. This article identifies requirements and challenges inherent in wildlife health investigations, reviews important definitions and novel health investigation methods, and proposes tools and strategies for effective wildlife health surveillance programs. Impediments to wildlife health investigations are largely related to zoological, behavioral and ecological characteristics of wildlife populations and to limited access to investigation materials. These concerns should not be viewed as insurmountable but it is imperative that they are considered in study design, data analysis and result interpretation. It is particularly crucial to remember that health surveillance does not begin in the laboratory but in the fields. In this context, participatory approaches and mutual respect are essential. Furthermore, interdisciplinarity and open minds are necessary because a wide range of tools and knowledge from different fields need to be integrated in wildlife health surveillance and research. The identification of factors contributing to disease emergence requires the comparison of health and ecological data over time and among geographical regions. Finally, there is a need for the development and validation of diagnostic tests for wildlife species and for data on free-ranging population densities. Training of health professionals in wildlife diseases should also be improved. Overall, the article particularly emphasizes five needs of wildlife health investigations: communication and collaboration; use of synergies and triangulation approaches; investments for the long term; systematic collection of metadata; and harmonization of definitions and methods.
Resumo:
The importance of long-term historical information derived from paleoecological studies has long been recognized as a fundamental aspect of effective conservation. However, there remains some uncertainty regarding the extent to which paleoecology can inform on specific issues of high conservation priority, at the scale for which conservation policy decisions often take place. Here we review to what extent the past occurrence of three fundamental aspects of forest conservation can be assessed using paleoecological data, with a focus on northern Europe. These aspects are (1) tree species composition, (2) old/large trees and coarse woody debris, and (3) natural disturbances. We begin by evaluating the types of relevant historical information available from contemporary forests, then evaluate common paleoecological techniques, namely dendrochronology, pollen, macrofossil, charcoal, and fossil insect and wood analyses. We conclude that whereas contemporary forests can be used to estimate historical, natural occurrences of several of the aspects addressed here (e.g. old/large trees), paleoecological techniques are capable of providing much greater temporal depth, as well as robust quantitative data for tree species composition and fire disturbance, qualitative insights regarding old/large trees and woody debris, but limited indications of past windstorms and insect outbreaks. We also find that studies of fossil wood and paleoentomology are perhaps the most underutilized sources of information. Not only can paleoentomology provide species specific information, but it also enables the reconstruction of former environmental conditions otherwise unavailable. Despite the potential, the majority of conservation-relevant paleoecological studies primarily focus on describing historical forest conditions in broad terms and for large spatial scales, addressing former climate, land-use, and landscape developments, often in the absence of a specific conservation context. In contrast, relatively few studies address the most pressing conservation issues in northern Europe, often requiring data on the presence or quantities of dead wood, large trees or specific tree species, at the scale of the stand or reserve. Furthermore, even fewer examples exist of detailed paleoecological data being used for conservation planning, or the setting of operative restorative baseline conditions at local scales. If ecologist and conservation biologists are going to benefit to the full extent possible from the ever-advancing techniques developed by the paleoecological sciences, further integration of these disciplines is desirable.
Resumo:
Combined approaches to conserve both biological and cultural diversity are seen as an alternative to classical nature conservation instruments. The objective of this study was to examine the influence of urbanization coupled with exclusive conservation measures, on land use, local knowledge and biodiversity in two Quechua speaking communities of Bolivia located within the Tunari National Park. We assessed and compared the links between land use, its transformation through conservation practices, local institutions and the worldviews of both communities and the implications they have for biodiversity at the level of ecosystems. Our results show that in both communities, people’s worldviews and environmental knowledge are linked with an integral and diversified use of their territory. However, the community most affected by urbanization and protected area regulations has intensified agriculture in a small area and has abandoned the use of large areas. This was accompanied by a loss of local environmental knowledge and a decrease in the diversity of ecosystems. The second community, where the park was not enforced, continues to manage their territory as a material expression of local environmental knowledge, while adopting community-based conservation measures with external support. Our findings highlight a case in which urbanization coupled with exclusive conservation approaches affects the components of both cultural and biological diversity. Actions that aim to enhance biocultural diversity in this context should therefore address the impact of factors identified as responsible for change in integrated social-ecological systems.
Resumo:
Detecting and quantifying threats and researching and implementing management actions are key to improving the conservation status of endangered species. Bibliometric analysis can constitute a useful tool for the evaluation of such questions from a long-term perspective. Taking as a case study the Cinereous Vulture Aegypius monachus in Spain, we tested relationships between population dynamics, research efforts, existing threats and conservation milestones. The population growth of the species (from 206 pairs in 1976 to 2,068 in 2011) was parallelled by the increase in the total number of publications, the number of articles in SCI journals and the number of published works dealing with aspects of conservation, threats and management. These results are discussed in terms of cause-effect relationships taking into account that the influence of other non-mutually exclusive factors could also probably explain such associations. Similarly, we analysed the trend of the Cinereous Vulture breeding population with respect to different threats and indices of food availability, obtaining a positive correlation with the increase in big-game hunting bags in Spain. With respect to conservation milestones, we concluded that the current situation is positive in terms of the protection of the species and its habitat, with the situation in relation to food availability being unclear. Finally, we reviewed the main conservation actions that have been taken for the species in Spain and how these have been progressively modified based on new scientific and technical evidence, as an example of adaptive management applied to conservation.