891 resultados para Probability of extinction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is now an extensive literature on extinction debt following deforestation. However, the potential for species credit in landscapes that have experienced a change from decreasing to expanding forest cover has received little attention. Both delayed responses should depend on current landscape forest cover and on species life-history traits, such as longevity, as short-lived species are likely to respond faster than long-lived species. We evaluated the effects of historical and present-day local forest cover on two vertebrate groups with different longevities understorey birds and non-flying small mammals - in forest patches at three Atlantic Forest landscapes. Our work investigated how the probability of extinction debt and species credit varies (i) amongst landscapes with different proportions of forest cover and distinct trajectories of forest cover change, and (ii) between taxa with different life spans. Our results suggest that the existence of extinction debt and species credit, as well as the potential for their future payment and/or receipt, is not only related to forest cover trajectory but also to the amount of remaining forest cover at the landscape scale. Moreover, differences in bird and small mammal life spans seem to be insufficient to affect differently their probability of showing time-delayed responses to landscape change. Synthesis and applications. Our work highlights the need for considering not only the trajectory of deforestation/regeneration but also the amount of forest cover at landscape scale when investigating time-delayed responses to landscape change. As many landscapes are experiencing a change from decreasing to expanding forest cover, understanding the association of extinction and immigration processes, as well as their interactions with the landscape dynamic, is a key factor to plan conservation and restoration actions in human-altered landscapes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Sawfishes currently are among the most threatened elasmobranchs in the world. Only two species inhabit Atlantic waters: the largetooth sawfish (Pristis pristis) and the smalltooth sawfish (Pristis pectinata), both having suffered dramatic declines in their ranges. 2. The goal of this study was to evaluate the status of P. pristis in the Atlantic, and estimate local extinction risk based on historical and recent occurrence records. In order to accomplish these goals, a thorough search for historical and recent records of P. pristis in the Atlantic was conducted, by reviewing scientific and popular literature, museum specimens, and contacting regional scientists from the species’ historical range. 3. In total, 801 P. pristis records (1830–2009) document its occurrence in four major regions in the Atlantic: USA (n =41), Mexico and Central America (n =535), South America (n=162), and West Africa (n =48). Locality data were not available for 15 records. 4. Historical abundance centres were the Colorado-San Juan River system in Nicaragua and Costa Rica (and secondarily Lake Izabal of Guatemala), the Amazon estuary, and coastal Guinea-Bissau. 5. Currently, the species faces drastic depletion throughout its entire former range and centres of abundance. It appears to have been extirpated from several areas. The probability of extinction was highest in the USA, northern South America (Colombia to Guyane), and southern West Africa (Cameroon to Namibia). 6. Currently, the Amazon estuary appears to have the highest remaining abundance of P. pristis in the Atlantic, followed by the Colorado–San Juan River system in Nicaragua and Costa Rica and the Bissagos Archipelago in Guinea Bissau. Therefore the protection of these populations is crucial for the preservation and recovery of the species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental problem faced by stereo matching algorithms is the matching or correspondence problem. A wide range of algorithms have been proposed for the correspondence problem. For all matching algorithms, it would be useful to be able to compute a measure of the probability of correctness, or reliability of a match. This paper focuses in particular on one class for matching algorithms, which are based on the rank transform. The interest in these algorithms for stereo matching stems from their invariance to radiometric distortion, and their amenability to fast hardware implementation. This work differs from previous work in that it derives, from first principles, an expression for the probability of a correct match. This method was based on an enumeration of all possible symbols for matching. The theoretical results for disparity error prediction, obtained using this method, were found to agree well with experimental results. However, disadvantages of the technique developed in this chapter are that it is not easily applicable to real images, and also that it is too computationally expensive for practical window sizes. Nevertheless, the exercise provides an interesting and novel analysis of match reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult owing to species biology and behavioural characteristics. The design of robust sampling programmes should be based on an underlying statistical distribution that is sufficiently flexible to capture variations in the spatial distribution of the target species. Results: Comparisons are made of the accuracy of four probability-of-detection sampling models - the negative binomial model,1 the Poisson model,1 the double logarithmic model2 and the compound model3 - for detection of insects over a broad range of insect densities. Although the double log and negative binomial models performed well under specific conditions, it is shown that, of the four models examined, the compound model performed the best over a broad range of insect spatial distributions and densities. In particular, this model predicted well the number of samples required when insect density was high and clumped within experimental storages. Conclusions: This paper reinforces the need for effective sampling programs designed to detect insects over a broad range of spatial distributions. The compound model is robust over a broad range of insect densities and leads to substantial improvement in detection probabilities within highly variable systems such as grain storage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light of Extinction presents a diverse series of views into the complex antics of a semi-autonomous gaggle of robotic actants. Audiences initially enter into the 'backend' of the experience to be rudely confronted with the raw, messy operations of a horde of object-manipulating robotic forms. Seen through viewing apertures these ‘things’ deny any opportunity to grasp their imagined order. Audiences then flow on into the 'front end' of the work where now, seen through another aperture, the very same forms seemingly coordinate a stunning deep-field choreography, floating lusciously within inky landscapes of media, noise and embodied sound. As one series of conceptions slip into extinction, so others flow on in. The idea of the 'extinction of human experience' expresses a projected fear for that which will disappear when biodiverse worlds have descended into an era of permanent darkness. ‘Light Of Extinction' re-positions this anthropomorphic lament in order to suggest a more rounded acknowledgement of what might still remain - suggesting the previously unacknowledged power and place of autonomous, synthetic creation. Momentary disbelief gives way to a relieving celebration of the imagined birth of ‘things’ – without need for staples such as conventional light or the harmonious lullabies of long-extinguished sounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Money is often a limiting factor in conservation, and attempting to conserve endangered species can be costly. Consequently, a framework for optimizing fiscally constrained conservation decisions for a single species is needed. In this paper we find the optimal budget allocation among isolated subpopulations of a threatened species to minimize local extinction probability. We solve the problem using stochastic dynamic programming, derive a useful and simple alternative guideline for allocating funds, and test its performance using forward simulation. The model considers subpopulations that persist in habitat patches of differing quality, which in our model is reflected in different relationships between money invested and extinction risk. We discover that, in most cases, subpopulations that are less efficient to manage should receive more money than those that are more efficient to manage, due to higher investment needed to reduce extinction risk. Our simple investment guideline performs almost as well as the exact optimal strategy. We illustrate our approach with a case study of the management of the Sumatran tiger, Panthera tigris sumatrae, in Kerinci Seblat National Park (KSNP), Indonesia. We find that different budgets should be allocated to the separate tiger subpopulations in KSNP. The subpopulation that is not at risk of extinction does not require any management investment. Based on the combination of risks of extinction and habitat quality, the optimal allocation for these particular tiger subpopulations is an unusual case: subpopulations that occur in higher-quality habitat (more efficient to manage) should receive more funds than the remaining subpopulation that is in lower-quality habitat. Because the yearly budget allocated to the KSNP for tiger conservation is small, to guarantee the persistence of all the subpopulations that are currently under threat we need to prioritize those that are easier to save. When allocating resources among subpopulations of a threatened species, the combined effects of differences in habitat quality, cost of action, and current subpopulation probability of extinction need to be integrated. We provide a useful guideline for allocating resources among isolated subpopulations of any threatened species. © 2010 by the Ecological Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development of a model, based on Bayesian networks, to estimate the likelihood that sheep flocks are infested with lice at shearing and to assist farm managers or advisers to assess whether or not to apply a lousicide treatment. The risk of lice comes from three main sources: (i) lice may have been present at the previous shearing and not eradicated; (ii) lice may have been introduced with purchased sheep; and (iii) lice may have entered with strays. A Bayesian network is used to assess the probability of each of these events independently and combine them for an overall assessment. Rubbing is a common indicator of lice but there are other causes too. If rubbing has been observed, an additional Bayesian network is used to assess the probability that lice are the cause. The presence or absence of rubbing and its possible cause are combined with these networks to improve the overall risk assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders’ bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder’s probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate determination of same-sex twin zygosity is important for medical, scientific and personal reasons. Determination may be based upon questionnaire data, blood group, enzyme isoforms and fetal membrane examination, but assignment of zygosity must ultimately be confirmed by genotypic data. Here methods are reviewed for calculating average probabilities of correctly concluding a twin pair is monozygotic, given they share the same genotypes across all loci for commonly utilized multiplex short tandem repeat (STR) kits.