901 resultados para extinction probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Potential conflicts exist between biodiversity conservation and climate-change mitigation as trade-offs in multiple-use land management. This study aims to evaluate public preferences for biodiversity conservation and climate-change mitigation policy considering respondents’ uncertainty on their choice. We conducted a choice experiment using land-use scenarios in the rural Kushiro watershed in northern Japan. The results showed that the public strongly wish to avoid the extinction of endangered species in preference to climate-change mitigation in the form of carbon sequestration by increasing the area of managed forest. Knowledge of the site and the respondents’ awareness of the personal benefits associated with supporting and regulating services had a positive effect on their preference for conservation plans. Thus, decision-makers should be careful about how they provide ecological information for informed choices concerning ecosystem services tradeoffs. Suggesting targets with explicit indicators will affect public preferences, as well as the willingness of the public to pay for such measures. Furthermore, the elicited-choice probabilities approach is useful for revealing the distribution of relative preferences for incomplete scenarios, thus verifying the effectiveness of indicators introduced in the experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Students explored variation and expectation in a probability activity at the end of the first year of a 3-year longitudinal study across grades 4-6. The activity involved experiments in tossing coins both manually and with simulation using the graphing software, TinkerPlots. Initial responses indicated that the students were aware of uncertainty, although an understanding of chance concepts appeared limited. Predicting outcomes of 10 tosses reflected an intuitive notion of equiprobability, with little awareness of variation. Understanding the relationship between experimental and theoretical probability did not emerge until multiple outcomes and representations were generated with the software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In parts of the Indo-Pacific, large-scale exploitation of the green turtle Chelonia mydas continues to pose a serious threat to the persistence of this species; yet very few studies have assessed the pattern and extent of the impact of such harvests. We used demographic and genetic data in an age-based model to investigate the viability of an exploited green turtle stock from Aru, south-east Indonesia. We found that populations are decreasing under current exploitation pressures. The effects of increasingly severe exploitation activities at foraging and nesting habitat varied depending on the migratory patterns of the stock. Our model predicted a rapid decline of the Aru stock in Indonesia under local exploitation pressure and a shift in the genetic composition of the stock. We used the model to investigate the influence of different types of conservation actions on the persistence of the Aru stock. The results show that local management actions such as nest protection and reducing harvests of adult nesting and foraging turtles can have considerable conservation outcomes and result in the long-term persistence of genetically distinct management units. © 2010 The Authors. Animal Conservation © 2010 The Zoological Society of London.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 90% of the original woodlands of the Mount Lofty Ranges of South Australia has been cleared, modified or fragmented, most severely in the last 60 years, and affecting the avifauna dependent on native vegetation. This study identifies which woodland-dependent species are still declining in two different habitats, Pink GumBlue Gum woodland and Stringybark woodland. We analyse the Mount Lofty Ranges Woodland Bird Long-Term Monitoring Dataset for 1999-2007, to look for changes in abundance of 59 species. We use logistic regression of prevalence on lists in a Bayesian framework, and List Length Analysis to control for variation in detectability. Compared with Reporting Rate Analysis, a more traditional approach, List Length Analysis provides tighter confidence intervals by accounting for changing detectability. Several common species were declining significantly. Increasers were generally large-bodied generalists. Many birds have already disappeared from this modified and naturally isolated woodland island, and our results suggest that more specialist insectivores are likely to follow. The Mount Lofty Ranges can be regarded as a 'canary landscape' for temperate woodlands elsewhere in Australia without immediate action their bird communities are likely to follow the trajectory of the Mount Lofty Ranges avifauna. Alternatively, with extensive habitat restoration and management, we could avoid paying the extinction debt. © Royal Australasian Ornithologists Union 2011.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we use an experimental design to compare the performance of elicitation rules for subjective beliefs. Contrary to previous works in which elicited beliefs are compared to an objective benchmark, we consider a purely subjective belief framework (confidence in one’s own performance in a cognitive task and a perceptual task). The performance of different elicitation rules is assessed according to the accuracy of stated beliefs in predicting success. We measure this accuracy using two main factors: calibration and discrimination. For each of them, we propose two statistical indexes and we compare the rules’ performances for each measurement. The matching probability method provides more accurate beliefs in terms of discrimination, while the quadratic scoring rule reduces overconfidence and the free rule, a simple rule with no incentives, which succeeds in eliciting accurate beliefs. Nevertheless, the matching probability appears to be the best mechanism for eliciting beliefs due to its performances in terms of calibration and discrimination, but also its ability to elicit consistent beliefs across measures and across tasks, as well as its empirical and theoretical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Little is known about the neuronal changes that occur within the lateral amygdala (LA) following fear extinction. In fear extinction, the repeated presentation of a conditioned stimulus (CS), in the absence of a previously paired aversive unconditioned stimulus (US), reduces fear elicited by the CS. Fear extinction is an active learning process that leads to the formation of a consolidated extinction memory, however it is fragile and prone to spontaneous recovery and renewal under environmental changes such as context. Understanding the neural mechanisms underlying fear extinction is of great clinical relevance, as psychological treatments of several anxiety disorders rely largely on extinction-based procedures and relapse is major clinical problem. This study investigated plasticity in the LA following fear memory reactivation in rats with and without extinction training. Phosphorylated MAPK (p44/42 ERK/MAPK), a protein kinase required in the amygdala for fear learning and its extinction, was used as a marker for neuronal plasticity. Rats (N = 11) underwent a Pavlovian auditory fear conditioning and extinction paradigm, and later received a single conditioned stimulus presentation to reactivate the fear memory. Results showed more pMAPK+ expressing neurons in the LA following extinction-reactivation compared to control rats, with the largest number of pMAPK+ neurons counted in the ventral LA, especially including the ventro-lateral subdivision (LAvl). These findings indicate that LA subdivision specific plasticity occurs to the conditioned fear memory in the LAvl following extinction-reactivation. These findings provide important insight into the organisation of fear memories in the LA, and pave the way for future research in the memory mechanisms of fear extinction and its pathophysiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative estimates of the vertical structure and the spatial gradients of aerosol extinction coefficients have been made from airborne lidar measurements across the coastline into offshore oceanic regions along the east and west coasts of India. The vertical structure revealed the presence of strong, elevated aerosol layers in the altitude region of similar to 2-4 km, well above the atmospheric boundary layer (ABL). Horizontal gradients also showed a vertical structure, being sharp with the e(-1) scaling distance (D-0H) as small as similar to 150 km in the well-mixed regions mostly under the influence of local source effects. Above the ABL, where local effects are subdued, the gradients were much shallower (similar to 600-800 km); nevertheless, they were steep compared to the value of similar to 1500-2500 km reported for columnar AOD during winter. The gradients of these elevated layers were steeper over the east coast of India than over the west coast. Near-simultaneous radio sonde (Vaisala, Inc., Finland) ascents made over the northern Bay of Bengal showed the presence of convectively unstable regions, first from surface to similar to 750-1000 m and the other extending from 1750 to 3000 m separated by a stable region in between. These can act as a conduit for the advection of aerosols and favor the transport of continental aerosols in the higher levels (> 2 km) into the oceans without entering the marine boundary layer below. Large spatial gradient in aerosol optical and hence radiative impacts between the coastal landmass and the adjacent oceans within a short distance of < 300 km (even at an altitude of 3 km) during summer and the premonsoon is of significance to the regional climate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An experiment is described that enables students to understand the properties of atmospheric extinction due to Rayleigh scattering. The experiment requires the use of red, green and blue lasers attached to a traveling microscope or similar device. The laser beams are passed through an artificial atmosphere, made from milky water, at varying depths, before impinging on either a light meter or a photodiode integral to a Picotech Dr. DAQ ADC. A plot of measured spectral intensity verses depth reveals the contribution Rayleigh scattering has to the extinction coefficient. For the experiment with the light meter, the extinction coefficient for red, green and blue light in the milky sample of water were 0.27, 0.36 and 0.47 cm-1 respectively and 0.032, 0.037 and 0.092 cm-1 for the Picotech Dr. DAQ ADC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An expression is derived for the probability that the determinant of an n x n matrix over a finite field vanishes; from this it is deduced that for a fixed field this probability tends to 1 as n tends to.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speed is recognised as a key contributor to crash likelihood and severity, and to road safety performance in general. Its fundamental role has been recognised by making Safe Speeds one of the four pillars of the Safe System. In this context, impact speeds above which humans are likely to sustain fatal injuries have been accepted as a reference in many Safe System infrastructure policy and planning discussions. To date, there have been no proposed relationships for impact speeds above which humans are likely to sustain fatal or serious (severe) injury, a more relevant Safe System measure. A research project on Safe System intersection design required a critical review of published literature on the relationship between impact speed and probability of injury. This has led to a number of questions being raised about the origins, accuracy and appropriateness of the currently accepted impact speed–fatality probability relationships (Wramborg 2005) in many policy documents. The literature review identified alternative, more recent and more precise relationships derived from the US crash reconstruction databases (NASS/CDS). The paper proposes for discussion a set of alternative relationships between vehicle impact speed and probability of MAIS3+ (fatal and serious) injury for selected common crash types. Proposed Safe System critical impact speed values are also proposed for use in road infrastructure assessment. The paper presents the methodology and assumptions used in developing these relationships. It identifies further research needed to confirm and refine these relationships. Such relationships would form valuable inputs into future road safety policies in Australia and New Zealand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The statistical minimum risk pattern recognition problem, when the classification costs are random variables of unknown statistics, is considered. Using medical diagnosis as a possible application, the problem of learning the optimal decision scheme is studied for a two-class twoaction case, as a first step. This reduces to the problem of learning the optimum threshold (for taking appropriate action) on the a posteriori probability of one class. A recursive procedure for updating an estimate of the threshold is proposed. The estimation procedure does not require the knowledge of actual class labels of the sample patterns in the design set. The adaptive scheme of using the present threshold estimate for taking action on the next sample is shown to converge, in probability, to the optimum. The results of a computer simulation study of three learning schemes demonstrate the theoretically predictable salient features of the adaptive scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.