271 resultados para extinction probability
Resumo:
Students explored variation and expectation in a probability activity at the end of the first year of a 3-year longitudinal study across grades 4-6. The activity involved experiments in tossing coins both manually and with simulation using the graphing software, TinkerPlots. Initial responses indicated that the students were aware of uncertainty, although an understanding of chance concepts appeared limited. Predicting outcomes of 10 tosses reflected an intuitive notion of equiprobability, with little awareness of variation. Understanding the relationship between experimental and theoretical probability did not emerge until multiple outcomes and representations were generated with the software.
Resumo:
Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.
Resumo:
In parts of the Indo-Pacific, large-scale exploitation of the green turtle Chelonia mydas continues to pose a serious threat to the persistence of this species; yet very few studies have assessed the pattern and extent of the impact of such harvests. We used demographic and genetic data in an age-based model to investigate the viability of an exploited green turtle stock from Aru, south-east Indonesia. We found that populations are decreasing under current exploitation pressures. The effects of increasingly severe exploitation activities at foraging and nesting habitat varied depending on the migratory patterns of the stock. Our model predicted a rapid decline of the Aru stock in Indonesia under local exploitation pressure and a shift in the genetic composition of the stock. We used the model to investigate the influence of different types of conservation actions on the persistence of the Aru stock. The results show that local management actions such as nest protection and reducing harvests of adult nesting and foraging turtles can have considerable conservation outcomes and result in the long-term persistence of genetically distinct management units. © 2010 The Authors. Animal Conservation © 2010 The Zoological Society of London.
Resumo:
Approximately 90% of the original woodlands of the Mount Lofty Ranges of South Australia has been cleared, modified or fragmented, most severely in the last 60 years, and affecting the avifauna dependent on native vegetation. This study identifies which woodland-dependent species are still declining in two different habitats, Pink GumBlue Gum woodland and Stringybark woodland. We analyse the Mount Lofty Ranges Woodland Bird Long-Term Monitoring Dataset for 1999-2007, to look for changes in abundance of 59 species. We use logistic regression of prevalence on lists in a Bayesian framework, and List Length Analysis to control for variation in detectability. Compared with Reporting Rate Analysis, a more traditional approach, List Length Analysis provides tighter confidence intervals by accounting for changing detectability. Several common species were declining significantly. Increasers were generally large-bodied generalists. Many birds have already disappeared from this modified and naturally isolated woodland island, and our results suggest that more specialist insectivores are likely to follow. The Mount Lofty Ranges can be regarded as a 'canary landscape' for temperate woodlands elsewhere in Australia without immediate action their bird communities are likely to follow the trajectory of the Mount Lofty Ranges avifauna. Alternatively, with extensive habitat restoration and management, we could avoid paying the extinction debt. © Royal Australasian Ornithologists Union 2011.
Resumo:
So far, low probability differentials for the key schedule of block ciphers have been used as a straightforward proof of security against related-key differential analysis. To achieve resistance, it is believed that for cipher with k-bit key it suffices the upper bound on the probability to be 2− k . Surprisingly, we show that this reasonable assumption is incorrect, and the probability should be (much) lower than 2− k . Our counter example is a related-key differential analysis of the well established block cipher CLEFIA-128. We show that although the key schedule of CLEFIA-128 prevents differentials with a probability higher than 2− 128, the linear part of the key schedule that produces the round keys, and the Feistel structure of the cipher, allow to exploit particularly chosen differentials with a probability as low as 2− 128. CLEFIA-128 has 214 such differentials, which translate to 214 pairs of weak keys. The probability of each differential is too low, but the weak keys have a special structure which allows with a divide-and-conquer approach to gain an advantage of 27 over generic analysis. We exploit the advantage and give a membership test for the weak-key class and provide analysis of the hashing modes. The proposed analysis has been tested with computer experiments on small-scale variants of CLEFIA-128. Our results do not threaten the practical use of CLEFIA.
Resumo:
In this paper, we use an experimental design to compare the performance of elicitation rules for subjective beliefs. Contrary to previous works in which elicited beliefs are compared to an objective benchmark, we consider a purely subjective belief framework (confidence in one’s own performance in a cognitive task and a perceptual task). The performance of different elicitation rules is assessed according to the accuracy of stated beliefs in predicting success. We measure this accuracy using two main factors: calibration and discrimination. For each of them, we propose two statistical indexes and we compare the rules’ performances for each measurement. The matching probability method provides more accurate beliefs in terms of discrimination, while the quadratic scoring rule reduces overconfidence and the free rule, a simple rule with no incentives, which succeeds in eliciting accurate beliefs. Nevertheless, the matching probability appears to be the best mechanism for eliciting beliefs due to its performances in terms of calibration and discrimination, but also its ability to elicit consistent beliefs across measures and across tasks, as well as its empirical and theoretical properties.
Resumo:
Little is known about the neuronal changes that occur within the lateral amygdala (LA) following fear extinction. In fear extinction, the repeated presentation of a conditioned stimulus (CS), in the absence of a previously paired aversive unconditioned stimulus (US), reduces fear elicited by the CS. Fear extinction is an active learning process that leads to the formation of a consolidated extinction memory, however it is fragile and prone to spontaneous recovery and renewal under environmental changes such as context. Understanding the neural mechanisms underlying fear extinction is of great clinical relevance, as psychological treatments of several anxiety disorders rely largely on extinction-based procedures and relapse is major clinical problem. This study investigated plasticity in the LA following fear memory reactivation in rats with and without extinction training. Phosphorylated MAPK (p44/42 ERK/MAPK), a protein kinase required in the amygdala for fear learning and its extinction, was used as a marker for neuronal plasticity. Rats (N = 11) underwent a Pavlovian auditory fear conditioning and extinction paradigm, and later received a single conditioned stimulus presentation to reactivate the fear memory. Results showed more pMAPK+ expressing neurons in the LA following extinction-reactivation compared to control rats, with the largest number of pMAPK+ neurons counted in the ventral LA, especially including the ventro-lateral subdivision (LAvl). These findings indicate that LA subdivision specific plasticity occurs to the conditioned fear memory in the LAvl following extinction-reactivation. These findings provide important insight into the organisation of fear memories in the LA, and pave the way for future research in the memory mechanisms of fear extinction and its pathophysiology.
Resumo:
An experiment is described that enables students to understand the properties of atmospheric extinction due to Rayleigh scattering. The experiment requires the use of red, green and blue lasers attached to a traveling microscope or similar device. The laser beams are passed through an artificial atmosphere, made from milky water, at varying depths, before impinging on either a light meter or a photodiode integral to a Picotech Dr. DAQ ADC. A plot of measured spectral intensity verses depth reveals the contribution Rayleigh scattering has to the extinction coefficient. For the experiment with the light meter, the extinction coefficient for red, green and blue light in the milky sample of water were 0.27, 0.36 and 0.47 cm-1 respectively and 0.032, 0.037 and 0.092 cm-1 for the Picotech Dr. DAQ ADC.
Resumo:
Speed is recognised as a key contributor to crash likelihood and severity, and to road safety performance in general. Its fundamental role has been recognised by making Safe Speeds one of the four pillars of the Safe System. In this context, impact speeds above which humans are likely to sustain fatal injuries have been accepted as a reference in many Safe System infrastructure policy and planning discussions. To date, there have been no proposed relationships for impact speeds above which humans are likely to sustain fatal or serious (severe) injury, a more relevant Safe System measure. A research project on Safe System intersection design required a critical review of published literature on the relationship between impact speed and probability of injury. This has led to a number of questions being raised about the origins, accuracy and appropriateness of the currently accepted impact speed–fatality probability relationships (Wramborg 2005) in many policy documents. The literature review identified alternative, more recent and more precise relationships derived from the US crash reconstruction databases (NASS/CDS). The paper proposes for discussion a set of alternative relationships between vehicle impact speed and probability of MAIS3+ (fatal and serious) injury for selected common crash types. Proposed Safe System critical impact speed values are also proposed for use in road infrastructure assessment. The paper presents the methodology and assumptions used in developing these relationships. It identifies further research needed to confirm and refine these relationships. Such relationships would form valuable inputs into future road safety policies in Australia and New Zealand.
Resumo:
Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.
Resumo:
Unlike standard applications of transport theory, the transport of molecules and cells during embryonic development often takes place within growing multidimensional tissues. In this work, we consider a model of diffusion on uniformly growing lines, disks, and spheres. An exact solution of the partial differential equation governing the diffusion of a population of individuals on the growing domain is derived. Using this solution, we study the survival probability, S(t). For the standard nongrowing case with an absorbing boundary, we observe that S(t) decays to zero in the long time limit. In contrast, when the domain grows linearly or exponentially with time, we show that S(t) decays to a constant, positive value, indicating that a proportion of the diffusing substance remains on the growing domain indefinitely. Comparing S(t) for diffusion on lines, disks, and spheres indicates that there are minimal differences in S(t) in the limit of zero growth and minimal differences in S(t) in the limit of fast growth. In contrast, for intermediate growth rates, we observe modest differences in S(t) between different geometries. These differences can be quantified by evaluating the exact expressions derived and presented here.
Resumo:
We consider the motion of a diffusive population on a growing domain, 0 < x < L(t ), which is motivated by various applications in developmental biology. Individuals in the diffusing population, which could represent molecules or cells in a developmental scenario, undergo two different kinds of motion: (i) undirected movement, characterized by a diffusion coefficient, D, and (ii) directed movement, associated with the underlying domain growth. For a general class of problems with a reflecting boundary at x = 0, and an absorbing boundary at x = L(t ), we provide an exact solution to the partial differential equation describing the evolution of the population density function, C(x,t ). Using this solution, we derive an exact expression for the survival probability, S(t ), and an accurate approximation for the long-time limit, S = limt→∞ S(t ). Unlike traditional analyses on a nongrowing domain, where S ≡ 0, we show that domain growth leads to a very different situation where S can be positive. The theoretical tools developed and validated in this study allow us to distinguish between situations where the diffusive population reaches the moving boundary at x = L(t ) from other situations where the diffusive population never reaches the moving boundary at x = L(t ). Making this distinction is relevant to certain applications in developmental biology, such as the development of the enteric nervous system (ENS). All theoretical predictions are verified by implementing a discrete stochastic model.
Resumo:
Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders’ bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder’s probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.
Resumo:
Accurate determination of same-sex twin zygosity is important for medical, scientific and personal reasons. Determination may be based upon questionnaire data, blood group, enzyme isoforms and fetal membrane examination, but assignment of zygosity must ultimately be confirmed by genotypic data. Here methods are reviewed for calculating average probabilities of correctly concluding a twin pair is monozygotic, given they share the same genotypes across all loci for commonly utilized multiplex short tandem repeat (STR) kits.