967 resultados para Tail Probability
Resumo:
We investigated the burst swimming performance of five species of Antarctic fish at -1.0degreesC. The species studied belonged to the suborder, Notothenioidei, and from the families, Nototheniidae and Bathydraconidae. Swimming performance of the fish was assessed over the initial 300 ms of a startle response using surgically attached miniature accelerometers. Escape responses in all fish consisted of a C-type fast start; consisting of an initial pronounced bending of the body into a C-shape, followed by one or more complete tail-beats and an un-powered glide. We found significant differences in the swimming performance of the five species of fish examined, with average maximum swimming velocities (U-max) ranging from 0.91 to 1.39 m s(-1) and maximum accelerations (A(max)) ranging from 10.6 to 15.6 m s(-2). The cryopelagic species, Pagothenia borchgrevinki, produced the fastest escape response, reaching a U-max and A(max) of 1.39 m s(-1) and 15.6 m s(-2), respectively. We also compared the body shapes of each fish species with their measures of maximum burst performance. The dragonfish, Gymnodraco acuticeps, from the family Bathdraconidae, did not conform to the pattern observed for the other four fish species belonging to the family Nototheniidae. However, we found a negative relationship between buoyancy of the fish species and burst swimming performance. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Semi-aquatic animals represent a transitional locomotor condition characterised by the possession of morphological features that allow locomotion both in water and on land. Most ecologically important behaviours of crocodilians occur in the water, raising the question of whether their 'terrestrial construction' constrains aquatic locomotion. Moreover, the demands for aquatic locomotion change with life-history stage. It was the aim of this research to determine the kinematic characteristics and efficiency of aquatic locomotion in different-sized crocodiles (Crocodylus porosus). Aquatic propulsion was achieved primarily by tail undulations, and the use of limbs during swimming was observed only in very small animals or at low swimming velocities in larger animals. Over the range of swimming speeds we examined, tail beat amplitude did not change with increasing velocity, but amplitude increased significantly with body length. However, amplitude expressed relative to body length decreased with increasing body length. Tail beat frequency increased with swimming velocity but there were no differences in frequency between different-sized animals. Mechanical power generated during swimming and thrust increased non-linearly with swimming velocity, but disproportionally so that kinematic efficiency decreased with increasing swimming velocity. The importance of unsteady forces, expressed as the reduced frequency, increased with increasing swimming velocity. Amplitude is the main determinant of body-size-related increases in swimming velocity but, compared with aquatic mammals and fish, crocodiles are slow swimmers probably because of constraints imposed by muscle performance and unsteady forces opposing forward movement. Nonetheless, the kinematic efficiency of aquatic locomotion in crocodiles is comparable to that of fully aquatic mammals, and it is considerably greater than that of semi-aquatic mammals.
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
Phyllurus gulbaru, sp. nov., is a highly distinct species of leaf-tailed gecko restricted to rocky rainforest of Pattersons Gorge, north-west of Townsville. The possession of a cylindrical, non-depressed, tapering original and regenerated tail separates P. gulbaru from all congeners except P. caudiannulatus. From this species P. gulbaru is separated by having a partially divided, as opposed to fully divided, rostral scale. Furthermore, the very small spinose body tubercles of P. gulbaru are in marked contrast to the large spinose body scales of P. caudiannulatus. An analysis of 729 bp of mitochondrial 12S rRNA and cytochrome b genes reveals P. gulbaru to be a deeply divergent lineage with closer affinities to mid-east Queensland congeners than the geographically neighbouring P. amnicola on Mt Elliot. In conservation terms, P. gulbaru is clearly at risk. Field surveys of Pattersons Gorge and the adjacent ranges indicate that this species is restricted to a very small area of highly fragmented habitat, of which only a small proportion receives a degree of protection in State forest. Further, there is ongoing, unchecked destruction of dry rainforest habitat by fire. Under current IUCN criteria, P. gulbaru warrants an Endangered ( B1, 2) listing.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
The extent to which density-dependent processes regulate natural populations is the subject of an ongoing debate. We contribute evidence to this debate showing that density-dependent processes influence the population dynamics of the ectoparasite Aponomma hydrosauri (Acari: Ixodidae), a tick species that infests reptiles in Australia. The first piece of evidence comes from an unusually long-term dataset on the distribution of ticks among individual hosts. If density-dependent processes are influencing either host mortality or vital rates of the parasite population, and those distributions can be approximated with negative binomial distributions, then general host-parasite models predict that the aggregation coefficient of the parasite distribution will increase with the average intensity of infections. We fit negative binomial distributions to the frequency distributions of ticks on hosts, and find that the estimated aggregation coefficient k increases with increasing average tick density. This pattern indirectly implies that one or more vital rates of the tick population must be changing with increasing tick density, because mortality rates of the tick's main host, the sleepy lizard, Tiliqua rugosa, are unaffected by changes in tick burdens. Our second piece of evidence is a re-analysis of experimental data on the attachment success of individual ticks to lizard hosts using generalized linear modelling. The probability of successful engorgement decreases with increasing numbers of ticks attached to a host. This is direct evidence of a density-dependent process that could lead to an increase in the aggregation coefficient of tick distributions described earlier. The population-scale increase in the aggregation coefficient is indirect evidence of a density-dependent process or processes sufficiently strong to produce a population-wide pattern, and thus also likely to influence population regulation. The direct observation of a density-dependent process is evidence of at least part of the responsible mechanism.
Resumo:
Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.
Resumo:
The mechanisms involved in angiotensin II type 1 receptor (AT(1)-R) trafficking and membrane localization are largely unknown. In this study, we examined the role of caveolin in these processes. Electron microscopy of plasma membrane sheets shows that the AT(1)-R is not concentrated in caveolae but is clustered in cholesterol-independent microdomains; upon activation, it partially redistributes to lipid rafts. Despite the lack of AT(1)-R in caveolae, AT(1)-R. caveolin complexes are readily detectable in cells co-expressing both proteins. This interaction requires an intact caveolin scaffolding domain because mutant caveolins that lack a functional caveolin scaffolding domain do not interact with AT(1)-R. Expression of an N-terminally truncated caveolin-3, CavDGV, that localizes to lipid bodies, or a point mutant, Cav3-P104L, that accumulates in the Golgi mislocalizes AT(1)-R to lipid bodies and Golgi, respectively. Mislocalization results in aberrant maturation and surface expression of AT(1)-R, effects that are not reversed by supplementing cells with cholesterol. Similarly mutation of aromatic residues in the caveolin-binding site abrogates AT(1)-R cell surface expression. In cells lacking caveolin-1 or caveolin-3, AT(1)-R does not traffic to the cell surface unless caveolin is ectopically expressed. This observation is recapitulated in caveolin-1 null mice that have a 55% reduction in renal AT(1)-R levels compared with controls. Taken together our results indicate that a direct interaction with caveolin is required to traffic the AT(1)-R through the exocytic pathway, but this does not result in AT(1)-R sequestration in caveolae. Caveolin therefore acts as a molecular chaperone rather than a plasma membrane scaffold for AT(1)-R.
Resumo:
Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.
Resumo:
We present an abstract model of the leader election protocol used in the IEEE 1394 High Performance Serial Bus standard. The model is expressed in the probabilistic Guarded Command Language. By formal reasoning based on this description, we establish the probability of the root contention part of the protocol successfully terminating in terms of the number of attempts to do so. Some simple calculations then allow us to establish an upper bound on the time taken for those attempts.
Resumo:
A more efficient classifying cyclone (CC) for fine particle classification has been developed in recent years at the JKMRC. The novel CC, known as the JKCC, has modified profiles of the cyclone body, vortex finder, and spigot when compared to conventional hydrocyclones. The novel design increases the centrifugal force inside the cyclone and mitigates the short circuiting flow that exists in all current cyclones. It also decreases the probability of particle contamination in the place near the cyclone spigot. Consequently the cyclone efficiency is improved while the unit maintains a simple structure. An international patent has been granted for this novel cyclone design. In the first development stage-a feasibility study-a 100 mm JKCC was tested and compared with two 100 min commercial units. Very encouraging results were achieved, indicating good potential for the novel design. In the second development stage-a scale-up stage-the JKCC was scaled up to 200 mm in diameter, and its geometry was optimized through numerous tests. The performance of the JKCC was compared with a 150 nun commercial unit and exhibited sharper separation, finer separation size, and lower flow ratios. The JKCC is now being scaled up into a fill-size (480 mm) hydrocyclone in the third development stage-an industrial study. The 480 mm diameter unit will be tested in an Australian coal preparation plant, and directly compared with a commercial CC operating under the same conditions. Classifying cyclone performance for fine coal could be further improved if the unit is installed in an inclined position. The study using the 200 mm JKCC has revealed that sharpness of separation improved and the flow ratio to underflow was decreased by 43% as the cyclone inclination was varied from the vertical position (0degrees) to the horizontal position (90degrees). The separation size was not affected, although the feed rate was slightly decreased. To ensure self-emptying upon shutdown, it is recommended that the JKCC be installed at an inclination of 75-80degrees. At this angle the cyclone performance is very similar to that at a horizontal position. Similar findings have been derived from the testing of a conventional hydrocyclone. This may be of benefit to operations that require improved performance from their classifying cyclones in terms of sharpness of separation and flow ratio, while tolerating slightly reduced feed rate.
Resumo:
A new modeling approach-multiple mapping conditioning (MMC)-is introduced to treat mixing and reaction in turbulent flows. The model combines the advantages of the probability density function and the conditional moment closure methods and is based on a certain generalization of the mapping closure concept. An equivalent stochastic formulation of the MMC model is given. The validity of the closuring hypothesis of the model is demonstrated by a comparison with direct numerical simulation results for the three-stream mixing problem. (C) 2003 American Institute of Physics.
Resumo:
A new lifetime distribution capable of modeling a bathtub-shaped hazard-rate function is proposed. The proposed model is derived as a limiting case of the Beta Integrated Model and has both the Weibull distribution and Type I extreme value distribution as special cases. The model can be considered as another useful 3-parameter generalization of the Weibull distribution. An advantage of the model is that the model parameters can be estimated easily based on a Weibull probability paper (WPP) plot that serves as a tool for model identification. Model characterization based on the WPP plot is studied. A numerical example is provided and comparison with another Weibull extension, the exponentiated Weibull, is also discussed. The proposed model compares well with other competing models to fit data that exhibits a bathtub-shaped hazard-rate function.