126 resultados para Species distribution modelling
Resumo:
Burkholderia pseudomallei, the causative agent of melioidosis is associated with soil. This study used a geographic information system (GIS) to determine the spatial distribution of clinical cases of melioidosis in the endemic suburban region of Townsville in Australia. A total of 65 cases over the period 1996–2008 were plotted using residential address. Two distinct groupings were found. One was around the base of a hill in the city centre and the other followed the old course of a major waterway in the region. Both groups (accounting for 43 of the 65 cases examined) are in areas expected to have particularly wet topsoils following intense rainfall, due to soil type or landscape position.
Resumo:
This study examined the distribution of major mosquito species and their roles in the transmission of Ross River virus (RRV) infection for coastline and inland areas in Brisbane, Australia (27°28′ S, 153°2′ E). We obtained data on the monthly counts of RRV cases in Brisbane between November 1998 and December 2001 by statistical local areas from the Queensland Department of Health and the monthly mosquito abundance from the Brisbane City Council. Correlation analysis was used to assess the pairwise relationships between mosquito density and the incidence of RRV disease. This study showed that the mosquito abundance of Aedes vigilax (Skuse), Culex annulirostris (Skuse), and Aedes vittiger (Skuse) were significantly associated with the monthly incidence of RRV in the coastline area, whereas Aedes vigilax, Culex annulirostris, and Aedes notoscriptus (Skuse) were significantly associated with the monthly incidence of RRV in the inland area. The results of the classification and regression tree (CART) analysis show that both occurrence and incidence of RRV were influenced by interactions between species in both coastal and inland regions. We found that there was an 89% chance for an occurrence of RRV if the abundance of Ae. vigifax was between 64 and 90 in the coastline region. There was an 80% chance for an occurrence of RRV if the density of Cx. annulirostris was between 53 and 74 in the inland area. The results of this study may have applications as a decision support tool in planning disease control of RRV and other mosquito-borne diseases.
Resumo:
Chronicwounds fail to proceed through an orderly process to produce anatomic and functional integrity and are a significant socioeconomic problem. There is much debate about the best way to treat these wounds. In this thesis we review earlier mathematical models of angiogenesis and wound healing. Many of these models assume a chemotactic response of endothelial cells, the primary cell type involved in angiogenesis. Modelling this chemotactic response leads to a system of advection-dominated partial differential equations and we review numerical methods to solve these equations and argue that the finite volume method with flux limiting is best-suited to these problems. One treatment of chronic wounds that is shrouded with controversy is hyperbaric oxygen therapy (HBOT). There is currently no conclusive data showing that HBOT can assist chronic wound healing, but there has been some clinical success. In this thesis we use several mathematical models of wound healing to investigate the use of hyperbaric oxygen therapy to assist the healing process - a novel threespecies model and a more complex six-species model. The second model accounts formore of the biological phenomena but does not lend itself tomathematical analysis. Bothmodels are then used tomake predictions about the efficacy of hyperbaric oxygen therapy and the optimal treatment protocol. Based on our modelling, we are able to make several predictions including that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds, treatment should continue until healing is complete and finding the right protocol for an individual patient is crucial if HBOT is to be effective. Analysis of the models allows us to derive constraints for the range of HBOT protocols that will stimulate healing, which enables us to predict which patients are more likely to have a positive response to HBOT and thus has the potential to assist in improving both the success rate and thus the cost-effectiveness of this therapy.
Resumo:
From 19 authoritative lists with 164 entries of ‘endangered’ Australian mammal species, 39 species have been reported as extinct. When examined in the light of field conditions, the 18 of these species thought to be from Queensland consist of (a) species described from fragmentary museum material collected in the earliest days of exploration, (b) populations inferred to exist in Queensland by extrapolation from distribution records in neighbouring States or countries, (c) inhabitants of remote and harsh locations where search effort is extraordinarily difficult (especially in circumstances of drought or flooding). and/or (d) individuals that are clearly transitory or peripheral in distribution. ‘Rediscovery’ of such scarce species - a not infrequent occurrence - is nowadays attracting increasing attention. Management in respect of any scarce wildlife in Queensland presently derives from such official lists. The analyses here indicate that this method of prioritizing action needs review. This is especially so because action then tends to be centred on species chosen out of the lists for populist reasons and that mostly addresses Crown lands. There is reason to believe that the preferred management may lie private lands where casual observation has provided for rediscovery and where management is most desirable and practicable.
Resumo:
The variability of input parameters is the most important source of overall model uncertainty. Therefore, an in-depth understanding of the variability is essential for uncertainty analysis of stormwater quality model outputs. This paper presents the outcomes of a research study which investigated the variability of pollutants build-up characteristics on road surfaces in residential, commercial and industrial land uses. It was found that build-up characteristics vary highly even within the same land use. Additionally, industrial land use showed relatively higher variability of maximum build-up, build-up rate and particle size distribution, whilst the commercial land use displayed a relatively higher variability of pollutant-solid ratio. Among the various build-up parameters analysed, D50 (volume-median-diameter) displayed the relatively highest variability for all three land uses.
Resumo:
Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.
Resumo:
A new immobilized flat plate photocatalytic reactor for wastewater treatment has been proposed in this study to avoid subsequent catalyst removal from the treated water. The reactor consists of an inlet, reactive section where catalyst is coated and an outlet parts. In order to optimize the fluid mixing and reactor design, this study aims to investigate the influence of baffles and its arrangement on the flat plate reactor hydrodynamics using computational fluid dynamics (CFD) simulation. For simulation, an array of baffles acting as turbulence promoters is inserted in the reactive zone of the reactor. In this regard, results obtained from the simulation of a baffled- flat plate photoreactor hydrodynamics for different baffle positions, heights and intervals are presented utilizing RNG k-ε turbulence model. Under the conditions simulated, the qualitative flow features, such as the development and separation of boundary layers, vortex formation, the presence of high shear regions and recirculation zones, and the underlying mechanism are examined. The influence of various baffle sizes on the distribution of pollutant concentration is also highlighted. The results presented here indicate that the spanning of recirculation increases the degree of interfacial distortion with a larger interfacial area between fluids which results in substantial enhancement in fluid mixing. The simulation results suggest that the qualitative and quantitative properties of fluid dynamics in a baffled reactor can be obtained which provides valuable insight to fully understand the effect of baffles and its arrangements on the flow pattern, behaviour, and feature.
Resumo:
Early detection surveillance programs aim to find invasions of exotic plant pests and diseases before they are too widespread to eradicate. However, the value of these programs can be difficult to justify when no positive detections are made. To demonstrate the value of pest absence information provided by these programs, we use a hierarchical Bayesian framework to model estimates of incursion extent with and without surveillance. A model for the latent invasion process provides the baseline against which surveillance data are assessed. Ecological knowledge and pest management criteria are introduced into the model using informative priors for invasion parameters. Observation models assimilate information from spatio-temporal presence/absence data to accommodate imperfect detection and generate posterior estimates of pest extent. When applied to an early detection program operating in Queensland, Australia, the framework demonstrates that this typical surveillance regime provides a modest reduction in the estimate that a surveyed district is infested. More importantly, the model suggests that early detection surveillance programs can provide a dramatic reduction in the putative area of incursion and therefore offer a substantial benefit to incursion management. By mapping spatial estimates of the point probability of infestation, the model identifies where future surveillance resources can be most effectively deployed.
Resumo:
Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.
An approach to statistical lip modelling for speaker identification via chromatic feature extraction
Resumo:
This paper presents a novel technique for the tracking of moving lips for the purpose of speaker identification. In our system, a model of the lip contour is formed directly from chromatic information in the lip region. Iterative refinement of contour point estimates is not required. Colour features are extracted from the lips via concatenated profiles taken around the lip contour. Reduction of order in lip features is obtained via principal component analysis (PCA) followed by linear discriminant analysis (LDA). Statistical speaker models are built from the lip features based on the Gaussian mixture model (GMM). Identification experiments performed on the M2VTS1 database, show encouraging results
Resumo:
The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.