959 resultados para Simulation tools
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.
Resumo:
Two examples of GIS-based multiple-criteria evaluations of plantation forests are presented. These desktop assessments use available topographical, geological and pedological information to establish the risk of occurrence of certain environmentally detrimental processes. The first case study is concerned with the risk that chemical additives (i.e. simazine) applied within the forestry landscape may reach the drainage system. The second case study assesses the vulnerability of forested areas to landslides. The subject of the first multiple-criteria evaluation (MCE) was a 4 km2 logging area, which had been recently site-prepared for a Pinus plantation. The criteria considered relevant to the assessment were proximity to creeks, slope, soil depth to the restrictive layer (i.e. potential depth to a perched water table) and soil erodability (based on clay content). The output of the MCE was in accordance with field observations, showing that this approach has the potential to provide management support by highlighting areas vulnerable to waterlogging, which in turn can trigger overland flow and export of pollutants to the local stream network. The subject of the second evaluation was an Araucaria plantation which is prone to landslips during heavy rain. The parameters included in the assessment were drainage system, the slope of the terrain and geological features such as rocks and structures. A good correlation between the MCE results and field observations was found, suggesting that this GIS approach is useful for the assessment of natural hazards. Multiple-criteria evaluations are highly flexible as they can be designed in either vector or raster format, depending on the type of available data. Although tested on specific areas, the MCEs presented here can be easily used elsewhere and assist both management intervention and the protection of the adjacent environment by assessing the vulnerability of the forest landscape to either introduced chemicals or natural hazards.
Resumo:
Phosphine is the primary fumigant used to protect the majority of the world' s grain and a variety of other stored commodities from insect pests. Phosphine is playing an increasingly important role in the protection of commodities for two primary reasons. Firstly, use of the alternative fumigant, methyl bromide, has been sharply curtailed and is tightly regulated due to its role in ozone depletion, and secondly, consumers are becoming increasingly intolerant of contact pesticides. Niche alternatives to phosphine exist, but they suffer from a range of factors that limit their use, including: 1) Limited commercial adoption due to expense or slow mode of action; 2) Poor efficacy due to low toxicity, rapid sorption, limited volatility or high density; 3) Public health concerns due to toxicity to handlers or nearby residents, as well as risk of explosion; 4) Poor consumer acceptance due to toxic residues or smell. These same factors limit the prospects of quickly identifying and deploying a new fumigant. Given that resistance toward phosphine is increasing among insect pests, improved monitoring and management of resistance is a priority. Knowledge of the mode of action of phosphine as well as the mechanisms of resistance may also greatly reduce the effort and expense of identifying synergists or novel replacement compounds.
Resumo:
Background: Plotless density estimators are those that are based on distance measures rather than counts per unit area (quadrats or plots) to estimate the density of some usually stationary event, e.g. burrow openings, damage to plant stems, etc. These estimators typically use distance measures between events and from random points to events to derive an estimate of density. The error and bias of these estimators for the various spatial patterns found in nature have been examined using simulated populations only. In this study we investigated eight plotless density estimators to determine which were robust across a wide range of data sets from fully mapped field sites. They covered a wide range of situations including animal damage to rice and corn, nest locations, active rodent burrows and distribution of plants. Monte Carlo simulations were applied to sample the data sets, and in all cases the error of the estimate (measured as relative root mean square error) was reduced with increasing sample size. The method of calculation and ease of use in the field were also used to judge the usefulness of the estimator. Estimators were evaluated in their original published forms, although the variable area transect (VAT) and ordered distance methods have been the subjects of optimization studies. Results: An estimator that was a compound of three basic distance estimators was found to be robust across all spatial patterns for sample sizes of 25 or greater. The same field methodology can be used either with the basic distance formula or the formula used with the Kendall-Moran estimator in which case a reduction in error may be gained for sample sizes less than 25, however, there is no improvement for larger sample sizes. The variable area transect (VAT) method performed moderately well, is easy to use in the field, and its calculations easy to undertake. Conclusion: Plotless density estimators can provide an estimate of density in situations where it would not be practical to layout a plot or quadrat and can in many cases reduce the workload in the field.
Resumo:
Maize (Zea mays L.) is a chill-susceptible crop cultivated in northern latitude environments. The detrimental effects of cold on growth and photosynthetic activity have long been established. However, a general overview of how important these processes are with respect to the reduction of productivity reported in the field is still lacking. In this study, a model-assisted approach was used to dissect variations in productivity under suboptimal temperatures and quantify the relative contributions of light interception (PARc) and radiation use efficiency (RUE) from emergence to flowering. A combination of architectural and light transfer models was used to calculate light interception in three field experiments with two cold-tolerant lines and at two sowing dates. Model assessment confirmed that the approach was suitable to infer light interception. Biomass production was strongly affected by early sowings. RUE was identified as the main cause of biomass reduction during cold events. Furthermore, PARc explained most of the variability observed at flowering, its relative contributions being more or less important according to the climate experienced. Cold temperatures resulted in lower PARc, mainly because final leaf length and width were significantly reduced for all leaves emerging after the first cold occurrence. These results confirm that virtual plants can be useful as fine phenotyping tools. A scheme of action of cold on leaf expansion, light interception and radiation use efficiency is discussed with a view towards helping breeders define relevant selection criteria. This paper originates from a presentation at the 5th International Workshop on Functional–Structural Plant Models, Napier, New Zealand, November 2007.
Resumo:
This report describes the development and simulation of a variable rate controller for a 6-degree of freedom nonlinear model. The variable rate simulation model represents an off the shelf autopilot. Flight experiment involves risks and can be expensive. Therefore a dynamic model to understand the performance characteristics of the UAS in mission simulation before actual flight test or to obtain parameters needed for the flight is important. The control and guidance is implemented in Simulink. The report tests the use of the model for air search and air sampling path planning. A GUI in which a set of mission scenarios, in which two experts (mission expert, i.e. air sampling or air search and an UAV expert) interact, is presented showing the benefits of the method.
Resumo:
APSIM-ORYZA is a new functionality developed in the APSIM framework to simulate rice production while addressing management issues such as fertilisation and transplanting, which are particularly important in Korean agriculture. To validate the model for Korean rice varieties and field conditions, the measured yields and flowering times from three field experiments conducted by the Gyeonggi Agricultural Research and Extension Services (GARES) in Korea were compared against the simulated outputs for different management practices and rice varieties. Simulated yields of early-, mid- and mid-to-late-maturing varieties of rice grown in a continuous rice cropping system from 1997 to 2004 showed close agreement with the measured data. Similar results were also found for yields simulated under seven levels of nitrogen application. When different transplanting times were modelled, simulated flowering times ranged from within 3 days of the measured values for the early-maturing varieties, to up to 9 days after the measured dates for the mid- and especially mid-to-late-maturing varieties. This was associated with highly variable simulated yields which correlated poorly with the measured data. This suggests the need to accurately calibrate the photoperiod sensitivity parameters of the model for the photoperiod-sensitive rice varieties in Korea.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
Soils with high levels of chloride and/or sodium in their subsurface layers are often referred to as having subsoil constraints (SSCs). There is growing evidence that SSCs affect wheat yields by increasing the lower limit of a crop's available soil water (CLL) and thus reducing the soil's plant-available water capacity (PAWC). This proposal was tested by simulation of 33 farmers' paddocks in south-western Queensland and north-western New South Wales. The simulated results accounted for 79% of observed variation in grain yield, with a root mean squared deviation (RMSD) of 0.50 t/ha. This result was as close as any achieved from sites without SSCs, thus providing strong support for the proposed mechanism that SSCs affect wheat yields by increasing the CLL and thus reducing the soil's PAWC. In order to reduce the need to measure CLL of every paddock or management zone, two additional approaches to simulating the effects of SSCs were tested. In the first approach the CLL of soils was predicted from the 0.3-0.5 m soil layer, which was taken as the reference CLL of a soil regardless of its level of SSCs, while the CLL values of soil layers below 0.5 m depth were calculated as a function of these soils' 0.3-0.5 m CLL values as well as of soil depth plus one of the SSC indices EC, Cl, ESP, or Na. The best estimates of subsoil CLL values were obtained when the effects of SSCs were described by an ESP-dependent function. In the second approach, depth-dependent CLL values were also derived from the CLL values of the 0.3-0.5 m soil layer. However, instead of using SSC indices to further modify CLL, the default values of the water-extraction coefficient (kl) of each depth layer were modified as a function of the SSC indices. The strength of this approach was evaluated on the basis of correlation of observed and simulated grain yields. In this approach the best estimates were obtained when the default kl values were multiplied by a Cl-determined function. The kl approach was also evaluated with respect to simulated soil moisture at anthesis and at grain maturity. Results using this approach were highly correlated with soil moisture results obtained from simulations based on the measured CLL values. This research provides strong evidence that the effects of SSCs on wheat yields are accounted for by the effects of these constraints on wheat CLL values. The study also produced two satisfactory methods for simulating the effects of SSCs on CLL and on grain yield. While Cl and ESP proved to be effective indices of SSCs, EC was not effective due to the confounding effect of the presence of gypsum in some of these soils. This study provides the tools necessary for investigating the effects of SSCs on wheat crop yields and natural resource management (NRM) issues such as runoff, recharge, and nutrient loss through simulation studies. It also facilitates investigation of suggested agronomic adaptations to SSCs.
Resumo:
This research deals with the development of a Solar-Powered UAV designed for remote sensing, in particular to the development of the autopilot sub-system and path planning. The design of the Solar-Powered UAS followed a systems engineering methodology, by first defining system architecture, and selecting each subsystem. Validation tests and integration of autopilot is performed, in order to evaluate the performances of each subsystem and to obtain a global operational system for data collection missions. The flight tests planning and simulation results are also explored in order to verify the mission capabilities using an autopilot on a UAS. The important aspect of this research is to develop a Solar-Powered UAS for the purpose of data collection and video monitoring, especially data and images from the ground; transmit to the GS (Ground Station), segment the collected data, and afterwards analyze it with a Matlab code.
Resumo:
This report describes a methodology for the design and coupling of a proton exchange membrane (PEM) Fuel Cell to an Unmanned Aerial Vehicle (UAV). The report summarizes existing work in the field, the type of UAV and the mission requirements, design the fuel cell system, simulation environment, and compares endurance and range to when the aircraft is fitted with a conventional internal combustion engine (ICE).
Resumo:
Several intelligent transportation systems (ITS) were used with an advanced driving simulator to assess its influence on driving behavior. Three types of ITS interventions were tested: video in vehicle, audio in vehicle, and on-road flashing marker. The results from the driving simulator were inputs for a developed model that used traffic microsimulation (VISSIM 5.4) to assess the safety interventions. Using a driving simulator, 58 participants were required to drive through active and passive crossings with and without an ITS device and in the presence or absence of an approaching train. The effect of changes in driver speed and compliance rate was greater at passive crossings than at active crossings. The slight difference in speed of drivers approaching ITS devices indicated that ITS helped drivers encounter crossings in a safer way. Since the traffic simulation was not able to replicate a dynamic speed change or a probability of stopping that varied depending on ITS safety devices, some modifications were made to the traffic simulation. The results showed that exposure to ITS devices at active crossings did not influence drivers’ behavior significantly according to the traffic performance indicator, such as delay time, number of stops, speed, and stopped delay. However, the results of traffic simulation for passive crossings, where low traffic volumes and low train headway normally occur, showed that ITS devices improved overall traffic performance.
Resumo:
Self-contained Non-Equilibrium Molecular Dynamics (NEMD) simulations using Lennard-Jones potentials were performed to identify the origin and mechanisms of atomic scale interfacial behavior between sliding metals. The mixing sequence and velocity profiles were compared via MD simulations for three cases, viz.: sell-mated, similar and hard-softvcrystal pairs. The results showed shear instability, atomic scale mixing, and generation of eddies at the sliding interface. Vorticity at the interface suggests that atomic flow during sliding is similar to fluid flow under Kelvin-Helmholtz instability and this is supported by velocity profiles from the simulations. The initial step-function velocity profile spreads during sliding. However the velocity profile does not change much at later stages of the simulation and it eventually stops spreading. The steady state friction coefficient during simulation was monitored as a function of sliding velocity. Frictional behavior can be explained on the basis of plastic deformation and adiabatic effects. The mixing layer growth kinetics was also investigated.
Resumo:
A simulation model that combines biological, search and economic components is applied to the eradication of a Miconia calvescens infestation at El Arish in tropical Queensland, Australia. Information on the year M. calvescens was introduced to the site, the number of plants controlled and the timing of control, is used to show that currently there could be M. calvescens plants remaining undetected at the site, including some mature plants. Modelling results indicate that the eradication programme has had a significant impact on the population of M. calvescens, as shown by simulated results for uncontrolled and controlled populations. The model was also used to investigate the effect of changing search effort on the cost of and time to eradication. Control costs were found to be negligible over all levels of search effort tested. Importantly, results suggest eradication may be achieved within several decades, if resources are increased slightly from their current levels and if there is a long-term commitment to funding the eradication programme.