935 resultados para Simulation tool
Resumo:
Considering ultrasound propagation through complex composite media as an array of parallel sonic rays, a comparison of computer simulated prediction with experimental data has previously been reported for transmission mode (where one transducer serves as transmitter, the other as receiver) in a series of ten acrylic step-wedge samples, immersed in water, exhibiting varying degrees of transit time inhomogeneity. In this study, the same samples were used but in pulse-echo mode, where the same ultrasound transducer served as both transmitter and receiver, detecting both ‘primary’ (internal sample interface) and ‘secondary’ (external sample interface) echoes. A transit time spectrum (TTS) was derived, describing the proportion of sonic rays with a particular transit time. A computer simulation was performed to predict the transit time and amplitude of various echoes created, and compared with experimental data. Applying an amplitude-tolerance analysis, 91.7±3.7% of the simulated data was within ±1 standard deviation (STD) of the experimentally measured amplitude-time data. Correlation of predicted and experimental transit time spectra provided coefficients of determination (R2) ranging from 100.0% to 96.8% for the various samples tested. The results acquired from this study provide good evidence for the concept of parallel sonic rays. Further, deconvolution of experimental input and output signals has been shown to provide an effective method to identify echoes otherwise lost due to phase cancellation. Potential applications of pulse-echo ultrasound transit time spectroscopy (PE-UTTS) include improvement of ultrasound image fidelity by improving spatial resolution and reducing phase interference artefacts.
Resumo:
Three simulations of evapotranspiration were done with two values of time step,viz 10 min and one day. Inputs to the model were weather data, including directly measured upward and downward radiation, and soil characteristics. Three soils were used for each simulation. Analysis of the results shows that the time step has a direct influence on the prediction of potential evapotranspiration, but a complex interaction of this effect with the soil moisture characteristic, rate of increase of ground cover and bare soil evaporation determines the actual transpiration predicted. The results indicate that as small a time step as possible should be used in the simulation.
Resumo:
The ability to deliver the drug to the patient in a safe, efficacious and cost-effective manner depends largely on the physicochemical properties of the active pharmaceutical ingredient (API) in the solid state. In this context, crystallization is of critical importance in pharmaceutical industry, as it defines physical and powder properties of crystalline APIs. An improved knowledge of the various aspects of crystallization process is therefore needed. The overall goal of this thesis was to gain better understanding of the relationships between crystallization, solid-state form and properties of pharmaceutical solids with a focus on a crystal engineering approach to design technological properties of APIs. Specifically, solid-state properties of the crystalline forms of the model APIs, erythromycin A and baclofen, and the influence of solvent on their crystallization behavior were investigated. In addition, the physical phenomena associated with wet granulation and hot-melting processing of the model APIs were examined at the molecular level. Finally, the effect of crystal habit modification of a model API on its tabletting properties was evaluated. The thesis enabled the understanding of the relationship between the crystalline forms of the model APIs, which is of practical importance for solid-state control during processing and storage. Moreover, a new crystalline form, baclofen monohydrate, was discovered and characterized. Upon polymorph screening, erythromycin A demonstrated high solvate-forming propensity thus emphasizing the need for careful control of the solvent effects during formulation. The solvent compositions that yield the desirable crystalline form of erythromycin A were defined. Furthermore, new examples on solvent-mediated phase transformations taking place during wet granulation of baclofen and hot-melt processing of erythromycin A dihydrate with PEG 6000 are reported. Since solvent-mediated phase transformations involve the crystallization of a stable phase and hence affect the dissolution kinetics and possibly absorption of the API these transformations must be well documented. Finally, a controlled-crystallization method utilizing HPMC as a crystal habit modifier was developed for erythromycin A dihydrate. The crystals with modified habit were shown to posses improved compaction properties as compared with those of unmodified crystals. This result supports the idea of morphological crystal engineering as a tool for designing technological properties of APIs and is of utmost practical interest.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.
Resumo:
Efficiency of analysis using generalized estimation equations is enhanced when intracluster correlation structure is accurately modeled. We compare two existing criteria (a quasi-likelihood information criterion, and the Rotnitzky-Jewell criterion) to identify the true correlation structure via simulations with Gaussian or binomial response, covariates varying at cluster or observation level, and exchangeable or AR(l) intracluster correlation structure. Rotnitzky and Jewell's approach performs better when the true intracluster correlation structure is exchangeable, while the quasi-likelihood criteria performs better for an AR(l) structure.
Resumo:
The electric field in certain electrostatic devices can be modeled by a grounded plate electrode affected by a corona discharge generated by a series of parallel wires connected to a DC high-voltage supply. The system of differential equations that describe the behaviour (i.e., charging and motion) of the conductive particle in such an electric field has been numerically solved, using several simplifying assumptions. Thus, it was possible to investigate the effect of various electrical and mechanical factors on the trajectories of conductive particles. This model has been employed to study the behaviour of coalparticles in fly-ash corona separators.
Resumo:
A recently developed radioimmunoassay (RIA) for measuring insulin-like growth factor (IGF-I) in a variety of fish species was used to investigate the correlation between growth rate and circulating IGF-I concentrations of barramundi (Lates calcarifer), Atlantic salmon (Salmo salar) and Southern Bluefin tuna (Thunnus maccoyii). Plasma IGF-I concentration significantly increased with increasing ration size in barramundi and IGF-I concentration was positively correlated to growth rates obtained in Atlantic salmon (r2=0.67) and barramundi (r2=0.65) when fed a variety of diet formulations. IGF-I was also positively correlated to protein concentration (r2=0.59). This evidence suggested that measuring IGF-I concentration may provide a useful tool for monitoring fish growth rate and also as a method to rapidly assess different aquaculture diets. However, no such correlation was demonstrated in the tuna study probably due to seasonal cooling of sea surface temperature shortly before blood was sampled. Thus, some recommendations for the design and sampling strategy of nutritional trials where IGF-I concentrations are measured are discussed
Resumo:
Bond graph is an apt modelling tool for any system working across multiple energy domains. Power electronics system modelling is usually the study of the interplay of energy in the domains of electrical, mechanical, magnetic and thermal. The usefulness of bond graph modelling in power electronic field has been realised by researchers. Consequently in the last couple of decades, there has been a steadily increasing effort in developing simulation tools for bond graph modelling that are specially suited for power electronic study. For modelling rotating magnetic fields in electromagnetic machine models, a support for vector variables is essential. Unfortunately, all bond graph simulation tools presently provide support only for scalar variables. We propose an approach to provide complex variable and vector support to bond graph such that it will enable modelling of polyphase electromagnetic and spatial vector systems. We also introduced a rotary gyrator element and use it along with the switched junction for developing the complex/vector variable's toolbox. This approach is implemented by developing a complex S-function tool box in Simulink inside a MATLAB environment This choice has been made so as to synthesise the speed of S-function, the user friendliness of Simulink and the popularity of MATLAB.
Resumo:
Sheep in western Queensland have been predominantly reared for wool. When wool prices became depressed interest in the sheep meat industry, increased. For north west Queensland producers, opportunities may exist to participate in live sheep and meat export to Asia. A simulation model was developed to determine whether this sheep producing area has the capability to provide sufficient numbers of sheep under variable climatic conditions while sustaining the land resources. Maximum capacity for sustainability of resources (as described by stock numbers) was derived from an in-depth study of the agricultural and pastoral potential of Queensland. Decades of sheep production and climatic data spanning differing seasonal conditions were collated for analysis. A ruminant biology model adapted from Grazplan was used to simulate pregnancy rate. Empirical equations predict mortalities, marking rates, and weight characteristics of sheep of various ages from simple climatic measures, stocking rate and reproductive status. The initial age structure of flocks was determined by running the model for several years with historical climatic conditions. Drought management strategies such as selling a proportion of wethers progressively down to two-tooth and oldest ewes were incorporated. Management decisions such as time of joining, age at which ewes were cast-for-age, wether turn-off age and turning-off rate of lambs vary with geographical area and can be specified at run time. The model is run for sequences of climatic conditions generated stochastically from distributions based on historical climatic data correlated in some instances. The model highlights the difficulties of sustaining a consistent supply of sheep under variable climatic conditions.
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
By quantifying the effects of climatic variability in the sheep grazing lands of north western and western Queensland, the key biological rates of mortality and reproduction can be predicted for sheep. These rates are essential components of a decision support package which can prove a useful management tool for producers, especially if they can easily obtain the necessary predictors. When the sub-models of the GRAZPLAN ruminant biology process model were re-parameterised from Queensland data along with an empirical equation predicting the probability of ewes mating added, the process model predicted the probability of pregnancy well (86% variation explained). Predicting mortality from GRAZPLAN was less successful but an empirical equation based on relative condition of the animal (a measure based on liveweight), pregnancy status and age explained 78% of the variation in mortalities. A crucial predictor in these models was liveweight which is not often recorded on producer properties. Empirical models based on climatic and pasture conditions estimated from the pasture production model GRASP, predicted marking and mortality rates for Mitchell grass (Astrebla sp.) pastures (81% and 63% of the variation explained). These prediction equations were tested against independent data from producer properties and the model successfully validated for Mitchell grass communities.
Resumo:
This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.
Resumo:
A 5′ Taq nuclease assay utilising minor groove binder technology and targeting the 16S rRNA gene was designed to detect Pasteurella multocida (the causative agent of fowl cholera) in swabs collected from poultry. The assay was first evaluated using pure cultures. The assay correctly identified four P. multocida taxonomic type strains, 18 P. multocida serovar reference strains and 40 Australian field isolates (17 from poultry, 11 from pigs and 12 from cattle). Representatives of nine other Pasteurella species, 26 other bacterial species (18 being members of the family Pasteurellaceae) and four poultry virus isolates did not react in the assay. The assay detected a minimum of approximately 10 cfu of P. multocida per reaction. Of 79 poultry swabs submitted to the laboratory for routine bacteriological culture, 17 were positive in the 5′ Taq nuclease assay, but only 10 were positive by culture. The other 62 swabs were negative for P. multocida by both 5′ Taq nuclease assay and culture. The assay is suitable for use in diagnosing fowl cholera, is more rapid than bacteriological culture, and may also have application in diagnosing P. multocida infections in cattle and pigs.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.