918 resultados para Computer simulation, Colloidal systems, Nucleation
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
This work examined a new method of detecting small water filled cracks in underground insulation ('water trees') using data from commecially available non-destructive testing equipment. A testing facility was constructed and a computer simulation of the insulation designed in order to test the proposed ageing factor - the degree of non-linearity. This was a large industry-backed project involving an ARC linkage grant, Ergon Energy and the University of Queensland, as well as the Queensland University of Technology.
Resumo:
Due to the increasing energy demand and global warming effects, energy efficient buildings have become increasingly important in the modern construction industry. This research is conducted to evaluate the energy performance, financial feasibility and potential energy savings of zero energy houses. Through the use of building computer simulation technique, a 5 stars energy rated house was modelled and validated by comparing the energy performance of a base case scenario to a typical house in Brisbane. By integrating energy reduction strategies and utilizing onsite renewable energy such as solar energy, zero energy performance is achieved. It is found that approximately 66 % energy savings can be achieved in the household annual energy usage by focusing on maximizing the thermal performance of building envelope, minimizing the energy requirements and incorporating solar energy technologies.
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
The acceptance of broadband ultrasound attenuation (BUA) for the assessment of osteoporosis suffers from a limited understanding of both ultrasound wave propagation through cancellous bone and its exact dependence upon the material and structural properties. It has recently been proposed that ultrasound wave propagation in cancellous bone may be described by a concept of parallel sonic rays; the transit time of each ray defined by the proportion of bone and marrow propagated. A Transit Time Spectrum (TTS) describes the proportion of sonic rays having a particular transit time, effectively describing the lateral inhomogeneity of transit times over the surface aperture of the receive ultrasound transducer. The aim of this study was to test the hypothesis that the solid volume fraction (SVF) of simplified bone:marrow replica models may be reliably estimated from the corresponding ultrasound transit time spectrum. Transit time spectra were derived via digital deconvolution of the experimentally measured input and output ultrasonic signals, and compared to predicted TTS based on the parallel sonic ray concept, demonstrating agreement in both position and amplitude of spectral peaks. Solid volume fraction was calculated from the TTS; agreement between true (geometric calculation) with predicted (computer simulation) and experimentally-derived values were R2=99.9% and R2=97.3% respectively. It is therefore envisaged that ultrasound transit time spectroscopy (UTTS) offers the potential to reliably estimate bone mineral density and hence the established T-score parameter for clinical osteoporosis assessment.
Resumo:
From a computer simulation of the 270 MHz 1H NMR spectra of hydroxyproline (Hyp) and its protected derivatives, precise values of ring vicinal coupling constants were obtained. These couplings were related to ring torsional angles, using a Karplus type analysis. From the NMR analysis it was observed that the pyrrolidine ring possesses a unique and highly homogeneous conformation (Cγ-exo form). Temperature dependence studies on protected dipeptides suggest that the pyrrolidine ring conformation is independent of backbone conformation. An unusual X-Hyp, β-turn was observed for Boc-Aib-Hyp-NHMe.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.
Resumo:
The statistical minimum risk pattern recognition problem, when the classification costs are random variables of unknown statistics, is considered. Using medical diagnosis as a possible application, the problem of learning the optimal decision scheme is studied for a two-class twoaction case, as a first step. This reduces to the problem of learning the optimum threshold (for taking appropriate action) on the a posteriori probability of one class. A recursive procedure for updating an estimate of the threshold is proposed. The estimation procedure does not require the knowledge of actual class labels of the sample patterns in the design set. The adaptive scheme of using the present threshold estimate for taking action on the next sample is shown to converge, in probability, to the optimum. The results of a computer simulation study of three learning schemes demonstrate the theoretically predictable salient features of the adaptive scheme.
Resumo:
We consider the problem of estimating the optimal parameter trajectory over a finite time interval in a parameterized stochastic differential equation (SDE), and propose a simulation-based algorithm for this purpose. Towards this end, we consider a discretization of the SDE over finite time instants and reformulate the problem as one of finding an optimal parameter at each of these instants. A stochastic approximation algorithm based on the smoothed functional technique is adapted to this setting for finding the optimal parameter trajectory. A proof of convergence of the algorithm is presented and results of numerical experiments over two different settings are shown. The algorithm is seen to exhibit good performance. We also present extensions of our framework to the case of finding optimal parameterized feedback policies for controlled SDE and present numerical results in this scenario as well.
Resumo:
Population structure, including population stratification and cryptic relatedness, can cause spurious associations in genome-wide association studies (GWAS). Usually, the scaled median or mean test statistic for association calculated from multiple single-nucleotide-polymorphisms across the genome is used to assess such effects, and 'genomic control' can be applied subsequently to adjust test statistics at individual loci by a genomic inflation factor. Published GWAS have clearly shown that there are many loci underlying genetic variation for a wide range of complex diseases and traits, implying that a substantial proportion of the genome should show inflation of the test statistic. Here, we show by theory, simulation and analysis of data that in the absence of population structure and other technical artefacts, but in the presence of polygenic inheritance, substantial genomic inflation is expected. Its magnitude depends on sample size, heritability, linkage disequilibrium structure and the number of causal variants. Our predictions are consistent with empirical observations on height in independent samples of ~4000 and ~133,000 individuals.
Resumo:
Blood cells participate in vital physiological processes, and their numbers are tightly regulated so that homeostasis is maintained. Disruption of key regulatory mechanisms underlies many blood-related Mendelian diseases but also contributes to more common disorders, including atherosclerosis. We searched for quantitative trait loci (QTL) for hematology traits through a whole-genome association study, because these could provide new insights into both hemopoeitic and disease mechanisms. We tested 1.8 million variants for association with 13 hematology traits measured in 6015 individuals from the Australian and Dutch populations. These traits included hemoglobin composition, platelet counts, and red blood cell and white blood cell indices. We identified three regions of strong association that, to our knowledge, have not been previously reported in the literature. The first was located in an intergenic region of chromosome 9q31 near LPAR1, explaining 1.5% of the variation in monocyte counts (best SNP rs7023923, p=8.9x10(-14)). The second locus was located on chromosome 6p21 and associated with mean cell erythrocyte volume (rs12661667, p=1.2x10(-9), 0.7% variance explained) in a region that spanned five genes, including CCND3, a member of the D-cyclin gene family that is involved in hematopoietic stem cell expansion. The third region was also associated with erythrocyte volume and was located in an intergenic region on chromosome 6q24 (rs592423, p=5.3x10(-9), 0.6% variance explained). All three loci replicated in an independent panel of 1543 individuals (p values=0.001, 9.9x10(-5), and 7x10(-5), respectively). The identification of these QTL provides new opportunities for furthering our understanding of the mechanisms regulating hemopoietic cell fate.
Resumo:
Rapid genetic gains for growth in barramundi ( Lates calcarifer) appear achievable by starting a breeding programme using foundation stock from progeny tested broodstock. The potential gains of this novel breeding design were investigated using biologically feasible scenarios tested with computer simulation models. The design involves the production of a large number of full-sib families using artificial mating which are compared in common growout conditions. The estimated breeding values of their paternal parents are calculated using a binomial probit analysis to assess their suitability as foundation broodstock. The programme can theoretically yield faster rates of genetic gain compared to other breeding programmes for aquaculture species. Assuming a heritability of 0.25 for growth, foundation broodstock evaluated in two years had breeding values for faster growth ranging from 21% to 51% depending on the genetic diversity of stock under evaluation. As a comparison it will take between nine and twenty-two years to identify broodstock with similar breeding values in a contemporary barramundi breeding programme.
ssSNPer: identifying statistically similar SNPs to aid interpretation of genetic association studies
Resumo:
ssSNPer is a novel user-friendly web interface that provides easy determination of the number and location of untested HapMap SNPs, in the region surrounding a tested HapMap SNP, which are statistically similar and would thus produce comparable and perhaps more significant association results. Identification of ssSNPs can have crucial implications for the interpretation of the initial association results and the design of follow-up studies. AVAILABILITY: http://fraser.qimr.edu.au/general/daleN/ssSNPer/
Resumo:
Zoonoses from wildlife threaten global public health. Hendra virus is one of several zoonotic viral diseases that have recently emerged from Pteropus species fruit-bats (flying-foxes). Most hypotheses regarding persistence of Hendra virus within flying-fox populations emphasize horizontal transmission within local populations (colonies) via urine and other secretions, and transmission among colonies via migration. As an alternative hypothesis, we explore the role of recrudescence in persistence of Hendra virus in flying-fox populations via computer simulation using a model that integrates published information on the ecology of flying-foxes, and the ecology and epidemiology of Hendra virus. Simulated infection patterns agree with infection patterns observed in the field and suggest that Hendra virus could be maintained in an isolated flying-fox population indefinitely via periodic recrudescence in a manner indistinguishable from maintenance via periodic immigration of infected individuals. Further, post-recrudescence pulses of infectious flying-foxes provide a plausible basis for the observed seasonal clustering of equine cases. Correct understanding of the infection dynamics of Hendra virus in flying-foxes is fundamental to effectively managing risk of infection in horses and humans. Given the lack of clear empirical evidence on how the virus is maintained within populations, the role of recrudescence merits increased attention.
Resumo:
The possibility of using spin-probe electron spin resonance (ESR) as a tool to study glass transition temperature, T g, of polymer electrolytes is explored in 4 hydroxy 2,2,6,6 tetramethylpiperidine N oxyl (TEMPOL) doped composite polymer electrolyte (PEG)46LiClO4 dispersed with nanoparticles of hydrotalcite. The T g is estimated from the measured values of T 50G, the temperature at which the extrema separation 2A zz of the broad powder spectrum decreases to 50 G. In another method, the correlation time τc for the spin probe dynamics was determined by computer simulation of the ESR spectra and T g has been identified as the temperature at which τc begins to show temperature dependence. While both methods give values of T g close to those obtained from differential scanning calorimetry, it is concluded that more work is required to establish spin-probe ESR as a reliable technique for the determination of T g.