959 resultados para simulation tools
Resumo:
Information and communication technology (ICT) has created opportunities for students' online interaction in higher education throughout the world. Limited research has been done in this area in Saudi Arabia. This study investigated university students' engagement and perceptions of online collaborative learning using Social Learning Tools (SLTs). In addition, it explored the quality of knowledge construction that occurred in this environment. A mixed methods case study approach was adopted, and the data was gathered from undergraduate students (n=43) who were enrolled in a 15-week course at a Saudi university. The results showed that while the students had positive perceptions towards SLTs and their engagement, data gathered from their work also showed little evidence of high levels of knowledge construction.
Resumo:
Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.
Resumo:
Compositional data analysis usually deals with relative information between parts where the total (abundances, mass, amount, etc.) is unknown or uninformative. This article addresses the question of what to do when the total is known and is of interest. Tools used in this case are reviewed and analysed, in particular the relationship between the positive orthant of D-dimensional real space, the product space of the real line times the D-part simplex, and their Euclidean space structures. The first alternative corresponds to data analysis taking logarithms on each component, and the second one to treat a log-transformed total jointly with a composition describing the distribution of component amounts. Real data about total abundances of phytoplankton in an Australian river motivated the present study and are used for illustration.
Resumo:
The aggregation property of multiheaded surfactants has been investigated by constant pressure molecular dynamics (MD) simulation in aqueous medium. The model multiheaded surfactants contain more than one headgroup (x = 2, 3, and 4) for a single tail group. This increases the hydrophilic charge progressively over the hydrophobic tail which has dramatic consequences in the aggregation behavior. In particular, we have looked at the change in the aggregation property such as critical micellar concentration (cmc), aggregation number, and size of the micelles for the multiheaded surfactants in water. We find with increasing number of headgroups of the Multiheaded surfactants that the cmc values increase and the aggregation numbers as well as the size of the micelles decrease. These trends are in agreement with the experimental findings as reported earlier with x = 1, 2, and 3. We also predict the aggregation properties of multiheaded surfactant With four headgroups (x = 4) for which no experimental studies exist yet.
Resumo:
An interactive graphics package for modeling with Petri Nets has been implemented. It uses the VT-11 graphics terminal supported on the PDP-11/35 computer to draw, execute, analyze, edit and redraw a Petri Net. Each of the above mentioned tasks can be performed by selecting appropriate items from a menu displayed on the screen. Petri Nets with a reasonably large number of nodes can be created and analyzed using this package. The number of nodes supported may be increased by making simple changes in the program. Being interactive, the program seeks information from the user after displaying appropriate messages on the terminal. After completing the Petri Net, it may be executed step by step and the changes in the number of tokens may be observed on the screen, at each place. Some properties of Petri Nets like safety, boundedness, conservation and redundancy can be checked using this package. This package can be used very effectively for modeling asynchronous (concurrent) systems with Petri Nets and simulating the model by “graphical execution.”
Resumo:
The monosaccharide 2-O-sulfo-α-l-iduronic acid (IdoA2S) is one of the major components of glycosaminoglycans. The ability of molecular mechanics force fields to reproduce ring-puckering conformational equilibrium is important for the successful prediction of the free energies of interaction of these carbohydrates with proteins. Here we report unconstrained molecular dynamics simulations of IdoA2S monosaccharide that were carried out to investigate the ability of commonly used force fields to reproduce its ring conformational flexibility in aqueous solution. In particular, the distribution of ring conformer populations of IdoA2S was determined. The GROMOS96 force field with the SPC/E water potential can predict successfully the dominant skew-boat to chair conformational transition of the IdoA2S monosaccharide in aqueous solution. On the other hand, the GLYCAM06 force field with the TIP3P water potential sampled transitional conformations between the boat and chair forms. Simulations using the GROMOS96 force field showed no pseudorotational equilibrium fluctuations and hence no inter-conversion between the boat and twist boat ring conformers. Calculations of theoretical proton NMR coupling constants showed that the GROMOS96 force field can predict the skew-boat to chair conformational ratio in good agreement with the experiment, whereas GLYCAM06 shows worse agreement. The omega rotamer distribution about the C5–C6 bond was predicted by both force fields to have torsions around 10°, 190°, and 360°.
Resumo:
Aggregation of the microtubule associated protein tau (MAPT) within neurons of the brain is the leading cause of tauopathies such as Alzheimer's disease. MAPT is a phospho-protein that is selectively phosphorylated by a number of kinases in vivo to perform its biological function. However, it may become pathogenically hyperphosphorylated, causing aggregation into paired helical filaments and neurofibrillary tangles. The phosphorylation induced conformational change on a peptide of MAPT (htau225−250) was investigated by performing molecular dynamics simulations with different phosphorylation patterns of the peptide (pThr231 and/or pSer235) in different simulation conditions to determine the effect of ionic strength and phosphate charge. All phosphorylation patterns were found to disrupt a nascent terminal β-sheet pattern (226VAVVR230 and 244QTAPVP249), replacing it with a range of structures. The double pThr231/pSer235 phosphorylation pattern at experimental ionic strength resulted in the best agreement with NMR structural characterization, with the observation of a transient α-helix (239AKSRLQT245). PPII helical conformations were only found sporadically throughout the simulations. Proteins 2014; 82:1907–1923. © 2014 Wiley Periodicals, Inc.
Resumo:
This paper asks a new question: how we can use RFID technology in marketing products in supermarkets and how we can measure its performance or ROI (Return-on-Investment). We try to answer the question by proposing a simulation model whereby customers become aware of other customers' real-time shopping behavior and may hence be influenced by their purchases and the levels of purchases. The proposed model is orthogonal to sales model and can have the similar effects: increase in the overall shopping volume. Managers often struggle with the prediction of ROI on purchasing such a technology, this simulation sets to provide them the answers of questions like the percentage of increase in sales given real-time purchase information to other customers. The simulation is also flexible to incorporate any given model of customers' behavior tailored to particular supermarket, settings, events or promotions. The results, although preliminary, are promising to use RFID technology for marketing products in supermarkets and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions. Several other parameters have been discussed including the herd behavior, fake customers, privacy, and optimality of sales-price margin and the ROI of investing in RFID technology for marketing purposes. © 2010 Springer Science+Business Media B.V.
Resumo:
This work proposes a supermarket optimization simulation model called Swarm-Moves is based on self organized complex system studies to identify parameters and their values that can influence customers to buy more on impulse in a given period of time. In the proposed model, customers are assumed to have trolleys equipped with technology like RFID that can aid the passing of products' information directly from the store to them in real-time and vice-versa. Therefore, they can get the information about other customers purchase patterns and constantly informing the store of their own shopping behavior. This can be easily achieved because the trolleys "know" what products they contain at any point. The Swarm-Moves simulation is the virtual supermarket providing the visual display to run and test the proposed model. The simulation is also flexible to incorporate any given model of customers' behavior tailored to particular supermarket, settings, events or promotions. The results, although preliminary, are promising to use RFID technology for marketing products in supermarkets and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions. ©2009 IEEE.
Resumo:
Bioremediation, which is the exploitation of the intrinsic ability of environmental microbes to degrade and remove harmful compounds from nature, is considered to be an environmentally sustainable and cost-effective means for environmental clean-up. However, a comprehensive understanding of the biodegradation potential of microbial communities and their response to decontamination measures is required for the effective management of bioremediation processes. In this thesis, the potential to use hydrocarbon-degradative genes as indicators of aerobic hydrocarbon biodegradation was investigated. Small-scale functional gene macro- and microarrays targeting aliphatic, monoaromatic and low molecular weight polyaromatic hydrocarbon biodegradation were developed in order to simultaneously monitor the biodegradation of mixtures of hydrocarbons. The validity of the array analysis in monitoring hydrocarbon biodegradation was evaluated in microcosm studies and field-scale bioremediation processes by comparing the hybridization signal intensities to hydrocarbon mineralization, real-time polymerase chain reaction (PCR), dot blot hybridization and both chemical and microbiological monitoring data. The results obtained by real-time PCR, dot blot hybridization and gene array analysis were in good agreement with hydrocarbon biodegradation in laboratory-scale microcosms. Mineralization of several hydrocarbons could be monitored simultaneously using gene array analysis. In the field-scale bioremediation processes, the detection and enumeration of hydrocarbon-degradative genes provided important additional information for process optimization and design. In creosote-contaminated groundwater, gene array analysis demonstrated that the aerobic biodegradation potential that was present at the site, but restrained under the oxygen-limited conditions, could be successfully stimulated with aeration and nutrient infiltration. During ex situ bioremediation of diesel oil- and lubrication oil-contaminated soil, the functional gene array analysis revealed inefficient hydrocarbon biodegradation, caused by poor aeration during composting. The functional gene array specifically detected upper and lower biodegradation pathways required for complete mineralization of hydrocarbons. Bacteria representing 1 % of the microbial community could be detected without prior PCR amplification. Molecular biological monitoring methods based on functional genes provide powerful tools for the development of more efficient remediation processes. The parallel detection of several functional genes using functional gene array analysis is an especially promising tool for monitoring the biodegradation of mixtures of hydrocarbons.
Resumo:
Purpose - The purpose of this paper is to apply lattice Boltzmann equation method (LBM) with multiple relaxation time (MRT) model, to investigate lid-driven flow in a three-dimensional (3D), rectangular cavity, and compare the results with flow in an equivalent two-dimensional (2D) cavity. Design/methodology/approach - The second-order MRT model is implemented in a 3D LBM code. The flow structure in cavities of different aspect ratios (0.25-4) and Reynolds numbers (0.01-1000) is investigated. The LBM simulation results are compared with those from numerical solution of Navier-Stokes (NS) equations and with available experimental data. Findings - The 3D simulations demonstrate that 2D models may predict the flow structure reasonably well at low Reynolds numbers, but significant differences with experimental data appear at high Reynolds numbers. Such discrepancy between 2D and 3D results are attributed to the effect of boundary layers near the side-walls in transverse direction (in 3D), due to which the vorticity in the core-region is weakened in general. Secondly, owing to the vortex stretching effect present in 3D flow, the vorticity in the transverse plane intensifies whereas that in the lateral plane decays, with increase in Reynolds number. However, on the symmetry-plane, the flow structure variation with respect to cavity aspect ratio is found to be qualitatively consistent with results of 2D simulations. Secondary flow vortices whose axis is in the direction of the lid-motion are observed; these are weak at low. Reynolds numbers, but become quite strong at high Reynolds numbers. Originality/value - The findings will be useful in the study of variety of enclosed fluid flows.
Resumo:
Mutation and recombination are the fundamental processes leading to genetic variation in natural populations. This variation forms the raw material for evolution through natural selection and drift. Therefore, studying mutation rates may reveal information about evolutionary histories as well as phylogenetic interrelationships of organisms. In this thesis two molecular tools, DNA barcoding and the molecular clock were examined. In the first part, the efficiency of mutations to delineate closely related species was tested and the implications for conservation practices were assessed. The second part investigated the proposition that a constant mutation rate exists within invertebrates, in form of a metabolic-rate dependent molecular clock, which can be applied to accurately date speciation events. DNA barcoding aspires to be an efficient technique to not only distinguish between species but also reveal population-level variation solely relying on mutations found on a short stretch of a single gene. In this thesis barcoding was applied to discriminate between Hylochares populations from Russian Karelia and new Hylochares findings from the greater Helsinki region in Finland. Although barcoding failed to delineate the two reproductively isolated groups, their distinct morphological features and differing life-history traits led to their classification as two closely related, although separate species. The lack of genetic differentiation appears to be due to a recent divergence event not yet reflected in the beetles molecular make-up. Thus, the Russian Hylochares was described as a new species. The Finnish species, previously considered as locally extinct, was recognized as endangered. Even if, due to their identical genetic make-up, the populations had been regarded as conspecific, conservation strategies based on prior knowledge from Russia would not have guaranteed the survival of the Finnish beetle. Therefore, new conservation actions based on detailed studies of the biology and life-history of the Finnish Hylochares were conducted to protect this endemic rarity in Finland. The idea behind the strict molecular clock is that mutation rates are constant over evolutionary time and may thus be used to infer species divergence dates. However, one of the most recent theories argues that a strict clock does not tick per unit of time but that it has a constant substitution rate per unit of mass-specific metabolic energy. Therefore, according to this hypothesis, molecular clocks have to be recalibrated taking body size and temperature into account. This thesis tested the temperature effect on mutation rates in equally sized invertebrates. For the first dataset (family Eucnemidae, Coleoptera) the phylogenetic interrelationships and evolutionary history of the genus Arrhipis had to be inferred before the influence of temperature on substitution rates could be studied. Further, a second, larger invertebrate dataset (family Syrphidae, Diptera) was employed. Several methodological approaches, a number of genes and multiple molecular clock models revealed that there was no consistent relationship between temperature and mutation rate for the taxa under study. Thus, the body size effect, observed in vertebrates but controversial for invertebrates, rather than temperature may be the underlying driving force behind the metabolic-rate dependent molecular clock. Therefore, the metabolic-rate dependent molecular clock does not hold for the here studied invertebrate groups. This thesis emphasizes that molecular techniques relying on mutation rates have to be applied with caution. Whereas they may work satisfactorily under certain conditions for specific taxa, they may fail for others. The molecular clock as well as DNA barcoding should incorporate all the information and data available to obtain comprehensive estimations of the existing biodiversity and its evolutionary history.
Resumo:
Undergraduate Medical Imaging (MI)students at QUT attend their first clinical placement towards the end of semester two. Students undertake two (pre)clinical skills development units – one theory and one practical. Students gain good contextual and theoretical knowledge during these units via a blended learning model with multiple learning methods employed. Students attend theory lectures, practical sessions, tutorial sessions in both a simulated and virtual environment and also attend pre-clinical scenario based tutorial sessions. The aim of this project is to evaluate the use of blended learning in the context of 1st year Medical Imaging Radiographic Technique and its effectiveness in preparing students for their first clinical experience. It is hoped that the multiple teaching methods employed within the pre-clinical training unit at QUT builds students clinical skills prior to the real situation. A quantitative approach will be taken, evaluating via pre and post clinical placement surveys. This data will be correlated with data gained in the previous year on the effectiveness of this training approach prior to clinical placement. In 2014 59 students were surveyed prior to their clinical placement demonstrated positive benefits of using a variety of learning tools to enhance their learning. 98.31%(n=58)of students agreed or strongly agreed that the theory lectures were a useful tool to enhance their learning. This was followed closely by 97% (n=57) of the students realising the value of performing role-play simulation prior to clinical placement. Tutorial engagement was considered useful for 93.22% (n=55) whilst 88.14% (n=52) reasoned that the x-raying of phantoms in the simulated radiographic laboratory was beneficial. Self-directed learning yielded 86.44% (n=51). The virtual reality simulation software was valuable for 72.41% (n=42) of the students. Of the 4 students that disagreed or strongly disagreed with the usefulness of any tool they strongly agreed to the usefulness of a minimum of one other learning tool. The impact of the blended learning model to meet diverse student needs continues to be positive with students engaging in most offerings. Students largely prefer pre -clinical scenario based practical and tutorial sessions where 'real-world’ situations are discussed.
Resumo:
Flexible objects such as a rope or snake move in a way such that their axial length remains almost constant. To simulate the motion of such an object, one strategy is to discretize the object into large number of small rigid links connected by joints. However, the resulting discretised system is highly redundant and the joint rotations for a desired Cartesian motion of any point on the object cannot be solved uniquely. In this paper, we revisit an algorithm, based on the classical tractrix curve, to resolve the redundancy in such hyper-redundant systems. For a desired motion of the `head' of a link, the `tail' is moved along a tractrix, and recursively all links of the discretised objects are moved along different tractrix curves. The algorithm is illustrated by simulations of a moving snake, tying of knots with a rope and a solution of the inverse kinematics of a planar hyper-redundant manipulator. The simulations show that the tractrix based algorithm leads to a more `natural' motion since the motion is distributed uniformly along the entire object with the displacements diminishing from the `head' to the `tail'.
Resumo:
In many parts of the world, uncontrolled fires in sparsely populated areas are a major concern as they can quickly grow into large and destructive conflagrations in short time spans. Detecting these fires has traditionally been a job for trained humans on the ground, or in the air. In many cases, these manned solutions are simply not able to survey the amount of area necessary to maintain sufficient vigilance and coverage. This paper investigates the use of unmanned aerial systems (UAS) for automated wildfire detection. The proposed system uses low-cost, consumer-grade electronics and sensors combined with various airframes to create a system suitable for automatic detection of wildfires. The system employs automatic image processing techniques to analyze captured images and autonomously detect fire-related features such as fire lines, burnt regions, and flammable material. This image recognition algorithm is designed to cope with environmental occlusions such as shadows, smoke and obstructions. Once the fire is identified and classified, it is used to initialize a spatial/temporal fire simulation. This simulation is based on occupancy maps whose fidelity can be varied to include stochastic elements, various types of vegetation, weather conditions, and unique terrain. The simulations can be used to predict the effects of optimized firefighting methods to prevent the future propagation of the fires and greatly reduce time to detection of wildfires, thereby greatly minimizing the ensuing damage. This paper also documents experimental flight tests using a SenseFly Swinglet UAS conducted in Brisbane, Australia as well as modifications for custom UAS.