959 resultados para Simulation tools
Resumo:
In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.
Resumo:
This paper presents an accurate and robust geometric and material nonlinear formulation to predict structural behaviour of unprotected steel members at elevated temperatures. A fire analysis including large displacement effects for frame structures is presented. This finite element formulation of beam-column elements is based on the plastic hinge approach to model the elasto-plastic strain-hardening material behaviour. The Newton-Raphson method allowing for the thermal-time dependent effect was employed for the solution of the non-linear governing equations for large deflection in thermal history. A combined incremental and total formulation for determining member resistance is employed in this nonlinear solution procedure for the efficient modeling of nonlinear effects. Degradation of material strength with increasing temperature is simulated by a set of temperature-stress-strain curves according to both ECCS and BS5950 Part 8, which implicitly allows for creep deformation. The effects of uniform or non-uniform temperature distribution over the section of the structural steel member are also considered. Several numerical and experimental verifications are presented.
Resumo:
The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.
Resumo:
Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
Food waste is a current challenge that both developing and developed countries face. This project applied a novel combination of available methods in Mechanical, agricultural and food engineering to address these challenges. A systematic approach was devised to investigate possibilities of reducing food waste and increasing the efficiency of industry by applying engineering concepts and theories including experimental, mathematical and computational modelling methods. This study highlights the impact of comprehensive understanding of agricultural and food material response to the mechanical operations and its direct relation to the volume of food wasted globally.
Resumo:
This chapter was developed as part of the ‘People, communities and economies of the Lake Eyre Basin’ project. It has been written for communities, government agencies and interface organisations involved in natural resource management (NRM) in the Lake Eyre Basin (LEB). Its purpose is to identify the key factors for successful community engagement processes relevant to the LEB and present tools and principles for successful engagement processes. The term ‘interface organisation’ is used here to refer to the diverse range of local and regional organisations (such as Catchment Committees or NRM Regional Bodies) that serve as linkages, or translators, between local communities and broader Australian and State Governments. The importance of fostering and harnessing effective processes of community engagement has been identified as crucial to building a prosperous future for rural and remote regions in Australia. The chapter presents an overview of the literature on successful community engagement processes for NRM, as well as an overview of the current NRM arrangements in the LEB. The main part of the chapter presents findings of the series of interviews conducted with the government liaison officers representing both state and federal organisations who are responsible for coordinating and facilitating regional NRM in the LEB, and with the members of communities of the LEB.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
Sugar cane is a major source of food and fuel worldwide. Biotechnology has the potential to improve economically-important traits in sugar cane as well as diversify sugar cane beyond traditional applications such as sucrose production. High levels of transgene expression are key to the success of improving crops through biotechnology. Here we describe new molecular tools that both expand and improve gene expression capabilities in sugar cane. We have identified promoters that can be used to drive high levels of gene expression in the leaf and stem of transgenic sugar cane. One of these promoters, derived from the Cestrum yellow leaf curling virus, drives levels of constitutive transgene expression that are significantly higher than those achieved by the historical benchmark maize polyubiquitin-1 (Zm-Ubi1) promoter. A second promoter, the maize phosphonenolpyruvate carboxylate promoter, was found to be a strong, leaf-preferred promoter that enables levels of expression comparable to Zm-Ubi1 in this organ. Transgene expression was increased approximately 50-fold by gene modification, which included optimising the codon usage of the coding sequence to better suit sugar cane. We also describe a novel dual transcriptional enhancer that increased gene expression from different promoters, boosting expression from Zm-Ubi1 over eightfold. These molecular tools will be extremely valuable for the improvement of sugar cane through biotechnology.
Resumo:
Graphene has been increasingly used as nano sized fillers to create a broad range of nanocomposites with exceptional properties. The interfaces between fillers and matrix play a critical role in dictating the overall performance of a composite. However, the load transfer mechanism along graphene-polymer interface has not been well understood. In this study, we conducted molecular dynamics simulations to investigate the influence of surface functionalization and layer length on the interfacial load transfer in graphene polymer nanocomposites. The simulation results show that oxygen-functionalized graphene leads to larger interfacial shear force than hydrogen-functionalized and pristine ones during pull-out process. The increase of oxygen coverage and layer length enhances interfacial shear force. Further increase of oxygen coverage to about 7% leads to a saturated interfacial shear force. A model was also established to demonstrate that the mechanism of interfacial load transfer consists of two contributing parts, including the formation of new surface and relative sliding along the interface. These results are believed to be useful in development of new graphene-based nanocomposites with better interfacial properties.
Resumo:
User evaluations using paper prototypes commonly lack social context. The Group simulation technique described in this paper offers a solution to this problem. The study introduces an early-phase participatory design technique targeted for small groups. The proposed technique is used for evaluating an interface, which enables group work in photo collection creation. Three groups of four users, 12 in total, took part in a simulation session where they tested a low-fidelity design concept that included their own personal photo content from an event that their group attended together. The users’ own content was used to evoke natural experiences. Our results indicate that the technique helped users to naturally engage with the prototype in the session. The technique is suggested to be suitable for evaluating other early-phase concepts and to guide design solutions, especially with the concepts that include users’ personal content and enable content sharing.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
Bone is characterized with an optimized combination of high stiffness and toughness. The understanding of bone nanomechanics is critical to the development of new artificial biological materials with unique properties. In this work, the mechanical characteristics of the interfaces between osteopontin (OPN, a noncollagenous protein in extrafibrillar protein matrix) and hydroxyapatite (HA, a mineral nanoplatelet in mineralized collagen fibrils) were investigated using molecular dynamics method. We found that the interfacial mechanical behaviour is governed by the electrostatic attraction between acidic amino acid residues in OPN and calcium in HA. Higher energy dissipation is associated with the OPN peptides with a higher number of acidic amino acid residues. When loading in the interface direction, new bonds between some acidic residues and HA surface are formed, resulting in a stick-slip type motion of OPN peptide on the HA surface and high interfacial energy dissipation. The formation of new bonds during loading is considered to be a key mechanism responsible for high fracture resistance observed in bone and other biological materials.
Resumo:
This study set out to investigate the kinds of learning difficulties encountered by the Malaysian students and how they actually coped with online learning. The modified Online Learning Environment Survey (OLES) instrument was used to collect data from the sample of 40 Malaysian students at a university in Brisbane, Australia. A controlled group of 35 Australian students was also included for comparison purposes. Contrary to assumptions from previous researches, the findings revealed that there were only a few differences between the international Asian and Australian students with regards to their perceptions of online learning. Recommendations based on the findings of this research study were applicable for Australian universities which have Asian international students enrolled to study online.