937 resultados para FEA simulations
Resumo:
Bioscience subjects require a significant amount of training in laboratory techniques to produce highly skilled science graduates. Many techniques which are currently used in diagnostic, research and industrial laboratories require expensive equipment for single users; examples of which include next generation sequencing, quantitative PCR, mass spectrometry and other analytical techniques. The cost of the machines, reagents and limited access frequently preclude undergraduate students from using such cutting edge techniques. In addition to cost and availability, the time taken for analytical runs on equipment such as High Performance Liquid Chromatography (HPLC) does not necessarily fit with the limitations of timetabling. Understanding the theory underlying these techniques without the accompanying practical classes can be unexciting for students. One alternative from wet laboratory provision is to use virtual simulations of such practical which enable students to see the machines and interact with them to generate data. The Faculty of Science and Technology at the University of Westminster has provided all second and third year undergraduate students with iPads so that these students all have access to a mobile device to assist with learning. We have purchased licences from Labster to access a range of virtual laboratory simulations. These virtual laboratories are fully equipped and require student responses to multiple answer questions in order to progress through the experiment. In a pilot study to look at the feasibility of the Labster virtual laboratory simulations with the iPad devices; second year Biological Science students (n=36) worked through the Labster HPLC simulation on iPads. The virtual HPLC simulation enabled students to optimise the conditions for the separation of drugs. Answers to Multiple choice questions were necessary to progress through the simulation, these focussed on the underlying principles of the HPLC technique. Following the virtual laboratory simulation students went to a real HPLC in the analytical suite in order to separate of asprin, caffeine and paracetamol. In a survey 100% of students (n=36) in this cohort agreed that the Labster virtual simulation had helped them to understand HPLC. In free text responses one student commented that "The terminology is very clear and I enjoyed using Labster very much”. One member of staff commented that “there was a very good knowledge interaction with the virtual practical”.
Resumo:
The use of serious games in education and their pedagogical benefit is being widely recognized. However, effective integration of serious games in education depends on addressing two big challenges: the successful incorporation of motivation and engagement that can lead to learning; and the highly specialised skills associated with customised development to meet the required pedagogical objectives. This paper presents the Westminster Serious Games Platform (wmin-SGP) an authoring tool that allows educators/domain experts without games design and development technical skills to create bespoke roleplay simulations in three dimensional scenes featuring fully embodied virtual humans capable of verbal and non-verbal interaction with users fit for specific educational objectives. The paper presents the wmin-SGP system architecture and it evaluates its effectiveness in fulfilling its purpose via the implementation of two roleplay simulations, one for Politics and one for Law. In addition, it presents the results of two types of evaluation that address how successfully the wmin-SGP combines usability principles and game core drives based on the Octalysis gamification framework that lead to motivating games experiences. The evaluation results shows that the wmin-SGP: provides an intuitive environment and tools that support users without advanced technical skills to create in real-time bespoke roleplay simulations in advanced graphical interfaces; satisfies most of the usability principles; and provides balanced simulations based on the Octalysis framework core drives. The paper concludes with a discussion of future extension of this real time authoring tool and directions for further development of the Octalysis framework to address learning.
Resumo:
The past years have witnessed an increased use of applied games for developing and evaluating communication skills. These skills benefit from in-terpersonal interactions. Providing feedback to students practicing communica-tion skills is difficult in a traditional class setting with one teacher and many students. This logistic challenge may be partly overcome by providing training using a simulation in which a student practices with communication scenarios. A scenario is a description of a series of interactions, where at each step the player is faced with a choice. We have developed a scenario editor that enables teachers to develop scenarios for practicing communication skills. A teacher can develop a scenario without knowledge of the implementation. This paper presents the implementation architecture for such a scenario-based simulation.
Resumo:
We present self-consistent, axisymmetric core-collapse supernova simulations performed with the Prometheus-Vertex code for 18 pre-supernova models in the range of 11–28 M ⊙, including progenitors recently investigated by other groups. All models develop explosions, but depending on the progenitor structure, they can be divided into two classes. With a steep density decline at the Si/Si–O interface, the arrival of this interface at the shock front leads to a sudden drop of the mass-accretion rate, triggering a rapid approach to explosion. With a more gradually decreasing accretion rate, it takes longer for the neutrino heating to overcome the accretion ram pressure and explosions set in later. Early explosions are facilitated by high mass-accretion rates after bounce and correspondingly high neutrino luminosities combined with a pronounced drop of the accretion rate and ram pressure at the Si/Si–O interface. Because of rapidly shrinking neutron star radii and receding shock fronts after the passage through their maxima, our models exhibit short advection timescales, which favor the efficient growth of the standing accretion-shock instability. The latter plays a supportive role at least for the initiation of the re-expansion of the stalled shock before runaway. Taking into account the effects of turbulent pressure in the gain layer, we derive a generalized condition for the critical neutrino luminosity that captures the explosion behavior of all models very well. We validate the robustness of our findings by testing the influence of stochasticity, numerical resolution, and approximations in some aspects of the microphysics.
Resumo:
Injection stretch blow moulding is a well-established method of forming thin-walled containers and has been extensively researched for numerous years. This paper is concerned with validating the finite element analysis of the free-stretch-blow process in an effort to progress the development of injection stretch blow moulding of poly(ethylene terephthalate). Extensive data was obtained experimentally over a wide process window accounting for material temperature and air flow rate, while capturing cavity pressure, stretch-rod reaction force and preform surface strain. This data was then used to assess the accuracy of the correlating FE simulation constructed using ABAQUS/Explicit solver and an appropriate viscoelastic material subroutine. Results reveal that the simulation is able to give good quantitative correlation for conditions where the deformation was predominantly equal biaxial whilst qualitative correlation was achievable when the mode of deformation was predominantly sequential biaxial. Overall the simulation was able to pick up the general trends of how the pressure, reaction force, strain rate and strain vary with the variation in preform temperature and air flow rate. The knowledge gained from these analyses provides insight into the mechanisms of bottle formation, subsequently improving the blow moulding simulation and allowing for reduction in future development costs.
Resumo:
Oscillating wave surge converters are a promising technology to harvest ocean wave energy in the near shore region. Although research has been going on for many years, the characteristics of the wave action on the structure and especially the phase relation between the driving force and wave quantities like velocity or surface elevation have not been investigated in detail. The main reason for this is the lack of suitable methods. Experimental investigations using tank tests do not give direct access to overall hydrodynamic loads, only damping torque of a power take off system can be measured directly. Non-linear computational fluid dynamics methods have only recently been applied in the research of this type of devices. This paper presents a new metric named wave torque, which is the total hydrodynamic torque minus the still water pitch stiffness at any given angle of rotation. Changes in characteristics of that metric over a wave cycle and for different power take off settings are investigated using computational fluid dynamics methods. Firstly, it is shown that linearised methods cannot predict optimum damping in typical operating states of OWSCs. We then present phase relationships between main kinetic parameters for different damping levels. Although the flap seems to operate close to resonance, as predicted by linear theory, no obvious condition defining optimum damping is found.
Resumo:
We present a reformulation of the hairy-probe method for introducing electronic open boundaries that is appropriate for steady-state calculations involving nonorthogonal atomic basis sets. As a check on the correctness of the method we investigate a perfect atomic wire of Cu atoms and a perfect nonorthogonal chain of H atoms. For both atom chains we find that the conductance has a value of exactly one quantum unit and that this is rather insensitive to the strength of coupling of the probes to the system, provided values of the coupling are of the same order as the mean interlevel spacing of the system without probes. For the Cu atom chain we find in addition that away from the regions with probes attached, the potential in the wire is uniform, while within them it follows a predicted exponential variation with position. We then apply the method to an initial investigation of the suitability of graphene as a contact material for molecular electronics. We perform calculations on a carbon nanoribbon to determine the correct coupling strength of the probes to the graphene and obtain a conductance of about two quantum units corresponding to two bands crossing the Fermi surface. We then compute the current through a benzene molecule attached to two graphene contacts and find only a very weak current because of the disruption of the π conjugation by the covalent bond between the benzene and the graphene. In all cases we find that very strong or weak probe couplings suppress the current.
Resumo:
Key Performance Indicators (KPIs) and their predictions are widely used by the enterprises for informed decision making. Nevertheless , a very important factor, which is generally overlooked, is that the top level strategic KPIs are actually driven by the operational level business processes. These two domains are, however, mostly segregated and analysed in silos with different Business Intelligence solutions. In this paper, we are proposing an approach for advanced Business Simulations, which converges the two domains by utilising process execution & business data, and concepts from Business Dynamics (BD) and Business Ontologies, to promote better system understanding and detailed KPI predictions. Our approach incorporates the automated creation of Causal Loop Diagrams, thus empowering the analyst to critically examine the complex dependencies hidden in the massive amounts of available enterprise data. We have further evaluated our proposed approach in the context of a retail use-case that involved verification of the automatically generated causal models by a domain expert.
Resumo:
The gravitationally confined detonation (GCD) model has been proposed as a possible explosion mechanism for Type Ia supernovae in the single-degenerate evolution channel. It starts with ignition of a deflagration in a single off-centre bubble in a near-Chandrasekhar-mass white dwarf. Driven by buoyancy, the deflagration flame rises in a narrow cone towards the surface. For the most part, the main component of the flow of the expanding ashes remains radial, but upon reaching the outer, low-pressure layers of the white dwarf, an additional lateral component develops. This causes the deflagration ashes to converge again at the opposite side, where the compression heats fuel and a detonation may be launched. We first performed five three-dimensional hydrodynamic simulations of the deflagration phase in 1.4 M⊙ carbon/oxygen white dwarfs at intermediate-resolution (2563computational zones). We confirm that the closer the initial deflagration is ignited to the centre, the slower the buoyant rise and the longer the deflagration ashes takes to break out and close in on the opposite pole to collide. To test the GCD explosion model, we then performed a high-resolution (5123 computational zones) simulation for a model with an ignition spot offset near the upper limit of what is still justifiable, 200 km. This high-resolution simulation met our deliberately optimistic detonation criteria, and we initiated a detonation. The detonation burned through the white dwarf and led to its complete disruption. For this model, we determined detailed nucleosynthetic yields by post-processing 106 tracer particles with a 384 nuclide reaction network, and we present multi-band light curves and time-dependent optical spectra. We find that our synthetic observables show a prominent viewing-angle sensitivity in ultraviolet and blue wavelength bands, which contradicts observed SNe Ia. The strong dependence on the viewing angle is caused by the asymmetric distribution of the deflagration ashes in the outer ejecta layers. Finally, we compared our model to SN 1991T. The overall flux level of the model is slightly too low, and the model predicts pre-maximum light spectral features due to Ca, S, and Si that are too strong. Furthermore, the model chemical abundance stratification qualitatively disagrees with recent abundance tomography results in two key areas: our model lacks low-velocity stable Fe and instead has copious amounts of high-velocity 56Ni and stable Fe. We therefore do not find good agreement of the model with SN 1991T.
Resumo:
Abandonment of farming systems on upland areas in southwest Britain during the Late Bronze Age – some 3000 years ago – is widely considered a ‘classic’ demonstration of the impact of deteriorating climate on the vulnerability of populations in such marginal environments. Here we test the hypothesis that climate change drove the abandonment of upland areas by developing new chronologies for human activity on upland areas during the Bronze Age across southwest Britain (Dartmoor, Exmoor and Bodmin Moor). We find Bronze Age activity in these areas spanned 3900–2950 calendar years ago with abandonment by 2900 calendar years ago. Holocene Irish bog and lake oak tree populations provide evidence of major shifts in hydroclimate across western Britain and Ireland, coincident with ice rafted debris layers recognized in North Atlantic marine sediments, indicating significant changes in the latitude and intensity of zonal atmospheric circulation across the region. We observe abandonment of upland areas in southwest Britain coinciding with a sustained period of extreme wet conditions that commenced 3100 calendar years ago. Our results are consistent with the view that climate change increased the vulnerability of these early farming communities and led to a less intensive use of such marginal environments across Britain.
Resumo:
Mid Sweden University is currently researching how to capture more of a scene with a camera and how to create 3D images that does not require extra equipment for the viewer. In the process of this research they have started looking into simulating some of the tests that they wish to conduct. The goal of this project is to research whether the 3D graphics engine Unity3D could be used to simulate these tests, and to what degree. To test this a simulation was designed and implemented. The simulation used a split display system where each camera is directly connected to a part of the screen and using the position of the viewer the correct part of the camera feed is shown. Some literary studies were also done into how current 3D technology works. The simulation was successfully implemented and shows that simple simulation can be done in Unity3D, however, some problems were encountered in the process. The conclusion of the project show that there is much work left before simulation is viable but that there is potential in the technology and that the research team should continue to investigate it.
Resumo:
Transient simulations are widely used in studying the past climate as they provide better comparison with any exisiting proxy data. However, multi-millennial transient simulations using coupled climate models are usually computationally very expensive. As a result several acceleration techniques are implemented when using numerical simulations to recreate past climate. In this study, we compare the results from transient simulations of the present and the last interglacial with and without acceleration of the orbital forcing, using the comprehensive coupled climate model CCSM3 (Community Climate System Model 3). Our study shows that in low-latitude regions, the simulation of long-term variations in interglacial surface climate is not significantly affected by the use of the acceleration technique (with an acceleration factor of 10) and hence, large-scale model-data comparison of surface variables is not hampered. However, in high-latitude regions where the surface climate has a direct connection to the deep ocean, e.g. in the Southern Ocean or the Nordic Seas, acceleration-induced biases in sea-surface temperature evolution may occur with potential influence on the dynamics of the overlying atmosphere. The data provided here are from both accelerated and non-accelerated runs as decadal mean values.
Resumo:
Otto-von-Guericke-Universität Magdeburg, Fakultät für Verfahrens- und Systemtechnik, Dissertation, 2016