959 resultados para Simulation tools
Resumo:
Spreadsheet for Creative City Index 2012
Resumo:
The decisions people make about medical treatments have a great impact on their lives. Health care practitioners, providers and patients often make decisions about medical treatments without complete understanding of the circumstances. The main reason for this is that medical data are available in fragmented, disparate and heterogeneous data silos. Without a centralised data warehouse structure to integrate these data silos, it is highly unlikely and impractical for the users to get all the information required on time to make a correct decision. In this research paper, a clinical data integration approach using SAS Clinical Data Integration Server tools is presented.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
This thesis investigates how modern individuals relate to themselves and others in the service of shaping their ethical conduct and governing themselves. It considers the use of online social networking sites (SNSs) as one particular practice through which people manage their day-to-day conduct and understandings of self. Current research on the use of SNSs has conceptualised them as tools for communication, information-sharing and self-presentation. This thesis suggests a different way of thinking about these sites as tools for self-formation. A Foucaultian genealogical, historical and problematising approach is applied in order to explore processes of subjectivation and historical backgrounds involved in the use of SNSs. This is complimented with an ANT-based understanding of the role that technologies play in shaping human action. Drawing new connections between three factors will show how they contribute to the ways in which people become selves today. These factors are, one, the psychologisation and rationalisation of modern life that lead people to confess and talk about themselves in order to improve and perfect themselves, two, the transparency or publicness of modern life that incites people to reveal themselves constantly to a public audience and, three, the techno-social hybrid character of Western societies. This thesis will show how some older practices of self-formation have been translated into the context of modern technologised societies and how the care of self has been reinvigorated and combined with the notion of baring self in public. This thesis contributes a different way of thinking about self and the internet that does not seek to define what the modern self is and how it is staged online but rather accounts for the multiple, contingent and historically conditioned processes of subjectivation through which individuals relate to themselves and others in the service of governing their daily conduct.
Resumo:
In this paper, the initial stage of films assembled by energetic C36 fullerenes on diamond (001)–(2 × 1) surface at low-temperature was investigated by molecular dynamics simulation using the Brenner potential. The incident energy was first uniformly distributed within an energy interval 20–50 eV, which was known to be the optimum energy range for chemisorption of single C36 on diamond (001) surface. More than one hundred C36 cages were impacted one after the other onto the diamond surface by randomly selecting their orientation as well as the impact position relative to the surface. The growth of films was found to be in three-dimensional island mode, where the deposited C36 acted as building blocks. The study of film morphology shows that it retains the structure of a free C36 cage, which is consistent with Low Energy Cluster Beam Deposition (LECBD) experiments. The adlayer is composed of many C36-monomers as well as the covalently bonded C36 dimers and trimers which is quite different from that of C20 fullerene-assembled film, where a big polymerlike chain was observed due to the stronger interaction between C20 cages. In addition, the chemisorption probability of C36 fullerenes is decreased with increasing coverage because the interaction between these clusters is weaker than that between the cluster and the surface. When the incident energy is increased to 40–65 eV, the chemisorption probability is found to increased and more dimers and trimers as well as polymerlike-C36 were observed on the deposited films. Furthermore, C36 film also showed high thermal stability even when the temperature was raised to 1500 K.
Resumo:
The deposition of small metal clusters (Cu, Au and Al) on f.c.c. metals (Cu, Au and Ni) has been studied by molecular dynamics simulation using Finnis–Sinclair (FS) potential. The impact energy varied from 0.01 to 10 eV/atom. First, the deposition of single cluster was simulated. We observed that, even at much lower energy, a small cluster with (Ih) icosahedral symmetry was reconstructed to match the substrate structure (f.c.c.) after deposition. Next, clusters were modeled to drop, one after the other, on the surface. The nanostructure was found by soft landing of Au clusters on Cu with increasing coverage, where interfacial energy dominates. While at relatively higher deposition energy (a few eV), the ordered f.c.c.-like structure was observed in the first adlayer of the film formed by Al clusters depositing on Ni substrate. This characteristic is mainly attributive to the ballistic collision. Our results indicate that the surface morphology synthesized by cluster deposition could be controlled by experimental parameters, which will be helpful for controlled design of nanostructure.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
This paper demonstrates, following Vygotsky, that language and tool use has a critical role in the collaborative problem-solving behaviour of school-age children. It reports original ethnographic classroom research examining the convergence of speech and practical activity in children’s collaborative problem solving with robotics programming tasks. The researchers analysed children’s interactions during a series of problem solving experiments in which Lego Mindstorms toolsets were used by teachers to create robotics design challenges among 24 students in a Year 4 Australian classroom (students aged 8.5–9.5 years). The design challenges were incrementally difficult, beginning with basic programming of straight line movement, and progressing to more complex challenges involving programming of the robots to raise Lego figures from conduit pipes using robots as pulleys with string and recycled materials. Data collection involved micro-genetic analysis of students’ speech interactions with tools, peers, and other experts, teacher interviews, and student focus group data. Coding the repeated patterns in the transcripts, the authors outline the structure of the children’s social speech in joint problem solving, demonstrating the patterns of speech and interaction that play an important role in the socialisation of the school-age child’s practical intellect.
Resumo:
Different types of defects can be introduced into graphene during material synthesis, and significantly influence the properties of graphene. In this work, we investigated the effects of structural defects, edge functionalisation and reconstruction on the fracture strength and morphology of graphene by molecular dynamics simulations. The minimum energy path analysis was conducted to investigate the formation of Stone-Wales defects. We also employed out-of-plane perturbation and energy minimization principle to study the possible morphology of graphene nanoribbons with edge-termination. Our numerical results show that the fracture strength of graphene is dependent on defects and environmental temperature. However, pre-existing defects may be healed, resulting in strength recovery. Edge functionalization can induce compressive stress and ripples in the edge areas of graphene nanoribbons. On the other hand, edge reconstruction contributed to the tensile stress and curved shape in the graphene nanoribbons.
Resumo:
This article reports on the development of online assessment tools for disengaged youth in flexible learning environments. Sociocultural theories of learning and assessment and Bourdieu’s sociological concepts of capital and exchange were used to design a purpose-built content management system. This design experiment engaged participants in assessment that led to the exchange of self, peer and teacher judgements for credentialing. This collaborative approach required students and teachers to adapt and amend social networking practices for students to submit and judge their own and others’ work using comments, ratings, keywords and tags. Students and teachers refined their evaluative expertise across contexts, and negotiated meanings and values of digital works, which gave rise to revised versions and emergent assessment criteria. By combining social networking tools with sociological models of capital, assessment activities related to students’ digital productions were understood as valuations and judgements within an emergent, negotiable social field of exchange.
Resumo:
The first objective of this project is to develop new efficient numerical methods and supporting error and convergence analysis for solving fractional partial differential equations to study anomalous diffusion in biological tissue such as the human brain. The second objective is to develop a new efficient fractional differential-based approach for texture enhancement in image processing. The results of the thesis highlight that the fractional order analysis captured important features of nuclear magnetic resonance (NMR) relaxation and can be used to improve the quality of medical imaging.
Resumo:
This thesis explored the development of statistical methods to support the monitoring and improvement in quality of treatment delivered to patients undergoing coronary angioplasty procedures. To achieve this goal, a suite of outcome measures was identified to characterise performance of the service, statistical tools were developed to monitor the various indicators and measures to strengthen governance processes were implemented and validated. Although this work focused on pursuit of these aims in the context of a an angioplasty service located at a single clinical site, development of the tools and techniques was undertaken mindful of the potential application to other clinical specialties and a wider, potentially national, scope.
Resumo:
Worksite wellness efforts can generate enormous health-care savings. Many of the methods available to obtain health and wellness measures can be confusing and lack clarity; for example it can be difficult to understand if measures are appropriate for individuals or population health. Come along and enjoy a hands-on learning experience about measures and better understanding health and wellness outcomes from baseline, midway and beyond.