934 resultados para dynamic methods
Resumo:
The appearance of poststructuralism as a research methodology in public health literature raises questions about the history and purpose of this research. We examine (a) some aspects of the history of qualitative methods and their place within larger social and research domains, and (b) the purposes of a public health research that employs poststructuralist philosophy delineating the methodological issues that require consideration in positing a poststructural analysis. We argue against poststructuralism becoming a research methodology deployed to seize the pubic health debate, rather than being employed for its own particular critical strengths.
Resumo:
This dissertation analyses how physical objects are translated into digital artworks using techniques which can lead to ‘imperfections’ in the resulting digital artwork that are typically removed to arrive at a ‘perfect’ final representation. The dissertation discusses the adaptation of existing techniques into an artistic workflow that acknowledges and incorporates the imperfections of translation into the final pieces. It presents an exploration of the relationship between physical and digital artefacts and the processes used to move between the two. The work explores the 'craft' of digital sculpting and the technology used in producing what the artist terms ‘a naturally imperfect form’, incorporating knowledge of traditional sculpture, an understanding of anatomy and an interest in the study of bones (Osteology). The outcomes of the research are presented as a series of digital sculptural works, exhibited as a collection of curiosities in multiple mediums, including interactive game spaces, augmented reality (AR), rapid prototype prints (RP) and video displays.
Resumo:
Learning capability (LC) is a special dynamic capability that a firm purposefully builds to develop a cognitive focus, so as to enable the configuration and improvement of other capabilities (both dynamic and operational) to create and respond to market changes. Empirical evidence regarding the essential role of LC in leveraging operational manufacturing capabilities is, however, limited in the literature. This study takes a routine-based approach to understand capability, and focuses on demonstrating leveraging power of LC upon two essential operational capabilities within the manufacturing context, i.e., operational new product development capability (ONPDC), and operational supplier integration capability (OSIC). A mixed-methods research framework was used, which combines sources of evidence derived from a survey study and a multiple case study. This study identified high-level routines of LC that can be designed and controlled by managers and practitioners, to reconfigure underlying routines of ONPDC and OSIC to achieve superior performance in a turbulent environment. Hence, the study advances the notion of knowledge-based dynamic capabilities, such as LC, as routine bundles. It also provides an impetus for managing manufacturing operations from a capability-based perspective in the fast changing knowledge era.
Resumo:
In the modern built environment, building construction and demolition consume a large amount of energy and emits greenhouse gasses due to widely used conventional construction materials such as reinforced and composite concrete. These materials consume high amount of natural resources and possess high embodied energy. More energy is required to recycle or reuse such materials at the cessation of use. Therefore, it is very important to use recyclable or reusable new materials in building construction in order to conserve natural resources and reduce the energy and emissions associated with conventional materials. Advancements in materials technology have resulted in the introduction of new composite and hybrid materials in infrastructure construction as alternatives to the conventional materials. This research project has developed a lightweight and prefabricatable Hybrid Composite Floor Plate System (HCFPS) as an alternative to conventional floor system, with desirable properties, easy to construct, economical, demountable, recyclable and reusable. Component materials of HCFPS include a central Polyurethane (PU) core, outer layers of Glass-fiber Reinforced Cement (GRC) and steel laminates at tensile regions. This research work explored the structural adequacy and performance characteristics of hybridised GRC, PU and steel laminate for the development of HCFPS. Performance characteristics of HCFPS were investigated using Finite Element (FE) method simulations supported by experimental testing. Parametric studies were conducted to develop the HCFPS to satisfy static performance using sectional configurations, spans, loading and material properties as the parameters. Dynamic response of HCFPS floors was investigated by conducting parametric studies using material properties, walking frequency and damping as the parameters. Research findings show that HCFPS can be used in office and residential buildings to provide acceptable static and dynamic performance. Design guidelines were developed for this new floor system. HCFPS is easy to construct and economical compared to conventional floor systems as it is lightweight and prefabricatable floor system. This floor system can also be demounted and reused or recycled at the cessation of use due to its component materials.
Resumo:
1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Overview: - Development of mixed methods research - Benefits and challenges of “mixing” - Different models - Good design - Two examples - How to report? - Have a go!
Resumo:
Over a seven-year period, Mark Radvan directed a suite of children’s theatre productions adapted from the original Tashi stories by Australian writers Anna and Barbara Fienberg. The Tashi Project’s repertoire of plays performed to over 40,000 children aged between 3 and 10 years old, and their carers, in seasons at the Out of the Box Festival, at Brisbane Powerhouse and in venues across Australia in two interstate tours in 2009 and 2010. The project investigated how best to combine an exploration of theatrical forms and conventions, with a performance style evolved in a specially developed training program and a deliberate positioning of young children as audiences capable of sophisticated readings of action, symbol, theme and character. The results of this project show that when brought into appropriate relationship with the theatre artists, young children aged 3-5 can engage with sophisticated narrative forms, and with the right contextual framing they enjoy heightened dramatic and emotional tension, bringing to the event sustained and highly engaged concentration. Older children aged 6-10 also bring sustained and heightened engagement to the same stories, providing that other more sophisticated dramatic elements are woven into the construction of the performances, such as character, theme and style.
Resumo:
Background and purpose: The purpose of the work presented in this paper was to determine whether patient positioning and delivery errors could be detected using electronic portal images of intensity modulated radiotherapy (IMRT). Patients and methods: We carried out a series of controlled experiments delivering an IMRT beam to a humanoid phantom using both the dynamic and multiple static field method of delivery. The beams were imaged, the images calibrated to remove the IMRT fluence variation and then compared with calibrated images of the reference beams without any delivery or position errors. The first set of experiments involved translating the position of the phantom both laterally and in a superior/inferior direction a distance of 1, 2, 5 and 10 mm. The phantom was also rotated 1 and 28. For the second set of measurements the phantom position was kept fixed and delivery errors were introduced to the beam. The delivery errors took the form of leaf position and segment intensity errors. Results: The method was able to detect shifts in the phantom position of 1 mm, leaf position errors of 2 mm, and dosimetry errors of 10% on a single segment of a 15 segment IMRT step and shoot delivery (significantly less than 1% of the total dose). Conclusions: The results of this work have shown that the method of imaging the IMRT beam and calibrating the images to remove the intensity modulations could be a useful tool in verifying both the patient position and the delivery of the beam.
Resumo:
The impact-induced deposition of Al13 clusters with icosahedral structure on Ni(0 0 1) surface was studied by molecular dynamics (MD) simulation using Finnis–Sinclair potentials. The incident kinetic energy (Ein) ranged from 0.01 to 30 eV per atom. The structural and dynamical properties of Al clusters on Ni surfaces were found to be strongly dependent on the impact energy. At much lower energy, the Al cluster deposited on the surface as a bulk molecule. However, the original icosahedral structure was transformed to the fcc-like one due to the interaction and the structure mismatch between the Al cluster and Ni surface. With increasing the impinging energy, the cluster was deformed severely when it contacted the substrate, and then broken up due to dense collision cascade. The cluster atoms spread on the surface at last. When the impact energy was higher than 11 eV, the defects, such as Al substitutions and Ni ejections, were observed. The simulation indicated that there exists an optimum energy range, which is suitable for Al epitaxial growth in layer by layer. In addition, at higher impinging energy, the atomic exchange between Al and Ni atoms will be favourable to surface alloying.
Resumo:
In this paper, the collision of a C36, with D6h symmetry, on diamond (001)-(/2×1) surface was investigated using molecular dynamics (MD) simulation based on the semi-empirical Brenner potential. The incident kinetic energy of the C36 ranges from 20 to 150 eV per cluster. The collision dynamics was investigated as a function of impact energy Ein. The C36 cluster was first impacted towards the center of two dimers with a fixed orientation. It was found that when Ein was lower than 30 eV, C36 bounces off the surface without breaking up. Increasing Ein to 30-45 eV, bonds were formed between C36 and surface dimer atoms, and the adsorbed C36 retained its original free-cluster structure. Around 50-60 eV, the C36 rebounded from the surface with cage defects. Above 70 eV, fragmentation both in the cluster and on the surface was observed. Our simulation supported the experimental findings that during low-energy cluster beam deposition small fullerenes could keep their original structure after adsorption (i.e. the memory effect), if Ein is within a certain range. Furthermore, we found that the energy threshold for chemisorption is sensitive to the orientation of the incident C36 and its impact position on the asymmetric surface.
Resumo:
This thesis was a step forward in improving the stability of power systems by applying new control and modelling techniques. The developed methods use the data obtained from voltage angle measurement devices which are synchronized with GPS signals to stabilize the system and avoid system-wide blackouts in the event of severe faults. New approaches were developed in this research for identifying and estimating reduced dynamic system models using phasor measurement units. The main goal of this research is achieved by integrating the developed methods to obtain a feasible wide-area control system for stabilizing the power systems.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.