968 resultados para Imaging science
Resumo:
The assessment of skin temperature (Tsk) in athletic therapy and sports medicine research is an extremely important physiological outcome measure.Various methodsof recording Tsk, including thermistors, thermocouples and thermocrons are currently being used for research purposes. These techniques are constrained by their wires limiting the freedom of the subject, slow response times, and/or sensors falling off. Furthermore, as these products typically are directly attached to the skin and cover the measurement site, their validity may be questionable.This manuscript addresses the use and potential benefits of using thermal imaging (TI) in sport medicine research.Non-contact infrared TI offers a quick, non-invasive, portable and athlete-friendly method of assessing Tsk. TI is a useful Tsk diagnostic tool that has potential to be an integral part of sport medicine research in the future. Furthermore, as the technique is non-contact it has several advantages over existing methods of recording skin temperature
Resumo:
Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.
Resumo:
Introduction: The use of amorphous-silicon electronic portal imaging devices (a-Si EPIDs) for dosimetry is complicated by the effects of scattered radiation. In photon radiotherapy, primary signal at the detector can be accompanied by photons scattered from linear accelerator components, detector materials, intervening air, treatment room surfaces (floor, walls, etc) and from the patient/phantom being irradiated. Consequently, EPID measurements which presume to take scatter into account are highly sensitive to the identification of these contributions. One example of this susceptibility is the process of calibrating an EPID for use as a gauge of (radiological) thickness, where specific allowance must be made for the effect of phantom-scatter on the intensity of radiation measured through different thicknesses of phantom. This is usually done via a theoretical calculation which assumes that phantom scatter is linearly related to thickness and field-size. We have, however, undertaken a more detailed study of the scattering effects of fields of different dimensions when applied to phantoms of various thicknesses in order to derive scattered-primary ratios (SPRs) directly from simulation results. This allows us to make a more-accurate calibration of the EPID, and to qualify the appositeness of the theoretical SPR calculations. Methods: This study uses a full MC model of the entire linac-phantom-detector system simulated using EGSnrc/BEAMnrc codes. The Elekta linac and EPID are modelled according to specifications from the manufacturer and the intervening phantoms are modelled as rectilinear blocks of water or plastic, with their densities set to a range of physically realistic and unrealistic values. Transmissions through these various phantoms are calculated using the dose detected in the model EPID and used in an evaluation of the field-size-dependence of SPR, in different media, applying a method suggested for experimental systems by Swindell and Evans [1]. These results are compared firstly with SPRs calculated using the theoretical, linear relationship between SPR and irradiated volume, and secondly with SPRs evaluated from our own experimental data. An alternate evaluation of the SPR in each simulated system is also made by modifying the BEAMnrc user code READPHSP, to identify and count those particles in a given plane of the system that have undergone a scattering event. In addition to these simulations, which are designed to closely replicate the experimental setup, we also used MC models to examine the effects of varying the setup in experimentally challenging ways (changing the size of the air gap between the phantom and the EPID, changing the longitudinal position of the EPID itself). Experimental measurements used in this study were made using an Elekta Precise linear accelerator, operating at 6MV, with an Elekta iView GT a-Si EPID. Results and Discussion: 1. Comparison with theory: With the Elekta iView EPID fixed at 160 cm from the photon source, the phantoms, when positioned isocentrically, are located 41 to 55 cm from the surface of the panel. At this geometry, a close but imperfect agreement (differing by up to 5%) can be identified between the results of the simulations and the theoretical calculations. However, this agreement can be totally disrupted by shifting the phantom out of the isocentric position. Evidently, the allowance made for source-phantom-detector geometry by the theoretical expression for SPR is inadequate to describe the effect that phantom proximity can have on measurements made using an (infamously low-energy sensitive) a-Si EPID. 2. Comparison with experiment: For various square field sizes and across the range of phantom thicknesses, there is good agreement between simulation data and experimental measurements of the transmissions and the derived values of the primary intensities. However, the values of SPR obtained through these simulations and measurements seem to be much more sensitive to slight differences between the simulated and real systems, leading to difficulties in producing a simulated system which adequately replicates the experimental data. (For instance, small changes to simulated phantom density make large differences to resulting SPR.) 3. Comparison with direct calculation: By developing a method for directly counting the number scattered particles reaching the detector after passing through the various isocentric phantom thicknesses, we show that the experimental method discussed above is providing a good measure of the actual degree of scattering produced by the phantom. This calculation also permits the analysis of the scattering sources/sinks within the linac and EPID, as well as the phantom and intervening air. Conclusions: This work challenges the assumption that scatter to and within an EPID can be accounted for using a simple, linear model. Simulations discussed here are intended to contribute to a fuller understanding of the contribution of scattered radiation to the EPID images that are used in dosimetry calculations. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital, Brisbane, Australia. The authors are also grateful to Elekta for the provision of manufacturing specifications which permitted the detailed simulation of their linear accelerators and amorphous-silicon electronic portal imaging devices. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.
Resumo:
Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.
Resumo:
This paper investigates engaging experienced birders, as volunteer citizen scientists, to analyze large recorded audio datasets gathered through environmental acoustic monitoring. Although audio data is straightforward to gather, automated analysis remains a challenging task; the existing expertise, local knowledge and motivation of the birder community can complement computational approaches and provide distinct benefits. We explored both the culture and practice of birders, and paradigms for interacting with recorded audio data. A variety of candidate design elements were tested with birders. This study contributes an understanding of how virtual interactions and practices can be developed to complement existing practices of experienced birders in the physical world. In so doing this study contributes a new approach to engagement in e-science. Whereas most citizen science projects task lay participants with discrete real world or artificial activities, sometimes using extrinsic motivators, this approach builds on existing intrinsically satisfying practices.
Resumo:
Dehydration of food materials requires water removal from it. This removal of moisture prevents the growth and reproduction of microorganisms that cause decay and minimizes many of the moisture-driven deterioration reactions (Brennan, 1994). However, during food drying, many other changes occur simultaneously resulting in a modified overall quality (Kompany et al., 1993). Among the physical attributes of dried food material porosity and microstructure are the important ones that can dominant other quality of dried foods (Aguilera et al., 2000). In addition, this two concerned quality attributes affected by process conditions, material components and raw structure of food stuff. In this work, temperature moisture distribution within food materials during microwave drying will be taken into consideration to observe its participation on the microstructure and porosity of the finished product. Apple is the selective materials for this work. Generally, most of the food materials are found in non-uniformed moisture contained condition. To develop non uniform temperature distribution, food materials have been dried in a microwave oven with different power levels (Chua et al., 2000). First of all, temperature and moisture model is simulated by COMSOL Multiphysics. Later on, digital imaging camera and Image Pro Premier software have been deployed to observation moisture distribution and thermal imaging camera for temperature distribution. Finally, Microstructure and porosity of the food materials are obtained from scanning electron microscope and porosity measuring devices respectively . Moisture distribution and temperature during drying influence the microstructure and porosity significantly. Specially, High temperature and moisture contained regions show less porosity and more rupture. These findings support other literatures of Halder et al. (2011) and Rahman et al (1990). On the other hand, low temperature and moisture regions depict uniform microstructure and high porosity. This work therefore assists in better understanding of the role of moisture and temperature distribution to a prediction of micro structure and porosity of dried food materials.
Resumo:
Molecular-level computer simulations of restricted water diffusion can be used to develop models for relating diffusion tensor imaging measurements of anisotropic tissue to microstructural tissue characteristics. The diffusion tensors resulting from these simulations can then be analyzed in terms of their relationship to the structural anisotropy of the model used. As the translational motion of water molecules is essentially random, their dynamics can be effectively simulated using computers. In addition to modeling water dynamics and water-tissue interactions, the simulation software of the present study was developed to automatically generate collagen fiber networks from user-defined parameters. This flexibility provides the opportunity for further investigations of the relationship between the diffusion tensor of water and morphologically different models representing different anisotropic tissues.
Resumo:
Gaining support for proteomics science requires effective knowledge translation. Knowledge translation (KT) processes turn the evidence generated by scientific discovery into recommendations for clinical applications, funding priorities, and policy/regulatory reforms. Clinicians, regulators, and funders need to understand why emerging proteomics knowledge is relevant, and what are the potential applications of that knowledge. A lack of clarity remains about what KT means.
Resumo:
Few science fiction films have been made in Australia by Australians for Australian audiences, with most of the handful of locally-produced films made since the mid-1990s. Yet there has always been a solid Australian audience for non-Australian science fiction films and a strong international niche audience for the genre. While Australia has provided below-the-line crews and heads of departments (cinematographers, production designers, and so on) for many non-Australian science fiction films produced domestically, few Australian film directors have specialised in the genre. This is somewhat surprising considering that Alex Proyas achieved a degree of international success for his gothic science fiction film Dark City (1998), and George Miller achieved international fame following the worldwide success of Mad Max II (1981). Although the science fiction element of Mad Max II is tenuous – and even more so in the case of the original Mad Max (George Miller, 1979) – Miller is credited with creating a new (sub)genre which incorporates science fiction elements and has been widely imitated internationally: the dystopian, post-apocalyptic movie. Nevertheless, Australia has only produced a small number of science fiction movies. In addition to the above films, key titles include: Mad Max Beyond Thunderdome (George Miller, 1985), Shirley Thompson versus the Aliens (Jim Sharman, 1972), The Time Guardian (Brian Hannant, 1987), The Chain Reaction (Ian Barry, 1980) and, more recently, Knowing (Alex Proyas, 2009), Daybreakers (Michael and Peter Spierig, 2009), and Iron Sky (Timo Vuorensola, 2012).
Resumo:
Genomics and genetic findings have been hailed with promises of unlocked codes and new frontiers of personalized medicine. Despite cautions about gene hype, the strong cultural pull of genes and genomics has allowed consideration of genomic personhood. Populated by the complicated records of mass spectrometer, proteomics, which studies the human protein, has not achieved either the funding or the popular cultural appeal proteomics scientists had hoped it would. While proteomics, being focused on the proteins that actually indicate and create disease states, has a more direct potential for clinical applications than genomic risk predictions, culturally, it has not provided the material for identity creation. In our ethnographic research, we explore how proteomic scientists attempting to shape an appeal to personhood through which legitimacy may be defined.
Resumo:
The Design Science Research Roadmap (DSR-Roadmap) [1] aims to give detailed methodological guidance to novice researchers in Information Systems (IS) DSR. Focus group evaluation, one phase of the overall study, of the evolving DSR-Roadmap revealed that a key difficulty faced by both novice and expert researchers in DSR, is abstracting design theory from design. This paper explores the extension of the DSR-Roadmap by employing IS deep structure ontology (BWW [2-4]) as a lens on IS design to firstly yield generalisable design theory, specifically 'IS Design Theory' (ISDT) elements [5]. Consideration is next given to the value of BWW in the application of the design theory by practitioners. Results of mapping BWW constructs to ISDT elements suggest that the BWW is promising as a common language between design researchers and practitioners, facilitating both design theory and design implementation
Resumo:
Although Design Science Research (DSR) is now an accepted approach to research in the Information Systems (IS) discipline, consensus on the methodology of DSR has yet to be achieved. Lack of a comprehensive and detailed methodology for Design Science Research (DSR) in the Information System (IS) discipline is a main issue. Prior research (the parent-study) aimed to remedy this situation and resulted in the DSR-Roadmap (Alturki et al., 2011a). Continuing empirical validation and revision of the DSR-Roadmap strives towards a methodology with appropriate levels of detail, integration, and completeness for novice researchers to efficiently and effectively conduct and report DSR in IS. The sub-study reported herein contributes to this larger, ongoing effort. This paper reports results from a formative evaluation effort of the DSR-Roadmap conducted using focus group analysis. Generally, participants endorsed the utility and intuitiveness of the DSR-Roadmap, while also suggesting valuable refinements. Both parent-study and sub-study make methodological contributions. The parent-study is the first attempt of utilizing DSR to develop a research methodology showing an example of how to use DSR in research methodology construction. The sub-study demonstrates the value of the focus group method in DSR for formative product evaluation.
Resumo:
Innovations are usually attributed to ideas generated in the minds of individuals. As we reflect upon the evolving design of an online project to engage students in learning science through hybridized writing activities we propose a more distributed view of the process of innovative design. That is, our experience suggests ideas are generated in the activity of interacting with human and material resources that expand and constrain possibilities. This project is innovative in that it is a new educational response to the problem of disengagement of students in science, and has proven to be effective in changing classroom practice and improving students’ scientific literacy. In this chapter, we identify the antecedents and trace the evolution of the project. This account illuminates the innovative design process, presents a summary of the evidence for the effectiveness of the project, and identifies future directions for further development and research. Keywords: Science learning, hybridized writing, case study, innovative approach
Resumo:
Through the use of critical discourse analysis, this thesis investigated the perceived importance of scientific literacy in the new Australian Curriculum: Science. It was found that scientific literacy was ambiguous, and that the document did not provide detailed scope for intentional teaching for scientific literacy. To overcome this, recommendations on how to intentionally teach for scientific literacy were provided, so that Australian Science teachers can focus on improving scientific literacy outcomes for all students within this new curriculum.
Resumo:
Background: In sub-tropical and tropical Queensland, a legacy of poor housing design,minimal building regulations with few compliance measures, an absence of post-construction performance evaluation and various social and market factors has led to a high and growing penetration of, and reliance on, air conditioners to provide thermal comfort for occupants. The pervasive reliance on air conditioners has arguably impacted on building forms, changed cultural expectations of comfort and social practices for achieving comfort, and may have resulted in a loss of skills in designing and constructing high performance building envelopes. Aim: The aim of this paper is to report on initial outcomes of a project that sought to determine how the predicted building thermal performance of twenty-five houses in subtropical and tropical Queensland compared with objective performance measures and comfort performance as perceived by occupants. The purpose of the project was to shed light on the role of various supply chain agents in the realisation of thermal performance outcomes. Methodology: The case study methodology embraced a socio-technical approach incorporating building science and sociology. Building simulation was used to model thermal performance under controlled comfort assumptions and adaptive comfort conditions. Actual indoor climate conditions were measured by temperature and relative humidity sensors placed throughout each house, whilst occupants’ expectations of thermal comfort and their self-reported behaviours were gathered through semi-structured interviews and periodic comfort surveys. Thermal imaging and air infiltration tests, along with building design documents, were analysed to evaluate the influence of various supply chain agents on the actual performance outcomes. Results: The results clearly show that in the housing supply chain – from designer to constructor to occupant – there is limited understanding from each agent of their role in contributing to, or inhibiting, occupants’ comfort.