62 resultados para Toolkit
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
Cross-Lingual Link Discovery (CLLD) is a new problem in Information Retrieval. The aim is to automatically identify meaningful and relevant hypertext links between documents in different languages. This is particularly helpful in knowledge discovery if a multi-lingual knowledge base is sparse in one language or another, or the topical coverage in each language is different; such is the case with Wikipedia. Techniques for identifying new and topically relevant cross-lingual links are a current topic of interest at NTCIR where the CrossLink task has been running since the 2011 NTCIR-9. This paper presents the evaluation framework for benchmarking algorithms for cross-lingual link discovery evaluated in the context of NTCIR-9. This framework includes topics, document collections, assessments, metrics, and a toolkit for pooling, assessment, and evaluation. The assessments are further divided into two separate sets: manual assessments performed by human assessors; and automatic assessments based on links extracted from Wikipedia itself. Using this framework we show that manual assessment is more robust than automatic assessment in the context of cross-lingual link discovery.
Resumo:
This project investigated ways in which the learning experience for students in Australian law schools could be enhanced by renewing final year legal curriculum through the design of effective capstone experiences to close the loop on tertiary legal studies and better prepare students for a smooth transition into the world of work and professional practice. Key project outcomes are a set of final year curriculum design principles and a transferable model for an effective final year program – a final year Toolkit comprising a range of templates, models and specific capstone examples for adoption or adaptation by legal educators. The project found that the efficacy of capstone experiences is affected by the curriculum context within which they are offered. For this reason, a number of ‘favourable conditions’, which promote the effectiveness of capstone experiences, have also been identified. The project’s final year principles and Toolkit promote program coherence and integration, should increase student satisfaction and levels of engagement with their experience of legal education and make a valuable contribution to assurance of learning in the new Tertiary Education Quality and Standards Agency (TEQSA) environment. From the point of view of the student experience, the final year principles and models address the current fragmented approach to final year legal curricula design and delivery. The knowledge and research base acquired under the auspices of this project is of both discipline and national importance as the project’s outcomes are transferable and have the potential to significantly influence the quality and coherence of the program experience of final year students in other tertiary disciplines, both within Australia and beyond. Project outcomes and deliverables are available on both the project’s website http://wiki.qut.edu.au/display/capstone/Home and on the Law Capstone Experience Forum website http://www.lawcapstoneexperience.com/. In the course of developing its deliverables, the project found that the design of capstone experiences varies significantly within and across disciplines; different frameworks may be used (for example, a disciplinary or inter-disciplinary focus, or to satisfy professional accreditation requirements), rationales and objectives may differ, and a variety of models utilised (for example, an integrated final year program, a single subject, a suite of subjects, or modules within several subjects). Broadly however, capstone experiences should provide final year students with an opportunity both to look back over their academic learning, in an effort to make sense of what they have accomplished, and to look forward to their professional and personal futures that build on that foundational learning.
Resumo:
In order to increase the accuracy of patient positioning for complex radiotherapy treatments various 3D imaging techniques have been developed. MegaVoltage Cone Beam CT (MVCBCT) can utilise existing hardware to implement a 3D imaging modality to aid patient positioning. MVCBCT has been investigated using an unmodified Elekta Precise linac and 15 iView amorphous silicon electronic portal imaging device (EPID). Two methods of delivery and acquisition have been investigated for imaging an anthropomorphic head phantom and quality assurance phantom. Phantom projections were successfully acquired and CT datasets reconstructed using both acquisition methods. Bone, tissue and air were 20 clearly resolvable in both phantoms even with low dose (22 MU) scans. The feasibility of MegaVoltage Cone beam CT was investigated using a standard linac, amorphous silicon EPID and a combination of a free open source reconstruction toolkit as well as custom in-house software written in Matlab. The resultant image quality has 25 been assessed and presented. Although bone, tissue and air were resolvable 2 in all scans, artifacts are present and scan doses are increased when compared with standard portal imaging. The feasibility of MVCBCT with unmodified Elekta Precise linac and EPID has been considered as well as the identification of possible areas for future development in artifact correction techniques to 30 further improve image quality.
An improved chemically inducible gene switch that functions in the monocotyledonous plant sugar cane
Resumo:
Chemically inducible gene switches can provide precise control over gene expression, enabling more specific analyses of gene function and expanding the plant biotechnology toolkit beyond traditional constitutive expression systems. The alc gene expression system is one of the most promising chemically inducible gene switches in plants because of its potential in both fundamental research and commercial biotechnology applications. However, there are no published reports demonstrating that this versatile gene switch is functional in transgenic monocotyledonous plants, which include some of the most important agricultural crops. We found that the original alc gene switch was ineffective in the monocotyledonous plant sugar cane, and describe a modified alc system that is functional in this globally significant crop. A promoter consisting of tandem copies of the ethanol receptor inverted repeat binding site, in combination with a minimal promoter sequence, was sufficient to give enhanced sensitivity and significantly higher levels of ethanol inducible gene expression. A longer CaMV 35S minimal promoter than was used in the original alc gene switch also substantially improved ethanol inducibility. Treating the roots with ethanol effectively induced the modified alc system in sugar cane leaves and stem, while an aerial spray was relatively ineffective. The extension of this chemically inducible gene expression system to sugar cane opens the door to new opportunities for basic research and crop biotechnology.
Resumo:
Introduction This study examines and compares the dosimetric quality of radiotherapy treatment plans for prostate carcinoma across a cohort of 163 patients treated across 5 centres: 83 treated with three-dimensional conformal radiotherapy (3DCRT), 33 treated with intensity-modulated radiotherapy (IMRT) and 47 treated with volumetric-modulated arc therapy (VMAT). Methods Treatment plan quality was evaluated in terms of target dose homogeneity and organ-at-risk sparing, through the use of a set of dose metrics. These included the mean, maximum and minimum doses; the homogeneity and conformity indices for the target volumes; and a selection of dose coverage values that were relevant to each organ-at-risk. Statistical significance was evaluated using two-tailed Welch’s T-tests. The Monte Carlo DICOM ToolKit software was adapted to permit the evaluation of dose metrics from DICOM data exported from a commercial radiotherapy treatment planning system. Results The 3DCRT treatment plans offered greater planning target volume dose homogeneity than the other two treatment modalities. The IMRT and VMAT plans offered greater dose reduction in the organs-at-risk: with increased compliance with recommended organ-at-risk dose constraints, compared to conventional 3DCRT treatments. When compared to each other, IMRT and VMAT did not provide significantly different treatment plan quality for like-sized tumour volumes. Conclusions This study indicates that IMRT and VMAT have provided similar dosimetric quality, which is superior to the dosimetric quality achieved with 3DCRT.
Resumo:
Background Cancer monitoring and prevention relies on the critical aspect of timely notification of cancer cases. However, the abstraction and classification of cancer from the free-text of pathology reports and other relevant documents, such as death certificates, exist as complex and time-consuming activities. Aims In this paper, approaches for the automatic detection of notifiable cancer cases as the cause of death from free-text death certificates supplied to Cancer Registries are investigated. Method A number of machine learning classifiers were studied. Features were extracted using natural language techniques and the Medtex toolkit. The numerous features encompassed stemmed words, bi-grams, and concepts from the SNOMED CT medical terminology. The baseline consisted of a keyword spotter using keywords extracted from the long description of ICD-10 cancer related codes. Results Death certificates with notifiable cancer listed as the cause of death can be effectively identified with the methods studied in this paper. A Support Vector Machine (SVM) classifier achieved best performance with an overall F-measure of 0.9866 when evaluated on a set of 5,000 free-text death certificates using the token stem feature set. The SNOMED CT concept plus token stem feature set reached the lowest variance (0.0032) and false negative rate (0.0297) while achieving an F-measure of 0.9864. The SVM classifier accounts for the first 18 of the top 40 evaluated runs, and entails the most robust classifier with a variance of 0.001141, half the variance of the other classifiers. Conclusion The selection of features significantly produced the most influences on the performance of the classifiers, although the type of classifier employed also affects performance. In contrast, the feature weighting schema created a negligible effect on performance. Specifically, it is found that stemmed tokens with or without SNOMED CT concepts create the most effective feature when combined with an SVM classifier.
Resumo:
A professional development toolkit was developed with an agenda, work sheets and resources to support a review of assessment practices pertaining to group work in a first year undergraduate course. A main contribution is the Rational for Group Work in Higher Education template that allows academic staff to determine the purpose for group work and identify the rationale behind the assessment tasks.
Resumo:
Within HCI, aging is often viewed in terms of designing assistive technologies to improve the lives of older people, such as those who are suffering from frailty or memory loss. Our research adopts a very different approach, reframing the relationship in terms of wisdom, creativity and invention. We ran a series of workshops where groups of retirees, aged between early 60s and late 80s, used the MaKey MaKey inventor's toolkit. We asked them to think about inventing the future and suggest ideas for new technologies. Our findings showed that they not only rose to the challenge but also mastered the technology, collaborated intensely together while using it and freely and at length discussed their own, their family's and others' relationship with technology. We discuss the value of empowering people in this way and consider what else could be invented to enable more people to be involved in the design and use of creative technologies.
Resumo:
The focus of this research is the creation of a stage-directing training manual on the researcher's site at the National Institute of Dramatic Art. The directing procedures build on the work of Stanislavski's Active Analysis and findings from present-day visual cognition studies. Action research methodology and evidence-based data collection are employed to improve the efficacy of both the directing procedures and the pedagogical manual. The manual serves as a supplement to director training and a toolkit for the more experienced practitioner. The manual and research findings provide a unique and innovative contribution to the field of theatre directing.
Resumo:
This paper describes experiences with the use of the Globus toolkit and related technologies for development of a secure portal that allows nationally-distributed Australian researchers to share data and application programs. The portal allows researchers to access infrastructure that will be used to enhance understanding of the causes of schizophrenia and advance its treatment, and aims to provide access to a resource that can expand into the world’s largest on-line collaborative mental health research facility. Since access to patient data is controlled by local ethics approvals, the portal must transparently both provide and deny access to patient data in accordance with the fine-grained access permissions afforded individual researchers. Interestingly, the access protocols are able to provide researchers with hints about currently inaccessible data that may be of interest to them, providing them the impetus to gain further access permissions.
Resumo:
This project developed a visual strategy and graphic outcomes to communicate the results of a scientific collaborative project to the Mackay community. During 2013 and 2014 a team from CSIRO engaged with the community in Mackay to collaboratively develop a set of strategies to improve the management of the Great Barrier Reef. The result of this work was a 300+ page scientific report that needed to be translated and summarised to the general community. The aim of this project was to strategically synthesise information contained in the report and to design and produce an outcome to be distributed to the participant community. By working with the CISRO researchers, an action toolkit was developed, with twelve cards and a booklet. Each card represented the story behind a certain local management issue and the actions that the participants suggested should be taken in order to improve management of The Reef. During the design synthesis it was identified that for all management issues there was a reference to the need to develop some sort of "educational campaign" to the area. That was then translated as an underlying action to support all other actions proposed in the toolkit.
Resumo:
Introduction & Aims Optimising fracture treatments requires a sound understanding of relationships between stability, callus development and healing outcomes. This has been the goal of computational modelling, but discrepancies remain between simulations and experimental results. We compared healing patterns vs fixation stiffness between a novel computational callus growth model and corresponding experimental data. Hypothesis We hypothesised that callus growth is stimulated by diffusible signals, whose production is in turn regulated by mechanical conditions at the fracture site. We proposed that introducing this scheme into computational models would better replicate the observed tissue patterns and the inverse relationship between callus size and fixation stiffness. Method Finite element models of bone healing under stiff and flexible fixation were constructed, based on the parameters of a parallel rat femoral osteotomy study. An iterative procedure was implemented, to simulate the development of callus and its mechanical regulation. Tissue changes were regulated according to published mechano-biological criteria. Predictions of healing patterns were compared between standard models, with a pre-defined domain for callus development, and a novel approach, in which periosteal callus growth is driven by a diffusible signal. Production of this signal was driven by local mechanical conditions. Finally, each model’s predictions were compared to the corresponding histological data. Results Models in which healing progressed within a prescribed callus domain predicted that greater interfragmentary movements would displace early periosteal bone formation further from the fracture. This results from artificially large distortional strains predicted near the fracture edge. While experiments showed increased hard callus size under flexible fixation, this was not reflected in the standard models. Allowing the callus to grow from a thin soft tissue layer, in response to a mechanically stimulated diffusible signal, results in a callus shape and tissue distribution closer to those observed histologically. Importantly, the callus volume increased with increasing interfragmentary movement. Conclusions A novel method to incorporate callus growth into computational models of fracture healing allowed us to successfully capture the relationship between callus size and fixation stability observed in our rat experiments. This approach expands our toolkit for understanding the influence of different fixation strategies on healing outcomes.
Resumo:
Since 2008 all Australian school students have sat standardised tests in Reading, Writing, Language Conventions (Spelling, Grammar and Punctuation) and Numeracy in years 3,5,7 and 9. NAPLAN tests report individual students' attainment of skills against a set of standards. Individual student results are communicated to parents. Schools are then ranked against other schools depending upon the aggregate of their NAPLAN results. The process is explained to parents and community members as “improving the learning outcomes for all Australian students” (MCEETYA, 2009). This paper will examine NAPLAN as it is being played out in a mediated space through analysing unsolicited comment found in new media such as Twitter and online forums. NAPLAN intersects with contemporary debates about Australian education policy: the roles schools should play in improving national productivity, the relationship between state and federal government interest in education, the role and expectations of the teacher, what curriculum and pedagogy should be and look like and how limited financial resources can best be spread across education sectors and systems. These are not new considerations, however, what has changed is that education policy seems to have become even more of a political issue than it has before. This paper uses Ball's 'toolkit' approach to education policy analysis to suggest that there are multiple 'effects' of NAPLAN culminating in a series of disconnected conversations between various stakeholders.