184 resultados para Vigarani, Carlo, 17th century


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to their small collecting volume diodes are commonly used in small field dosimetry. However the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm . The metric D_(w,Q)/D_(Det,Q) used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting D_(w,Q)/D_(Det,Q) as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which D_(w,Q)/D_(Det,Q) was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3 mm, 1.15 mm and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k_(Q_clin 〖,Q〗_msr)^(f_clin 〖,f〗_msr ) was equal to unity to within statistical uncertainty (0.5 %) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The results for the unclosed silicon chip show that an ideal small field dosimetry diode could be created by using a silicon chip with a small amount of air above it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The accurate identification of tissue electron densities is of great importance for Monte Carlo (MC) dose calculations. When converting patient CT data into a voxelised format suitable for MC simulations, however, it is common to simplify the assignment of electron densities so that the complex tissues existing in the human body are categorized into a few basic types. This study examines the effects that the assignment of tissue types and the calculation of densities can have on the results of MC simulations, for the particular case of a Siemen’s Sensation 4 CT scanner located in a radiotherapy centre where QA measurements are routinely made using 11 tissue types (plus air). Methods: DOSXYZnrc phantoms are generated from CT data, using the CTCREATE user code, with the relationship between Hounsfield units (HU) and density determined via linear interpolation between a series of specified points on the ‘CT-density ramp’ (see Figure 1(a)). Tissue types are assigned according to HU ranges. Each voxel in the DOSXYZnrc phantom therefore has an electron density (electrons/cm3) defined by the product of the mass density (from the HU conversion) and the intrinsic electron density (electrons /gram) (from the material assignment), in that voxel. In this study, we consider the problems of density conversion and material identification separately: the CT-density ramp is simplified by decreasing the number of points which define it from 12 down to 8, 3 and 2; and the material-type-assignment is varied by defining the materials which comprise our test phantom (a Supertech head) as two tissues and bone, two plastics and bone, water only and (as an extreme case) lead only. The effect of these parameters on radiological thickness maps derived from simulated portal images is investigated. Results & Discussion: Increasing the degree of simplification of the CT-density ramp results in an increasing effect on the resulting radiological thickness calculated for the Supertech head phantom. For instance, defining the CT-density ramp using 8 points, instead of 12, results in a maximum radiological thickness change of 0.2 cm, whereas defining the CT-density ramp using only 2 points results in a maximum radiological thickness change of 11.2 cm. Changing the definition of the materials comprising the phantom between water and plastic and tissue results in millimetre-scale changes to the resulting radiological thickness. When the entire phantom is defined as lead, this alteration changes the calculated radiological thickness by a maximum of 9.7 cm. Evidently, the simplification of the CT-density ramp has a greater effect on the resulting radiological thickness map than does the alteration of the assignment of tissue types. Conclusions: It is possible to alter the definitions of the tissue types comprising the phantom (or patient) without substantially altering the results of simulated portal images. However, these images are very sensitive to the accurate identification of the HU-density relationship. When converting data from a patient’s CT into a MC simulation phantom, therefore, all possible care should be taken to accurately reproduce the conversion between HU and mass density, for the specific CT scanner used. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital (RBWH), Brisbane, Australia. The authors are grateful to the staff of the RBWH, especially Darren Cassidy, for assistance in obtaining the phantom CT data used in this study. The authors also wish to thank Cathy Hargrave, of QUT, for assistance in formatting the CT data, using the Pinnacle TPS. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a hybrid framework of Swedish cultural practices and Australian grounded theory for organizational development and suggests practical strategies for 'working smarter' in 21st Century libraries. Toward that end, reflective evidence-based practices are offered to incrementally build organizational capacity for asking good questions, selecting authoritative sources, evaluating multiple perspectives, organizing emerging insights, and communicating them to inform, educate, and influence. In addition, to ensure the robust information exchange necessary to collective workplace learning, leadership traits are proposed for ensuring inclusive communication, decision making, and planning processes. These findings emerge from action research projects conducted from 2003 to 2008 in two North American libraries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exercise-based cardiac rehabilitation (CR) is efficacious in reducing mortality and hospital admissions; however it remains inaccessible to large proportions of the patient population. Removal of attendance barriers for hospital or centre-based CR has seen the promotion of home-based CR. Delivery of safe and appropriately prescribed exercise in the home was first documented 25 years ago, with the utilisation of fixed land-line telecommunications to monitor ECG. The advent of miniature ECG sensors, in conjunction with smartphones, now enables CR to be delivered with greater flexibility with regard to location, time and format, while retaining the capacity for real-time patient monitoring. A range of new systems allow other signals including speed, location, pulse oximetry, and respiration to be monitored and these may have application in CR. There is compelling evidence that telemonitored-based CR is an effective alternative to traditional CR practice. The long-standing barrier of access to centre-based CR, combined with new delivery platforms, raises the question of when telemonitored-based CR could replace conventional approaches as the standard practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By using information collected from numerous American Economic Review publications from the last 100 years, Torgler and Piatti examine the top publishing institutions to determine their most renowned AER papers based on citation success. Areas of interest include how often an individual can publish in the AER, how equally successful citations are distributed and who the top AER publishing authors are. The book explores what the level of cooperation is among authors and what drives systems such as the alphabetical name ordering. Torgler and Piatti critically examine the individual characteristics of AER authors, editors, editorial board members and referees and even tackle more intricate details such as the frequency of female publications in the AER. The authors observe and analyse the relationship between academic age and publication performance to see if there is any pattern on these factors and citation success. The book then goes on to analyse data concerning awards, and whether awards can increase the probability of publishing in the AER at a later stage

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Irradiance profile around the receiver tube (RT) of a parabolic trough collector (PTC) is a key effect of optical performance that affects the overall energy performance of the collector. Thermal performance evaluation of the RT relies on the appropriate determination of the irradiance profile. This article explains a technique in which empirical equations were developed to calculate the local irradiance as a function of angular location of the RT of a standard PTC using a vigorously verified Monte Carlo ray tracing model. A large range of test conditions including daily normal insolation, spectral selective coatings and glass envelop conditions were selected from the published data by Dudley et al. [1] for the job. The R2 values of the equations are excellent that vary in between 0.9857 and 0.9999. Therefore, these equations can be used confidently to produce realistic non-uniform boundary heat flux profile around the RT at normal incidence for conjugate heat transfer analyses of the collector. Required values in the equations are daily normal insolation, and the spectral selective properties of the collector components. Since the equations are polynomial functions, data processing software can be employed to calculate the flux profile very easily and quickly. The ultimate goal of this research is to make the concentrating solar power technology cost competitive with conventional energy technology facilitating its ongoing research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International agreement on the framework for protecting the rights of Indigenous populations within nation states has occurred alongside unprecedented levels of globalisation of other previously nation-based activities such as economic and social provision and planning. As the idea of the postcolonial democratic state emerges, this collection undertakes an international and comparative examination of the role of higher education in educating globally aware professionals who are able to work effectively and in cultural safety with Indigenous Peoples...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Where teachers of English as a foreign language (EFL) once observed a paucity of authentic language input, public displays of written English are now proliferating. Ideas for capitalising on this abundance can be drawn from two strands of pedagogic thought: a psycholinguistic approach to conventional literacy long established in foreign, second and first language education (e.g., Teng, 2009), and a more recent and critical approach informed by diverse theoretical understandings of the ‘linguistic landscape’ (e.g., Rowland, 2013). In this paper I draw from these two approaches to suggest ways of helping EFL learners use environmental print to develop knowledge and skills required of English readers in the twenty-first century: (1) fluency in breaking the codes of English and other languages of publicly displayed text; (2) facility with making meaning as the English of these texts becomes ever more diverse in cultural, historical and contextual implication; (3) use of environmental English in contexts that range from the local to the transnational; and (4) critique of the presence of English and attendant worldviews in the urban environment (Chern & Dooley, forthcoming). The psychological concept of motivation and the complementary sociological concept of investment are at the heart of my deliberations here: realisation of the pedagogic potential of environmental print to develop literate resources requires consideration of sources of motivation in the classroom learning situation (Dörnyei & Ushioda, 2011), as well as learner investment in literate practices in English (Norton, 2010).