77 resultados para accelerators
Resumo:
Laser accelerated proton beams have been proposed to be used in different research fields. A great interest has risen for the potential replacement of conventional accelerating machines with laser-based accelerators, and in particular for the development of new concepts of more compact and cheaper hadrontherapy centers. In this context the ELIMED (ELI MEDical applications) research project has been launched by INFN-LNS and ASCR-FZU researchers within the pan-European ELI-Beamlines facility framework. The ELIMED project aims to demonstrate the potential clinical applicability of optically accelerated proton beams and to realize a laser-accelerated ion transport beamline for multi-disciplinary user applications. In this framework the eye melanoma, as for instance the uveal melanoma normally treated with 62 MeV proton beams produced by standard accelerators, will be considered as a model system to demonstrate the potential clinical use of laser-driven protons in hadrontherapy, especially because of the limited constraints in terms of proton energy and irradiation geometry for this particular tumour treatment. Several challenges, starting from laser-target interaction and beam transport development up to dosimetry and radiobiology, need to be overcome in order to reach the ELIMED final goals. A crucial role will be played by the final design and realization of a transport beamline capable to provide ion beams with proper characteristics in terms of energy spectrum and angular distribution which will allow performing dosimetric tests and biological cell irradiation. A first prototype of the transport beamline has been already designed and other transport elements are under construction in order to perform a first experimental test with the TARANIS laser system by the end of 2013. A wide international collaboration among specialists of different disciplines like Physics, Biology, Chemistry, Medicine and medical doctors coming from Europe, Japan, and the US is growing up around the ELIMED project with the aim to work on the conceptual design, technical and experimental realization of this core beamline of the ELI Beamlines facility. © 2013 SPIE.
Resumo:
This study aims to evaluate the use of Varian radiotherapy dynamic treatment log (DynaLog) files to verify IMRT plan delivery as part of a routine quality assurance procedure. Delivery accuracy in terms of machine performance was quantified by multileaf collimator (MLC) position errors and fluence delivery accuracy for patients receiving intensity modulated radiation therapy (IMRT) treatment. The relationship between machine performance and plan complexity, quantified by the modulation complexity score (MCS) was also investigated. Actual MLC positions and delivered fraction of monitor units (MU), recorded every 50 ms during IMRT delivery, were extracted from the DynaLog files. The planned MLC positions and fractional MU were taken from the record and verify system MLC control file. Planned and delivered beam data were compared to determine leaf position errors with and without the overshoot effect. Analysis was also performed on planned and actual fluence maps reconstructed from the MLC control file and delivered treatment log files respectively. This analysis was performed for all treatment fractions for 5 prostate, 5 prostate and pelvic node (PPN) and 5 head and neck (H&N) IMRT plans, totalling 82 IMRT fields in ∼5500 DynaLog files. The root mean square (RMS) leaf position errors without the overshoot effect were 0.09, 0.26, 0.19 mm for the prostate, PPN and H&N plans respectively, which increased to 0.30, 0.39 and 0.30 mm when the overshoot effect was considered. Average errors were not affected by the overshoot effect and were 0.05, 0.13 and 0.17 mm for prostate, PPN and H&N plans respectively. The percentage of pixels passing fluence map gamma analysis at 3%/3 mm was 99.94 ± 0.25%, which reduced to 91.62 ± 11.39% at 1%/1 mm criterion. Leaf position errors, but not gamma passing rate, were directly related to plan complexity as determined by the MCS. Site specific confidence intervals for average leaf position errors were set at -0.03-0.12 mm for prostate and -0.02-0.28 mm for more complex PPN and H&N plans. For all treatment sites confidence intervals for RMS errors with the overshoot was set at 0-0.50 mm and for the percentage of pixels passing a gamma analysis at 1%/1 mm a confidence interval of 68.83% was set also for all treatment sites. This work demonstrates the successful implementation of treatment log files to validate IMRT deliveries and how dynamic log files can diagnose delivery errors not possible with phantom based QC. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.
Resumo:
Ion acceleration driven by high intensity laser pulses is attracting an impressive and steadily increasing research effort. Experiments over the past 10-15 years have demonstrated, over a wide range of laser and target parameters, the generation of multi-MeV proton and ion beams with unique properties, which have stimulated interest in a number of innovative applications. While most of this work has been based on sheath acceleration processes, where space-charge fields are established by relativistic electrons at surfaces of the irradiated target, a number of novel mechanisms has been the focus of recent theoretical and experimental activities. This paper will provide a brief review of the state of the art in the field of laser-driven ion acceleration, with particular attention to recent developments.
Resumo:
The maximum energy to which cosmic rays can be accelerated at weakly magnetised ultra-relativistic shocks is investigated. We demonstrate that for such shocks, in which the scattering of energetic particles is mediated exclusively by ion skin-depth scale structures, as might be expected for a Weibel-mediated shock, there is an intrinsic limit on the maximum energy to which particles can be accelerated. This maximum energy is determined from the requirement that particles must be isotropized in the downstream plasma frame before the mean field transports them far downstream, and falls considerably short of what is required to produce ultra-high-energy cosmic rays. To circumvent this limit, a highly disorganized field is required on larger scales. The growth of cosmic ray-induced instabilities on wavelengths much longer than the ion-plasma skin depth, both upstream and downstream of the shock, is considered. While these instabilities may play an important role in magnetic field amplification at relativistic shocks, on scales comparable to the gyroradius of the most energetic particles, the calculated growth rates have insufficient time to modify the scattering. Since strong modification is a necessary condition for particles in the downstream region to re-cross the shock, in the absence of an alternative scattering mechanism, these results imply that acceleration to higher energies is ruled out. If weakly magnetized ultra-relativistic shocks are disfavoured as high-energy particle accelerators in general, the search for potential sources of ultra-high-energy cosmic rays can be narrowed.
Resumo:
This work investigated the differences between multileaf collimator (MLC) positioning accuracy determined using either log files or electronic portal imaging devices (EPID) and then assessed the possibility of reducing patient specific quality control (QC) via phantom-less methodologies. In-house software was developed, and validated, to track MLC positional accuracy with the rotational and static gantry picket fence tests using an integrated electronic portal image. This software was used to monitor MLC daily performance over a 1 year period for two Varian TrueBeam linear accelerators, with the results directly compared with MLC positions determined using leaf trajectory log files. This software was validated by introducing known shifts and collimator errors. Skewness of the MLCs was found to be 0.03 ± 0.06° (mean ±1 standard deviation (SD)) and was dependent on whether the collimator was rotated manually or automatically. Trajectory log files, analysed using in-house software, showed average MLC positioning errors with a magnitude of 0.004 ± 0.003 mm (rotational) and 0.004 ± 0.011 mm (static) across two TrueBeam units over 1 year (mean ±1 SD). These ranges, as indicated by the SD, were lower than the related average MLC positioning errors of 0.000 ± 0.025 mm (rotational) and 0.000 ± 0.039 mm (static) that were obtained using the in-house EPID based software. The range of EPID measured MLC positional errors was larger due to the inherent uncertainties of the procedure. Over the duration of the study, multiple MLC positional errors were detected using the EPID based software but these same errors were not detected using the trajectory log files. This work shows the importance of increasing linac specific QC when phantom-less methodologies, such as the use of log files, are used to reduce patient specific QC. Tolerances of 0.25 mm have been created for the MLC positional errors using the EPID-based automated picket fence test. The software allows diagnosis of any specific leaf that needs repair and gives an indication as to the course of action that is required.
Resumo:
Suitable instrumentation for laser-accelerated proton (ion) beams is critical for development of integrated, laser-driven ion accelerator systems. Instrumentation aimed at beam diagnostics and control must be applied to the driving laser pulse, the laser-plasma that forms at the target and the emergent proton (ion) bunch in a correlated way to develop these novel accelerators. This report is a brief overview of established diagnostic techniques and new developments based on material presented at the first workshop on 'Instrumentation for Diagnostics and Control of Laser-accelerated Proton (Ion) Beams' in Abingdon, UK. It includes radiochromic film (RCF), image plates (IP), micro-channel plates (MCP), Thomson spectrometers, prompt inline scintillators, time and space-resolved interferometry (TASRI) and nuclear activation schemes. Repetition-rated instrumentation requirements for target metrology are also addressed. (C) 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Resumo:
We will outline recent progress, in the UK ASAIL laser-ion acceleration programme, which aims to advance laser-driven ion beams to the point at which they will become a serious alternative to conventional accelerators for radiotherapy.
Resumo:
Low-power processors and accelerators that were originally designed for the embedded systems market are emerging as building blocks for servers. Power capping has been actively explored as a technique to reduce the energy footprint of high-performance processors. The opportunities and limitations of power capping on the new low-power processor and accelerator ecosystem are less understood. This paper presents an efficient power capping and management infrastructure for heterogeneous SoCs based on hybrid ARM/FPGA designs. The infrastructure coordinates dynamic voltage and frequency scaling with task allocation on a customised Linux system for the Xilinx Zynq SoC. We present a compiler-assisted power model to guide voltage and frequency scaling, in conjunction with workload allocation between the ARM cores and the FPGA, under given power caps. The model achieves less than 5% estimation bias to mean power consumption. In an FFT case study, the proposed power capping schemes achieve on average 97.5% of the performance of the optimal execution and match the optimal execution in 87.5% of the cases, while always meeting power constraints.
Resumo:
This paper describes the scientific aims and potentials as well as the preliminary technical design of IRIDE, an innovative tool for multi-disciplinary investigations in a wide field of scientific, technological and industrial applications. IRIDE will be a high intensity "particles factory", based on a combination of high duty cycle radio-frequency superconducting electron linacs and of high energy lasers. Conceived to provide unique research possibilities for particle physics, for condensed matter physics, chemistry and material science, for structural biology and industrial applications, IRIDE will open completely new research possibilities and advance our knowledge in many branches of science and technology. IRIDE is also supposed to be realized in subsequent stages of development depending on the assigned priorities. © 2013 Elsevier B.V.
Resumo:
An ultra-relativistic electron beam propagating through a high-Z solid triggersan electromagnetic cascade, whereby a large number of high-energy photons andelectron–positron pairs are produced mainly via the bremsstrahlung and Bethe–Heitler processes, respectively. These mechanisms are routinely used to generatepositron beams in conventional accelerators such as the electron–positron collider(LEP). Here we show that the application of similar physical mechanisms to a laserdrivenelectron source allows for the generation of high-quality positron beams in amuch more compact and cheaper configuration. We anticipate that the applicationof these results to the next generation of lasers might open the pathway for therealization of an all-optical high-energy electron–positron collider.
Resumo:
We present a mathematically rigorous Quality-of-Service (QoS) metric which relates the achievable quality of service metric (QoS) for a real-time analytics service to the server energy cost of offering the service. Using a new iso-QoS evaluation methodology, we scale server resources to meet QoS targets and directly rank the servers in terms of their energy-efficiency and by extension cost of ownership. Our metric and method are platform-independent and enable fair comparison of datacenter compute servers with significant architectural diversity, including micro-servers. We deploy our metric and methodology to compare three servers running financial option pricing workloads on real-life market data. We find that server ranking is sensitive to data inputs and desired QoS level and that although scale-out micro-servers can be up to two times more energy-efficient than conventional heavyweight servers for the same target QoS, they are still six times less energy efficient than high-performance computational accelerators.
Resumo:
We present a new regime to generate high-energy quasimonoenergetic proton beams in a "slow-pulse" regime, where the laser group velocity vg<c is reduced by an extended near-critical density plasma. In this regime, for properly matched laser intensity and group velocity, ions initially accelerated by the light sail (LS) mode can be further trapped and reflected by the snowplough potential generated by the laser in the near-critical density plasma. These two acceleration stages are connected by the onset of Rayleigh-Taylor-like (RT) instability. The usual ion energy spectrum broadening by RT instability is controlled and high quality proton beams can be generated. It is shown by multidimensional particle-in-cell simulation that quasimonoenergetic proton beams with energy up to hundreds of MeV can be generated at laser intensities of 1021W/cm2.
Resumo:
OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.
METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.
RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).
CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.
ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.
Resumo:
Software-programmable `soft' processors have shown tremendous potential for efficient realisation of high performance signal processing operations on Field Programmable Gate Array (FPGA), whilst lowering the design burden by avoiding the need to design fine-grained custom circuit archi-tectures. However, the complex data access patterns, high memory bandwidth and computational requirements of sliding window applications, such as Motion Estimation (ME) and Matrix Multiplication (MM), lead to low performance, inefficient soft processor realisations. This paper resolves this issue, showing how by adding support for block data addressing and accelerators for high performance loop execution, performance and resource efficiency over four times better than current best-in-class metrics can be achieved. In addition, it demonstrates the first recorded real-time soft ME estimation realisation for H.263 systems.
Resumo:
In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.
In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.