931 resultados para large data sets


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In terms of atmospheric impact, the volcanic eruption of Mt. Pinatubo (1991) is the best characterized large eruption on record. We investigate here the model-derived stratospheric warming following the Pinatubo eruption as derived from SAGE II extinction data including recent improvements in the processing algorithm. This method, termed SAGE_4λ, makes use of the four wavelengths (385, 452, 525 and 1024 nm) of the SAGE II data when available, and uses a data-filling procedure in the opacity-induced "gap" regions. Using SAGE_4λ, we derived aerosol size distributions that properly reproduce extinction coefficients also at much longer wavelengths. This provides a good basis for calculating the absorption of terrestrial infrared radiation and the resulting stratospheric heating. However, we also show that the use of this data set in a global chemistry–climate model (CCM) still leads to stronger aerosol-induced stratospheric heating than observed, with temperatures partly even higher than the already too high values found by many models in recent general circulation model (GCM) and CCM intercomparisons. This suggests that the overestimation of the stratospheric warming after the Pinatubo eruption may not be ascribed to an insufficient observational database but instead to using outdated data sets, to deficiencies in the implementation of the forcing data, or to radiative or dynamical model artifacts. Conversely, the SAGE_4λ approach reduces the infrared absorption in the tropical tropopause region, resulting in a significantly better agreement with the post-volcanic temperature record at these altitudes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a vertically resolved zonal mean monthly mean global ozone data set spanning the period 1901 to 2007, called HISTOZ.1.0. It is based on a new approach that combines information from an ensemble of chemistry climate model (CCM) simulations with historical total column ozone information. The CCM simulations incorporate important external drivers of stratospheric chemistry and dynamics (in particular solar and volcanic effects, greenhouse gases and ozone depleting substances, sea surface temperatures, and the quasi-biennial oscillation). The historical total column ozone observations include ground-based measurements from the 1920s onward and satellite observations from 1970 to 1976. An off-line data assimilation approach is used to combine model simulations, observations, and information on the observation error. The period starting in 1979 was used for validation with existing ozone data sets and therefore only ground-based measurements were assimilated. Results demonstrate considerable skill from the CCM simulations alone. Assimilating observations provides additional skill for total column ozone. With respect to the vertical ozone distribution, assimilating observations increases on average the correlation with a reference data set, but does not decrease the mean squared error. Analyses of HISTOZ.1.0 with respect to the effects of El Niño–Southern Oscillation (ENSO) and of the 11 yr solar cycle on stratospheric ozone from 1934 to 1979 qualitatively confirm previous studies that focussed on the post-1979 period. The ENSO signature exhibits a much clearer imprint of a change in strength of the Brewer–Dobson circulation compared to the post-1979 period. The imprint of the 11 yr solar cycle is slightly weaker in the earlier period. Furthermore, the total column ozone increase from the 1950s to around 1970 at northern mid-latitudes is briefly discussed. Indications for contributions of a tropospheric ozone increase, greenhouse gases, and changes in atmospheric circulation are found. Finally, the paper points at several possible future improvements of HISTOZ.1.0.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Upper-air observations are a fundamental data source for global atmospheric data products, but uncertainties, particularly in the early years, are not well known. Most of the early observations, which have now been digitized, are prone to a large variety of undocumented uncertainties (errors) that need to be quantified, e.g., for their assimilation in reanalysis projects. We apply a novel approach to estimate errors in upper-air temperature, geopotential height, and wind observations from the Comprehensive Historical Upper-Air Network for the time period from 1923 to 1966. We distinguish between random errors, biases, and a term that quantifies the representativity of the observations. The method is based on a comparison of neighboring observations and is hence independent of metadata, making it applicable to a wide scope of observational data sets. The estimated mean random errors for all observations within the study period are 1.5 K for air temperature, 1.3 hPa for pressure, 3.0 ms−1for wind speed, and 21.4° for wind direction. The estimates are compared to results of previous studies and analyzed with respect to their spatial and temporal variability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Atmospheric circulation modes are important concepts in understanding the variability of atmospheric dynamics. Assuming their spatial patterns to be fixed, such modes are often described by simple indices from rather short observational data sets. The increasing length of reanalysis products allows these concepts and assumptions to be scrutinised. Here we investigate the stability of spatial patterns of Northern Hemisphere teleconnections by using the Twentieth Century Reanalysis as well as several control and transient millennium-scale simulations with coupled models. The observed and simulated centre of action of the two major teleconnection patterns, the North Atlantic Oscillation (NAO) and to some extent the Pacific North American (PNA), are not stable in time. The currently observed dipole pattern of the NAO, its centre of action over Iceland and the Azores, split into a north–south dipole pattern in the western Atlantic with a wave train pattern in the eastern part, connecting the British Isles with West Greenland and the eastern Mediterranean during the period 1940–1969 AD. The PNA centres of action over Canada are shifted southwards and over Florida into the Gulf of Mexico during the period 1915–1944 AD. The analysis further shows that shifts in the centres of action of either teleconnection pattern are not related to changes in the external forcing applied in transient simulations of the last millennium. Such shifts in their centres of action are accompanied by changes in the relation of local precipitation and temperature with the overlying atmospheric mode. These findings further undermine the assumption of stationarity between local climate/proxy variability and large-scale dynamics inherent when using proxy-based reconstructions of atmospheric modes, and call for a more robust understanding of atmospheric variability on decadal timescales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The motion of lung tumors during respiration makes the accurate delivery of radiation therapy to the thorax difficult because it increases the uncertainty of target position. The adoption of four-dimensional computed tomography (4D-CT) has allowed us to determine how a tumor moves with respiration for each individual patient. Using information acquired during a 4D-CT scan, we can define the target, visualize motion, and calculate dose during the planning phase of the radiotherapy process. One image data set that can be created from the 4D-CT acquisition is the maximum-intensity projection (MIP). The MIP can be used as a starting point to define the volume that encompasses the motion envelope of the moving gross target volume (GTV). Because of the close relationship that exists between the MIP and the final target volume, we investigated four MIP data sets created with different methodologies (3 using various 4D-CT sorting implementations, and one using all available cine CT images) to compare target delineation. It has been observed that changing the 4D-CT sorting method will lead to the selection of a different collection of images; however, the clinical implications of changing the constituent images on the resultant MIP data set are not clear. There has not been a comprehensive study that compares target delineation based on different 4D-CT sorting methodologies in a patient population. We selected a collection of patients who had previously undergone thoracic 4D-CT scans at our institution, and who had lung tumors that moved at least 1 cm. We then generated the four MIP data sets and automatically contoured the target volumes. In doing so, we identified cases in which the MIP generated from a 4D-CT sorting process under-represented the motion envelope of the target volume by more than 10% than when measured on the MIP generated from all of the cine CT images. The 4D-CT methods suffered from duplicate image selection and might not choose maximum extent images. Based on our results, we suggest utilization of a MIP generated from the full cine CT data set to ensure a representative inclusive tumor extent, and to avoid geometric miss.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The newly developed atmosphere–ocean-chemistry-climate model SOCOL-MPIOM is presented by demonstrating the influence of the interactive chemistry module on the climate state and the variability. Therefore, we compare pre-industrial control simulations with (CHEM) and without (NOCHEM) interactive chemistry. In general, the influence of the chemistry on the mean state and the variability is small and mainly restricted to the stratosphere and mesosphere. The largest differences are found for the atmospheric dynamics in the polar regions, with slightly stronger northern and southern winter polar vortices in CHEM. The strengthening of the vortex is related to larger stratospheric temperature gradients, which are attributed to a parametrization of the absorption of ozone and oxygen in the Lyman-alpha, Schumann–Runge, Hartley, and Higgins bands. This effect is parametrized in the version with interactive chemistry only. A second reason for the temperature differences between CHEM and NOCHEM is related to diurnal variations in the ozone concentrations in the higher atmosphere, which are missing in NOCHEM. Furthermore, stratospheric water vapour concentrations differ substantially between the two experiments, but their effect on the temperatures is small. In both setups, the simulated intensity and variability of the northern polar vortex is inside the range of present day observations. Sudden stratospheric warming events are well reproduced in terms of their frequency, but the distribution amongst the winter months is too uniform. Additionally, the performance of SOCOL-MPIOM under changing external forcings is assessed for the period 1600–2000 using an ensemble of simulations driven by a spectral solar forcing reconstruction. The amplitude of the reconstruction is large in comparison to other state-of-the-art reconstructions, providing an upper limit for the importance of the solar signal. In the pre-industrial period (1600–1850) the simulated surface temperature trends are in reasonable agreement with temperature reconstructions, although the multi-decadal variability is more pronounced. This enhanced variability can be attributed to the variability in the solar forcing. The simulated temperature reductions during the Maunder Minimum are in the lowest probability range of the proxy records. During the Dalton Minimum, when also volcanic forcing is an important driver of temperature variations, the agreement is better. In the industrial period from 1850 onward SOCOL-MPIOM overestimates the temperature increase in comparison to observational data sets. Sensitivity simulations show that this overestimation can be attributed to the increasing trend in the solar forcing reconstruction that is used in this study and an additional warming induced by the simulated ozone changes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background context Studies involving factor analysis (FA) of the items in the North American Spine Society (NASS) outcome assessment instrument have revealed inconsistent factor structures for the individual items. Purpose This study examined whether the factor structure of the NASS varied in relation to the severity of the back/neck problem and differed from that originally recommended by the developers of the questionnaire, by analyzing data before and after surgery in a large series of patients undergoing lumbar or cervical disc arthroplasty. Study design/setting Prospective multicenter observational case series. Patient sample Three hundred ninety-one patients with low back pain and 553 patients with neck pain completed questionnaires preoperatively and again at 3 to 6 and 12 months follow-ups (FUs), in connection with the SWISSspine disc arthroplasty registry. Outcome measures North American Spine Society outcome assessment instrument. Methods First, an exploratory FA without a priori assumptions and subsequently a confirmatory FA were performed on the 17 items of the NASS-lumbar and 19 items of the NASS-cervical collected at each assessment time point. The item-loading invariance was tested in the German version of the questionnaire for baseline and FU. Results Both NASS-lumbar and NASS-cervical factor structures differed between baseline and postoperative data sets. The confirmatory analysis and item-loading invariance showed better fit for a three-factor (3F) structure for NASS-lumbar, containing items on “disability,” “back pain,” and “radiating pain, numbness, and weakness (leg/foot)” and for a 5F structure for NASS-cervical including disability, “neck pain,” “radiating pain and numbness (arm/hand),” “weakness (arm/hand),” and “motor deficit (legs).” Conclusions The best-fitting factor structure at both baseline and FU was selected for both the lumbar- and cervical-NASS questionnaires. It differed from that proposed by the originators of the NASS instruments. Although the NASS questionnaire represents a valid outcome measure for degenerative spine diseases, it is able to distinguish among all major symptom domains (factors) in patients undergoing lumbar and cervical disc arthroplasty; overall, the item structure could be improved. Any potential revision of the NASS should consider its factorial structure; factorial invariance over time should be aimed for, to allow for more precise interpretations of treatment success.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION Even though arthroplasty of the ankle joint is considered to be an established procedure, only about 1,300 endoprostheses are implanted in Germany annually. Arthrodeses of the ankle joint are performed almost three times more often. This may be due to the availability of the procedure - more than twice as many providers perform arthrodesis - as well as the postulated high frequency of revision procedures of arthroplasties in the literature. In those publications, however, there is often no clear differentiation between revision surgery with exchange of components, subsequent interventions due to complications and subsequent surgery not associated with complications. The German Orthopaedic Foot and Ankle Association's (D. A. F.) registry for total ankle replacement collects data pertaining to perioperative complications as well as cause, nature and extent of the subsequent interventions, and postoperative patient satisfaction. MATERIAL AND METHODS The D. A. F.'s total ankle replacement register is a nation-wide, voluntary registry. After giving written informed consent, the patients can be added to the database by participating providers. Data are collected during hospital stay for surgical treatment, during routine follow-up inspections and in the context of revision surgery. The information can be submitted in paper-based or online formats. The survey instruments are available as minimum data sets or scientific questionnaires which include patient-reported outcome measures (PROMs). The pseudonymous clinical data are collected and evaluated at the Institute for Evaluative Research in Medicine, University of Bern/Switzerland (IEFM). The patient-related data remain on the register's module server in North Rhine-Westphalia, Germany. The registry's methodology as well as the results of the revisions and patient satisfaction for 115 patients with a two year follow-up period are presented. Statistical analyses are performed with SAS™ (Version 9.4, SAS Institute, Inc., Cary, NC, USA). RESULTS About 2½ years after the register was launched there are 621 datasets on primary implantations, 1,427 on follow-ups and 121 records on re-operation available. 49 % of the patients received their implants due to post-traumatic osteoarthritis, 27 % because of a primary osteoarthritis and 15 % of patients suffered from a rheumatic disease. More than 90 % of the primary interventions proceeded without complications. Subsequent interventions were recorded for 84 patients, which corresponds to a rate of 13.5 % with respect to the primary implantations. It should be noted that these secondary procedures also include two-stage procedures not due to a complication. "True revisions" are interventions with exchange of components due to mechanical complications and/or infection and were present in 7.6 % of patients. 415 of the patients commented on their satisfaction with the operative result during the last follow-up: 89.9 % of patients evaluate their outcome as excellent or good, 9.4 % as moderate and only 0.7 % (3 patients) as poor. In these three cases a component loosening or symptomatic USG osteoarthritis was present. Two-year follow-up data using the American Orthopedic Foot and Ankle Society Ankle and Hindfoot Scale (AOFAS-AHS) are already available for 115 patients. The median AOFAS-AHS score increased from 33 points preoperatively to more than 80 points three to six months postoperatively. This increase remained nearly constant over the entire two-year follow-up period. CONCLUSION Covering less than 10 % of the approximately 240 providers in Germany and approximately 12 % of the annually implanted total ankle-replacements, the D. A. F.-register is still far from being seen as a national registry. Nevertheless, geographical coverage and inclusion of "high-" (more than 100 total ankle replacements a year) and "low-volume surgeons" (less than 5 total ankle replacements a year) make the register representative for Germany. The registry data show that the number of subsequent interventions and in particular the "true revision" procedures are markedly lower than the 20 % often postulated in the literature. In addition, a high level of patient satisfaction over the short and medium term is recorded. From the perspective of the authors, these results indicate that total ankle arthroplasty - given a correct indication and appropriate selection of patients - is not inferior to an ankle arthrodesis concerning patients' satisfaction and function. First valid survival rates can be expected about 10 years after the register's start.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The OPERA detector, designed to search for νμ → ντ oscillations in the CNGS beam, is located in the underground Gran Sasso laboratory, a privileged location to study TeV-scale cosmic rays. For the analysis here presented, the detector was used to measure the atmospheric muon charge ratio in the TeV region. OPERA collected chargeseparated cosmic ray data between 2008 and 2012. More than 3 million atmospheric muon events were detected and reconstructed, among which about 110000 multiple muon bundles. The charge ratio Rμ ≡ Nμ+/Nμ− was measured separately for single and for multiple muon events. The analysis exploited the inversion of the magnet polarity which was performed on purpose during the 2012 Run. The combination of the two data sets with opposite magnet polarities allowedminimizing systematic uncertainties and reaching an accurate determination of the muon charge ratio. Data were fitted to obtain relevant parameters on the composition of primary cosmic rays and the associated kaon production in the forward fragmentation region. In the surface energy range 1–20 TeV investigated by OPERA, Rμ is well described by a parametric model including only pion and kaon contributions to themuon flux, showing no significant contribution of the prompt component. The energy independence supports the validity of Feynman scaling in the fragmentation region up to 200 TeV/nucleon primary energy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An  ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cyclones, which develop over the western Mediterranean and move northeastward are a major source of extreme weather and known to be responsible for heavy precipitation over the northern side of the Alpine range and Central Europe. As the relevant processes triggering these so-called Vb events and their impact on extreme precipitation are not yet fully understood, this study focuses on gaining insight into the dynamics of past events. For this, a cyclone detection and tracking tool is applied to the ERA-Interim reanalysis (1979–2013) to identify prominent Vb situations. Precipitation in the ERA-Interim and the E-OBS data sets is used to evaluate case-to-case precipitation amounts and to assess consistency between the two data sets. Both data sets exhibit high variability in precipitation amounts among different Vb events. While only 23 % of all Vb events are associated with extreme precipitation, around 15 % of all extreme precipitation days (99 percentile) over the northern Alpine region and Central Europe are induced by Vb events, although Vb cyclones are rare events (2.3 per year). To obtain a better understanding of the variability within Vb events, the analysis of the 10 heaviest and lowest precipitation Vb events reveals noticeable differences in the state of the atmosphere. These differences are most pronounced in the geopotential height and potential vorticity field, indicating a much stronger cyclone for heavy precipitation events. The related differences in wind direction are responsible for the moisture transport around the Alps and the orographical lifting along the northern slopes of the Alps. These effects are the main reasons for a disastrous outcome of Vb events, and consequently are absent in the Vb events associated with low precipitation. Hence, our results point out that heavy precipitation related to Vb events is mainly related to large-scale dynamics rather than to thermodynamic processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pencil beam scanned (PBS) proton therapy has many advantages over conventional radiotherapy, but its effectiveness for treating mobile tumours remains questionable. Gating dose delivery to the breathing pattern is a well-developed method in conventional radiotherapy for mitigating tumour-motion, but its clinical efficiency for PBS proton therapy is not yet well documented. In this study, the dosimetric benefits and the treatment efficiency of beam gating for PBS proton therapy has been comprehensively evaluated. A series of dedicated 4D dose calculations (4DDC) have been performed on 9 different 4DCT(MRI) liver data sets, which give realistic 4DCT extracting motion information from 4DMRI. The value of 4DCT(MRI) is its capability of providing not only patient geometries and deformable breathing characteristics, but also includes variations in the breathing patterns between breathing cycles. In order to monitor target motion and derive a gating signal, we simulate time-resolved beams' eye view (BEV) x-ray images as an online motion surrogate. 4DDCs have been performed using three amplitude-based gating window sizes (10/5/3 mm) with motion surrogates derived from either pre-implanted fiducial markers or the diaphragm. In addition, gating has also been simulated in combination with up to 19 times rescanning using either volumetric or layered approaches. The quality of the resulting 4DDC plans has been quantified in terms of the plan homogeneity index (HI), total treatment time and duty cycle. Results show that neither beam gating nor rescanning alone can fully retrieve the plan homogeneity of the static reference plan. Especially for variable breathing patterns, reductions of the effective duty cycle to as low as 10% have been observed with the smallest gating rescanning window (3 mm), implying that gating on its own for such cases would result in much longer treatment times. In addition, when rescanning is applied on its own, large differences between volumetric and layered rescanning have been observed as a function of increasing number of re-scans. However, once gating and rescanning is combined, HI to within 2% of the static plan could be achieved in the clinical target volume, with only moderately prolonged treatment times, irrespective of the rescanning strategy used. Moreover, these results are independent of the motion surrogate used. In conclusion, our results suggest image guided beam gating, combined with rescanning, is a feasible, effective and efficient motion mitigation approach for PBS-based liver tumour treatments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.