936 resultados para Target Field Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural stem cells have been proposed as a new and promising treatment modality in various pathologies of the central nervous system, including malignant brain tumors. However, the underlying mechanism by which neural stem cells target tumor areas remains elusive. Monitoring of these cells is currently done by use of various modes of molecular imaging, such as optical imaging, magnetic resonance imaging and positron emission tomography, which is a novel technology for visualizing metabolism and signal transduction to gene expression. In this new context, the microenvironment of (malignant) brain tumors and the blood-brain barrier gains increased interest. The authors of this review give a unique overview of the current molecular-imaging techniques used in different therapeutic experimental brain tumor models in relation to neural stem cells. Such methods for molecular imaging of gene-engineered neural stem/progenitor cells are currently used to trace the location and temporal level of expression of therapeutic and endogenous genes in malignant brain tumors, closing the gap between in vitro and in vivo integrative biology of disease in neural stem cell transplantation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: Various screening methods for colorectal cancer (CRC) are promoted by professional societies; however, few data are available about the factors that determine patient participation in screening, which is crucial to the success of population-based programs. This study aimed (i) to identify factors that determine acceptance of screening and preference of screening method, and (ii) to evaluate procedure success, detection of colorectal neoplasia, and patient satisfaction with screening colonoscopy. PATIENTS AND METHODS: Following a public awareness campaign, the population aged 50 - 80 years was offered CRC screening in the form of annual fecal occult blood tests, flexible sigmoidoscopy, a combination of both, or colonoscopy. RESULTS: 2731 asymptomatic persons (12.0 % of the target population) registered with and were eligible to take part in the screening program. Access to information and a positive attitude to screening were major determinants of participation. Colonoscopy was the method preferred by 74.8 % of participants. Advanced colorectal neoplasia was present in 8.5 %; its prevalence was higher in males and increased with age. Significant complications occurred in 0.5 % of those undergoing colonoscopy and were associated with polypectomy or sedation. Most patients were satisfied with colonoscopy and over 90 % would choose it again for CRC screening. CONCLUSIONS: In this population-based study, only a small proportion of the target population underwent CRC screening despite an extensive information campaign. Colonoscopy was the preferred method and was safe. The determinants of participation in screening and preference of screening method, together with the distribution of colorectal neoplasia in different demographic categories, provide a rationale for improving screening procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pulsed-field gel electrophoresis (PFGE) is widely used for epidemic investigations of methicillin-resistant Staphylococcus aureus (MRSA). In the present study, we evaluated its use in a long-term epidemiological setting (years to few decades, country to continent level). The clustering obtained from PFGE patterns after SmaI digestion of the DNA of 20 strains was compared to that obtained using a phylogenetic typing method (multiprimer RAPD). The results showed that the analysis of small PFGE bands (10-85kb) correlates better with multiprimer RAPD than the analysis of large PFGE bands (>85-700kb), suggesting that the analysis of small bands would be more suitable for the investigation of long-term epidemiological setting. However, given the technical difficulties to obtain a good resolution of these bands and the putative presence of plasmids among them, PFGE does not appear to be a method of choice for the long-term epidemiology analysis of MRSA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Imatinib has been increasingly proposed for therapeutic drug monitoring (TDM), as trough concentrations (Cmin) correlate with response rates in CML patients. This analysis aimed to evaluate the impact of imatinib exposure on optimal molecular response rates in a large European cohort of patients followed by centralized TDM.¦Methods: Sequential PK/PD analysis was performed in NONMEM 7 on 2230 plasma (PK) samples obtained along with molecular response (PD) data from 1299 CML patients. Model-based individual Bayesian estimates of exposure, parameterized as to initial dose adjusted and log-normalized Cmin (log-Cmin) or clearance (CL), were investigated as potential predictors of optimal molecular response, while accounting for time under treatment (stratified at 3 years), gender, CML phase, age, potentially interacting comedication, and TDM frequency. PK/PD analysis used mixed-effect logistic regression (iterative two-stage method) to account for intra-patient correlation.¦Results: In univariate analyses, CL, log-Cmin, time under treatment, TDM frequency, gender (all p<0.01) and CML phase (p=0.02) were significant predictors of the outcome. In multivariate analyses, all but log-Cmin remained significant (p<0.05). Our model estimates a 54.1% probability of optimal molecular response in a female patient with a median CL of 14.4 L/h, increasing by 4.7% with a 35% decrease in CL (percentile 10 of CL distribution), and decreasing by 6% with a 45% increased CL (percentile 90), respectively. Male patients were less likely than female to be in optimal response (odds ratio: 0.62, p<0.001), with an estimated probability of 42.3%.¦Conclusions: Beyond CML phase and time on treatment, expectedly correlated to the outcome, an effect of initial imatinib exposure on the probability of achieving optimal molecular response was confirmed in field-conditions by this multivariate analysis. Interestingly, male patients had a higher risk of suboptimal response, which might not exclusively derive from their 18.5% higher CL, but also from reported lower adherence to the treatment. A prospective longitudinal study would be desirable to confirm the clinical importance of identified covariates and to exclude biases possibly affecting this observational survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The studies of Giacomo Becattini concerning the notion of the "Marshallian industrial district" have led a revolution in the field of economic development around the world. The paper offers an interpretation of the methodology adopted by Becattini. The roots are clearly Marshallian. Becattini proposes a return to the economy as a complex social science that operates in historical time. We adopt a Schumpeterian approach to the method in economic analysis in order to highlight the similarities between the Marshall and Becattini's approach. Finally the paper uses the distinction between logical time, real time and historical time which enable us to study the "localized" economic process in a Becattinian way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the various determinants of treatment response, the achievement of sufficient blood levels is essential for curing malaria. For helping us at improving our current understanding of antimalarial drugs pharmacokinetics, efficacy and toxicity, we have developed a liquid chromatography-tandem mass spectrometry method (LC-MS/MS) requiring 200mul of plasma for the simultaneous determination of 14 antimalarial drugs and their metabolites which are the components of the current first-line combination treatments for malaria (artemether, artesunate, dihydroartemisinin, amodiaquine, N-desethyl-amodiaquine, lumefantrine, desbutyl-lumefantrine, piperaquine, pyronaridine, mefloquine, chloroquine, quinine, pyrimethamine and sulfadoxine). Plasma is purified by a combination of protein precipitation, evaporation and reconstitution in methanol/ammonium formate 20mM (pH 4.0) 1:1. Reverse-phase chromatographic separation of antimalarial drugs is obtained using a gradient elution of 20mM ammonium formate and acetonitrile both containing 0.5% formic acid, followed by rinsing and re-equilibration to the initial solvent composition up to 21min. Analyte quantification, using matrix-matched calibration samples, is performed by electro-spray ionization-triple quadrupole mass spectrometry by selected reaction monitoring detection in the positive mode. The method was validated according to FDA recommendations, including assessment of extraction yield, matrix effect variability, overall process efficiency, standard addition experiments as well as antimalarials short- and long-term stability in plasma. The reactivity of endoperoxide-containing antimalarials in the presence of hemolysis was tested both in vitro and on malaria patients samples. With this method, signal intensity of artemisinin decreased by about 20% in the presence of 0.2% hemolysed red-blood cells in plasma, whereas its derivatives were essentially not affected. The method is precise (inter-day CV%: 3.1-12.6%) and sensitive (lower limits of quantification 0.15-3.0 and 0.75-5ng/ml for basic/neutral antimalarials and artemisinin derivatives, respectively). This is the first broad-range LC-MS/MS assay covering the currently in-use antimalarials. It is an improvement over previous methods in terms of convenience (a single extraction procedure for 14 major antimalarials and metabolites reducing significantly the analytical time), sensitivity, selectivity and throughput. While its main limitation is investment costs for the equipment, plasma samples can be collected in the field and kept at 4 degrees C for up to 48h before storage at -80 degrees C. It is suited to detecting the presence of drug in subjects for screening purposes and quantifying drug exposure after treatment. It may contribute to filling the current knowledge gaps in the pharmacokinetics/pharmacodynamics relationships of antimalarials and better define the therapeutic dose ranges in different patient populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the overlapping distribution of Trypanosoma rangeli and T. cruzi in Central and South America, sharing several reservoirs and triatomine vectors, we herein describe a simple method to collect triatomine feces and hemolymph in filter paper for further detection and specific characterization of these two trypanosomes. Experimentally infected triatomines feces and hemolymph were collected in filter paper and specific detection of T. rangeli or T. cruzi DNA by polymerase chain reaction was achieved. This simple DNA collection method allows sample collection in the field and further specific trypanosome detection and characterization in the laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la University of Calgary, Canadà, entre desembre del 2007 i febrer del 2008. El projecte ha consistit en l'anàlisi de les dades d'una recerca en el camp de la psicologia de la música, concretament en com influeix la música en l'atenció a través de les vies dels estats emocionals i enèrgics de la persona. Per a la recerca es feu ús de videu en les sessions, obtenint dades visuals i auditives per a complementar les dades de tipus quantitatiu provinents dels resultats d'uns tests d'atenció subministrats. L'anàlisi es realitzà segons mètodes i tècniques de caràcter qualitatiu, apresos durant l'estada. Així mateix també s'ha aprofundit en la comprensió del paradigma qualitatiu com a paradigma vàlid i realment complementari del paradigma qualitatiu. S'ha focalitzat especialment en l'anàlisi de la conversa des d'un punt de vista interpretatiu així com l'anàlisi de llenguatge corporal i facial a partir de l'observació de videu, tot formulant descriptors i subdescriptors de la conducta que està relacionada amb la hipòtesis. Alguns descriptors havien estat formulats prèviament a l’anàlisi, en base a altres investigacions i al background de la investigadora; altres s’han anat descobrint durant l’anàlisi. Els descriptors i subdescriptors de la conducta estan relacionats amb l'intent dels estats anímics i enèrgics dels diferents participants. L'anàlisi s'ha realitzat com un estudi de casos, fent un anàlisi exhaustiu persona per persona amb l'objectiu de trobar patrons de reacció intrapersonals i intrapersonals. Els patrons observats s'utilitzaran com a contrast amb la informació quantitativa, tot realitzant triangulació amb les dades per trobar-ne possibles recolzaments o contradiccions entre sí. Els resultats preliminars indiquen relació entre el tipus de música i el comportament, sent que la música d'emotivitat negativa està associada a un tancament de la persona, però quan la música és enèrgica els participants s'activen (conductualment observat) i somriuen si aquesta és positiva.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel hybrid (or multiphysics) algorithm, which couples pore-scale and Darcy descriptions of two-phase flow in porous media. The flow at the pore-scale is described by the Navier?Stokes equations, and the Volume of Fluid (VOF) method is used to model the evolution of the fluid?fluid interface. An extension of the Multiscale Finite Volume (MsFV) method is employed to construct the Darcy-scale problem. First, a set of local interpolators for pressure and velocity is constructed by solving the Navier?Stokes equations; then, a coarse mass-conservation problem is constructed by averaging the pore-scale velocity over the cells of a coarse grid, which act as control volumes; finally, a conservative pore-scale velocity field is reconstructed and used to advect the fluid?fluid interface. The method relies on the localization assumptions used to compute the interpolators (which are quite straightforward extensions of the standard MsFV) and on the postulate that the coarse-scale fluxes are proportional to the coarse-pressure differences. By numerical simulations of two-phase problems, we demonstrate that these assumptions provide hybrid solutions that are in good agreement with reference pore-scale solutions and are able to model the transition from stable to unstable flow regimes. Our hybrid method can naturally take advantage of several adaptive strategies and allows considering pore-scale fluxes only in some regions, while Darcy fluxes are used in the rest of the domain. Moreover, since the method relies on the assumption that the relationship between coarse-scale fluxes and pressure differences is local, it can be used as a numerical tool to investigate the limits of validity of Darcy's law and to understand the link between pore-scale quantities and their corresponding Darcy-scale variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n this paper the iterative MSFV method is extended to include the sequential implicit simulation of time dependent problems involving the solution of a system of pressure-saturation equations. To control numerical errors in simulation results, an error estimate, based on the residual of the MSFV approximate pressure field, is introduced. In the initial time steps in simulation iterations are employed until a specified accuracy in pressure is achieved. This initial solution is then used to improve the localization assumption at later time steps. Additional iterations in pressure solution are employed only when the pressure residual becomes larger than a specified threshold value. Efficiency of the strategy and the error control criteria are numerically investigated. This paper also shows that it is possible to derive an a-priori estimate and control based on the allowed pressure-equation residual to guarantee the desired accuracy in saturation calculation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Gamma Knife surgery (GKS) is a non-invasive neurosurgical stereotactic procedure, increasingly used as an alternative to open functional procedures. This includes targeting of the ventro-intermediate nucleus of the thalamus (e.g. Vim) for tremor. We currently perform an indirect targeting, as the Vim is not visible on current 3Tesla MRI acquisitions. Our objective was to enhance anatomic imaging (aiming at refining the precision of anatomic target selection by direct visualisation) in patients treated for tremor with Vim GKS, by using high field 7T MRI. MATERIALS AND METHODSH: Five young healthy subjects were scanned on 3 (T1-w and diffusion tensor imaging) and 7T (high-resolution susceptibility weighted images (SWI)) MRI in Lausanne. All images were further integrated for the first time into the Gamma Plan Software(®) (Elekta Instruments, AB, Sweden) and co-registered (with T1 was a reference). A simulation of targeting of the Vim was done using various methods on the 3T images. Furthermore, a correlation with the position of the found target with the 7T SWI was performed. The atlas of Morel et al. (Zurich, CH) was used to confirm the findings on a detailed analysis inside/outside the Gamma Plan. RESULTS: The use of SWI provided us with a superior resolution and an improved image contrast within the basal ganglia. This allowed visualization and direct delineation of some subgroups of thalamic nuclei in vivo, including the Vim. The position of the target, as assessed on 3T, perfectly matched with the supposed one of the Vim on the SWI. Furthermore, a 3-dimensional model of the Vim-target area was created on the basis of the obtained images. CONCLUSION: This is the first report of the integration of SWI high field MRI into the LGP, aiming at the improvement of targeting validation of the Vim in tremor. The anatomical correlation between the direct visualization on 7T and the current targeting methods on 3T (e.g. quadrilatere of Guyot, histological atlases) seems to show a very good anatomical matching. Further studies are needed to validate this technique, both by improving the accuracy of the targeting of the Vim (potentially also other thalamic nuclei) and to perform clinical assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).