88 resultados para Planning software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

For radiotherapy treatment planning of retinoblastoma inchildhood, Computed Tomography (CT) represents thestandard method for tumor volume delineation, despitesome inherent limitations. CT scan is very useful inproviding information on physical density for dosecalculation and morphological volumetric information butpresents a low sensitivity in assessing the tumorviability. On the other hand, 3D ultrasound (US) allows ahigh accurate definition of the tumor volume thanks toits high spatial resolution but it is not currentlyintegrated in the treatment planning but used only fordiagnosis and follow-up. Our ultimate goal is anautomatic segmentation of gross tumor volume (GTV) in the3D US, the segmentation of the organs at risk (OAR) inthe CT and the registration of both. In this paper, wepresent some preliminary results in this direction. Wepresent 3D active contour-based segmentation of the eyeball and the lens in CT images; the presented approachincorporates the prior knowledge of the anatomy by usinga 3D geometrical eye model. The automated segmentationresults are validated by comparing with manualsegmentations. Then, for the fusion of 3D CT and USimages, we present two approaches: (i) landmark-basedtransformation, and (ii) object-based transformation thatmakes use of eye ball contour information on CT and USimages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La planification scanographique (3D) a démontré son utilité pour une reconstruction anatomique plus précise de la hanche (longueur du fémur, centre de rotation, offset, antéversion et rétroversion). Des études ont montré que lors de la planification 2D 50% seulement correspondaient à l'implant définitif du fémur alors que dans une autre étude ce taux s'élevait à 94% pour une planification 3D. Les erreurs étaient liées à l'agrandissement des radiographies. L'erreur sur la taille de la tige est liée à l'estimation inadéquate de la morphologie osseuse ainsi qu'à la densité osseuse. L'erreur de l'antéversion, augmentée par l'inclinaison du bassin, a pu être éliminée par la planification 3D et l'offset restauré dans 98%. Cette étude est basée sur une nouvelle technique de planification scanographique en trois dimensions pour une meilleure précision de la reconstruction de la hanche. Le but de cette étude est de comparer l'anatomie post-opératoire à celle préopératoire en comparant les tailles d'implant prévu lors de la planification 3D à celle réellement utilisée lors de l'opération afin de déterminer l'exactitude de la restauration anatomique avec étude des différents paramètres (centre de rotation, densité osseuse, L'offset fémoral, rotations des implants, longueur du membre) à l'aide du Logiciel HIP-PLAN (Symbios) avec évaluation de la reproductibilité de notre planification 3D dans une série prospective de 50 patients subissant une prothèse totale de hanche non cimentée primaire par voie antérieure. La planification pré-opératoire a été comparée à un CTscan postopératoire par fusion d'images. CONCLUSION ET PRESPECTIVE Les résultats obtenus sont les suivants : La taille de l'implant a été prédit correctement dans 100% des tiges, 94% des cupules et 88% des têtes (longueur). La différence entre le prévu et la longueur de la jambe postopératoire était de 0,3+2,3 mm. Les valeurs de décalage global, antéversion fémorale, inclinaison et antéversion de la cupule étaient 1,4 mm ± 3,1, 0,6 ± 3,3 0 -0,4 0 ± 5 et 6,9 ° ± 11,4, respectivement. Cette planification permet de prévoir la taille de l'implant précis. Position de la tige et de l'inclinaison de la cupule sont exactement reproductible. La planification scanographique préopératoire 3D permet une évaluation précise de l'anatomie individuelle des patients subissant une prothèse totale de hanche. La prédiction de la taille de l'implant est fiable et la précision du positionnement de la tige est excellente. Toutefois, aucun avantage n'est observée en termes d'orientation de la cupule par rapport aux études impliquant une planification 2D ou la navigation. De plus amples recherches comparant les différentes techniques de planification pré-opératoire à la navigation sont nécessaire.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To report a single-center experience treating patients with squamous- cell carcinoma of the anal canal using helical Tomotherapy (HT) and concurrent chemotherapy (CT).Materials/Methods: From October 2007 to February 2011, 55 patients were treated with HT and concurrent CT (5-fluorouracil/capecitabin and mitomycin) for anal squamous-cell carcinoma. All patients underwent computed- tomography-based treatment planning, with pelvic and inguinal nodes receiving 36 Gy in 1.8 Gy/fraction. Following a planned 1-week break, primary tumor site and involved nodes were boosted to a total dose 59.4 Gy in 1.8 Gy/fraction. Dose-volume histograms of several organs at risk (OAR; bladder, small intestine, rectum, femoral heads, penile bulb, external genitalia) were assessed in terms of conformal avoidance. All toxicity was scored according to the CTCAE, v.3.0. HT plans and treatment were implemented using the Tomotherapy, Inc. software and hardware. For dosimetric comparisons, 3D RT and/or IMRT plans were also computed for some of the patients using the CMS planning system, for treatment with 6-18 MV photons and/or electrons with suitable energies from a Siemens Primus linear accelerator equipped with a multileaf collimator.Locoregional control and survival curves were compared with the log-rank test, and multivariate analysis by the Cox model.Results: With 360-degree-of-freedom beam projection, HT has an advantage over other RT techniques (3D or 5-field step-and-shot IMRT). There is significant improvement over 3D or 5-field IMRT plans in terms of dose conformity around the PTV, and dose gradients are steeper outside the target volume, resulting in reduced doses to OARs. Using HT, acute toxicity was acceptable, and seemed to be better than historical standards.Conclusions: Our results suggest that HT combined with concurrent CT for anal cancer is effective and tolerable. Compared to 3D RT or 5-field step-andshot IMRT, there is better conformity around the PTV, and better OAR sparing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the aim of improving human health, scientists have been using an approach referred to as translational research, in which they aim to convey their laboratory discoveries into clinical applications to help prevent and cure disease. Such discoveries often arise from cellular, molecular, and physiological studies that progress to the clinical level. Most of the translational work is done using animal models that share common genes, molecular pathways, or phenotypes with humans. In this article, we discuss how translational work is carried out in various animal models and illustrate its relevance for human sleep research and sleep-related disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arterial Spin Labeling (ASL) is a method to measure perfusion using magnetically labeled blood water as an endogenous tracer. Being fully non-invasive, this technique is attractive for longitudinal studies of cerebral blood flow in healthy and diseased individuals, or as a surrogate marker of metabolism. So far, ASL has been restricted mostly to specialist centers due to a generally low SNR of the method and potential issues with user-dependent analysis needed to obtain quantitative measurement of cerebral blood flow (CBF). Here, we evaluated a particular implementation of ASL (called Quantitative STAR labeling of Arterial Regions or QUASAR), a method providing user independent quantification of CBF in a large test-retest study across sites from around the world, dubbed "The QUASAR reproducibility study". Altogether, 28 sites located in Asia, Europe and North America participated and a total of 284 healthy volunteers were scanned. Minimal operator dependence was assured by using an automatic planning tool and its accuracy and potential usefulness in multi-center trials was evaluated as well. Accurate repositioning between sessions was achieved with the automatic planning tool showing mean displacements of 1.87+/-0.95 mm and rotations of 1.56+/-0.66 degrees . Mean gray matter CBF was 47.4+/-7.5 [ml/100 g/min] with a between-subject standard variation SD(b)=5.5 [ml/100 g/min] and a within-subject standard deviation SD(w)=4.7 [ml/100 g/min]. The corresponding repeatability was 13.0 [ml/100 g/min] and was found to be within the range of previous studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge about spatial biodiversity patterns is a basic criterion for reserve network design. Although herbarium collections hold large quantities of information, the data are often scattered and cannot supply complete spatial coverage. Alternatively, herbarium data can be used to fit species distribution models and their predictions can be used to provide complete spatial coverage and derive species richness maps. Here, we build on previous effort to propose an improved compositionalist framework for using species distribution models to better inform conservation management. We illustrate the approach with models fitted with six different methods and combined using an ensemble approach for 408 plant species in a tropical and megadiverse country (Ecuador). As a complementary view to the traditional richness hotspots methodology, consisting of a simple stacking of species distribution maps, the compositionalist modelling approach used here combines separate predictions for different pools of species to identify areas of alternative suitability for conservation. Our results show that the compositionalist approach better captures the established protected areas than the traditional richness hotspots strategies and allows the identification of areas in Ecuador that would optimally complement the current protection network. Further studies should aim at refining the approach with more groups and additional species information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Effective cancer treatment generally requires combination therapy. The combination of external beam therapy (XRT) with radiopharmaceutical therapy (RPT) requires accurate three-dimensional dose calculations to avoid toxicity and evaluate efficacy. We have developed and tested a treatment planning method, using the patient-specific three-dimensional dosimetry package 3D-RD, for sequentially combined RPT/XRT therapy designed to limit toxicity to organs at risk. METHODS AND MATERIALS: The biologic effective dose (BED) was used to translate voxelized RPT absorbed dose (D(RPT)) values into a normalized total dose (or equivalent 2-Gy-fraction XRT absorbed dose), NTD(RPT) map. The BED was calculated numerically using an algorithmic approach, which enabled a more accurate calculation of BED and NTD(RPT). A treatment plan from the combined Samarium-153 and external beam was designed that would deliver a tumoricidal dose while delivering no more than 50 Gy of NTD(sum) to the spinal cord of a patient with a paraspinal tumor. RESULTS: The average voxel NTD(RPT) to tumor from RPT was 22.6 Gy (range, 1-85 Gy); the maximum spinal cord voxel NTD(RPT) from RPT was 6.8 Gy. The combined therapy NTD(sum) to tumor was 71.5 Gy (range, 40-135 Gy) for a maximum voxel spinal cord NTD(sum) equal to the maximum tolerated dose of 50 Gy. CONCLUSIONS: A method that enables real-time treatment planning of combined RPT-XRT has been developed. By implementing a more generalized conversion between the dose values from the two modalities and an activity-based treatment of partial volume effects, the reliability of combination therapy treatment planning has been expanded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Advance care planning (ACP) is increasingly regarded as the gold standard in the care of patients with life-limiting illnesses. Research has focused on adults, but ACP is also being practiced in pediatrics. We conducted a systematic review on empirical literature on pediatric ACP (pACP) to assess current practices, effects, and perspectives of pACP. METHODS: We searched PubMed, BELIT, and PSYCinfo for empirical literature on pACP, published January 1991 through January 2012. Titles, abstracts, and full texts were screened by 3 independent reviewers for studies that met the predefined criteria. The evidence level of the studies was assessed. Relevant study outcomes were retrieved according to predefined questions. RESULTS: We included 5 qualitative and 8 quantitative studies. Only 3 pACP programs were identified, all from the United States. Two of them were informed by adult programs. Major pACP features are discussions between families and care providers, as well as advance directives. A chaplain and other providers may be involved if required. Programs vary in how well they are evaluated; only 1 was studied by using a randomized controlled trial. Preliminary data suggest that pACP can successfully be implemented and is perceived as helpful. It may be emotionally relieving and facilitate communication and decision-making. Major challenges are negative reactions from emergency services, schools, and the community. CONCLUSIONS: There are few systematic pACP programs worldwide and none in Europe. Future research should investigate the needs of all stakeholders. In particular, the perspective of professionals has so far been neglected.