887 resultados para Beam Search Method
Resumo:
BACKGROUND Skull-base chondrosarcoma (ChSa) is a rare disease, and the prognostication of this disease entity is ill defined. METHODS We assessed the long-term local control (LC) results, overall survival (OS), and prognostic factors of skull-base ChSa patients treated with pencil beam scanning proton therapy (PBS PT). Seventy-seven (male, 35; 46%) patients with histologically confirmed ChSa were treated at the Paul Scherrer Institute. Median age was 38.9 years (range, 10.2-70.0y). Median delivered dose was 70.0 GyRBE (range, 64.0-76.0 GyRBE). LC, OS, and toxicity-free survival (TFS) rates were calculated using the Kaplan Meier method. RESULTS After a mean follow-up of 69.2 months (range, 4.6-190.8 mo), 6 local (7.8%) failures were observed, 2 of which were late failures. Five (6.5%) patients died. The actuarial 8-year LC and OS were 89.7% and 93.5%, respectively. Tumor volume > 25 cm(3) (P = .02), brainstem/optic apparatus compression at the time of PT (P = .04) and age >30 years (P = .08) were associated with lower rates of LC. High-grade (≥3) radiation-induced toxicity was observed in 6 (7.8%) patients. The 8-year high-grade TFS was 90.8%. A higher rate of high-grade toxicity was observed for older patients (P = .073), those with larger tumor volume (P = .069), and those treated with 5 weekly fractions (P = .069). CONCLUSIONS This is the largest PT series reporting the outcome of patients with low-grade ChSa of the skull base treated with PBS only. Our data indicate that protons are both safe and effective. Tumor volume, brainstem/optic apparatus compression, and age were prognosticators of local failures.
Resumo:
This paper addresses the issue of fully automatic segmentation of a hip CT image with the goal to preserve the joint structure for clinical applications in hip disease diagnosis and treatment. For this purpose, we propose a Multi-Atlas Segmentation Constrained Graph (MASCG) method. The MASCG method uses multi-atlas based mesh fusion results to initialize a bone sheetness based multi-label graph cut for an accurate hip CT segmentation which has the inherent advantage of automatic separation of the pelvic region from the bilateral proximal femoral regions. We then introduce a graph cut constrained graph search algorithm to further improve the segmentation accuracy around the bilateral hip joint regions. Taking manual segmentation as the ground truth, we evaluated the present approach on 30 hip CT images (60 hips) with a 15-fold cross validation. When the present approach was compared to manual segmentation, an average surface distance error of 0.30 mm, 0.29 mm, and 0.30 mm was found for the pelvis, the left proximal femur, and the right proximal femur, respectively. A further look at the bilateral hip joint regions demonstrated an average surface distance error of 0.16 mm, 0.21 mm and 0.20 mm for the acetabulum, the left femoral head, and the right femoral head, respectively.
Resumo:
OBJECTIVE The improvement in diagnostic accuracy and optimization of treatment planning in periodontology through the use of three-dimensional imaging with cone beam computed tomography (CBCT) is discussed controversially in the literature. The objective was to identify the best available external evidence for the indications of CBCT for periodontal diagnosis and treatment planning in specific clinical situations. DATA SOURCES A systematic literature search was performed for articles published by 2 March 2015 using electronic databases and hand search. Two reviewers performed the study selection, data collection, and validity assessment. PICO and PRISMA criteria were applied. From the combined search, seven studies were finally included. CONCLUSION The case series were published from the years 2009 to 2014. Five of the included publications refer to maxillary and/or mandibular molars and two to aspects related to vertical bony defects. Two studies show a high accuracy of CBCT in detecting intrabony defect morphology when compared to periapical radiographs. Particularly, in maxillary molars, CBCT provides high accuracy for detecting furcation involvement and morphology of surrounding periodontal tissues. CBCT has demonstrated advantages, when more invasive treatment approaches were considered in terms of decision making and cost benefit. Within their limits, the available data suggest that CBCT may improve diagnostic accuracy and optimize treatment planning in periodontal defects, particularly in maxillary molars with furcation involvement, and that the higher irradiation doses and cost-benefit ratio should be carefully analyzed before using CBCT for periodontal diagnosis and treatment planning.
Resumo:
Pencil beam scanned (PBS) proton therapy has many advantages over conventional radiotherapy, but its effectiveness for treating mobile tumours remains questionable. Gating dose delivery to the breathing pattern is a well-developed method in conventional radiotherapy for mitigating tumour-motion, but its clinical efficiency for PBS proton therapy is not yet well documented. In this study, the dosimetric benefits and the treatment efficiency of beam gating for PBS proton therapy has been comprehensively evaluated. A series of dedicated 4D dose calculations (4DDC) have been performed on 9 different 4DCT(MRI) liver data sets, which give realistic 4DCT extracting motion information from 4DMRI. The value of 4DCT(MRI) is its capability of providing not only patient geometries and deformable breathing characteristics, but also includes variations in the breathing patterns between breathing cycles. In order to monitor target motion and derive a gating signal, we simulate time-resolved beams' eye view (BEV) x-ray images as an online motion surrogate. 4DDCs have been performed using three amplitude-based gating window sizes (10/5/3 mm) with motion surrogates derived from either pre-implanted fiducial markers or the diaphragm. In addition, gating has also been simulated in combination with up to 19 times rescanning using either volumetric or layered approaches. The quality of the resulting 4DDC plans has been quantified in terms of the plan homogeneity index (HI), total treatment time and duty cycle. Results show that neither beam gating nor rescanning alone can fully retrieve the plan homogeneity of the static reference plan. Especially for variable breathing patterns, reductions of the effective duty cycle to as low as 10% have been observed with the smallest gating rescanning window (3 mm), implying that gating on its own for such cases would result in much longer treatment times. In addition, when rescanning is applied on its own, large differences between volumetric and layered rescanning have been observed as a function of increasing number of re-scans. However, once gating and rescanning is combined, HI to within 2% of the static plan could be achieved in the clinical target volume, with only moderately prolonged treatment times, irrespective of the rescanning strategy used. Moreover, these results are independent of the motion surrogate used. In conclusion, our results suggest image guided beam gating, combined with rescanning, is a feasible, effective and efficient motion mitigation approach for PBS-based liver tumour treatments.
Resumo:
Lung damage is a common side effect of chemotherapeutic drugs such as bleomycin. This study used a bleomycin mouse model which simulates the lung damage observed in humans. Noninvasive, in vivo cone-beam computed tomography (CBCT) was used to visualize and quantify fibrotic and inflammatory damage over the entire lung volume of mice. Bleomycin was used to induce pulmonary damage in vivo and the results from two CBCT systems, a micro-CT and flat panel CT (fpCT), were compared to histologic measurements, the standard method of murine lung damage quantification. Twenty C57BL/6 mice were given either 3 U/kg of bleomycin or saline intratracheally. The mice were scanned at baseline, before the administration of bleomycin, and then 10, 14, and 21 days afterward. At each time point, a subset of mice was sacrificed for histologic analysis. The resulting CT images were used to assess lung volume. Percent lung damage (PLD) was calculated for each mouse on both the fpCT (PLDfpcT) and the micro-CT (PLDμCT). Histologic PLD (PLDH) was calculated for each histologic section at each time point (day 10, n = 4; day 14, n = 4; day 21, n = 5; control group, n = 5). A linear regression was applied to the PLDfpCT vs. PLDH, PLDμCT vs. PLDH and PLDfpCT vs. PLDμCT distributions. This study did not demonstrate strong correlations between PLDCT and PLDH. The coefficient of determination, R, was 0.68 for PLDμCT vs. PLDH and 0.75 for the PLD fpCT vs. PLDH. The experimental issues identified from this study were: (1) inconsistent inflation of the lungs from scan to scan, (2) variable distribution of damage (one histologic section not representative of overall lung damage), (3) control mice not scanned with each group of bleomycin mice, (4) two CT systems caused long anesthesia time for the mice, and (5) respiratory gating did not hold the volume of lung constant throughout the scan. Addressing these issues might allow for further improvement of the correlation between PLDCT and PLDH. ^
Resumo:
In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
This instrument was part of the research project "Research on Evaluation of Health and Education Plans and Programs in the Province of Buenos Aires", developed by the of Chair Preventive Psychology of the Psychology course of studies at the School of Humanities and Educational Sciences, National Univer- sity of La Plata (Argentina). The basis for proposing an assessment instrument is the need for a method enabling analysis, systematization of knowledge and the assignment of values distributed into scales and organized in general charts, on social programs. Its main concern is the analysis of health and education programs and projects, restricted to certain specific areas or regions, in search for theoretic trustworthi-ness, methodological accuracy as well as pragmatic operability. This is the result of four years of researching said programs at system, service and community levels.
Resumo:
This instrument was part of the research project "Research on Evaluation of Health and Education Plans and Programs in the Province of Buenos Aires", developed by the of Chair Preventive Psychology of the Psychology course of studies at the School of Humanities and Educational Sciences, National Univer- sity of La Plata (Argentina). The basis for proposing an assessment instrument is the need for a method enabling analysis, systematization of knowledge and the assignment of values distributed into scales and organized in general charts, on social programs. Its main concern is the analysis of health and education programs and projects, restricted to certain specific areas or regions, in search for theoretic trustworthi-ness, methodological accuracy as well as pragmatic operability. This is the result of four years of researching said programs at system, service and community levels.
Resumo:
This instrument was part of the research project "Research on Evaluation of Health and Education Plans and Programs in the Province of Buenos Aires", developed by the of Chair Preventive Psychology of the Psychology course of studies at the School of Humanities and Educational Sciences, National Univer- sity of La Plata (Argentina). The basis for proposing an assessment instrument is the need for a method enabling analysis, systematization of knowledge and the assignment of values distributed into scales and organized in general charts, on social programs. Its main concern is the analysis of health and education programs and projects, restricted to certain specific areas or regions, in search for theoretic trustworthi-ness, methodological accuracy as well as pragmatic operability. This is the result of four years of researching said programs at system, service and community levels.
Resumo:
Kelp forests represent a major habitat type in coastal waters worldwide and their structure and distribution is predicted to change due to global warming. Despite their ecological and economical importance, there is still a lack of reliable spatial information on their abundance and distribution. In recent years, various hydroacoustic mapping techniques for sublittoral environments evolved. However, in turbid coastal waters, such as off the island of Helgoland (Germany, North Sea), the kelp vegetation is present in shallow water depths normally excluded from hydroacoustic surveys. In this study, single beam survey data consisting of the two seafloor parameters roughness and hardness were obtained with RoxAnn from water depth between 2 and 18 m. Our primary aim was to reliably detect the kelp forest habitat with different densities and distinguish it from other vegetated zones. Five habitat classes were identified using underwater-video and were applied for classification of acoustic signatures. Subsequently, spatial prediction maps were produced via two classification approaches: Linear discriminant analysis (LDA) and manual classification routine (MC). LDA was able to distinguish dense kelp forest from other habitats (i.e. mixed seaweed vegetation, sand, and barren bedrock), but no variances in kelp density. In contrast, MC also provided information on medium dense kelp distribution which is characterized by intermediate roughness and hardness values evoked by reduced kelp abundances. The prediction maps reach accordance levels of 62% (LDA) and 68% (MC). The presence of vegetation (kelp and mixed seaweed vegetation) was determined with higher prediction abilities of 75% (LDA) and 76% (MC). Since the different habitat classes reveal acoustic signatures that strongly overlap, the manual classification method was more appropriate for separating different kelp forest densities and low-lying vegetation. It became evident that the occurrence of kelp in this area is not simply linked to water depth. Moreover, this study shows that the two seafloor parameters collected with RoxAnn are suitable indicators for the discrimination of different densely vegetated seafloor habitats in shallow environments.
Resumo:
Today's digital libraries (DLs) archive vast amounts of information in the form of text, videos, images, data measurements, etc. User access to DL content can rely on similarity between metadata elements, or similarity between the data itself (content-based similarity). We consider the problem of exploratory search in large DLs of time-oriented data. We propose a novel approach for overview-first exploration of data collections based on user-selected metadata properties. In a 2D layout representing entities of the selected property are laid out based on their similarity with respect to the underlying data content. The display is enhanced by compact summarizations of underlying data elements, and forms the basis for exploratory navigation of users in the data space. The approach is proposed as an interface for visual exploration, leading the user to discover interesting relationships between data items relying on content-based similarity between data items and their respective metadata labels. We apply the method on real data sets from the earth observation community, showing its applicability and usefulness.
Resumo:
The OPERA experiment was designed to search for νµ → ντ oscillations in appearance mode, i.e. by detecting the τ leptons produced in charged current ντ interactions. The experiment took data from 2008 to 2012 in the CERN Neutrinos to Gran Sasso beam. The observation of the νµ → ντ appearance, achieved with four candidate events in a subsample of the data, was previously reported. In this Letter, a fifth ντ candidate event, found in an enlarged data sample, is described. Together with a further reduction of the expected background, the candidate events detected so far allow us to assess the discovery of νµ → ντ oscillations in appearance mode with a significance larger than 5 σ.
Resumo:
A finite element model was used to simulate timberbeams with defects and predict their maximum load in bending. Taking into account the elastoplastic constitutive law of timber, the prediction of fracture load gives information about the mechanisms of timber failure, particularly with regard to the influence of knots, and their local graindeviation, on the fracture. A finite element model was constructed using the ANSYS element Plane42 in a plane stress 2D-analysis, which equates thickness to the width of the section to create a mesh which is as uniform as possible. Three sub-models reproduced the bending test according to UNE EN 408: i) timber with holes caused by knots; ii) timber with adherent knots which have structural continuity with the rest of the beam material; iii) timber with knots but with only partial contact between knot and beam which was artificially simulated by means of contact springs between the two materials. The model was validated using ten 45 × 145 × 3000 mm beams of Pinus sylvestris L. which presented knots and graindeviation. The fracture stress data obtained was compared with the results of numerical simulations, resulting in an adjustment error less of than 9.7%
Resumo:
A new and effective method for reduction of truncation errors in partial spherical near-field (SNF) measurements is proposed. The method is useful when measuring electrically large antennas, where the measurement time with the classical SNF technique is prohibitively long and an acquisition over the whole spherical surface is not practical. Therefore, to reduce the data acquisition time, partial sphere measurement is usually made, taking samples over a portion of the spherical surface in the direction of the main beam. But in this case, the radiation pattern is not known outside the measured angular sector as well as a truncation error is present in the calculated far-field pattern within this sector. The method is based on the Gerchberg-Papoulis algorithm used to extrapolate functions and it is able to extend the valid region of the calculated far-field pattern up to the whole forward hemisphere. To verify the effectiveness of the method, several examples are presented using both simulated and measured truncated near-field data.