11 resultados para CORRELATED CALCULATIONS

em DigitalCommons@The Texas Medical Center


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A non-parametric method was developed and tested to compare the partial areas under two correlated Receiver Operating Characteristic curves. Based on the theory of generalized U-statistics the mathematical formulas have been derived for computing ROC area, and the variance and covariance between the portions of two ROC curves. A practical SAS application also has been developed to facilitate the calculations. The accuracy of the non-parametric method was evaluated by comparing it to other methods. By applying our method to the data from a published ROC analysis of CT image, our results are very close to theirs. A hypothetical example was used to demonstrate the effects of two crossed ROC curves. The two ROC areas are the same. However each portion of the area between two ROC curves were found to be significantly different by the partial ROC curve analysis. For computation of ROC curves with large scales, such as a logistic regression model, we applied our method to the breast cancer study with Medicare claims data. It yielded the same ROC area computation as the SAS Logistic procedure. Our method also provides an alternative to the global summary of ROC area comparison by directly comparing the true-positive rates for two regression models and by determining the range of false-positive values where the models differ. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

America’s low-income families struggle to protect their children from multiple threats to their health and growth. Many research and advocacy groups explore the health and educational effects of food insecurity, but less is known about these effects on very young children. Children’s HealthWatch, a group of pediatric clinicians and public health researchers, has continuously collected data on the effects of food insecurity alone and in conjunction with other household hardships since 1998. The group’s peer reviewed research has shown that a number of economic risks at the household level, including food, housing and energy insecurity, tend to be correlated. These insecurities alone or in conjunction increase the risk that a young child will suffer various negative health consequences, including increases in lifetime hospitalizations, parental report of fair or poor health,1 or risk for developmental delays.2 Child food insecurity is an incremental risk indicator above and beyond the risk imposed by household-level food insecurity. The Children’sHealthwatch research also suggests public benefits programs modify some of these effects for families experiencing hardships. This empirical evidence is presented in a variety of public venues outside the usual scientific settings, such as congressional hearings, to support the needs of America’s most vulnerable population through policy change. Children’s HealthWatch research supports legislative solutions to food insecurity, including sustained funding for public programs and re-evaluation of the use of the Thrifty Food Plan as the basis of SNAP benefits calculations. Children’s HealthWatch is one of many models to support the American Academy of Pediatrics’ call to “stand up, speak up, and step up for children.”3 No isolated group or single intervention will solve child poverty or multiple hardships. However, working collaboratively each group has a role to play in supporting the health and well-being of young children and their families. 1. Cook JT, Frank DA, Berkowitz C, et al. Food insecurity is associated with adverse health outcomes among human infants and toddlers. J Nutr. 2004;134:1432-1438. 2. Rose-Jacobs R, Black MM, Casey PH, et al. Household food insecurity: associations with at-risk infant and toddler development. Pediatrics. 2008;121:65-72. 3. AAP leader says to stand up, speak up, and step up for child health [news release]. Boston, MA: American Academy of Pediatrics; October 11, 2008. http://www2.aap.org/pressroom/nce/nce08childhealth.htm. Accessed January 1, 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The MDAH pencil-beam algorithm developed by Hogstrom et al (1981) has been widely used in clinics for electron beam dose calculations for radiotherapy treatment planning. The primary objective of this research was to address several deficiencies of that algorithm and to develop an enhanced version. Two enhancements have been incorporated into the pencil-beam algorithm; one models fluence rather than planar fluence, and the other models the bremsstrahlung dose using measured beam data. Comparisons of the resulting calculated dose distributions with measured dose distributions for several test phantoms have been made. From these results it is concluded (1) that the fluence-based algorithm is more accurate to use for the dose calculation in an inhomogeneous slab phantom, and (2) the fluence-based calculation provides only a limited improvement to the accuracy the calculated dose in the region just downstream of the lateral edge of an inhomogeneity. The source of the latter inaccuracy is believed primarily due to assumptions made in the pencil beam's modeling of the complex phantom or patient geometry.^ A pencil-beam redefinition model was developed for the calculation of electron beam dose distributions in three dimensions. The primary aim of this redefinition model was to solve the dosimetry problem presented by deep inhomogeneities, which was the major deficiency of the enhanced version of the MDAH pencil-beam algorithm. The pencil-beam redefinition model is based on the theory of electron transport by redefining the pencil beams at each layer of the medium. The unique approach of this model is that all the physical parameters of a given pencil beam are characterized for multiple energy bins. Comparisons of the calculated dose distributions with measured dose distributions for a homogeneous water phantom and for phantoms with deep inhomogeneities have been made. From these results it is concluded that the redefinition algorithm is superior to the conventional, fluence-based, pencil-beam algorithm, especially in predicting the dose distribution downstream of a local inhomogeneity. The accuracy of this algorithm appears sufficient for clinical use, and the algorithm is structured for future expansion of the physical model if required for site specific treatment planning problems. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many studies in biostatistics deal with binary data. Some of these studies involve correlated observations, which can complicate the analysis of the resulting data. Studies of this kind typically arise when a high degree of commonality exists between test subjects. If there exists a natural hierarchy in the data, multilevel analysis is an appropriate tool for the analysis. Two examples are the measurements on identical twins, or the study of symmetrical organs or appendages such as in the case of ophthalmic studies. Although this type of matching appears ideal for the purposes of comparison, analysis of the resulting data while ignoring the effect of intra-cluster correlation has been shown to produce biased results.^ This paper will explore the use of multilevel modeling of simulated binary data with predetermined levels of correlation. Data will be generated using the Beta-Binomial method with varying degrees of correlation between the lower level observations. The data will be analyzed using the multilevel software package MlwiN (Woodhouse, et al, 1995). Comparisons between the specified intra-cluster correlation of these data and the estimated correlations, using multilevel analysis, will be used to examine the accuracy of this technique in analyzing this type of data. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work was to develop a comprehensive IMSRT QA procedure that examined, using EPID dosimetry and Monte Carlo (MC) calculations, each step in the treatment planning and delivery process. These steps included verification of the field shaping, treatment planning system (RTPS) dose calculations, and patient dose delivery. Verification of each step in the treatment process is assumed to result in correct dose delivery to the patient. ^ The accelerator MC model was verified against commissioning data for field sizes from 0.8 × 0.8 cm 2 to 10 × 10 cm 2. Depth doses were within 2% local percent difference (LPD) in low gradient regions and 1 mm distance to agreement (DTA) in high gradient regions. Lateral profiles were within 2% LPD in low gradient regions and 1 mm DTA in high gradient regions. Calculated output factors were within 1% of measurement for field sizes ≥1 × 1 cm2. ^ The measured and calculated pretreatment EPID dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Pretreatment field verification resulted in 97% percent of the points passing. ^ The RTPS and Monte Carlo phantom dose calculations were compared using 5% LPD, 2 mm DTA, or 2% of the maximum dose with ≥95% of compared points required passing for successful verification. RTPS calculation verification resulted in 97% percent of the points passing. ^ The measured and calculated EPID exit dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Exit dose verification resulted in 97% percent of the points passing. ^ Each of the processes above verified an individual step in the treatment planning and delivery process. The combination of these verification steps ensures accurate treatment delivery to the patient. This work shows that Monte Carlo calculations and EPID dosimetry can be used to quantitatively verify IMSRT treatments resulting in improved patient care and, potentially, improved clinical outcome. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uveal melanoma is a rare but life-threatening form of ocular cancer. Contemporary treatment techniques include proton therapy, which enables conservation of the eye and its useful vision. Dose to the proximal structures is widely believed to play a role in treatment side effects, therefore, reliable dose estimates are required for properly evaluating the therapeutic value and complication risk of treatment plans. Unfortunately, current simplistic dose calculation algorithms can result in errors of up to 30% in the proximal region. In addition, they lack predictive methods for absolute dose per monitor unit (D/MU) values. ^ To facilitate more accurate dose predictions, a Monte Carlo model of an ocular proton nozzle was created and benchmarked against measured dose profiles to within ±3% or ±0.5 mm and D/MU values to within ±3%. The benchmarked Monte Carlo model was used to develop and validate a new broad beam dose algorithm that included the influence of edgescattered protons on the cross-field intensity profile, the effect of energy straggling in the distal portion of poly-energetic beams, and the proton fluence loss as a function of residual range. Generally, the analytical algorithm predicted relative dose distributions that were within ±3% or ±0.5 mm and absolute D/MU values that were within ±3% of Monte Carlo calculations. Slightly larger dose differences were observed at depths less than 7 mm, an effect attributed to the dose contributions of edge-scattered protons. Additional comparisons of Monte Carlo and broad beam dose predictions were made in a detailed eye model developed in this work, with generally similar findings. ^ Monte Carlo was shown to be an excellent predictor of the measured dose profiles and D/MU values and a valuable tool for developing and validating a broad beam dose algorithm for ocular proton therapy. The more detailed physics modeling by the Monte Carlo and broad beam dose algorithms represent an improvement in the accuracy of relative dose predictions over current techniques, and they provide absolute dose predictions. It is anticipated these improvements can be used to develop treatment strategies that reduce the incidence or severity of treatment complications by sparing normal tissue. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current statistical methods for estimation of parametric effect sizes from a series of experiments are generally restricted to univariate comparisons of standardized mean differences between two treatments. Multivariate methods are presented for the case in which effect size is a vector of standardized multivariate mean differences and the number of treatment groups is two or more. The proposed methods employ a vector of independent sample means for each response variable that leads to a covariance structure which depends only on correlations among the $p$ responses on each subject. Using weighted least squares theory and the assumption that the observations are from normally distributed populations, multivariate hypotheses analogous to common hypotheses used for testing effect sizes were formulated and tested for treatment effects which are correlated through a common control group, through multiple response variables observed on each subject, or both conditions.^ The asymptotic multivariate distribution for correlated effect sizes is obtained by extending univariate methods for estimating effect sizes which are correlated through common control groups. The joint distribution of vectors of effect sizes (from $p$ responses on each subject) from one treatment and one control group and from several treatment groups sharing a common control group are derived. Methods are given for estimation of linear combinations of effect sizes when certain homogeneity conditions are met, and for estimation of vectors of effect sizes and confidence intervals from $p$ responses on each subject. Computational illustrations are provided using data from studies of effects of electric field exposure on small laboratory animals. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The basis for the recent transition of Enterococcus faecium from a primarily commensal organism to one of the leading causes of hospital-acquired infections in the United States is not yet understood. To address this, the first part of my project assessed isolates from early outbreaks in the USA and South America using sequence analysis, colony hybridizations, and minimal inhibitory concentrations (MICs) which showed clinical isolates possess virulence and antibiotic resistance determinants that are less abundant or lacking in community isolates. I also revealed that the level of ampicillin resistance increased over time in clinical strains. By sequencing the pbp5 gene, I demonstrated an ~5% difference in the pbp5 gene between strains with MICs <4ug/ml and those with MICs >4µg/ml, but no specific sequence changes correlated with increases in MICs within the latter group. A 3-10% nucleotide difference was also seen in three other genes analyzed, which suggested the existence of two distinct subpopulations of E. faecium. This led to the second part of my project analyzing concatenated core gene sequences, SNPs, the 16S rRNA, and phylogenetics of 21 E. faecium genomes confirming two distinct clades; a community-associated (CA) clade and hospital-associated (HA) clade. Molecular clock calculations indicate that these two clades likely diverged ~ 300,000 to > 1 million years ago, long before the modern antibiotic era. Genomic analysis also showed that, in addition to core genomic differences, HA E. faecium harbor specific accessory genetic elements that may confer selection advantages over CA E. faecium. The third part of my project discovered 6 E. faecium genes with the newly identified “WxL” domain. My analyses, using RT-PCR, western blots, patient sera, whole-cell ELISA, and immunogold electron microscopy, indicated that E. faecium WxL genes exist in operons, encode bacterial cell surface localized proteins, that WxL proteins are antigenic in humans, and are more exposed on the surface of clinical isolates versus community isolates (even though they are ubiquitous in both clades). ELISAs and BIAcore analyses also showed that proteins encoded by these operons bind several different host extracellular matrix proteins, as well as to each other, suggesting a novel cell-surface complex. In summary, my studies provide new insights into the evolution of E. faecium by showing that there are two distantly related clades; one being more successful in the hospital setting. My studies also identified operons encoding WxL proteins whose characteristics could also contribute to colonization and virulence within this species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The physical characteristic of protons is that they deliver most of their radiation dose to the target volume and deliver no dose to the normal tissue distal to the tumor. Previously, numerous studies have shown unique advantages of proton therapy over intensity-modulated radiation therapy (IMRT) in conforming dose to the tumor and sparing dose to the surrounding normal tissues and the critical structures in many clinical sites. However, proton therapy is known to be more sensitive to treatment uncertainties such as inter- and intra-fractional variations in patient anatomy. To date, no study has clearly demonstrated the effectiveness of proton therapy compared with the conventional IMRT under the consideration of both respiratory motion and tumor shrinkage in non-small cell lung cancer (NSCLC) patients. Purpose: This thesis investigated two questions for establishing a clinically relevant comparison of the two different modalities (IMRT and proton therapy). The first question was whether or not there are any differences in tumor shrinkage between patients randomized to IMRT versus passively scattered proton therapy (PSPT). Tumor shrinkage is considered a standard measure of radiation therapy response that has been widely used to gauge a short-term progression of radiation therapy. The second question was whether or not there are any differences between the planned dose and 5D dose under the influence of inter- and intra-fractional variations in the patient anatomy for both modalities. Methods: A total of 45 patients (25 IMRT patients and 20 PSPT patients) were used to quantify the tumor shrinkage in terms of the change of the primary gross tumor volume (GTVp). All patients were randomized to receive either IMRT or PSPT for NSCLC. Treatment planning goals were identical for both groups. All patients received 5 to 8 weekly repeated 4-dimensional computed tomography (4DCT) scans during the course of radiation treatments. The original GTVp contours were propagated to T50 of weekly 4DCT images using deformable image registration and their absolute volumes were measured. Statistical analysis was performed to compare the distribution of tumor shrinkage between the two population groups. In order to investigate the difference between the planned dose and the 5D dose with consideration of both breathing motion and anatomical change, we re-calculated new dose distributions at every phase of the breathing cycle for all available weekly 4DCT data sets which resulted 50 to 80 individual dose calculations for each of the 7 patients presented in this thesis. The newly calculated dose distributions were then deformed and accumulated to T50 of the planning 4DCT for comparison with the planned dose distribution. Results: At the end of the treatment, both IMRT and PSPT groups showed mean tumor volume reductions of 23.6% ( 19.2%) and 20.9% ( 17.0 %) respectively. Moreover, the mean difference in tumor shrinkage between two groups is 3% along with the corresponding 95% confidence interval, [-8%, 14%]. The rate of tumor shrinkage was highly correlated with the initial tumor volume size. For the planning dose and 5D dose comparison study, all 7 patients showed a mean difference of 1 % in terms of target coverage for both IMRT and PSPT treatment plans. Conclusions: The results of the tumor shrinkage investigation showed no statistically significant difference in tumor shrinkage between the IMRT and PSPT patients, and the tumor shrinkage between the two modalities is similar based on the 95% confidence interval. From the pilot study of comparing the planned dose with the 5D dose, we found the difference to be only 1%. Overall impression of the two modalities in terms of treatment response as measured by the tumor shrinkage and 5D dose under the influence of anatomical change that were designed under the same protocol (i.e. randomized trial) showed similar result.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electron pencil-beam redefinition algorithm (PBRA) of Shiu and Hogstrom has been developed for use in radiotherapy treatment planning (RTP). Earlier studies of Boyd and Hogstrom showed that the PBRA lacked an adequate incident beam model, that PBRA might require improved electron physics, and that no data existed which allowed adequate assessment of the PBRA-calculated dose accuracy in a heterogeneous medium such as one presented by patient anatomy. The hypothesis of this research was that by addressing the above issues the PBRA-calculated dose would be accurate to within 4% or 2 mm in regions of high dose gradients. A secondary electron source was added to the PBRA to account for collimation-scattered electrons in the incident beam. Parameters of the dual-source model were determined from a minimal data set to allow ease of beam commissioning. Comparisons with measured data showed 3% or better dose accuracy in water within the field for cases where 4% accuracy was not previously achievable. A measured data set was developed that allowed an evaluation of PBRA in regions distal to localized heterogeneities. Geometries in the data set included irregular surfaces and high- and low-density internal heterogeneities. The data was estimated to have 1% precision and 2% agreement with accurate, benchmarked Monte Carlo (MC) code. PBRA electron transport was enhanced by modeling local pencil beam divergence. This required fundamental changes to the mathematics of electron transport (divPBRA). Evaluation of divPBRA with the measured data set showed marginal improvement in dose accuracy when compared to PBRA; however, 4% or 2mm accuracy was not achieved by either PBRA version for all data points. Finally, PBRA was evaluated clinically by comparing PBRA- and MC-calculated dose distributions using site-specific patient RTP data. Results show PBRA did not agree with MC to within 4% or 2mm in a small fraction (<3%) of the irradiated volume. Although the hypothesis of the research was shown to be false, the minor dose inaccuracies should have little or no impact on RTP decisions or patient outcome. Therefore, given ease of beam commissioning, documentation of accuracy, and calculational speed, the PBRA should be considered a practical tool for clinical use. ^