20 resultados para high dimensional secondary classifier
em DigitalCommons@The Texas Medical Center
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.
Resumo:
Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^
Resumo:
Introduction. Despite the ban of lead-containing gasoline and paint, childhood lead poisoning remains a public health issue. Furthermore, a Medicaid-eligible child is 8 times more likely to have an elevated blood lead level (EBLL) than a non-Medicaid child, which is the primary reason for the early detection lead screening mandate for ages 12 and 24 months among the Medicaid population. Based on field observations, there was evidence that suggested a screening compliance issue. Objective. The purpose of this study was to analyze blood lead screening compliance in previously lead poisoned Medicaid children and test for an association between timely lead screening and timely childhood immunizations. The mean months between follow-up tests were also examined for a significant difference between the non-compliant and compliant lead screened children. Methods. Access to the surveillance data of all childhood lead poisoned cases in Bexar County was granted by the San Antonio Metropolitan Health District. A database was constructed and analyzed using descriptive statistics, logistic regression methods and non-parametric tests. Lead screening at 12 months of age was analyzed separately from lead screening at 24 months. The small portion of the population who were also related were included in one analysis and removed from a second analysis to check for significance. Gender, ethnicity, age of home, and having a sibling with an EBLL were ruled out as confounders for the association tests but ethnicity and age of home were adjusted in the nonparametric tests. Results. There was a strong significant association between lead screening compliance at 12 months and childhood immunization compliance, with or without including related children (p<0.00). However, there was no significant association between the two variables at the age of 24 months. Furthermore, there was no significant difference between the median of the mean months of follow-up blood tests among the non-compliant and compliant lead screened population for at the 12 month screening group but there was a significant difference at the 24 month screening group (p<0.01). Discussion. Descriptive statistics showed that 61% and 56% of the previously lead poisoned Medicaid population did not receive their 12 and 24 month mandated lead screening on time, respectively. This suggests that their elevated blood lead level may have been diagnosed earlier in their childhood. Furthermore, a child who is compliant with their lead screening at 12 months of age is 2.36 times more likely to also receive their childhood immunizations on time compared to a child who was not compliant with their 12 month screening. Even though there was no statistical significant association found for the 24 month group, the public health significance of a screening compliance issue is no less important. The Texas Medicaid program needs to enforce lead screening compliance because it is evident that there has been no monitoring system in place. Further recommendations include a need for an increased focus on parental education and the importance of taking their children for wellness exams on time.^
Resumo:
Bacteriophage BPP-1, which infects Bordetella species, can switch its specificity by mutations to the ligand-binding surface of its major tropism-determinant protein, Mtd. This targeted mutagenesis results from the activity of a phage-encoded diversity-generating retroelement. Purified Mtd binds its receptor with low affinity, yet BPP-1 binding and infection of Bordettella cells are efficient because of high-avidity binding between phage-associated Mtd and its receptor. Here, using an integrative approach of three-dimensional (3D) structural analyses of the entire phage by cryo-electron tomography and single-prticle cryo-electron microscopy, we provide direct localization of Mtd in the phage and the structural basis of the high-avidity binding of the BPP-1 phage. Our structure shows that each BPP-1 particle has a T = 7 icosahedral head and an unusual tail apparatus consisting of a short central tail "hub," six short tail spikes, and six extended tail fibers. Subtomographic averaging of the tail fiber maps revealed a two-lobed globular structure at the distal end of each long tail fiber. Tomographic reconstructions of immuno-gold-labeled BPP-1 directly localized Mtd to these globular structures. Finally, our icosahedral reconstruction of the BPP-1 head at 7A resolution reveals an HK97-like major capsid protein stabilized by a smaller cementing protein. Our structure represents a unique bacteriophage reconstruction with its tail fibers and ligand-binding domains shown in relation to its tail apparatus. The localization of Mtd at the distal ends of the six tail fibers explains the high avidity binding of Mtd molecules to cell surfaces for initiation of infection.
Resumo:
Alzheimer's disease (AD) is characterized by the cerebral accumulation of misfolded and aggregated amyloid-beta protein (Abeta). Disease symptoms can be alleviated, in vitro and in vivo, by 'beta-sheet breaker' pentapeptides that reduce plaque load. However the peptide nature of these compounds, made them biologically unstable and unable to penetrate membranes with high efficiency. The main goal of this study was to use computational methods to identify small molecule mimetics with better drug-like properties. For this purpose, the docked conformations of the active peptides were used to identify compounds with similar activities. A series of related beta-sheet breaker peptides were docked to solid state NMR structures of a fibrillar form of Abeta. The lowest energy conformations of the active peptides were used to design three dimensional (3D)-pharmacophores, suitable for screening the NCI database with Unity. Small molecular weight compounds with physicochemical features and a conformation similar to the active peptides were selected, ranked by docking and biochemical parameters. Of 16 diverse compounds selected for experimental screening, 2 prevented and reversed Abeta aggregation at 2-3microM concentration, as measured by Thioflavin T (ThT) fluorescence and ELISA assays. They also prevented the toxic effects of aggregated Abeta on neuroblastoma cells. Their low molecular weight and aqueous solubility makes them promising lead compounds for treating AD.
Resumo:
The current study evaluates the effectiveness of family preservation programs funded by the Mississippi Department of Human Services. This venture encompassed scrutiny and assessment of improvements in child functioning, positive changes in parental functioning and family functioning and the decrease in foster care placement. Further, this evaluation assessed client and staff satisfaction. It also included an assessment of the perceived impact this program had on the community. Results indicate that the family preservation programs were effective in improving the self-esteem of participants, family cohesion, and adaptability. There were no significant changes in child placement, teen births, or abuse rates. Client and staff satisfaction were high on all quality dimensions. The majority of the sample of community members felt that the family preservation programs were effective in the community.
Resumo:
This research examines the graduation rate experienced by students receiving public education services in the state of Texas. Special attention is paid to that subgroup of Texas students who meet Texas Education Agency criteria for handicapped status. The study is guided by two research questions: What are the high school completion rates experienced by handicapped and nonhandicapped students attending Texas public schools? and What are the predictors of graduation for handicapped and nonhandicapped students?^ In addition, the following hypotheses are explored. Hypothesis 1: Handicapped students attending a Texas public school will experience a lower rate of high school completion than their nonhandicapped counterparts. Hypothesis 2: Handicapped and nonhandicapped students attending school in a Texas public school with a budget above the median budget for Texas public schools will experience a higher rate of high school completion than similar students in Texas public schools with a budget below the median budget. Hypothesis 3: Handicapped and nonhandicapped students attending school in large Texas urban areas will experience a lower rate of high school completion than similar students in Texas public schools in rural areas. Hypothesis 4: Handicapped and nonhandicapped students attending a Texas public school in a county which rates above the state median for food stamps and AFDC recipients will experience a lower rate of high school completion than students living in counties below the median.^ The study will employ extant data from the records of the Texas Education Agency for the 1988-1989 and the 1989-1990 school years, from the Texas Department of Health for the years of 1989 and 1990, and from the 1980 Census.^ The study reveals that nonhandicapped students are graduating with a two year average rate of.906, while handicapped students following an Individualized Educational Program (IEP) achieve a two year average rate of.532, and handicapped students following the regular academic program present a two year average graduation rate of only.371. The presence of other handicapped students, and the school district's average expense per student are found to contribute significantly to the completion rates of handicapped students. Size groupings are used to elucidate the various impacts of these variables on different school districts and different student groups.^ Conclusions and implications are offered regarding the need to reach national consensus on the definition and computation of high school completion for both handicapped and nonhandicapped students, and the need for improved statewide tracking of handicapped completion rates. ^
Resumo:
Background. Nosocomial invasive aspergillosis (a highly fatal disease) is an increasing problem for immunocompromised patients. Aspergillus spp. can be transmitted via air (most commonly) and by water. ^ The hypothesis for this prospective study was that there is an association between patient occupancy, housekeeping practices, patients, visitors, and Aspergillus spp. loading. Rooms were sampled as not terminally cleaned (dirty) and terminally cleaned (clean). The secondary hypothesis was that Aspergillus spp. positive samples collected from more than one sampling location within the same patient room represent the same isolate. ^ Methods. Between April and October 2004, 2873 environmental samples (713 air, 607 water, 1256 surface and 297 spore traps) were collected in and around 209 “clean” and “dirty” patient rooms in a large cancer center hospital. Water sources included aerosolized water from patient room showerheads, sinks, drains, and toilets. Bioaerosol samples were from the patient room and from the running shower, flushing toilet, and outside the building. The surface samples included sink and shower drains, showerheads, and air grills. Aspergillus spp. positive samples were also sent for PCR, molecular typing (n = 89). ^ Results. All water samples were negative for Aspergillus spp. There were a total of 130 positive culturable samples (5.1%). The predominant species found was Aspergillus niger. Of the positive culturable samples, 106 (14.9%) were air and 24 (3.8%) were surface. There were 147 spore trap samples, and 49.5% were positive for Aspergillus/Penicillum spp. Of the culturable positive samples sent for PCR, 16 were indistinguishable matches. There was no significant relationship between air and water samples and positive samples from the same room. ^ Conclusion. Primarily patients, visitors and staff bring the Aspergillus spp. into the hospital. The high number of A. niger samples suggests the spores are entering the hospital from outdoors. Eliminating the materials brought to the patient floors from the outside, requiring employees, staff, and visitors to wear cover up over their street clothes, and improved cleaning procedures could further reduce positive samples. Mold strains change frequently; it is probably more significant to understand pathogenicity of viable spores than to commit resources on molecular strain testing on environmental samples alone. ^
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
Purpose. To determine if self-efficacy (SE) changes predicted total fat (TF) and total fiber (TFB) intake and the relationship between SE changes and the two dietary outcomes. ^ Design. This is a secondary analysis, utilizing baseline and first follow up (FFU) data from the NULIFE, a randomized trial. ^ Setting. Nutrition classes were taught in the Texas Medical Center in Houston, Texas. ^ Participants. 79 pre-menopausal, 25--45 year old African American women with an 85% response rate at FFU. ^ Method. Dietary intake was assessed with the Arizona Food Frequency Questionnaire and SE with the Self Efficacy for Dietary Change Questionnaire. Analysis was done using Stata version 9. Linear and logistic regression was used with adjustment for confounders. ^ Results. Linear regression analyses showed that SE changes for eating fruits and vegetables predicted total fiber intake in the control group for both the univariate (P = 0.001) and multivariate (P = 0.01) models while SE for eating fruits and vegetables at first follow-up predicted total fiber intake in the intervention for both models (P = 0.000). Logistic regression analyses of low fat SE changes and 30% or less for total fat intake, showed an adjusted OR of 0.22 (95% CI = 0.03, 1.48; P = 0.12) in the intervention group. The logistic regression analyses of SE changes in fruits and vegetables and 10g or more for total fiber intake, showed an adjusted OR of 6.25 (95% CI = 0.53, 72.78; P = 0.14) in the control group. ^ Conclusion. SE for eating fruits and vegetables at first follow-up predicted intervention groups' TFB intake and intervention women that increased their SE for eating a low fat diet were more likely to achieve the study goal of 30% or less calories from TF. SE changes for eating fruits and vegetables predicted the control's TFB intake and control women that increased their SE for eating fruits and vegetables were more likely to achieve the study goal of 10 g or more from TFB. Limitations are use of self-report measures, small sample size, and possible control group contamination.^
Resumo:
Education is related to health. In cross-sectional data, education level has been associated with physical functioning. Also, lower levels of education have been associated with health behaviors including smoking, alcohol use, and greater body weight. In school, students may benefit from greater exposed to health-related messages, while students who have dropped out may be more susceptible to influences regarding negative health behaviors such as smoking. ^ Improved school retention might improve long-term health outcomes. However, there is limited evidence regarding modifiable factors that predict likelihood of dropping out. Two likely psychosocial measures are locus of control and parent-child academic conversations. In the current study, data from two waves of a population-based longitudinal survey, the National Education Longitudinal Survey, were utilized to evaluate whether these two psychosocial measures could predict likelihood of dropping out, for students (n = 16,749) in tenth grade at 1990, with dropout status determined at 1992, while controlling for recognized sociodemographic predictors including parental income, parental education level, race/ethnicity, and sex. Locus of control was measured with the Pearlin Mastery Scale, and parent-child academic conversations were measured by three questions concerning course selection at school, school activities and events, and things the student studied in class. ^ In a logistic regression model, with the sociodemographic control measures entered in a first step before entry of the psychosocial measures in a second step, this study determined that lower levels of locus of control were associated with greater likelihood of dropping out after two years (odds ratio (OR) = 1.11, 95% confidence interval (CI) 108 to 1.15, p < .001), and two of the three parent-child academic discussion items were associated with greater likelihood of dropping out after two years (OR = 1.69, CI 1.48-1.93, p < .001; OR = 1.22, CI 1.05-1.41, p = .01; OR = 1.01, CI .88-1.15, p = .94). ^ It is possible that interventions aimed at improving locus of control, and aimed at building parent-child academic conversations, could lower the likelihood of students dropping out, and this in turn could yield improved heath behaviors and health status in the child's future. ^
Resumo:
External beam radiation therapy is used to treat nearly half of the more than 200,000 new cases of prostate cancer diagnosed in the United States each year. During a radiation therapy treatment, healthy tissues in the path of the therapeutic beam are exposed to high doses. In addition, the whole body is exposed to a low-dose bath of unwanted scatter radiation from the pelvis and leakage radiation from the treatment unit. As a result, survivors of radiation therapy for prostate cancer face an elevated risk of developing a radiogenic second cancer. Recently, proton therapy has been shown to reduce the dose delivered by the therapeutic beam to normal tissues during treatment compared to intensity modulated x-ray therapy (IMXT, the current standard of care). However, the magnitude of stray radiation doses from proton therapy, and their impact on this incidence of radiogenic second cancers, was not known. ^ The risk of a radiogenic second cancer following proton therapy for prostate cancer relative to IMXT was determined for 3 patients of large, median, and small anatomical stature. Doses delivered to healthy tissues from the therapeutic beam were obtained from treatment planning system calculations. Stray doses from IMXT were taken from the literature, while stray doses from proton therapy were simulated using a Monte Carlo model of a passive scattering treatment unit and an anthropomorphic phantom. Baseline risk models were taken from the Biological Effects of Ionizing Radiation VII report. A sensitivity analysis was conducted to characterize the uncertainty of risk calculations to uncertainties in the risk model, the relative biological effectiveness (RBE) of neutrons for carcinogenesis, and inter-patient anatomical variations. ^ The risk projections revealed that proton therapy carries a lower risk for radiogenic second cancer incidence following prostate irradiation compared to IMXT. The sensitivity analysis revealed that the results of the risk analysis depended only weakly on uncertainties in the risk model and inter-patient variations. Second cancer risks were sensitive to changes in the RBE of neutrons. However, the findings of the study were qualitatively consistent for all patient sizes and risk models considered, and for all neutron RBE values less than 100. ^
Resumo:
Cardiovascular disease has been the leading cause of death in the United States for over fifty years. While multiple risk factors for cardiovascular disease have been identified, hypertension is one of the most commonly recognized and treatable. Recent studies indicate that the prevalence of hypertension among children and adolescents is between 3-5%, much higher than originally estimated and likely rising due to the epidemic of obesity in the U.S. In 2004, the National High Blood Pressure Education Program Working Group on High Blood Pressure in Children and Adolescents published new guidelines for the diagnosis and treatment of hypertension in this population. Included in these recommendations was the creation of a new diagnosis, pre-hypertension, aimed at identifying children at-risk for hypertension to provide early lifestyle interventions in an effort to prevent its ultimate development. In order to determine the risk associated with pre-hypertension for the development of incident HTN, a secondary analysis of a repeated cross-sectional study measuring blood pressure in Houston area adolescents from 2000 to 2007 was performed. Of 1006 students participating in the blood pressure screening on more than one occasion not diagnosed with hypertension at initial encounter, eleven were later found to have hypertension providing an overall incident rate of 0.5% per year. Incidence rates were higher among overweight adolescents–1.9% per year [IRR 8.6 (1.97, 51.63)]; students “at-risk for hypertension” (pre-hypertensive or initial blood pressure in the hypertensive range but falling on subsequent measures)–1.4% per year [IRR 4.77 (1.21, 19.78)]; and those with blood pressure ≥90th percentile on three occasions–6.6% per year [IRR 21.87 (3.40, 112.40)]. Students with pre-hypertension as currently defined by the Task Force did have an increased rate of hypertension (1.1% per year) but it did not reach statistical significance [IRR 2.44 (0.42, 10.18)]. Further research is needed to determine the morbidity and mortality associated with pre-hypertension in this age group as well as the effectiveness of various interventions for preventing the development of hypertensive disease among these at-risk individuals. ^
Resumo:
In the United States, “binge” drinking among college students is an emerging public health concern due to the significant physical and psychological effects on young adults. The focus is on identifying interventions that can help decrease high-risk drinking behavior among this group of drinkers. One such intervention is Motivational interviewing (MI), a client-centered therapy that aims at resolving client ambivalence by developing discrepancy and engaging the client in change talk. Of late, there is a growing interest in determining the active ingredients that influence the alliance between the therapist and the client. This study is a secondary analysis of the data obtained from the Southern Methodist Alcohol Research Trial (SMART) project, a dismantling trial of MI and feedback among heavy drinking college students. The present project examines the relationship between therapist and client language in MI sessions on a sample of “binge” drinking college students. Of the 126 SMART tapes, 30 tapes (‘MI with feedback’ group = 15, ‘MI only’ group = 15) were randomly selected for this study. MISC 2.1, a mutually exclusive and exhaustive coding system, was used to code the audio/videotaped MI sessions. Therapist and client language were analyzed for communication characteristics. Overall, therapists adopted a MI consistent style and clients were found to engage in change talk. Counselor acceptance, empathy, spirit, and complex reflections were all significantly related to client change talk (p-values ranged from 0.001 to 0.047). Additionally, therapist ‘advice without permission’ and MI Inconsistent therapist behaviors were strongly correlated with client sustain talk (p-values ranged from 0.006 to 0.048). Simple linear regression models showed a significant correlation between MI consistent (MICO) therapist language (independent variable) and change talk (dependent variable) and MI inconsistent (MIIN) therapist language (independent variable) and sustain talk (dependent variable). The study has several limitations such as small sample size, self-selection bias, poor inter-rater reliability for the global scales and the lack of a temporal measure of therapist and client language. Future studies might consider a larger sample size to obtain more statistical power. In addition the correlation between therapist language, client language and drinking outcome needs to be explored.^