10 resultados para Calculation tool in reliability
em DigitalCommons@The Texas Medical Center
Resumo:
Both TBL and PBL attempt to maximally engage the learner and both are designed to encourage interactive teaching / learning. PBL is student centered. TBL, in contrast, is typically instructor centered. The PBL Executive Committee of the UTHSC-Houston Medical School, in an attempt to capture the pedagogical advantages of PBL and of TBL, implemented a unique PBL experience into the ICE/PBL course during the final block of PBL instruction in year 2. PBL cases provided the content knowledge for focused learning. The subsequent, related TBL exercises fostered integration / critical thinking about each of these cases. [See PDF for complete abstract]
Resumo:
Introduction Commercial treatment planning systems employ a variety of dose calculation algorithms to plan and predict the dose distributions a patient receives during external beam radiation therapy. Traditionally, the Radiological Physics Center has relied on measurements to assure that institutions participating in the National Cancer Institute sponsored clinical trials administer radiation in doses that are clinically comparable to those of other participating institutions. To complement the effort of the RPC, an independent dose calculation tool needs to be developed that will enable a generic method to determine patient dose distributions in three dimensions and to perform retrospective analysis of radiation delivered to patients who enrolled in past clinical trials. Methods A multi-source model representing output for Varian 6 MV and 10 MV photon beams was developed and evaluated. The Monte Carlo algorithm, know as the Dose Planning Method (DPM), was used to perform the dose calculations. The dose calculations were compared to measurements made in a water phantom and in anthropomorphic phantoms. Intensity modulated radiation therapy and stereotactic body radiation therapy techniques were used with the anthropomorphic phantoms. Finally, past patient treatment plans were selected and recalculated using DPM and contrasted against a commercial dose calculation algorithm. Results The multi-source model was validated for the Varian 6 MV and 10 MV photon beams. The benchmark evaluations demonstrated the ability of the model to accurately calculate dose for the Varian 6 MV and the Varian 10 MV source models. The patient calculations proved that the model was reproducible in determining dose under similar conditions described by the benchmark tests. Conclusions The dose calculation tool that relied on a multi-source model approach and used the DPM code to calculate dose was developed, validated, and benchmarked for the Varian 6 MV and 10 MV photon beams. Several patient dose distributions were contrasted against a commercial algorithm to provide a proof of principal to use as an application in monitoring clinical trial activity.
Resumo:
Obesity has been on the rise in the United States over the last 30 years for all populations, including preschoolers. The purpose of the project was to develop an observation tool to measure physical activity levels in preschool children and use the tool in a pilot test of the CATCH UP curriculum at two Head Start Centers in Houston. Pretest and posttest interobserver agreements were all above 0.60 for physical activity level and physical activity type. Preschoolers spent the majority of their time in light physical activity (75.33% pretest, 87.77% posttest), and spent little time in moderate to vigorous physical activity (MVPA) (24.67% pretest, 12.23% posttest). Percent time spent in MVPA decreased significantly pretest to posttest from (F=5.738, p=0.043). While the pilot testing of the CATCH UP curriculum did not show an increase in MVPA, the SOFIT-P tool did show promising results as being a new method for collecting physical activity level data for preschoolers. Once the new tool has undergone more reliability and validity testing, it could allow for a more convenient method of collecting physical activity levels for preschoolers. ^
Resumo:
In the field of chemical carcinogenesis the use of animal models has proved to be a useful tool in dissecting the multistage process of tumor formation. In this regard the outbred SENCAR mouse has been the strain of choice in the analysis of skin carcinogenesis given its high sensitivity to the chemically induced acquisition of premalignant lesions, papillomas, and the later progression of these lesions into squamous cell carcinomas (SCC).^ The derivation of an inbred strain from the SENCAR stock called SSIN, that in spite of a high sensitivity to the development of papillomas lack the ability to transform these premalignant lesions into SCC, suggested that tumor promotion and progression were under the genetic control of different sets of genes.^ In the present study the nature of susceptibility to tumor progression was investigated. Analysis of F1 hybrids between the outbred SENCAR and SSIN mice suggested that there is at least one dominant gene responsible for susceptibility to tumor progression.^ Later development of another inbred strain from the outbred SENCAR stock, that had sensitivity to both tumor promotion and progression, allowed the formulation of a more accurate genetic model. Using this newly derived line, SENCAR B/Pt. and SSIN it was determined that there is one dominant tumor progression susceptibility gene. Linkage analysis showed that this gene maps to mouse chromosome 14 and it was possible to narrow the region to a 16 cM interval.^ In order to better characterize the nature of the progression susceptibility differences between these two strains, their proliferative pattern was investigated. It was found that SENCAR B/Pt, have an enlarged proliferative compartment with overexpression of cyclin D1, p16 and p21. Further studies showed an aberrant overexpression of TGF-$\beta$ in the susceptible strain, an increase in apoptosis, p53 protein accumulation and early loss of connexin 26. These results taken together suggest that papillomas in the SENCAR B/Pt. mice have higher proliferation and may have an increase in genomic instability, these two factors would contribute to a higher sensitivity to tumor progression. ^
Resumo:
Glutathione (GSH) is involved in the detoxication of numerous chemicals exogenously exposed or endogenously generated. Exposure to these agents cause depletion of cellular GSH rendering these cells more susceptible to the toxic action of these same agents. Formaldehyde (CH(,2)O) was found to deplete cellular GSH, presumably by the formation of the GSH-CH(,2)O complex, S-hydroxymethylglutathione, and its rapid extrusion into the extracellular medium.^ The metabolism and toxicity of CH(,2)O were determined to be dependent upon cellular GSH in vitro and in vivo. The rate of CH(,2)O oxidation decreased and the extent of toxicity increased when isolated rat hepatocytes or strain A/J mice were pretreated with the GSH-depleting agent, diethyl maleate (DEM). Additional experiments were designed to further study the role GSH plays in detoxication using isolated rat hepatocytes.^ L-Methionine protected against the extent of lipid peroxidation and leakage of the cytosolic enzyme, lactate dehydrogenase (LDH), caused by CH(,2)O in DEM-pretreated hepatocytes, further supporting the protective role of GSH against cellular toxicity. The antioxidants, ascorbate, butylated hydroxytoluene, and (alpha)-tocopherol, were all protective against the extent of lipid peroxidation and leakage of LDH in isolated rat hepatocytes. Whereas L-methionine may be protective by increasing the cellular concentration of GSH which is used to detoxify free radicals or by facilitating the rate of CH(,2)O oxidation, the antioxidant, ascorbate, was protective without altering the rate of CH(,2)O oxidation or increasing cellular GSH levels. These results suggest that the free radical-mediated toxicity caused by CH(,2)O in DEM-pretreated hepatocytes is due to the further depletion of GSH by CH(,2)O and not to increased CH(,2)O persistence. How this further depletion in GSH by CH(,2)O in DEM-pretreated hepatocytes results in lipid peroxidation and cell death was further investigated.^ The further decrease in GSH caused by CH(,2)O in DEM-pretreated hepatocytes, suspected of stimulating lipid peroxidation and cell death, was found not to be due to depletion of mitochondrial GSH but to depletion of protein sulfhydryl groups. In addition, cellular toxicity appears more closely correlated with depletion of protein sulfhydryl groups than with an increase in cytosolic free Ca('2+). The combination of CH(,2)O and DEM may be a useful tool in identifying these critical sulfhydryl-protein(s) and to further understand the role GSH plays in detoxication. ^
Resumo:
Linkage disequilibrium (LD) is defined as the nonrandom association of alleles at two or more loci in a population and may be a useful tool in a diverse array of applications including disease gene mapping, elucidating the demographic history of populations, and testing hypotheses of human evolution. However, the successful application of LD-based approaches to pertinent genetic questions is hampered by a lack of understanding about the forces that mediate the genome-wide distribution of LD within and between human populations. Delineating the genomic patterns of LD is a complex task that will require interdisciplinary research that transcends traditional scientific boundaries. The research presented in this dissertation is predicated upon the need for interdisciplinary studies and both theoretical and experimental projects were pursued. In the theoretical studies, I have investigated the effect of genotyping errors and SNP identification strategies on estimates of LD. The primary importance of these two chapters is that they provide important insights and guidance for the design of future empirical LD studies. Furthermore, I analyzed the allele frequency distribution of 26,530 single nucleotide polymorphisms (SNPs) in three populations and generated the first-generation natural selection map of the human genome, which will be an important resource for explaining and understanding genomic patterns of LD. Finally, in the experimental study, I describe a novel and simple, low-cost, and high-throughput SNP genotyping method. The theoretical analyses and experimental tools developed in this dissertation will facilitate a more complete understanding of patterns of LD in human populations. ^
Resumo:
Background. Cardiac risk assessment in cancer patients has not extensively been studied. We evaluated the role of stress myocardial perfusion imaging (MPI) in predicting cardiovascular outcomes in cancer patients undergoing non-cardiac surgery. ^ Methods. A retrospective chart review was performed on 507 patients who had a MPI from 01/2002 - 03/2003 and underwent non-cardiac surgery. Median follow-up duration was 1.5 years. Cox proportional hazard model was used to determine the time-to-first event. End points included total cardiac events (cardiac death, myocardial infarction (MI) and coronary revascularization), cardiac death, and all cause mortality. ^ Results. Of all 507 MPI studies 146 (29%) were abnormal. There were significant differences in risk factors between normal and abnormal MPI groups. Mean age was 66±11 years, with 60% males and a median follow-up duration of 1.8 years (25th quartile=0.8 years, 75th quartile=2.2 years). The majority of patients had an adenosine stress study (53%), with fewer exercise (28%) and dobutamine stress (16%) studies. In the total group there were 39 total cardiac events, 31 cardiac deaths, and 223 all cause mortality events during the study. Univariate predictors of total cardiac events included CAD (p=0.005), previous MI (p=0.005), use of beta blockers (p=0.002), and not receiving chemotherapy (p=0.012). Similarly, the univariate predictors of cardiac death included previous MI (p=0.019) and use of beta blockers (p=0.003). In the multivariate model for total cardiac events, age at surgery (HR 1.04, p=0.030), use of beta blockers (HR 2.46; p=0.011), dobutamine MPI (HR 3.08; p=0.018) and low EF (HR 0.97; p=0.02) were significant predictors of worse outcomes. In the multivariate model for predictors of cardiac death, beta blocker use (HR=2.74; p=0.017) and low EF (HR=0.95; p<0.003) were predictors of cardiac death. The only univariate MPI predictor of total cardiac events was scar severity (p=0.005). While MPI predictors of cardiac death were scar severity (p= 0.001) and ischemia severity (p=0.02). ^ Conclusions. Stress MPI is a useful tool in predicting long term outcomes in cancer patients undergoing surgery. Ejection fraction and severity of myocardial scar are important factors determining long term outcomes in this group.^
Resumo:
Ascertaining the family health history (FHH) may provide insight into genetic and environmental susceptibilities specific to a variety of chronic diseases, including type II diabetes mellitus. However, discussion of FHH during patient-provider encounters has been limited and uncharacterized. A longitudinal, observational study was conducted in order to compare the content of FHH topics in a convenience sample of 37 patients, 13 new and 24 established. Each patient had an average of three follow-up encounters involving 6 staff physicians at the Audie L. Murphy Memorial Veterans Hospital (VHA) in San Antonio, TX from 2003 to 2005. A total of 131 encounters were analyzed in this study. The average age of the selected population was 68 years and included 35 males and two females. Transcriptions of encounters were obtained, coded and analyzed, in NVIVO 8. Of the 131 total encounters transcribed among the 37 patients, only 24 encounters (18.3%) included discussion of FHH. Additionally, the relationship between FHH discussion and discussion of self-care management (SCM) topics were assessed. In this study, providers were more likely to initiate discussion on family health history among new patients in the first encounter (ORnew = 8.55, 95% CI: 1.49–52.90). The discussion of FHH occurred sporadically in established patients throughout the longitudinal study with no apparent pattern. Provider-initiated FHH discussion most frequently had satisfactory level(s) of discussion while patient-initiated FHH discussion most frequently had minimal level(s) of discussion. FHH discussion most oftentimes involved topics of cancer and cardiovascular disease among primary-degree familial relationships. Overall, family health histories are largely, an underutilized tool in personalized preventive care.^
Resumo:
Natural disasters occur in various forms such as hurricanes, tsunamis, earthquakes, outbreaks, etc. The most unsettling aspect of a natural disaster is that it can strike at any moment. Over the past decade, our society has experienced an alarming increase of natural disasters. How to expeditiously respond and recover from natural disasters has become a precedent question for public health officials. To date, the most recent natural disaster was the January 12, 2010 earthquake in Haiti; however the most memorable was that of Hurricane Katrina (“Haiti Earthquake”, 2010). ^ This study provides insight on the need to develop a National Disaster Response and Recovery Program which effectively responds to natural disasters. The specific aims of this paper were to (1) observe the government’s role on federal, state and local levels in assisting Hurricanes Katrina and Rita evacuees, (2) assess the prevalence of needs among Hurricanes Katrina and Rita families participating in the Disaster Housing Assistance Program (DHAP) and (3) describe the level of progress towards “self sufficiency” for the DHAP families receiving case management social services. ^ Secondary data from a cross-sectional “Needs Assessment” questionnaire were analyzed. The questionnaire was administered initially and again six months later (follow-up) by H.A.U.L. case managers. The “Needs Assessment” questionnaire collected data regarding participants’ education, employment, transportation, child care, health resources, income, permanent housing and disability needs. Case managers determined the appropriate level of social services required for each family based on the data collected from the “Needs Assessment” questionnaire. ^ Secondary data provided by the H.A.U.L. were analyzed to determine the prevalence of needs among the DHAP families. In addition, differences measured between the initial and follow-up (at six months) questionnaires were analyzed to determine statistical significance between case management services provided and prevalence of needs among the DHAP families from initial to 6 months later at follow-up. The data analyzed describe the level of progress made by these families to achieve program “self sufficiency” (see Appendix A). Disaster assistance programs which first address basic human needs; then socioeconomic needs may offer an essential tool in aiding disaster affected communities quickly recover from natural disasters. ^
Resumo:
Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (