24 resultados para analysis tool
em DigitalCommons@The Texas Medical Center
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
The objectives of this study were to identify and measure the average outcomes of the Open Door Mission's nine-month community-based substance abuse treatment program, identify predictors of successful outcomes, and make recommendations to the Open Door Mission for improving its treatment program.^ The Mission's program is exclusive to adult men who have limited financial resources: most of which were homeless or dependent on parents or other family members for basic living needs. Many, but not all, of these men are either chemically dependent or have a history of substance abuse.^ This study tracked a cohort of the Mission's graduates throughout this one-year study and identified various indicators of success at short-term intervals, which may be predictive of longer-term outcomes. We tracked various levels of 12-step program involvement, as well as other social and spiritual activities, such as church affiliation and recovery support.^ Twenty-four of the 66 subjects, or 36% met the Mission's requirements for success. Specific to this success criteria; Fifty-four, or 82% reported affiliation with a home church; Twenty-six, or 39% reported full-time employment; Sixty-one, or 92% did not report or were not identified as having any post-treatment arrests or incarceration, and; Forty, or 61% reported continuous abstinence from both drugs and alcohol.^ Five research-based hypotheses were developed and tested. The primary analysis tool was the web-based non-parametric dependency modeling tool, B-Course, which revealed some strong associations with certain variables, and helped the researchers generate and test several data-driven hypotheses. Full-time employment is the greatest predictor of abstinence: 95% of those who reported full time employment also reported continuous post-treatment abstinence, while 50% of those working part-time were abstinent and 29% of those with no employment were abstinent. Working with a 12-step sponsor, attending aftercare, and service with others were identified as predictors of abstinence.^ This study demonstrates that associations with abstinence and the ODM success criteria are not simply based on one social or behavioral factor. Rather, these relationships are interdependent, and show that abstinence is achieved and maintained through a combination of several 12-step recovery activities. This study used a simple assessment methodology, which demonstrated strong associations across variables and outcomes, which have practical applicability to the Open Door Mission for improving its treatment program. By leveraging the predictive capability of the various success determination methodologies discussed and developed throughout this study, we can identify accurate outcomes with both validity and reliability. This assessment instrument can also be used as an intervention that, if operationalized to the Mission’s clients during the primary treatment program, may measurably improve the effectiveness and outcomes of the Open Door Mission.^
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
OBJECTIVE: To characterize PubMed usage over a typical day and compare it to previous studies of user behavior on Web search engines. DESIGN: We performed a lexical and semantic analysis of 2,689,166 queries issued on PubMed over 24 consecutive hours on a typical day. MEASUREMENTS: We measured the number of queries, number of distinct users, queries per user, terms per query, common terms, Boolean operator use, common phrases, result set size, MeSH categories, used semantic measurements to group queries into sessions, and studied the addition and removal of terms from consecutive queries to gauge search strategies. RESULTS: The size of the result sets from a sample of queries showed a bimodal distribution, with peaks at approximately 3 and 100 results, suggesting that a large group of queries was tightly focused and another was broad. Like Web search engine sessions, most PubMed sessions consisted of a single query. However, PubMed queries contained more terms. CONCLUSION: PubMed's usage profile should be considered when educating users, building user interfaces, and developing future biomedical information retrieval systems.
Resumo:
Recent studies indicate that polymorphic genetic markers are potentially helpful in resolving genealogical relationships among individuals in a natural population. Genetic data provide opportunities for paternity exclusion when genotypic incompatibilities are observed among individuals, and the present investigation examines the resolving power of genetic markers in unambiguous positive determination of paternity. Under the assumption that the mother for each offspring in a population is unambiguously known, an analytical expression for the fraction of males excluded from paternity is derived for the case where males and females may be derived from two different gene pools. This theoretical formulation can also be used to predict the fraction of births for each of which all but one male can be excluded from paternity. We show that even when the average probability of exclusion approaches unity, a substantial fraction of births yield equivocal mother-father-offspring determinations. The number of loci needed to increase the frequency of unambiguous determinations to a high level is beyond the scope of current electrophoretic studies in most species. Applications of this theory to electrophoretic data on Chamaelirium luteum (L.) shows that in 2255 offspring derived from 273 males and 70 females, only 57 triplets could be unequivocally determined with eight polymorphic protein loci, even though the average combined exclusionary power of these loci was 73%. The distribution of potentially compatible male parents, based on multilocus genotypes, was reasonably well predicted from the allele frequency data available for these loci. We demonstrate that genetic paternity analysis in natural populations cannot be reliably based on exclusionary principles alone. In order to measure the reproductive contributions of individuals in natural populations, more elaborate likelihood principles must be deployed.
Resumo:
When evaluated for promotion or tenure, faculty members are increasingly judged more on the quality than on the quantity of their scholarly publications. As a result, they want help from librarians in locating all citations to their published works for documentation in their curriculum vitae. Citation analysis using Science Citation Index and Social Science Citation Index provides a logical starting point in measuring quality, but the limitations of these sources leave a void in coverage of citations to an author's work. This article discusses alternative and additional methods of locating citations to published works.
Resumo:
Withdrawal reflexes of the mollusk Aplysia exhibit sensitization, a simple form of long-term memory (LTM). Sensitization is due, in part, to long-term facilitation (LTF) of sensorimotor neuron synapses. LTF is induced by the modulatory actions of serotonin (5-HT). Pettigrew et al. developed a computational model of the nonlinear intracellular signaling and gene network that underlies the induction of 5-HT-induced LTF. The model simulated empirical observations that repeated applications of 5-HT induce persistent activation of protein kinase A (PKA) and that this persistent activation requires a suprathreshold exposure of 5-HT. This study extends the analysis of the Pettigrew model by applying bifurcation analysis, singularity theory, and numerical simulation. Using singularity theory, classification diagrams of parameter space were constructed, identifying regions with qualitatively different steady-state behaviors. The graphical representation of these regions illustrates the robustness of these regions to changes in model parameters. Because persistent protein kinase A (PKA) activity correlates with Aplysia LTM, the analysis focuses on a positive feedback loop in the model that tends to maintain PKA activity. In this loop, PKA phosphorylates a transcription factor (TF-1), thereby increasing the expression of an ubiquitin hydrolase (Ap-Uch). Ap-Uch then acts to increase PKA activity, closing the loop. This positive feedback loop manifests multiple, coexisting steady states, or multiplicity, which provides a mechanism for a bistable switch in PKA activity. After the removal of 5-HT, the PKA activity either returns to its basal level (reversible switch) or remains at a high level (irreversible switch). Such an irreversible switch might be a mechanism that contributes to the persistence of LTM. The classification diagrams also identify parameters and processes that might be manipulated, perhaps pharmacologically, to enhance the induction of memory. Rational drug design, to affect complex processes such as memory formation, can benefit from this type of analysis.
Resumo:
This paper introduces an extended hierarchical task analysis (HTA) methodology devised to evaluate and compare user interfaces on volumetric infusion pumps. The pumps were studied along the dimensions of overall usability and propensity for generating human error. With HTA as our framework, we analyzed six pumps on a variety of common tasks using Norman’s Action theory. The introduced method of evaluation divides the problem space between the external world of the device interface and the user’s internal cognitive world, allowing for predictions of potential user errors at the human-device level. In this paper, one detailed analysis is provided as an example, comparing two different pumps on two separate tasks. The results demonstrate the inherent variation, often the cause of usage errors, found with infusion pumps being used in hospitals today. The reported methodology is a useful tool for evaluating human performance and predicting potential user errors with infusion pumps and other simple medical devices.
Resumo:
OBJECTIVES: To determine the prevalence of false or misleading statements in messages posted by internet cancer support groups and whether these statements were identified as false or misleading and corrected by other participants in subsequent postings. DESIGN: Analysis of content of postings. SETTING: Internet cancer support group Breast Cancer Mailing List. MAIN OUTCOME MEASURES: Number of false or misleading statements posted from 1 January to 23 April 2005 and whether these were identified and corrected by participants in subsequent postings. RESULTS: 10 of 4600 postings (0.22%) were found to be false or misleading. Of these, seven were identified as false or misleading by other participants and corrected within an average of four hours and 33 minutes (maximum, nine hours and nine minutes). CONCLUSIONS: Most posted information on breast cancer was accurate. Most false or misleading statements were rapidly corrected by participants in subsequent postings.
Resumo:
Introduction Commercial treatment planning systems employ a variety of dose calculation algorithms to plan and predict the dose distributions a patient receives during external beam radiation therapy. Traditionally, the Radiological Physics Center has relied on measurements to assure that institutions participating in the National Cancer Institute sponsored clinical trials administer radiation in doses that are clinically comparable to those of other participating institutions. To complement the effort of the RPC, an independent dose calculation tool needs to be developed that will enable a generic method to determine patient dose distributions in three dimensions and to perform retrospective analysis of radiation delivered to patients who enrolled in past clinical trials. Methods A multi-source model representing output for Varian 6 MV and 10 MV photon beams was developed and evaluated. The Monte Carlo algorithm, know as the Dose Planning Method (DPM), was used to perform the dose calculations. The dose calculations were compared to measurements made in a water phantom and in anthropomorphic phantoms. Intensity modulated radiation therapy and stereotactic body radiation therapy techniques were used with the anthropomorphic phantoms. Finally, past patient treatment plans were selected and recalculated using DPM and contrasted against a commercial dose calculation algorithm. Results The multi-source model was validated for the Varian 6 MV and 10 MV photon beams. The benchmark evaluations demonstrated the ability of the model to accurately calculate dose for the Varian 6 MV and the Varian 10 MV source models. The patient calculations proved that the model was reproducible in determining dose under similar conditions described by the benchmark tests. Conclusions The dose calculation tool that relied on a multi-source model approach and used the DPM code to calculate dose was developed, validated, and benchmarked for the Varian 6 MV and 10 MV photon beams. Several patient dose distributions were contrasted against a commercial algorithm to provide a proof of principal to use as an application in monitoring clinical trial activity.
Resumo:
BACKGROUND: The Enterococcus faecium genogroup, referred to as clonal complex 17 (CC17), seems to possess multiple determinants that increase its ability to survive and cause disease in nosocomial environments. METHODS: Using 53 clinical and geographically diverse US E. faecium isolates dating from 1971 to 1994, we determined the multilocus sequence type; the presence of 16 putative virulence genes (hyl(Efm), esp(Efm), and fms genes); resistance to ampicillin (AMP) and vancomycin (VAN); and high-level resistance to gentamicin and streptomycin. RESULTS: Overall, 16 different sequence types (STs), mostly CC17 isolates, were identified in 9 different regions of the United States. The earliest CC17 isolates were part of an outbreak that occurred in 1982 in Richmond, Virginia. The characteristics of CC17 isolates included increases in resistance to AMP, the presence of hyl(Efm) and esp(Efm), emergence of resistance to VAN, and the presence of at least 13 of 14 fms genes. Eight of 41 of the early isolates with resistance to AMP, however, were not in CC17. CONCLUSIONS: Although not all early US AMP isolates were clonally related, E. faecium CC17 isolates have been circulating in the United States since at least 1982 and appear to have progressively acquired additional virulence and antibiotic resistance determinants, perhaps explaining the recent success of this species in the hospital environment.
Resumo:
BACKGROUND: It is well recognized that colorectal cancer does not frequently metastasize to bone. The aim of this retrospective study was to establish whether colorectal cancer ever bypasses other organs and metastasizes directly to bone and whether the presence of lung lesions is superior to liver as a better predictor of the likelihood and timing of bone metastasis. METHODS: We performed a retrospective analysis on patients with a clinical diagnosis of colon cancer referred for staging using whole-body 18F-FDG PET and CT or PET/CT. We combined PET and CT reports from 252 individuals with information concerning patient history, other imaging modalities, and treatments to analyze disease progression. RESULTS: No patient had isolated osseous metastasis at the time of diagnosis, and none developed isolated bone metastasis without other organ involvement during our survey period. It took significantly longer for colorectal cancer patients to develop metastasis to the lungs (23.3 months) or to bone (21.2 months) than to the liver (9.8 months). Conclusion: Metastasis only to bone without other organ involvement in colorectal cancer patients is extremely rare, perhaps more rare than we previously thought. Our findings suggest that resistant metastasis to the lungs predicts potential disease progression to bone in the colorectal cancer population better than liver metastasis does.
Resumo:
BACKGROUND: Enterococcus faecalis has emerged as a major hospital pathogen. To explore its diversity, we sequenced E. faecalis strain OG1RF, which is commonly used for molecular manipulation and virulence studies. RESULTS: The 2,739,625 base pair chromosome of OG1RF was found to contain approximately 232 kilobases unique to this strain compared to V583, the only publicly available sequenced strain. Almost no mobile genetic elements were found in OG1RF. The 64 areas of divergence were classified into three categories. First, OG1RF carries 39 unique regions, including 2 CRISPR loci and a new WxL locus. Second, we found nine replacements where a sequence specific to V583 was substituted by a sequence specific to OG1RF. For example, the iol operon of OG1RF replaces a possible prophage and the vanB transposon in V583. Finally, we found 16 regions that were present in V583 but missing from OG1RF, including the proposed pathogenicity island, several probable prophages, and the cpsCDEFGHIJK capsular polysaccharide operon. OG1RF was more rapidly but less frequently lethal than V583 in the mouse peritonitis model and considerably outcompeted V583 in a murine model of urinary tract infections. CONCLUSION: E. faecalis OG1RF carries a number of unique loci compared to V583, but the almost complete lack of mobile genetic elements demonstrates that this is not a defining feature of the species. Additionally, OG1RF's effects in experimental models suggest that mediators of virulence may be diverse between different E. faecalis strains and that virulence is not dependent on the presence of mobile genetic elements.
Resumo:
Attention has recently been drawn to Enterococcus faecium because of an increasing number of nosocomial infections caused by this species and its resistance to multiple antibacterial agents. However, relatively little is known about the pathogenic determinants of this organism. We have previously identified a cell-wall-anchored collagen adhesin, Acm, produced by some isolates of E. faecium, and a secreted antigen, SagA, exhibiting broad-spectrum binding to extracellular matrix proteins. Here, we analysed the draft genome of strain TX0016 for potential microbial surface components recognizing adhesive matrix molecules (MSCRAMMs). Genome-based bioinformatics identified 22 predicted cell-wall-anchored E. faecium surface proteins (Fms), of which 15 (including Acm) had characteristics typical of MSCRAMMs, including predicted folding into a modular architecture with multiple immunoglobulin-like domains. Functional characterization of one [Fms10; redesignated second collagen adhesin of E. faecium (Scm)] revealed that recombinant Scm(65) (A- and B-domains) and Scm(36) (A-domain) bound to collagen type V efficiently in a concentration-dependent manner, bound considerably less to collagen type I and fibrinogen, and differed from Acm in their binding specificities to collagen types IV and V. Results from far-UV circular dichroism measurements of recombinant Scm(36) and of Acm(37) indicated that these proteins were rich in beta-sheets, supporting our folding predictions. Whole-cell ELISA and FACS analyses unambiguously demonstrated surface expression of Scm in most E. faecium isolates. Strikingly, 11 of the 15 predicted MSCRAMMs clustered in four loci, each with a class C sortase gene; nine of these showed similarity to Enterococcus faecalis Ebp pilus subunits and also contained motifs essential for pilus assembly. Antibodies against one of the predicted major pilus proteins, Fms9 (redesignated EbpC(fm)), detected a 'ladder' pattern of high-molecular-mass protein bands in a Western blot analysis of cell surface extracts from E. faecium, suggesting that EbpC(fm) is polymerized into a pilus structure. Further analysis of the transcripts of the corresponding gene cluster indicated that fms1 (ebpA(fm)), fms5 (ebpB(fm)) and ebpC(fm) are co-transcribed, a result consistent with those for pilus-encoding gene clusters of other Gram-positive bacteria. All 15 genes occurred frequently in 30 clinically derived diverse E. faecium isolates tested. The common occurrence of MSCRAMM- and pilus-encoding genes and the presence of a second collagen-binding protein may have important implications for our understanding of this emerging pathogen.