16 resultados para comparison method
em DigitalCommons@The Texas Medical Center
Resumo:
The difficulty of detecting differential gene expression in microarray data has existed for many years. Several correction procedures try to avoid the family-wise error rate in multiple comparison process, including the Bonferroni and Sidak single-step p-value adjustments, Holm's step-down correction method, and Benjamini and Hochberg's false discovery rate (FDR) correction procedure. Each multiple comparison technique has its advantages and weaknesses. We studied each multiple comparison method through numerical studies (simulations) and applied the methods to the real exploratory DNA microarray data, which detect of molecular signatures in papillary thyroid cancer (PTC) patients. According to our results of simulation studies, Benjamini and Hochberg step-up FDR controlling procedure is the best process among these multiple comparison methods and we discovered 1277 potential biomarkers among 54675 probe sets after applying the Benjamini and Hochberg's method to PTC microarray data.^
Resumo:
Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^
Resumo:
Objective Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular and for improving healthcare quality and patient safety in general. Method The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. Results The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. Conclusions Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
The sedative and cardiovascular effects of rectally administered diazepam (0.6 mg/kg) were compared to placebo in uncooperative children who required sedation during dental treatment. Twelve healthy preschool children, who required amalgam restorations, were treated during two standardized restorative appointments in a double-blind, crossover study. Blood pressure and pulse were obtained during four specified intervals during the appointment. The behavior of the children during the treatment visits was videotaped and later statistically analyzed using a kinesics/vocalization instrument. Behavioral ratings of cooperation were significantly improved during the treatment visit following diazepam. All interfering bodily movements, patient vocalizations and operator commands for the diazepam group were reduced significantly (p≤0.0001). No significant differences were observed for noninterfering behavioral response. Rectally administered diazepam did not alter blood pressure or pulse significantly in these sedated children when compared to the placebo. These findings indicate that rectal diazepam is an effective sedative agent with minimal effect on the cardiovascular system for the management of the young pediatric dental patient.
Resumo:
A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^
Resumo:
Complete NotI, SfiI, XbaI and BlnI cleavage maps of Escherichia coli K-12 strain MG1655 were constructed. Techniques used included: CHEF pulsed field gel electrophoresis; transposon mutagenesis; fragment hybridization to the ordered $\lambda$ library of Kohara et al.; fragment and cosmid hybridization to Southern blots; correlation of fragments and cleavage sites with EcoMap, a sequence-modified version of the genomic restriction map of Kohara et al.; and correlation of cleavage sites with DNA sequence databases. In all, 105 restriction sites were mapped and correlated with the EcoMap coordinate system.^ NotI, SfiI, XbaI and BlnI restriction patterns of five commonly used E. coli K-12 strains were compared to those of MG1655. The variability between strains, some of which are separated by numerous steps of mutagenic treatment, is readily detectable by pulsed-field gel electrophoresis. A model is presented to account for the difference between the strains on the basis of simple insertions, deletions, and in one case an inversion. Insertions and deletions ranged in size from 1 kb to 86 kb. Several of the larger features have previously been characterized and some of the smaller rearrangements can potentially account for previously reported genetic features of these strains.^ Some aspects of the frequency and distribution of NotI, SfiI, XbaI and BlnI cleavage sites were analyzed using a method based on Markov chain theory. Overlaps of Dam and Dcm methylase sites with XbaI and SfiI cleavage sites were examined. The one XbaI-Dam overlap in the database is in accord with the expected frequency of this overlap. The occurrence of certain types of SfiI-Dcm overlaps are overrepresented. Of the four subtypes of SfiI-Dcm overlap, only one has a partial inhibitory effect on the activity of SfiI. Recognition sites for all four enzymes are rarer than expected based on oligonucleotide frequency data, with this effect being much stronger for XbaI and BlnI than for NotI and SfiI. The latter two enzyme sites are rare mainly due to apparent negative selection against GGCC (both) and CGGCCG (NotI). The former two enzyme sites are rare mainly due to effects of the VSP repair system on certain di-tri- and tetranucleotides, most notably CTAG. Models are proposed to explain several of the anomalies of oligonucleotide distribution in E. coli, and the biological significance of the systems that produce these anomalies is discussed. ^
Resumo:
A non-parametric method was developed and tested to compare the partial areas under two correlated Receiver Operating Characteristic curves. Based on the theory of generalized U-statistics the mathematical formulas have been derived for computing ROC area, and the variance and covariance between the portions of two ROC curves. A practical SAS application also has been developed to facilitate the calculations. The accuracy of the non-parametric method was evaluated by comparing it to other methods. By applying our method to the data from a published ROC analysis of CT image, our results are very close to theirs. A hypothetical example was used to demonstrate the effects of two crossed ROC curves. The two ROC areas are the same. However each portion of the area between two ROC curves were found to be significantly different by the partial ROC curve analysis. For computation of ROC curves with large scales, such as a logistic regression model, we applied our method to the breast cancer study with Medicare claims data. It yielded the same ROC area computation as the SAS Logistic procedure. Our method also provides an alternative to the global summary of ROC area comparison by directly comparing the true-positive rates for two regression models and by determining the range of false-positive values where the models differ. ^
Resumo:
A new technique for the detection of microbiological fecal pollution in drinking and in raw surface water has been modified and tested against the standard multiple-tube fermentation technique (most-probable-number, MPN). The performance of the new test in detecting fecal pollution in drinking water has been tested at different incubation temperatures. The basis for the new test was the detection of hydrogen sulfide produced by the hydrogen sulfide producing bacteria which are usually associated with the coliform group. The positive results are indicated by the appearance of a brown to black color in the contents of the fermentation tube within 18 to 24 hours of incubation at 35 (+OR-) .5(DEGREES)C. For this study 158 water samples of different sources have been used. The results were analyzed statistically with the paired t-test and the one-way analysis of variance. No statistically significant difference was noticed between the two methods, when tested 35 (+OR-) .5(DEGREES)C, in detecting fecal pollution in drinking water. The new test showed more positive results with raw surface water, which could be due to the presence of hydrogen sulfide producing bacteria of non-fecal origin like Desulfovibrio and Desulfomaculum. The survival of the hydrogen sulfide producing bacteria and the coliforms was also tested over a 7-day period, and the results showed no significant difference. The two methods showed no significant difference when used to detect fecal pollution at a very low coliform density. The results showed that the new test is mostly effective, in detecting fecal pollution in drinking water, when used at 35 (+OR-) .5(DEGREES)C. The new test is effective, simple, and less expensive when used to detect fecal pollution in drinking water and raw surface water at 35 (+OR-) .5(DEGREES)C. The method can be used for qualitative and/or quantitative analysis of water in the field and in the laboratory. ^
Resumo:
DNA mediated gene transfection is an important tool for moving and isolating genes from one cell type and putting them into a foreign genetic background. DNA transfection studies have been done routinely in many laboratories to identify and isolate transforming sequences in human tumors and tumor cell lines. A second technique, microcell-mediated chromosome transfer, allows the transfer of small numbers of intact human chromosome from one cell to another. This work was done to compare the efficiency of these two techniques in the transformation of NIH 3T3 mouse fibroblast cells.^ My intent in comparing these two techniques was to see if there was a difference in the transforming capability of DNA which has been purified of all associated protein and RNAs, and that of DNA which is introduced into a cell in its native form, the chromosome. If chromosomal sequences were capable of transforming the 3T3 cells in culture, the method could then be used as a way to isolate the relevant tumorigenic chromosomes from human tumors.^ The study shows, however, that even for those cell lines that contain transforming sequences identified by DNA-mediated gene transfer, those same sequences were unable to transform 3T3 cells when introduced to the cells by somatic fusion of human tumor microcells. I believe that the human transforming sequences in their original genetic conformation are not recognized by the mouse cell as genes which should be expressed; therefore, no noticeable transformation event was selected by this technique. ^
Resumo:
Purpose. Fluorophotometry is a well validated method for assessing corneal permeability in human subjects. However, with the growing importance of basic science animal research in ophthalmology, fluorophotometry’s use in animals must be further evaluated. The purpose of this study was to evaluate corneal epithelial permeability following desiccating stress using the modified Fluorotron Master™. ^ Methods. Corneal permeability was evaluated prior to and after subjecting 6-8 week old C57BL/6 mice to experimental dry eye (EDE) for 2 and 5 days (n=9/time point). Untreated mice served as controls. Ten microliters of 0.001% sodium fluorescein (NaF) were instilled topically into each mouse’s left eye to create an eye bath, and left to permeate for 3 minutes. The eye bath was followed by a generous wash with Buffered Saline Solution (BSS) and alignment with the Fluorotron Master™. Seven corneal scans using the Fluorotron Master were performed during 15 minutes (1 st post-wash scans), followed by a second wash using BSS and another set of five corneal scans (2nd post-wash scans) during the next 15 minutes. Corneal permeability was calculated using data calculated with the FM™ Mouse software. ^ Results. When comparing the difference between the Post wash #1 scans within the group and the Post wash #2 scans within the group using a repeated measurement design, there was a statistical difference in the corneal fluorescein permeability of the Post-wash #1 scans after 5 days (1160.21±108.26 vs. 1000.47±75.56 ng/mL, P<0.016 for UT-5 day comparison 8 [0.008]), but not after only 2 days of EDE compared to Untreated mice (1115.64±118.94 vs. 1000.47±75.56 ng/mL, P>0.016 for UT-2 day comparison [0.050]). There was no statistical difference between the 2 day and 5 day Post wash #1 scans (P=.299). The Post-wash #2 scans demonstrated that EDE caused a significant NaF retention at both 2 and 5 days of EDE compared to baseline, untreated controls (1017.92±116.25, 1015.40±120.68 vs. 528.22±127.85 ng/mL, P<0.05 [0.0001 for both]). There was no statistical difference between the 2 day and 5 day Post wash #2 scans (P=.503). The comparison between the Untreated post wash #1 with untreated post wash #2 scans using a Paired T-test showed a significant difference between the two sets of scans (P=0.000). There is also a significant difference between the 2 day comparison and the 5 day comparison (P values = 0.010 and 0.002, respectively). ^ Conclusion. Desiccating stress increases permeability of the corneal epithelium to NaF, and increases NaF retention in the corneal stroma. The Fluorotron Master is a useful and sensitive tool to evaluate corneal permeability in murine dry eye, and will be a useful tool to evaluate the effectiveness of dry eye treatments in animal-model drug trials.^
Resumo:
Background. Because our hands are the most common mode of transmission for bacteria causing hospital acquired infections, hand hygiene practices are the most effective method of preventing the spread of these pathogens, limiting the occurrence of healthcare-associated infections and reducing transmission of multi-drug resistant organisms. Yet, compliance rates are below 40% on the average. ^ Objective. This culminating experience project is primarily a literature review on hand hygiene to help determine the barriers to hand hygiene compliance and offer solutions on improving these rates and to build on a hand hygiene evaluation performed during my infection control internship completed at Memorial Hermann Hospital during the fall semester of 2005. ^ Method. A review of peer-reviewed literature using Ovid Medline, Ebsco Medline and PubMed databases using keywords: hand hygiene, hand hygiene compliance, alcohol based handrub, healthcare-associated infections, hospital-acquired infections, and infection control. ^ Results. A total of eight hand hygiene studies are highlighted. At a children's hospital in Seattle, hand hygiene compliance rates increases from 62% to 81% after five periods of interventions. In Thailand, 26 nurses dramatically increased compliance from 6.3% to 81.2% after just 7 months of training. Automated alcohol based handrub dispensers improved compliance rates in Chicago from 36.3% to 70.1%. Using education and increased distribution of alcohol based handrubs increased hand hygiene rates from 59% to 79% for Ebnother, from 54% to 85% for Hussein and from 32% to 63% for Randle. Spartanburg Regional Medical Center increased their rates from 72.5% to 90.3%. A level III NICU achieved 100% compliance after a month long educational campaign but fell back down to its baseline rate of 89% after 3 months. ^ Discussion. The interventions used to promote hand hygiene in the highlighted studies varied from low tech approaches such as printed materials to advanced electronic gadgets that alerted individuals automatically to perform hand hygiene. All approaches were effective and increased compliance rates. Overcoming hand hygiene barriers, receiving and accepting feedback is the key to maintaining consistently high hand hygiene adherence. ^
Resumo:
Many public health agencies and researchers are interested in comparing hospital outcomes, for example, morbidity, mortality, and hospitalization across areas and hospitals. However, since there is variation of rates in clinical trials among hospitals because of several biases, we are interested in controlling for the bias and assessing real differences in clinical practices. In this study, we compared the variations between hospitals in rates of severe Intraventricular Haemorrhage (IVH) infant using Frequentist statistical approach vs. Bayesian hierarchical model through simulation study. The template data set for simulation study was included the number of severe IVH infants of 24 intensive care units in Australian and New Zealand Neonatal Network from 1995 to 1997 in severe IVH rate in preterm babies. We evaluated the rates of severe IVH for 24 hospitals with two hierarchical models in Bayesian approach comparing their performances with the shrunken rates in Frequentist method. Gamma-Poisson (BGP) and Beta-Binomial (BBB) were introduced into Bayesian model and the shrunken estimator of Gamma-Poisson (FGP) hierarchical model using maximum likelihood method were calculated as Frequentist approach. To simulate data, the total number of infants in each hospital was kept and we analyzed the simulated data for both Bayesian and Frequentist models with two true parameters for severe IVH rate. One was the observed rate and the other was the expected severe IVH rate by adjusting for five predictors variables for the template data. The bias in the rate of severe IVH infant estimated by both models showed that Bayesian models gave less variable estimates than Frequentist model. We also discussed and compared the results from three models to examine the variation in rate of severe IVH by 20th centile rates and avoidable number of severe IVH cases. ^
Resumo:
The purpose of this study was to compare the relative effectiveness of alternative methods of tracing named contacts of syphilis patients. A total of 236 contacts, identified by patients in two City of Houston Department of Health and Human Services clinics during the period April 1 through July 31, 1987, were studied. After contacts were grouped by sex and age, the proportion brought to examination by each of three methods, and by a combination of methods, was determined for each subgroup.^ The study found that 78.4% of all the 236 named sex contacts reported were located and brought to examination by the various methods of contact tracing and that 21.6% were missed. Of the 185 contacts examined, a combination of methods identified 47.7% of the cases, telephone contact, 28.6%, field contact, 16.9%, and patient referral, 11.8%.^ Of the 236 contacts reported, males made up 56.8% and females 43.2%. Contact tracing was more successful among females, with 81.4% of the 102 named female contacts, as compared to 76.1% of the 134 named male contacts being brought to examination. It is not known whether equal efforts were exerted in the follow-up of both male and female contacts. In both female and male subgroups, a combination of methods brought over 40% of sex contacts to examination. Telephone contact among females yielded 27.7% of the cases and field contact 18.1%, whereas in males, telephone contact identified 29.4% of the cases and field contact 15.7%. Patient referral was the least productive method in both sex groups, locating 12.8% in males as compared to 10.8% in females.^ On an age specific basis, a combination of methods was the most effective method in the 15-39 age group, whereas telephone contact was most effective in the 40-44 age group, and field contact in the 50-54 age group. Of all the methods of contact tracing, patient referral was the least productive in most age groups.^ Future studies of contact tracing should incorporate several important variables which were not examined in this study. ^
Resumo:
Studies have shown that rare genetic variants have stronger effects in predisposing common diseases, and several statistical methods have been developed for association studies involving rare variants. In order to better understand how these statistical methods perform, we seek to compare two recently developed rare variant statistical methods (VT and C-alpha) on 10,000 simulated re-sequencing data sets with disease status and the corresponding 10,000 simulated null data sets. The SLC1A1 gene has been suggested to be associated with diastolic blood pressure (DBP) in previous studies. In the current study, we applied VT and C-alpha methods to the empirical re-sequencing data for the SLC1A1 gene from 300 whites and 200 blacks. We found that VT method obtains higher power and performs better than C-alpha method with the simulated data we used. The type I errors were well-controlled for both methods. In addition, both VT and C-alpha methods suggested no statistical evidence for the association between the SLC1A1 gene and DBP. Overall, our findings provided an important comparison of the two statistical methods for future reference and provided preliminary and pioneer findings on the association between the SLC1A1 gene and blood pressure.^