16 resultados para text analytic approaches
em DigitalCommons@The Texas Medical Center
Resumo:
Tuberculosis remains a major threat as drug resistance continues to increase. Pulmonary tuberculosis in adults is responsible for 80% of clinical cases and nearly 100% of transmission of infection. Unfortunately, since we have no animal models of adult type pulmonary tuberculosis, the most important type of disease remains largely out of reach of modern science and many fundamental questions remain unanswered. This paper reviews research dating back to the 1950's providing compelling evidence that cord factor (trehalose 6,6 dimycolate [TDM]) is essential for understanding tuberculosis. However, the original papers by Bloch and Noll were too far ahead of their time to have immediate impact. We can now recognize that the physical and biologic properties of cord factor are unprecedented in science, especially its ability to switch between two sets of biologic activities with changes in conformation. While TDM remains on organisms, it protects them from killing within macrophages, reduces antibiotic effectiveness and inhibits the stimulation of protective immune responses. If it comes off organisms and associates with lipid, TDM becomes a driver of tissue damage and necrosis. Studies emanating from cord factor research have produced (1) a rationale for improving vaccines, (2) an approach to new drugs that overcome natural resistance to antibiotics, (3) models of caseating granulomas that reproduce multiple manifestations of human tuberculosis. (4) evidence that TDM is a key T cell antigen in destructive lesions of tuberculosis, and (5) a new understanding of the pathology and pathogenesis of postprimary tuberculosis that can guide more informative studies of long standing mysteries of tuberculosis.
Resumo:
Each year, pediatric traumatic brain injury (TBI) accounts for 435,000 emergency department visits, 37,000 hospital admissions, and approximately 2,500 deaths in the United States. TBI results in immediate injury from direct mechanical force and shear. Secondary injury results from the release of biochemical or inflammatory factors that alter the loco-regional milieu in the acute, subacute, and delayed intervals after a mechanical insult. Preliminary preclinical and clinical research is underway to evaluate the benefit from progenitor cell therapeutics, hypertonic saline infusion, and controlled hypothermia. However, all phase III clinical trials investigating pharmacologic monotherapy for TBI have shown no benefit. A recent National Institutes of Health consensus statement recommends research into multimodality treatments for TBI. This article will review the complex pathophysiology of TBI as well as the possible therapeutic mechanisms of progenitor cell transplantation, hypertonic saline infusion, and controlled hypothermia for possible utilization in multimodality clinical trials.
Resumo:
Lipids fulfill multiple and diverse functions in cells. Establishing the molecular basis for these functions has been challenging due to the lack of catalytic activity of lipids and the pleiotropic effects of mutations that affect lipid composition. By combining molecular genetic manipulation of membrane lipid composition with biochemical characterization of the resulting phenotypes, the molecular details of novel lipid functions have been established. This review summarizes the results of such a combined approach to defining lipid function in bacteria.
Resumo:
Family reunification is one of the central tenents of the child welfare system, yet research supporting effective practices to promote safe reunifications is limited. As a departure from previous initiatives, the Parent Partner (PP) program enlists as staff mothers and fathers who have experienced child removal, services, and reunification. This study examines outcomes for children served by the PP program. The experimental group includes 236 children whose parents were served by a Parent Partner and a matched comparison group of 55 children whose parents were served by the public child welfare agency in 2004, before the Parent Partner program was established. Cases were examined 12 months following case opening to determine reunification status. Results from the outcome study indicate that reunification may be more likely for children whose parents were served by Parent Partners. Although there are limitations to the data, findings from this study suggest that the Parent Partner model may hold promise as a child welfare intervention designed to support reunification.
Resumo:
In “Partnering with Parents: Promising Approaches to Improve Reunification Outcomes for Children in Foster Care,” authors J. D. Berrick, Edward Cohen and Elizabeth Anthony describe promising outcomes from a quasi-experimental study of reunification outcomes associated with the Parent Partner program in one western county
Resumo:
A change in synaptic strength arising from the activation of two neuronal pathways at approximately the same time is a form of associative plasticity and may underlie classical conditioning. Previously, a cellular analog of a classical conditioning protocol has been demonstrated to produce short-term associative plasticity at the connections between sensory and motor neurons in Aplysia. A similar training protocol produced long-term (24 hour) enhancement of excitatory postsynaptic potentials (EPSPs). EPSPs produced by sensory neurons in which activity was paired with a reinforcing stimulus were significantly larger than unpaired controls 24 hours after training. To examined whether the associative plasticity observed at these synapses may be involved in higher-order forms of classical conditioning, a neural analog of contingency was developed. In addition, computer simulations were used to analyze whether the associative plasticity observed in Aplysia could, in theory, account for second-order conditioning and blocking. ^
Resumo:
Dielectrophoresis (DEP) has been used to manipulate cells in low-conductivity suspending media using AC electrical fields generated on micro-fabricated electrode arrays. This has created the possibility of performing automatically on a micro-scale more sophisticated cell processing than that currently requiring substantial laboratory equipment, reagent volumes, time, and human intervention. In this research the manipulation of aqueous droplets in an immiscible, low-permittivity suspending medium is described to complement previous work on dielectrophoretic cell manipulation. Such droplets can be used as carriers not only for air- and water-borne samples, contaminants, chemical reagents, viral and gene products, and cells, but also the reagents to process and characterize these samples. A long-term goal of this area of research is to perform chemical and biological assays on automated, micro-scaled devices at or near the point-of-care, which will increase the availability of modern medicine to people who do not have ready access to large medical institutions and decrease the cost and delays associated with that lack of access. In this research I present proofs-of-concept for droplet manipulation and droplet-based biochemical analysis using dielectrophoresis as the motive force. Proofs-of-concept developed for the first time in this research include: (1) showing droplet movement on a two-dimensional array of electrodes, (2) achieving controlled dielectric droplet injection, (3) fusing and reacting droplets, and (4) demonstrating a protein fluorescence assay using micro-droplets. ^
Resumo:
Many public health agencies and researchers are interested in comparing hospital outcomes, for example, morbidity, mortality, and hospitalization across areas and hospitals. However, since there is variation of rates in clinical trials among hospitals because of several biases, we are interested in controlling for the bias and assessing real differences in clinical practices. In this study, we compared the variations between hospitals in rates of severe Intraventricular Haemorrhage (IVH) infant using Frequentist statistical approach vs. Bayesian hierarchical model through simulation study. The template data set for simulation study was included the number of severe IVH infants of 24 intensive care units in Australian and New Zealand Neonatal Network from 1995 to 1997 in severe IVH rate in preterm babies. We evaluated the rates of severe IVH for 24 hospitals with two hierarchical models in Bayesian approach comparing their performances with the shrunken rates in Frequentist method. Gamma-Poisson (BGP) and Beta-Binomial (BBB) were introduced into Bayesian model and the shrunken estimator of Gamma-Poisson (FGP) hierarchical model using maximum likelihood method were calculated as Frequentist approach. To simulate data, the total number of infants in each hospital was kept and we analyzed the simulated data for both Bayesian and Frequentist models with two true parameters for severe IVH rate. One was the observed rate and the other was the expected severe IVH rate by adjusting for five predictors variables for the template data. The bias in the rate of severe IVH infant estimated by both models showed that Bayesian models gave less variable estimates than Frequentist model. We also discussed and compared the results from three models to examine the variation in rate of severe IVH by 20th centile rates and avoidable number of severe IVH cases. ^
Resumo:
Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^
Resumo:
The investigator conducted an action-oriented investigation of pregnancy and birth among the women of Mesa los Hornos, an urban squatter slum in Mexico City. Three aims guided the project: (1) To obtain information for improving prenatal and maternity service utilization; (2) To examine the utility of rapid ethnographic and epidemiologic assessment methodologies; (3) To cultivate community involvement in health development.^ Viewing service utilization as a culturally-bound decision, the study included a qualitative phase to explore women's cognition of pregnancy and birth, their perceived needs during pregnancy, and their criteria of service acceptability. A probability-based community survey delineated parameters of service utilization and pregnancy health events, and probed reasons for decisions to use medical services, lay midwives, or other sources of prenatal and labor and delivery assistance. Qualitative survey of service providers at relevant clinics, hospitals, and practices contributed information on service availability and access, and on coordination among private, social security, and public assistance health service sectors. The ethnographic approach to exploring the rationale for use or non-use of services provided a necessary complement to conventional barrier-based assessment, to inform planning of culturally appropriate interventions.^ Information collection and interpretation was conducted under the aegis of an advisory committee of community residents and service agency representatives; the residents' committee formulated recommendations for action based on findings, and forwarded the mandate to governmental social and urban development offices. Recommendations were designed to inform and develop community participation in health care decision-making.^ Rapid research methods are powerful tools for achieving community-based empowerment toward investigation and resolution of local health problems. But while ethnography works well in synergy with quantitative assessment approaches to strengthen the validity and richness of short-term field work, the author strongly urges caution in application of Rapid Ethnographic Assessments. An ethnographic sensibility is essential to the research enterprise for the development of an active and cooperative community base, the design and use of quantitative instruments, the appropriate use of qualitative techniques, and the interpretation of culturally-oriented information. However, prescribed and standardized Rapid Ethnographic Assessment techniques are counter-productive if used as research short-cuts before locale- and subject-specific cultural understanding is achieved. ^
Resumo:
Mixed longitudinal designs are important study designs for many areas of medical research. Mixed longitudinal studies have several advantages over cross-sectional or pure longitudinal studies, including shorter study completion time and ability to separate time and age effects, thus are an attractive choice. Statistical methodology used in general longitudinal studies has been rapidly developing within the last few decades. Common approaches for statistical modeling in studies with mixed longitudinal designs have been the linear mixed-effects model incorporating an age or time effect. The general linear mixed-effects model is considered an appropriate choice to analyze repeated measurements data in longitudinal studies. However, common use of linear mixed-effects model on mixed longitudinal studies often incorporates age as the only random-effect but fails to take into consideration the cohort effect in conducting statistical inferences on age-related trajectories of outcome measurements. We believe special attention should be paid to cohort effects when analyzing data in mixed longitudinal designs with multiple overlapping cohorts. Thus, this has become an important statistical issue to address. ^ This research aims to address statistical issues related to mixed longitudinal studies. The proposed study examined the existing statistical analysis methods for the mixed longitudinal designs and developed an alternative analytic method to incorporate effects from multiple overlapping cohorts as well as from different aged subjects. The proposed study used simulation to evaluate the performance of the proposed analytic method by comparing it with the commonly-used model. Finally, the study applied the proposed analytic method to the data collected by an existing study Project HeartBeat!, which had been evaluated using traditional analytic techniques. Project HeartBeat! is a longitudinal study of cardiovascular disease (CVD) risk factors in childhood and adolescence using a mixed longitudinal design. The proposed model was used to evaluate four blood lipids adjusting for age, gender, race/ethnicity, and endocrine hormones. The result of this dissertation suggest the proposed analytic model could be a more flexible and reliable choice than the traditional model in terms of fitting data to provide more accurate estimates in mixed longitudinal studies. Conceptually, the proposed model described in this study has useful features, including consideration of effects from multiple overlapping cohorts, and is an attractive approach for analyzing data in mixed longitudinal design studies.^
Resumo:
Mechanisms that allow pathogens to colonize the host are not the product of isolated genes, but instead emerge from the concerted operation of regulatory networks. Therefore, identifying components and the systemic behavior of networks is necessary to a better understanding of gene regulation and pathogenesis. To this end, I have developed systems biology approaches to study transcriptional and post-transcriptional gene regulation in bacteria, with an emphasis in the human pathogen Mycobacterium tuberculosis (Mtb). First, I developed a network response method to identify parts of the Mtb global transcriptional regulatory network utilized by the pathogen to counteract phagosomal stresses and survive within resting macrophages. As a result, the method unveiled transcriptional regulators and associated regulons utilized by Mtb to establish a successful infection of macrophages throughout the first 14 days of infection. Additionally, this network-based analysis identified the production of Fe-S proteins coupled to lipid metabolism through the alkane hydroxylase complex as a possible strategy employed by Mtb to survive in the host. Second, I developed a network inference method to infer the small non-coding RNA (sRNA) regulatory network in Mtb. The method identifies sRNA-mRNA interactions by integrating a priori knowledge of possible binding sites with structure-driven identification of binding sites. The reconstructed network was useful to predict functional roles for the multitude of sRNAs recently discovered in the pathogen, being that several sRNAs were postulated to be involved in virulence-related processes. Finally, I applied a combined experimental and computational approach to study post-transcriptional repression mediated by small non-coding RNAs in bacteria. Specifically, a probabilistic ranking methodology termed rank-conciliation was developed to infer sRNA-mRNA interactions based on multiple types of data. The method was shown to improve target prediction in Escherichia coli, and therefore is useful to prioritize candidate targets for experimental validation.
Resumo:
In the complex landscape of public education, participants at all levels are searching for policy and practice levers that can raise overall performance and close achievement gaps. The collection of articles in this edition of the Journal of Applied Research on Children takes a big step toward providing the tools and tactics needed for an evidence-based approach to educational policy and practice.
Resumo:
Proviral integration site for Moloney murine leukemia virus (Pim) kinases are Ser/Thr/Tyr kinases. They modulate B-cell development but become oncoproteins and promote cancer development once overexpressed. Containing three isoforms, Pim-1, -2 and -3 are known to phosphorylate various substrates that regulate transcription, translation, cell cycle, and survival pathways in both hematological and solid tumors. Mantle cell lymphoma (MCL) is an aggressive B-cell lymphoma. Elevated Pim kinase levels are common in MCL, and it negatively correlates with patient outcome. SGI-1776 is a small molecule inhibitor selective for Pim-1/-3. We hypothesize that SGI-1776 treatment in MCL will inhibit Pim kinase function, and inhibition of downstream substrates phosphorylation will disrupt transcriptional, translational, and cell cycle processes while promoting apoptosis. SGI-1776 treatment induced moderate to high levels of apoptosis in four MCL cell lines (JeKo-1, Mino, SP-53 and Granta-519) and peripheral blood mononuclear cells (PBMCs) from MCL patients. Phosphorylation of transcription and translation regulators, c-Myc and 4E-BP1 declined in both model systems. Additionally, levels of short-lived Mcl-1 mRNA and protein also decreased and correlated with decline of global RNA synthesis. Collectively, our investigations highlight Pim kinases as viable drug targets in MCL and emphasize their roles in transcriptional and translational regulation. We further investigated a combination strategy using SGI-1776 with bendamustine, an FDA-approved DNA-damaging alkylating agent for treating non-Hodgkin’s lymphoma. We hypothesized this combination will enhance SGI-1776-induced transcription and translation inhibition, while promoting bendamustine-triggered DNA damage and inducing additive to synergistic cytotoxicity in B-cell lymphoma. Bendamustine alone resulted in moderate levels of apoptosis induction in MCL cell lines (JeKo-1 and Mino), and in MCL and splenic marginal zone lymphoma (a type of B-cell lymphoma) primary cells. An additive effect in cell killing was observed when combined with SGI-1776. Expectedly, SGI-1776 effectively decreased global RNA and protein synthesis levels, while bendamustine significantly inhibited DNA synthesis and generated DNA damage response. In combination, intensified inhibitory effects in DNA, RNA and protein syntheses were observed. Together, these data suggested feasibility of using Pim kinase inhibitor in combination with chemotherapeutic agents such as bendamustine in B-cell lymphoma, and provided foundation of their mechanism of actions in lymphoma cells.
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^