955 resultados para Open Library Environment
Resumo:
Introduction Gene expression is an important process whereby the genotype controls an individual cell’s phenotype. However, even genetically identical cells display a variety of phenotypes, which may be attributed to differences in their environment. Yet, even after controlling for these two factors, individual phenotypes still diverge due to noisy gene expression. Synthetic gene expression systems allow investigators to isolate, control, and measure the effects of noise on cell phenotypes. I used mathematical and computational methods to design, study, and predict the behavior of synthetic gene expression systems in S. cerevisiae, which were affected by noise. Methods I created probabilistic biochemical reaction models from known behaviors of the tetR and rtTA genes, gene products, and their gene architectures. I then simplified these models to account for essential behaviors of gene expression systems. Finally, I used these models to predict behaviors of modified gene expression systems, which were experimentally verified. Results Cell growth, which is often ignored when formulating chemical kinetics models, was essential for understanding gene expression behavior. Models incorporating growth effects were used to explain unexpected reductions in gene expression noise, design a set of gene expression systems with “linear” dose-responses, and quantify the speed with which cells explored their fitness landscapes due to noisy gene expression. Conclusions Models incorporating noisy gene expression and cell division were necessary to design, understand, and predict the behaviors of synthetic gene expression systems. The methods and models developed here will allow investigators to more efficiently design new gene expression systems, and infer gene expression properties of TetR based systems.
Resumo:
The effectiveness of the Anisotropic Analytical Algorithm (AAA) implemented in the Eclipse treatment planning system (TPS) was evaluated using theRadiologicalPhysicsCenteranthropomorphic lung phantom using both flattened and flattening-filter-free high energy beams. Radiation treatment plans were developed following the Radiation Therapy Oncology Group and theRadiologicalPhysicsCenterguidelines for lung treatment using Stereotactic Radiation Body Therapy. The tumor was covered such that at least 95% of Planning Target Volume (PTV) received 100% of the prescribed dose while ensuring that normal tissue constraints were followed as well. Calculated doses were exported from the Eclipse TPS and compared with the experimental data as measured using thermoluminescence detectors (TLD) and radiochromic films that were placed inside the phantom. The results demonstrate that the AAA superposition-convolution algorithm is able to calculate SBRT treatment plans with all clinically used photon beams in the range from 6 MV to 18 MV. The measured dose distribution showed a good agreement with the calculated distribution using clinically acceptable criteria of ±5% dose or 3mm distance to agreement. These results show that in a heterogeneous environment a 3D pencil beam superposition-convolution algorithms with Monte Carlo pre-calculated scatter kernels, such as AAA, are able to reliably calculate dose, accounting for increased lateral scattering due to the loss of electronic equilibrium in low density medium. The data for high energy plans (15 MV and 18 MV) showed very good tumor coverage in contrast to findings by other investigators for less sophisticated dose calculation algorithms, which demonstrated less than expected tumor doses and generally worse tumor coverage for high energy plans compared to 6MV plans. This demonstrates that the modern superposition-convolution AAA algorithm is a significant improvement over previous algorithms and is able to calculate doses accurately for SBRT treatment plans in the highly heterogeneous environment of the thorax for both lower (≤12 MV) and higher (greater than 12 MV) beam energies.
Resumo:
Measurement of the absorbed dose from ionizing radiation in medical applications is an essential component to providing safe and reproducible patient care. There are a wide variety of tools available for measuring radiation dose; this work focuses on the characterization of two common, solid-state dosimeters in medical applications: thermoluminescent dosimeters (TLD) and optically stimulated luminescent dosimeters (OSLD). There were two main objectives to this work. The first objective was to evaluate the energy dependence of TLD and OSLD for non-reference measurement conditions in a radiotherapy environment. The second objective was to fully characterize the OSLD nanoDot in a CT environment, and to provide validated calibration procedures for CT dose measurement using OSLD. Current protocols for dose measurement using TLD and OSLD generally assume a constant photon energy spectrum within a nominal beam energy regardless of measurement location, tissue composition, or changes in beam parameters. Variations in the energy spectrum of therapeutic photon beams may impact the response of TLD and OSLD and could thereby result in an incorrect measure of dose unless these differences are accounted for. In this work, we used a Monte Carlo based model to simulate variations in the photon energy spectra of a Varian 6MV beam; then evaluated the impact of the perturbations in energy spectra on the response of both TLD and OSLD using Burlin Cavity Theory. Energy response correction factors were determined for a range of conditions and compared to measured correction factors with good agreement. When using OSLD for dose measurement in a diagnostic imaging environment, photon energy spectra are often referenced to a therapy-energy or orthovoltage photon beam – commonly 250kVp, Co-60, or even 6MV, where the spectra are substantially different. Appropriate calibration techniques specifically for the OSLD nanoDot in a CT environment have not been presented in the literature; furthermore the dependence of the energy response of the calibration energy has not been emphasized. The results of this work include detailed calibration procedures for CT dosimetry using OSLD, and a full characterization of this dosimetry system in a low-dose, low-energy setting.
Resumo:
Invited commentary on "Child Welfare Workers’ Perceptions of the Influence of the Organizational Environment on Permanency Decisions for Families".
Resumo:
The findings of this study suggest that while child welfare workers are consistently distracted by competing priorities from unexpected events, most are committed, and to understand perspectives is more inclusive and may improve retention rates. Notably, while it is recognized that permanency decisions are not made in an intellectual, legal or clinical vacuum and certain traditional aspects of the bureaucratic structure do not impact decision making, this study advances the body of knowledge on child welfare decision making. Examined in this study are child welfare case workers’ perceptions of the extent to which the organizational environment influences the permanency decisions they make to reunify or terminate parental rights of children placed out-of-home. This study includes a sample of 95 child welfare social workers employed in three public child welfare agencies in the Baltimore and Washington, DC metropolitan area. It used a cross-sectional research design, employing a survey instrument to examine bureaucratic distraction, role conflict, and supervisory adequacy as contextual factors in the organizational environment's influence on permanency outcome decisions. Implications are made for child welfare policy, practice, and research.
Resumo:
The purpose of this thesis was to investigate the association between parent acculturation and parental fruit and vegetable intake, child fruit and vegetable intake, and child access and availability to fruits and vegetables. Secondary data analysis was performed on a convenience sample of low-income Hispanic-identifying parents (n = 177) and children from a baseline survey from the Sprouting Healthy Kids intervention. T tests were used to examine the association between parent acculturation status (acculturated or non-acculturated) and fruit intake, vegetable intake and combined fruit and vegetable intake of both the parent and the child. T tests were also used to determine the relationship between parent acculturation and child access and availability to fruits, vegetables, and combined fruits and vegetables. Statistical significance was set at a p level of 0.05. The mean FVI for the parents and children were 3.41 servings and 2.96 servings, respectively. Statistical significance was found for the relationships between parent acculturation and parent fruit intake and parent acculturation and child fruit access. Lower acculturation of the parent was significantly related to higher fruit intake. Counter to the hypothesis, higher acculturation was found to be associated with greater access to fruits for the child. These findings suggest the necessity for not only culturally specific nutrition interventions, but the need for interventions to target behaviors for specific levels of acculturation within a culture. ^
Resumo:
Cells govern their activities and modulate their interactions with the environment to achieve homeostasis. The heat shock response (HSR) is one of the most well studied fundamental cellular responses to environmental and physiological challenges, resulting in rapid synthesis of heat shock proteins (HSPs), which serve to protect cellular constituents from the deleterious effects of stress. In addition to its role in cytoprotection, the HSR also influences lifespan and is associated with a variety of human diseases including cancer, aging and neurodegenerative disorders. In most eukaryotes, the HSR is primarily mediated by the highly conserved transcription factor HSF1, which recognizes target hsp genes by binding to heat shock elements (HSEs) in their promoters. In recent years, significant efforts have been made to identify small molecules as potential pharmacological activators of HSF1 that could be used for therapeutic benefit in the treatment of human diseases relevant to protein conformation. However, the detailed mechanisms through which these molecules drive HSR activation remain unclear. In this work, I utilized the baker's yeast Saccharomyces cerevisiae as a model system to identify a group of thiol-reactive molecules including oxidants, transition metals and metalloids, and electrophiles, as potent activators of yeast Hsf1. Using an artificial HSE-lacZ reporter and the glucocorticoid receptor system (GR), these diverse thiol-reactive compounds are shown to activate Hsf1 and inhibit Hsp90 chaperone complex activity in a reciprocal, dose-dependent manner. To further understand whether cells sense these reactive compounds through accumulation of unfolded proteins, the proline analog azetidine-2-carboxylic acid (AZC) and protein cross-linker dithiobis(succinimidyl propionate) (DSP) were used to force misfolding of nascent polypeptides and existing cytosolic proteins, respectively. Both unfolding reagents display kinetic HSP induction profiles dissimilar to those generated by thiol-reactive compounds. Moreover, AZC treatment leads to significant cytotoxicity, which is not observed in the presence of the thiol-reactive compounds at the concentrations sufficient to induce Hsf1. Additionally, DSP treatment has little to no effect on Hsp90 functions. Together with the ultracentrifugation analysis of cell lysates that detected no insoluble protein aggregates, my data suggest that at concentrations sufficient to induce Hsf1, thiol-reactive compounds do not induce the HSR via a mechanism based on accumulation of unfolded cytosolic proteins. Another possibility is that thiol-reactive compounds may influence aspects of the protein quality control system such as the ubiquitin-proteasome system (UPS). To address this hypothesis, β-galactosidase reporter fusions were used as model substrates to demonstrate that thiol-reactive compounds do not inhibit ubiquitin activating enzymes (E1) or proteasome activity. Therefore, thiol-reactive compounds do not activate the HSR by inhibiting UPS-dependent protein degradation. I therefore hypothesized that these molecules may directly inactivate protein chaperones, known as repressors of Hsf1. To address this possibility, a thiol-reactive biotin probe was used to demonstrate in vitro that the yeast cytosolic Hsp70 Ssa1, which partners with Hsp90 to repress Hsf1, is specifically modified. Strikingly, mutation of conserved cysteine residues in Ssa1 renders cells insensitive to Hsf1 activation by cadmium and celastrol but not by heat shock. Conversely, substitution with the sulfinic acid and steric bulk mimic aspartic acid led to constitutive activation of Hsf1. Cysteine 303, located in the nucleotide-binding/ATPase domain of Ssa1, was shown to be modified in vivo by a model organic electrophile using Click chemistry technology, verifying that Ssa1 is a direct target for thiol-reactive compounds through adduct formation. Consistently, cadmium pretreatment promoted cells thermotolerance, which is abolished in cells carrying SSA1 cysteine mutant alleles. Taken together, these findings demonstrate that Hsp70 acts as a sensor to induce the cytoprotective heat shock response in response to environmental or endogenously produced thiol-reactive molecules and can discriminate between two distinct environmental stressors.
Resumo:
Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.
Resumo:
Developing a Model Interruption is a known human factor that contributes to errors and catastrophic events in healthcare as well as other high-risk industries. The landmark Institute of Medicine (IOM) report, To Err is Human, brought attention to the significance of preventable errors in medicine and suggested that interruptions could be a contributing factor. Previous studies of interruptions in healthcare did not offer a conceptual model by which to study interruptions. As a result of the serious consequences of interruptions investigated in other high-risk industries, there is a need to develop a model to describe, understand, explain, and predict interruptions and their consequences in healthcare. Therefore, the purpose of this study was to develop a model grounded in the literature and to use the model to describe and explain interruptions in healthcare. Specifically, this model would be used to describe and explain interruptions occurring in a Level One Trauma Center. A trauma center was chosen because this environment is characterized as intense, unpredictable, and interrupt-driven. The first step in developing the model began with a review of the literature which revealed that the concept interruption did not have a consistent definition in either the healthcare or non-healthcare literature. Walker and Avant’s method of concept analysis was used to clarify and define the concept. The analysis led to the identification of five defining attributes which include (1) a human experience, (2) an intrusion of a secondary, unplanned, and unexpected task, (3) discontinuity, (4) externally or internally initiated, and (5) situated within a context. However, before an interruption could commence, five conditions known as antecedents must occur. For an interruption to take place (1) an intent to interrupt is formed by the initiator, (2) a physical signal must pass a threshold test of detection by the recipient, (3) the sensory system of the recipient is stimulated to respond to the initiator, (4) an interruption task is presented to recipient, and (5) the interruption task is either accepted or rejected by v the recipient. An interruption was determined to be quantifiable by (1) the frequency of occurrence of an interruption, (2) the number of times the primary task has been suspended to perform an interrupting task, (3) the length of time the primary task has been suspended, and (4) the frequency of returning to the primary task or not returning to the primary task. As a result of the concept analysis, a definition of an interruption was derived from the literature. An interruption is defined as a break in the performance of a human activity initiated internal or external to the recipient and occurring within the context of a setting or location. This break results in the suspension of the initial task by initiating the performance of an unplanned task with the assumption that the initial task will be resumed. The definition is inclusive of all the defining attributes of an interruption. This is a standard definition that can be used by the healthcare industry. From the definition, a visual model of an interruption was developed. The model was used to describe and explain the interruptions recorded for an instrumental case study of physicians and registered nurses (RNs) working in a Level One Trauma Center. Five physicians were observed for a total of 29 hours, 31 minutes. Eight registered nurses were observed for a total of 40 hours 9 minutes. Observations were made on either the 0700–1500 or the 1500-2300 shift using the shadowing technique. Observations were recorded in the field note format. The field notes were analyzed by a hybrid method of categorizing activities and interruptions. The method was developed by using both a deductive a priori classification framework and by the inductive process utilizing line-byline coding and constant comparison as stated in Grounded Theory. The following categories were identified as relative to this study: Intended Recipient - the person to be interrupted Unintended Recipient - not the intended recipient of an interruption; i.e., receiving a phone call that was incorrectly dialed Indirect Recipient – the incidental recipient of an interruption; i.e., talking with another, thereby suspending the original activity Recipient Blocked – the intended recipient does not accept the interruption Recipient Delayed – the intended recipient postpones an interruption Self-interruption – a person, independent of another person, suspends one activity to perform another; i.e., while walking, stops abruptly and talks to another person Distraction – briefly disengaging from a task Organizational Design – the physical layout of the workspace that causes a disruption in workflow Artifacts Not Available – supplies and equipment that are not available in the workspace causing a disruption in workflow Initiator – a person who initiates an interruption Interruption by Organizational Design and Artifacts Not Available were identified as two new categories of interruption. These categories had not previously been cited in the literature. Analysis of the observations indicated that physicians were found to perform slightly fewer activities per hour when compared to RNs. This variance may be attributed to differing roles and responsibilities. Physicians were found to have more activities interrupted when compared to RNs. However, RNs experienced more interruptions per hour. Other people were determined to be the most commonly used medium through which to deliver an interruption. Additional mediums used to deliver an interruption vii included the telephone, pager, and one’s self. Both physicians and RNs were observed to resume an original interrupted activity more often than not. In most interruptions, both physicians and RNs performed only one or two interrupting activities before returning to the original interrupted activity. In conclusion the model was found to explain all interruptions observed during the study. However, the model will require an even more comprehensive study in order to establish its predictive value.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.
Resumo:
Hemophilia A is a clotting disorder caused by functional factor VIII (FVIII) deficiency. About 25% of patients treated with therapeutic recombinant FVIII develop antibodies (inhibitors) that render subsequent FVIII treatments ineffective. The immune mechanisms of inhibitor formation are not entirely understood, but circumstantial evidence indicates a role for increased inflammatory response, possibly via stimulation of Toll-like receptors (TLRs), at the time of FVIII immunization. I hypothesized that stimulation through TLR4 in conjunction with FVIII treatments would increase the formation of FVIII inhibitors. To test this hypothesis, FVIII K.O. mice were injected with recombinant human FVIII with or without concomitant doses of TLR4 agonist (lipopoysaccharide; LPS). The addition of LPS combined with FVIII significantly increased the rate and the production of anti-FVIII IgG antibodies and neutralizing FVIII inhibitors. In the spleen, repeated in vivo TLR4 stimulation with LPS increased the relative percentage of macrophages and dendritic cells (DCs) over the course of 4 injections. However, repeated in vivo FVIII stimulation significantly increased the density of TLR4 expressed on the surface of all spleen antigen presenting cells (APCs). Culture of splenocytes isolated from mice revealed that the combined stimulation of LPS and FVIII also synergistically increased early secretion of the inflammatory cytokines IL-6, TNF-α, and IL-10, which was not maintained throughout the course of the repeated injections. While cytokine secretion was relatively unchanged in response to FVIII re-stimulation in culture, LPS re-stimulation in culture induced increased and prolonged inflammatory cytokine secretion. Re-stimulation with both LPS and FVIII induced cytokine secretion similar to LPS stimulation alone. Interestingly, long term treatment of mice with LPS alone resulted in splenocytes that showed reduced response to FVIII in culture. Together these results indicated that creating a pro-inflammatory environment through the combined stimulation of chronic, low-dose LPS and FVIII changed not only the populations but also the repertoire of APCs in the spleen, triggering the increased production of FVIII inhibitors. These results suggested an anti-inflammatory regimen should be instituted for all hemophilia A patients to reduce or delay the formation of FVIII inhibitors during replacement therapy.
Resumo:
Tumor growth often outpaces its vascularization, leading to development of a hypoxic tumor microenvironment. In response, an intracellular hypoxia survival pathway is initiated by heterodimerization of hypoxia-inducible factor (HIF)-1α and HIF-1β, which subsequently upregulates the expression of several hypoxia-inducible genes, promotes cell survival and stimulates angiogenesis in the oxygen-deprived environment. Hypoxic tumor regions are often associated with resistance to various classes of radio- or chemotherapeutic agents. Therefore, development of HIF-1α/β heterodimerization inhibitors may provide a novel approach to anti-cancer therapy. To this end, a novel approach for imaging HIF-1α/β heterodimerization in vitro and in vivo was developed in this study. Using this screening platform, we identified a promising lead candidate and further chemically derivatized the lead candidate to assess the structure-activity relationship (SAR). The most effective first generation drug inhibitors were selected and their pharmacodynamics and anti-tumor efficacy in vivo were verified by bioluminescence imaging (BLI) of HIF-1α/β heterodimerization in the xenograft tumor model. Furthermore, the first generation drug inhibitors, M-TMCP and D-TMCP, demonstrated efficacy as monotherapies, resulting in tumor growth inhibition via disruption of HIF-1 signaling-mediated tumor stromal neoangiogenesis.
Resumo:
Autophagy is an evolutionarily conserved process that functions to maintain homeostasis and provides energy during nutrient deprivation and environmental stresses for the survival of cells by delivering cytoplasmic contents to the lysosomes for recycling and energy generation. Dysregulation of this process has been linked to human diseases including immune disorders, neurodegenerative muscular diseases and cancer. Autophagy is a double edged sword in that it has both pro-survival and pro-death roles in cancer cells. Its cancer suppressive roles include the clearance of damaged organelles, which could otherwise lead to inflammation and therefore promote tumorigenesis. In its pro-survival role, autophagy allows cancer cells to overcome cytotoxic stresses generated the cancer environment or cancer treatments such as chemotherapy and evade cell death. A better understanding of how drugs that perturb autophagy affect cancer cell signaling is of critical importance toimprove the cancer treatment arsenal. In order to gain insights in the relationship between autophagy and drug treatments, we conducted a high-throughput drug screen to identify autophagy modulators. Our high-throughput screen utilized image based fluorescent microscopy for single cell analysis to identify chemical perturbants of the autophagic process. Phenothiazines emerged as the largest family of drugs that alter the autophagic process by increasing LC3-II punctae levels in different cancer cell lines. In addition, we observed multiple biological effects in cancer cells treated with phenothiazines. Those antitumorigenic effects include decreased cell migration, cell viability, and ATP production along with abortive autophagy. Our studies highlight the potential role of phenothiazines as agents for combinational therapy with other chemotherapeutic agents in the treatment of different cancers.