970 resultados para Ligand-based methodologies
Resumo:
Abstract : The human body is composed of a huge number of cells acting together in a concerted manner. The current understanding is that proteins perform most of the necessary activities in keeping a cell alive. The DNA, on the other hand, stores the information on how to produce the different proteins in the genome. Regulating gene transcription is the first important step that can thus affect the life of a cell, modify its functions and its responses to the environment. Regulation is a complex operation that involves specialized proteins, the transcription factors. Transcription factors (TFs) can bind to DNA and activate the processes leading to the expression of genes into new proteins. Errors in this process may lead to diseases. In particular, some transcription factors have been associated with a lethal pathological state, commonly known as cancer, associated with uncontrolled cellular proliferation, invasiveness of healthy tissues and abnormal responses to stimuli. Understanding cancer-related regulatory programs is a difficult task, often involving several TFs interacting together and influencing each other's activity. This Thesis presents new computational methodologies to study gene regulation. In addition we present applications of our methods to the understanding of cancer-related regulatory programs. The understanding of transcriptional regulation is a major challenge. We address this difficult question combining computational approaches with large collections of heterogeneous experimental data. In detail, we design signal processing tools to recover transcription factors binding sites on the DNA from genome-wide surveys like chromatin immunoprecipitation assays on tiling arrays (ChIP-chip). We then use the localization about the binding of TFs to explain expression levels of regulated genes. In this way we identify a regulatory synergy between two TFs, the oncogene C-MYC and SP1. C-MYC and SP1 bind preferentially at promoters and when SP1 binds next to C-NIYC on the DNA, the nearby gene is strongly expressed. The association between the two TFs at promoters is reflected by the binding sites conservation across mammals, by the permissive underlying chromatin states 'it represents an important control mechanism involved in cellular proliferation, thereby involved in cancer. Secondly, we identify the characteristics of TF estrogen receptor alpha (hERa) target genes and we study the influence of hERa in regulating transcription. hERa, upon hormone estrogen signaling, binds to DNA to regulate transcription of its targets in concert with its co-factors. To overcome the scarce experimental data about the binding sites of other TFs that may interact with hERa, we conduct in silico analysis of the sequences underlying the ChIP sites using the collection of position weight matrices (PWMs) of hERa partners, TFs FOXA1 and SP1. We combine ChIP-chip and ChIP-paired-end-diTags (ChIP-pet) data about hERa binding on DNA with the sequence information to explain gene expression levels in a large collection of cancer tissue samples and also on studies about the response of cells to estrogen. We confirm that hERa binding sites are distributed anywhere on the genome. However, we distinguish between binding sites near promoters and binding sites along the transcripts. The first group shows weak binding of hERa and high occurrence of SP1 motifs, in particular near estrogen responsive genes. The second group shows strong binding of hERa and significant correlation between the number of binding sites along a gene and the strength of gene induction in presence of estrogen. Some binding sites of the second group also show presence of FOXA1, but the role of this TF still needs to be investigated. Different mechanisms have been proposed to explain hERa-mediated induction of gene expression. Our work supports the model of hERa activating gene expression from distal binding sites by interacting with promoter bound TFs, like SP1. hERa has been associated with survival rates of breast cancer patients, though explanatory models are still incomplete: this result is important to better understand how hERa can control gene expression. Thirdly, we address the difficult question of regulatory network inference. We tackle this problem analyzing time-series of biological measurements such as quantification of mRNA levels or protein concentrations. Our approach uses the well-established penalized linear regression models where we impose sparseness on the connectivity of the regulatory network. We extend this method enforcing the coherence of the regulatory dependencies: a TF must coherently behave as an activator, or a repressor on all its targets. This requirement is implemented as constraints on the signs of the regressed coefficients in the penalized linear regression model. Our approach is better at reconstructing meaningful biological networks than previous methods based on penalized regression. The method is tested on the DREAM2 challenge of reconstructing a five-genes/TFs regulatory network obtaining the best performance in the "undirected signed excitatory" category. Thus, these bioinformatics methods, which are reliable, interpretable and fast enough to cover large biological dataset, have enabled us to better understand gene regulation in humans.
Resumo:
Conjugates of a dicarba analogue of octreotide, a potent somatostatin agonist whose receptors are overexpressed on tumor cells, with [PtCl2(dap)] (dap = 1-(carboxylic acid)-1,2-diaminoethane) (3), [(η6-bip)Os(4-CO2-pico)Cl] (bip = biphenyl, pico = picolinate) (4), [(η6-p-cym)RuCl(dap)]+ (p-cym = p-cymene) (5), and [(η6-p-cym)RuCl(imidazole-CO2H)(PPh3)]+ (6), were synthesized by using a solid-phase approach. Conjugates 35 readily underwent hydrolysis and DNA binding, whereas conjugate 6 was inert to ligand substitution. NMR spectroscopy and molecular dynamics calculations showed that conjugate formation does not perturb the overall peptide structure. Only 6 exhibited antiproliferative activity in human tumor cells (IC50 = 63 ± 2 μM in MCF-7 cells and IC50 = 26 ± 3 μM in DU-145 cells) with active participation of somatostatin receptors in cellular uptake. Similar cytotoxic activity was found in a normal cell line (IC50 = 45 ± 2.6 μM in CHO cells), which can be attributed to a similar level of expression of somatostatin subtype-2 receptor. These studies provide new insights into the effect of receptor-binding peptide conjugation on the activity of metal-based anticancer drugs, and demonstrate the potential of such hybrid compounds to target tumor cells specifically.
Resumo:
A series of compounds of general formula [Ru(eta(6)-p-cymene) (R(2)acac)(PTA)][X] (R(2)acac = Me(2)acac, tBu(2)acac, Ph(2)acac, Me(2)acac-Cl; PTA = 1,3,5-triaza-7-phosphaadamantane; X = BPh4, BF4), and the precursor to the Me2acac-Cl derivative [Ru(eta(6)-p-cymene)(Me(2)acac-Cl)Cl], have been prepared and characterised spectroscopically. Five of the compounds have also been characterised in the solid state by X-ray crystallography. The tetrafluoroborate salts are water-soluble, quite resistant to hydrolysis, and have been evaluated for cytotoxicity against A549 lung carcinoma and A2780 human ovarian cancer cells. The compounds are cytotoxic towards the latter cell line, and relative activities are discussed in terms of hydrolysis (less important) and lipophilicity, which appears to exert the dominating influence.
Resumo:
EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.
Resumo:
The death-inducing receptor Fas is activated when cross-linked by the type II membrane protein Fas ligand (FasL). When human soluble FasL (sFasL, containing the extracellular portion) was expressed in human embryo kidney 293 cells, the three N-linked glycans of each FasL monomer were found to be essential for efficient secretion. Based on the structure of the closely related lymphotoxin alpha-tumor necrosis factor receptor I complex, a molecular model of the FasL homotrimer bound to three Fas molecules was generated using knowledge-based protein modeling methods. Point mutations of amino acid residues predicted to affect the receptor-ligand interaction were introduced at three sites. The F275L mutant, mimicking the loss of function murine gld mutation, exhibited a high propensity for aggregation and was unable to bind to Fas. Mutants P206R, P206D, and P206F displayed reduced cytotoxicity toward Fas-positive cells with a concomitant decrease in the binding affinity for the recombinant Fas-immunoglobulin Fc fusion proteins. Although the cytotoxic activity of mutant Y218D was unaltered, mutant Y218R was inactive, correlating with the prediction that Tyr-218 of FasL interacts with a cluster of three basic amino acid side chains of Fas. Interestingly, mutant Y218F could induce apoptosis in murine, but not human cells.
Resumo:
Molecular docking softwares are one of the important tools of modern drug development pipelines. The promising achievements of the last 10 years emphasize the need for further improvement, as reflected by several recent publications (Leach et al., J Med Chem 2006, 49, 5851; Warren et al., J Med Chem 2006, 49, 5912). Our initial approach, EADock, showed a good performance in reproducing the experimental binding modes for a set of 37 different ligand-protein complexes (Grosdidier et al., Proteins 2007, 67, 1010). This article presents recent improvements regarding the scoring and sampling aspects over the initial implementation, as well as a new seeding procedure based on the detection of cavities, opening the door to blind docking with EADock. These enhancements were validated on 260 complexes taken from the high quality Ligand Protein Database [LPDB, (Roche et al., J Med Chem 2001, 44, 3592)]. Two issues were identified: first, the quality of the initial structures cannot be assumed and a manual inspection and/or a search in the literature are likely to be required to achieve the best performance. Second the description of interactions involving metal ions still has to be improved. Nonetheless, a remarkable success rate of 65% was achieved for a large scale blind docking assay, when considering only the top ranked binding mode and a success threshold of 2 A RMSD to the crystal structure. When looking at the five-top ranked binding modes, the success rate increases up to 76%. In a standard local docking assay, success rates of 75 and 83% were obtained, considering only the top ranked binding mode, or the five top binding modes, respectively.
Resumo:
Life cycle analysis (LCA) is a comprehensive method for assessing the environmental impact of a product or an activity over its entire life cycle. The purpose of conducting LCA studies varies from one application to another. Different applications use LCA for different purposes. In general, the main aim of using LCA is to reduce the environmental impact of products through guiding the decision making process towards more sustainable solutions. The most critical phase in an LCA study is the Life Cycle Impact Assessment (LCIA) where the life cycle inventory (LCI) results of the considered substances related to the study of a certain system are transformed into understandable impact categories that represent the impact on the environment. In this research work, a general structure clarifying the steps that shall be followed ir order to conduct an LCA study effectively is presented. These steps are based on the ISO 14040 standard framework. In addition, a survey is done on the most widely used LCIA methodologies. Recommendations about possible developments and suggetions for further research work regarding the use of LCA and LCIA methodologies are discussed as well.
Resumo:
The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.
Resumo:
This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.
Resumo:
The avidity of the T-cell receptor (TCR) for antigenic peptides presented by the peptide-MHC (pMHC) on cells is a key parameter for cell-mediated immunity. Yet a fundamental feature of most tumor antigen-specific CD8(+) T cells is that this avidity is low. In this study, we addressed the need to identify and select tumor-specific CD8(+) T cells of highest avidity, which are of the greatest interest for adoptive cell therapy in patients with cancer. To identify these rare cells, we developed a peptide-MHC multimer technology, which uses reversible Ni(2+)-nitrilotriacetic acid histidine tags (NTAmers). NTAmers are highly stable but upon imidazole addition, they decay rapidly to pMHC monomers, allowing flow-cytometric-based measurements of monomeric TCR-pMHC dissociation rates of living CD8(+) T cells on a wide avidity spectrum. We documented strong correlations between NTAmer kinetic results and those obtained by surface plasmon resonance. Using NTAmers that were deficient for CD8 binding to pMHC, we found that CD8 itself stabilized the TCR-pMHC complex, prolonging the dissociation half-life several fold. Notably, our NTAmer technology accurately predicted the function of large panels of tumor-specific T cells that were isolated prospectively from patients with cancer. Overall, our results demonstrated that NTAmers are effective tools to isolate rare high-avidity cytotoxic T cells from patients for use in adoptive therapies for cancer treatment.
Resumo:
Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.
Resumo:
Nanoparticulate formulations for synthetic long peptide (SLP)-cancer vaccines as alternative to clinically used Montanide ISA 51- and squalene-based emulsions are investigated in this study. SLPs were loaded into TLR ligand-adjuvanted cationic liposomes and PLGA nanoparticles (NPs) to potentially induce cell-mediated immune responses. The liposomal and PLGA NP formulations were successfully loaded with up to four different compounds and were able to enhance antigen uptake by dendritic cells (DCs) and subsequent activation of T cells in vitro. Subcutaneous vaccination of mice with the different formulations showed that the SLP-loaded cationic liposomes were the most efficient for the induction of functional antigen-T cells in vivo, followed by PLGA NPs which were as potent as or even more than the Montanide and squalene emulsions. Moreover, after transfer of antigen-specific target cells in immunized mice, liposomes induced the highest in vivo killing capacity. These findings, considering also the inadequate safety profile of the currently clinically used adjuvant Montanide ISA-51, make these two particulate, biodegradable delivery systems promising candidates as delivery platforms for SLP-based immunotherapy of cancer.
Resumo:
Sulfonamides obtained by reaction of 8-aminoquinoline with 4-nitrobenzenesulfonylchloride and 2,4,6-triisopropylbenzenesulfonyl chloride were used to synthesize coordination compounds with CuII and ZnII with a ML2 composition. Determination of the crystal structures of the resulting zinc and copper complexes by X-ray diffraction show a distorted tetrahedral environment for the [Cu(qnbsa)2], [Cu(qibsa)2] and [Zn(qibsa)2] complexes in which the sulfonamide group acts as a bidentate ligand through the nitrogen atoms from the sulfonamidate and quinoline groups. The complex [Zn(qnbsa)2] crystallizes with a water molecule from the solvent and the Zn is five-coordinated and shows a bipyramidal-trigonal geometry. The electrochemical and electronic spectroscopy properties of the copper complexes are also discussed.
Resumo:
In this study, dispersive liquid-liquid microextraction based on the solidification of floating organic droplets was used for the preconcentration and determination of thorium in the water samples. In this method, acetone and 1-undecanol were used as disperser and extraction solvents, respectively, and the ligand 1-(2-thenoyl)-3,3,3-trifluoracetone reagent (TTA) and Aliquat 336 was used as a chelating agent and an ion-paring reagent, for the extraction of thorium, respectively. Inductively coupled plasma-optical emission spectrometry was applied for the quantitation of the analyte after preconcentration. The effect of various factors, such as the extraction and disperser solvent, sample pH, concentration of TTA and concentration of aliquat336 were investigated. Under the optimum conditions, the calibration graph was linear within the thorium content range of 1.0-250 µg L-1 with a detection limit of 0.2 µg L-1. The method was also successfully applied for the determination of thorium in the different water samples.
Resumo:
Integrins are heterodimeric cell adhesion receptors involved in cell-cell and cell-extracellular matrix (ECM) interactions. They transmit bidirectional signals across the cell membrane. This results in a wide range of biological events from cell differentiation to apoptosis. alpha2beta1 integrin is an abundant collagen receptor expressed on the surface of several cell types. In addition to ECM ligands, alpha2beta1 integrins are bound by echovirus 1 (EV1) which uses alpha2beta1 as a receptor to initiate its life cycle in the infected cell. The aim of this thesis project was to provide further insight into the mechanisms of alpha2beta1 integrin ligand recognition and receptor activation. Collagen fibrils are the principal tensile elements of the ECM. Yet, the interaction of alpha2beta1 integrin with the fibrillar form of collagen I has received relatively little attention. This research focused on the ability of alpha2beta1 integrin to act as a receptor for type I collagen fibrils. Also the molecular requirements of the EV1 interaction with alpha2beta1 were studied. Conventionally, ligand binding has been suggested to require integrin activation and the binding may further trigger integrin signalling. Another main objective of this study was to elucidate both the inside-out and outside-in signalling mechanisms of alpha2beta1 integrin in adherent cells. The results indicated that alpha2beta1 integrin is the principal integrin-type collagen receptor for type I collagen fibrils, and alpha2beta1 may participate in the regulation of pericellular collagen fibrillogenesis. Furthermore, alpha2beta1 integrin inside-out activation appeared to be synergistically regulated by integrin clustering and conformational activation. The triggering of alpha2beta1 integrin outside-in signalling, however, was shown to require both conformational changes and clustering. In contrast to ECM ligands, EV1 appeared to take advantage of the bent, inactive form of alpha2beta1 integrin in initiating its life cycle in the cell. This research together with other recent studies, has shed light on the molecular mechanisms of integrin activation. It is becoming evident that large ligands are able to bind to the bent form of integrin, which has been previously considered to be physiologically inactive. Consequently, our understanding of the conformational modulation of integrins upon activation is changing.