7 resultados para PHAGE-LAMBDA
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The prognostic value of ABC transporters in Ewing sarcoma is still poorly explored and controversial. We described for the first time the impact of various ABCs on Ewing sarcoma prognosis by assessment of their gene expression in two independent cohorts of patients. Unexpected associations with favourable outcomes were observed for two ABCs of the A-subfamily, ABCA6 and ABCA7, whereas no associations with the canonical multidrug ABC transporters were identified. The ABCs of the A-subfamily are involved in cholesterol/phospholipids transportation and efflux from cells. Our clinical data support the drug-efflux independent contribution to cancer progression of the ABCAs, which has been confirmed in PDX-derived cell lines. The impact of these ABCA transporters on tumor progression seems to be mediated by lowering intracellular cholesterol, supporting the role of these proteins in lipid transport. In addition, the gene expression of ABCA6 and ABCA7 is regulated by transcription factors which control lipid metabolism: ABCA6 was induced by the binding of FoxO1/FoxO3a to its promoter and repressed by IGF1R/Akt signaling, whereas the expression of ABCA7 was regulated by p53. The data point to ABCA6 and ABCA7 as potential prognostic markers in Ewing sarcoma and suggest the IGF1/ABCA/lipid axis as an intriguing therapeutic target. Agonist monoclonal antibodies towards ABCA6/7 or inhibitors of cholesterol biosynthesis, such as statins or aminobiphoshonates, may be investigated as therapeutic options in combination with chemotherapy. Considering that no monoclonal antibodies selectively targeting extracellular domains of ABCA6/7 are available, the second part of the project has been dedicated to the generation of human antibody phage-display libraries as tools for selecting monoclonal antibodies. A novel synthetic human antibody phage-display library has been designed, cloned and characterized. The library takes advantages of the high variability of a designed naïve repertoire to be a useful tool for isolating antibodies towards all potential antigens, including the ABCAs.
Resumo:
This thesis explores the advancement of cancer treatment through targeted photodynamic therapy (PDT) using bioengineered phages. It aims to harness the specificity of phages for targeting cancer-related receptors such as EGFR and HER2, which are pivotal in numerous malignancies and associated with poor outcomes. The study commenced with the M13EGFR phage, modified to target EGFR through pIII-displayed EGFR-binding peptides, demonstrating enhanced killing efficiency when conjugated with the Rose Bengal photosensitizer. This phase underscored phages' potential in targeted PDT. A breakthrough was achieved with the development of the M137D12 phage, engineered to display the 7D12 nanobody for precise EGFR targeting, marking a shift from peptide-based to nanobody-based targeting and yielding better specificity and therapeutic results. The translational potential was highlighted through in vitro and in vivo assays employing therapeutic lasers, showing effective, specific cancer cell killing through a necrotic mechanism. Additionally, the research delved into the interaction between the M13CC phage and colon cancer models, demonstrating its ability to penetrate and disrupt cancer spheroids only upon irradiation, indicating a significant advancement in targeting cells within challenging tumor microenvironments. In summary, the thesis provides a thorough examination of the phage platform's efficacy and versatility for targeted PDT. The promising outcomes, especially with the M137D12 phage, and initial findings on a HER2-targeting phage (M13HER2), forecast a promising future for phage-mediated, targeted anticancer strategies employing photosensitizers in PDT.
Resumo:
Higher-order process calculi are formalisms for concurrency in which processes can be passed around in communications. Higher-order (or process-passing) concurrency is often presented as an alternative paradigm to the first order (or name-passing) concurrency of the pi-calculus for the description of mobile systems. These calculi are inspired by, and formally close to, the lambda-calculus, whose basic computational step ---beta-reduction--- involves term instantiation. The theory of higher-order process calculi is more complex than that of first-order process calculi. This shows up in, for instance, the definition of behavioral equivalences. A long-standing approach to overcome this burden is to define encodings of higher-order processes into a first-order setting, so as to transfer the theory of the first-order paradigm to the higher-order one. While satisfactory in the case of calculi with basic (higher-order) primitives, this indirect approach falls short in the case of higher-order process calculi featuring constructs for phenomena such as, e.g., localities and dynamic system reconfiguration, which are frequent in modern distributed systems. Indeed, for higher-order process calculi involving little more than traditional process communication, encodings into some first-order language are difficult to handle or do not exist. We then observe that foundational studies for higher-order process calculi must be carried out directly on them and exploit their peculiarities. This dissertation contributes to such foundational studies for higher-order process calculi. We concentrate on two closely interwoven issues in process calculi: expressiveness and decidability. Surprisingly, these issues have been little explored in the higher-order setting. Our research is centered around a core calculus for higher-order concurrency in which only the operators strictly necessary to obtain higher-order communication are retained. We develop the basic theory of this core calculus and rely on it to study the expressive power of issues universally accepted as basic in process calculi, namely synchrony, forwarding, and polyadic communication.
Resumo:
In this work we investigate the influence of dark energy on structure formation, within five different cosmological models, namely a concordance $\Lambda$CDM model, two models with dynamical dark energy, viewed as a quintessence scalar field (using a RP and a SUGRA potential form) and two extended quintessence models (EQp and EQn) where the quintessence scalar field interacts non-minimally with gravity (scalar-tensor theories). We adopted for all models the normalization of the matter power spectrum $\sigma_{8}$ to match the CMB data. For each model, we perform hydrodynamical simulations in a cosmological box of $(300 \ {\rm{Mpc}} \ h^{-1})^{3}$ including baryons and allowing for cooling and star formation. We find that, in models with dynamical dark energy, the evolving cosmological background leads to different star formation rates and different formation histories of galaxy clusters, but the baryon physics is not affected in a relevant way. We investigate several proxies for the cluster mass function based on X-ray observables like temperature, luminosity, $M_{gas}$, and $Y_{X}$. We confirm that the overall baryon fraction is almost independent of the dark energy models within few percentage points. The same is true for the gas fraction. This evidence reinforces the use of galaxy clusters as cosmological probe of the matter and energy content of the Universe. We also study the $c-M$ relation in the different cosmological scenarios, using both dark matter only and hydrodynamical simulations. We find that the normalization of the $c-M$ relation is directly linked to $\sigma_{8}$ and the evolution of the density perturbations for $\Lambda$CDM, RP and SUGRA, while for EQp and EQn it depends also on the evolution of the linear density contrast. These differences in the $c-M$ relation provide another way to use galaxy clusters to constrain the underlying cosmology.
Resumo:
In the thesis we present the implementation of the quadratic maximum likelihood (QML) method, ideal to estimate the angular power spectrum of the cross-correlation between cosmic microwave background (CMB) and large scale structure (LSS) maps as well as their individual auto-spectra. Such a tool is an optimal method (unbiased and with minimum variance) in pixel space and goes beyond all the previous harmonic analysis present in the literature. We describe the implementation of the QML method in the {\it BolISW} code and demonstrate its accuracy on simulated maps throughout a Monte Carlo. We apply this optimal estimator to WMAP 7-year and NRAO VLA Sky Survey (NVSS) data and explore the robustness of the angular power spectrum estimates obtained by the QML method. Taking into account the shot noise and one of the systematics (declination correction) in NVSS, we can safely use most of the information contained in this survey. On the contrary we neglect the noise in temperature since WMAP is already cosmic variance dominated on the large scales. Because of a discrepancy in the galaxy auto spectrum between the estimates and the theoretical model, we use two different galaxy distributions: the first one with a constant bias $b$ and the second one with a redshift dependent bias $b(z)$. Finally, we make use of the angular power spectrum estimates obtained by the QML method to derive constraints on the dark energy critical density in a flat $\Lambda$CDM model by different likelihood prescriptions. When using just the cross-correlation between WMAP7 and NVSS maps with 1.8° resolution, we show that $\Omega_\Lambda$ is about the 70\% of the total energy density, disfavouring an Einstein-de Sitter Universe at more than 2 $\sigma$ CL (confidence level).
Resumo:
The thesis applies the ICC tecniques to the probabilistic polinomial complexity classes in order to get an implicit characterization of them. The main contribution lays on the implicit characterization of PP (which stands for Probabilistic Polynomial Time) class, showing a syntactical characterisation of PP and a static complexity analyser able to recognise if an imperative program computes in Probabilistic Polynomial Time. The thesis is divided in two parts. The first part focuses on solving the problem by creating a prototype of functional language (a probabilistic variation of lambda calculus with bounded recursion) that is sound and complete respect to Probabilistic Prolynomial Time. The second part, instead, reverses the problem and develops a feasible way to verify if a program, written with a prototype of imperative programming language, is running in Probabilistic polynomial time or not. This thesis would characterise itself as one of the first step for Implicit Computational Complexity over probabilistic classes. There are still open hard problem to investigate and try to solve. There are a lot of theoretical aspects strongly connected with these topics and I expect that in the future there will be wide attention to ICC and probabilistic classes.
Resumo:
The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.