10 resultados para Better comprehension
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
Negli ultimi anni, un crescente numero di studiosi ha focalizzato la propria attenzione sullo sviluppo di strategie che permettessero di caratterizzare le proprietà ADMET dei farmaci in via di sviluppo, il più rapidamente possibile. Questa tendenza origina dalla consapevolezza che circa la metà dei farmaci in via di sviluppo non viene commercializzato perché ha carenze nelle caratteristiche ADME, e che almeno la metà delle molecole che riescono ad essere commercializzate, hanno comunque qualche problema tossicologico o ADME [1]. Infatti, poco importa quanto una molecola possa essere attiva o specifica: perché possa diventare farmaco è necessario che venga ben assorbita, distribuita nell’organismo, metabolizzata non troppo rapidamente, ne troppo lentamente e completamente eliminata. Inoltre la molecola e i suoi metaboliti non dovrebbero essere tossici per l’organismo. Quindi è chiaro come una rapida determinazione dei parametri ADMET in fasi precoci dello sviluppo del farmaco, consenta di risparmiare tempo e denaro, permettendo di selezionare da subito i composti più promettenti e di lasciar perdere quelli con caratteristiche negative. Questa tesi si colloca in questo contesto, e mostra l’applicazione di una tecnica semplice, la biocromatografia, per caratterizzare rapidamente il legame di librerie di composti alla sieroalbumina umana (HSA). Inoltre mostra l’utilizzo di un’altra tecnica indipendente, il dicroismo circolare, che permette di studiare gli stessi sistemi farmaco-proteina, in soluzione, dando informazioni supplementari riguardo alla stereochimica del processo di legame. La HSA è la proteina più abbondante presente nel sangue. Questa proteina funziona da carrier per un gran numero di molecole, sia endogene, come ad esempio bilirubina, tiroxina, ormoni steroidei, acidi grassi, che xenobiotici. Inoltre aumenta la solubilità di molecole lipofile poco solubili in ambiente acquoso, come ad esempio i tassani. Il legame alla HSA è generalmente stereoselettivo e ad avviene a livello di siti di legame ad alta affinità. Inoltre è ben noto che la competizione tra farmaci o tra un farmaco e metaboliti endogeni, possa variare in maniera significativa la loro frazione libera, modificandone l’attività e la tossicità. Per queste sue proprietà la HSA può influenzare sia le proprietà farmacocinetiche che farmacodinamiche dei farmaci. Non è inusuale che un intero progetto di sviluppo di un farmaco possa venire abbandonato a causa di un’affinità troppo elevata alla HSA, o a un tempo di emivita troppo corto, o a una scarsa distribuzione dovuta ad un debole legame alla HSA. Dal punto di vista farmacocinetico, quindi, la HSA è la proteina di trasporto del plasma più importante. Un gran numero di pubblicazioni dimostra l’affidabilità della tecnica biocromatografica nello studio dei fenomeni di bioriconoscimento tra proteine e piccole molecole [2-6]. Il mio lavoro si è focalizzato principalmente sull’uso della biocromatografia come metodo per valutare le caratteristiche di legame di alcune serie di composti di interesse farmaceutico alla HSA, e sul miglioramento di tale tecnica. Per ottenere una miglior comprensione dei meccanismi di legame delle molecole studiate, gli stessi sistemi farmaco-HSA sono stati studiati anche con il dicroismo circolare (CD). Inizialmente, la HSA è stata immobilizzata su una colonna di silice epossidica impaccata 50 x 4.6 mm di diametro interno, utilizzando una procedura precedentemente riportata in letteratura [7], con alcune piccole modifiche. In breve, l’immobilizzazione è stata effettuata ponendo a ricircolo, attraverso una colonna precedentemente impaccata, una soluzione di HSA in determinate condizioni di pH e forza ionica. La colonna è stata quindi caratterizzata per quanto riguarda la quantità di proteina correttamente immobilizzata, attraverso l’analisi frontale di L-triptofano [8]. Di seguito, sono stati iniettati in colonna alcune soluzioni raceme di molecole note legare la HSA in maniera enantioselettiva, per controllare che la procedura di immobilizzazione non avesse modificato le proprietà di legame della proteina. Dopo essere stata caratterizzata, la colonna è stata utilizzata per determinare la percentuale di legame di una piccola serie di inibitori della proteasi HIV (IPs), e per individuarne il sito(i) di legame. La percentuale di legame è stata calcolata attraverso il fattore di capacità (k) dei campioni. Questo parametro in fase acquosa è stato estrapolato linearmente dal grafico log k contro la percentuale (v/v) di 1-propanolo presente nella fase mobile. Solamente per due dei cinque composti analizzati è stato possibile misurare direttamente il valore di k in assenza di solvente organico. Tutti gli IPs analizzati hanno mostrato un’elevata percentuale di legame alla HSA: in particolare, il valore per ritonavir, lopinavir e saquinavir è risultato maggiore del 95%. Questi risultati sono in accordo con dati presenti in letteratura, ottenuti attraverso il biosensore ottico [9]. Inoltre, questi risultati sono coerenti con la significativa riduzione di attività inibitoria di questi composti osservata in presenza di HSA. Questa riduzione sembra essere maggiore per i composti che legano maggiormente la proteina [10]. Successivamente sono stati eseguiti degli studi di competizione tramite cromatografia zonale. Questo metodo prevede di utilizzare una soluzione a concentrazione nota di un competitore come fase mobile, mentre piccole quantità di analita vengono iniettate nella colonna funzionalizzata con HSA. I competitori sono stati selezionati in base al loro legame selettivo ad uno dei principali siti di legame sulla proteina. In particolare, sono stati utilizzati salicilato di sodio, ibuprofene e valproato di sodio come marker dei siti I, II e sito della bilirubina, rispettivamente. Questi studi hanno mostrato un legame indipendente dei PIs ai siti I e II, mentre è stata osservata una debole anticooperatività per il sito della bilirubina. Lo stesso sistema farmaco-proteina è stato infine investigato in soluzione attraverso l’uso del dicroismo circolare. In particolare, è stato monitorata la variazione del segnale CD indotto di un complesso equimolare [HSA]/[bilirubina], a seguito dell’aggiunta di aliquote di ritonavir, scelto come rappresentante della serie. I risultati confermano la lieve anticooperatività per il sito della bilirubina osservato precedentemente negli studi biocromatografici. Successivamente, lo stesso protocollo descritto precedentemente è stato applicato a una colonna di silice epossidica monolitica 50 x 4.6 mm, per valutare l’affidabilità del supporto monolitico per applicazioni biocromatografiche. Il supporto monolitico monolitico ha mostrato buone caratteristiche cromatografiche in termini di contropressione, efficienza e stabilità, oltre che affidabilità nella determinazione dei parametri di legame alla HSA. Questa colonna è stata utilizzata per la determinazione della percentuale di legame alla HSA di una serie di poliamminochinoni sviluppati nell’ambito di una ricerca sulla malattia di Alzheimer. Tutti i composti hanno mostrato una percentuale di legame superiore al 95%. Inoltre, è stata osservata una correlazione tra percentuale di legame è caratteristiche della catena laterale (lunghezza e numero di gruppi amminici). Successivamente sono stati effettuati studi di competizione dei composti in esame tramite il dicroismo circolare in cui è stato evidenziato un effetto anticooperativo dei poliamminochinoni ai siti I e II, mentre rispetto al sito della bilirubina il legame si è dimostrato indipendente. Le conoscenze acquisite con il supporto monolitico precedentemente descritto, sono state applicate a una colonna di silice epossidica più corta (10 x 4.6 mm). Il metodo di determinazione della percentuale di legame utilizzato negli studi precedenti si basa su dati ottenuti con più esperimenti, quindi è necessario molto tempo prima di ottenere il dato finale. L’uso di una colonna più corta permette di ridurre i tempi di ritenzione degli analiti, per cui la determinazione della percentuale di legame alla HSA diventa molto più rapida. Si passa quindi da una analisi a medio rendimento a una analisi di screening ad alto rendimento (highthroughput- screening, HTS). Inoltre, la riduzione dei tempi di analisi, permette di evitare l’uso di soventi organici nella fase mobile. Dopo aver caratterizzato la colonna da 10 mm con lo stesso metodo precedentemente descritto per le altre colonne, sono stati iniettati una serie di standard variando il flusso della fase mobile, per valutare la possibilità di utilizzare flussi elevati. La colonna è stata quindi impiegata per stimare la percentuale di legame di una serie di molecole con differenti caratteristiche chimiche. Successivamente è stata valutata la possibilità di utilizzare una colonna così corta, anche per studi di competizione, ed è stata indagato il legame di una serie di composti al sito I. Infine è stata effettuata una valutazione della stabilità della colonna in seguito ad un uso estensivo. L’uso di supporti cromatografici funzionalizzati con albumine di diversa origine (ratto, cane, guinea pig, hamster, topo, coniglio), può essere proposto come applicazione futura di queste colonne HTS. Infatti, la possibilità di ottenere informazioni del legame dei farmaci in via di sviluppo alle diverse albumine, permetterebbe un migliore paragone tra i dati ottenuti tramite esperimenti in vitro e i dati ottenuti con esperimenti sull’animale, facilitando la successiva estrapolazione all’uomo, con la velocità di un metodo HTS. Inoltre, verrebbe ridotto anche il numero di animali utilizzati nelle sperimentazioni. Alcuni lavori presenti in letteratura dimostrano l’affidabilita di colonne funzionalizzate con albumine di diversa origine [11-13]: l’utilizzo di colonne più corte potrebbe aumentarne le applicazioni.
Resumo:
The study of the objects LaTène type found in middle-eastern alpine region (Trentino Alto Adige-Südtirol, Engadina, North Tirol, Voralberg and Villach basin) is aimed to a better comprehension of the complex net of relationships established among the Celts, settled both in the central Europe territories and, since the IV century b.C., in the Po Plain, and the local populations. The ancient authors, who called the inhabitants of this area Raeti, propose for this territory the usual pattern according to which, the population of a region was formed consequently to a migration or was caused by the hunting of pre-existing peoples. The archaeologists, in the last thirty years, recognized a cultural facies typical of the middle-eastern alpine territory during the second Iron Age, and defined that as Fritzens-Sanzeno culture (from the sites of Fritzens, Inn valley, and Sanzeno, Non Valley). The so-called Fritzens-Sanzeno culture spread out without breaks from the material culture of the final Bronze Age and the first Iron Age. This local substratum, characterized by a ceramic repertoire strongly standardized, by peculiar architectural solutions and by a particular typology of rural sacred places (Brandopferplätze), accepted, above all during the second Iron Age, the strong influences coming from the Etruscan world and from the Celtic one (evident in the presence of objects of ornament, of glass artefacts, of elements of the weaponry and of coins). The objects LaTène type become, with different degrees of reliability, important markers of the relationships existing between the Celts and the Raeti, although the ways of interaction (cultural influence, people's movements, commercial exchanges, gifts among élites etc.) is not still clear. The revision of published data and the study of unpublished materials allows to define a rich and articulated picture both to chronological level and to territorial one.
Resumo:
This Ph.D. thesis consists in three research papers focused on the relationship between media industry and the financial sector. The importance of a correct understanding what is the effect of media on financial markets is becoming increasingly important as long as fully informed markets hypothesis has been challenged. Therefore, if financial markets do not have access to complete information, the importance of information professionals, the media, follows. On the other side, another challenge for economic and finance scholar is to understand how financial features are able to influence media and to condition information disclosure. The main aim of this Ph.D. dissertation is to contribute to a better comprehension for both the phenomena. The first paper analyzes the effects of owning equity shares in a newspaper- publishing firm. The main findings show how for a firm being part of the ownership structure of a media firm ends to receive more and better coverage. This confirms the view in which owning a media outlet is a source of conflicts of interest. The second paper focuses on the effect of media-delivered information on financial markets. In the framework of IPO in the U.S. market, we found empirical evidence of a significant effect of the media role in the IPO pricing. Specifically, increasing the quantity and the quality of the coverage increases the first-day returns (i.e. the underpricing). Finally the third paper tries to summarize what has been done in studying the relationship between media and financial industries, putting together contributes from economic, business, and financial scholars. The main finding of this dissertation is therefore to have underlined the importance and the effectiveness of the relationship between media industry and the financial sector, contributing to the stream of research that investigates about the media role and media effectiveness in the financial and business sectors.
Resumo:
Electronic business surely represents the new development perspective for world-wide trade. Together with the idea of ebusiness, and the exigency to exchange business messages between trading partners, the concept of business-to-business (B2B) integration arouse. B2B integration is becoming necessary to allow partners to communicate and exchange business documents, like catalogues, purchase orders, reports and invoices, overcoming architectural, applicative, and semantic differences, according to the business processes implemented by each enterprise. Business relationships can be very heterogeneous, and consequently there are variousways to integrate enterprises with each other. Moreover nowadays not only large enterprises, but also the small- and medium- enterprises are moving towards ebusiness: more than two-thirds of Small and Medium Enterprises (SMEs) use the Internet as a business tool. One of the business areas which is actively facing the interoperability problem is that related with the supply chain management. In order to really allow the SMEs to improve their business and to fully exploit ICT technologies in their business transactions, there are three main players that must be considered and joined: the new emerging ICT technologies, the scenario and the requirements of the enterprises and the world of standards and standardisation bodies. This thesis presents the definition and the development of an interoperability framework (and the bounded standardisation intiatives) to provide the Textile/Clothing sectorwith a shared set of business documents and protocols for electronic transactions. Considering also some limitations, the thesis proposes a ontology-based approach to improve the functionalities of the developed framework and, exploiting the technologies of the semantic web, to improve the standardisation life-cycle, intended as the development, dissemination and adoption of B2B protocols for specific business domain. The use of ontologies allows the semantic modellisation of knowledge domains, upon which it is possible to develop a set of components for a better management of B2B protocols, and to ease their comprehension and adoption for the target users.
Resumo:
Research for new biocompatible and easily implantable materials continuously proposes new molecules and new substances with biological, chemical and physical characteristics, that are more and more adapted to aesthetic and reconstructive surgery and to the development of biomedical devices such as cardiovascular prostheses. Two classes of polymeric biomaterials seem to meet better these requirements: “hydrogels” , which includes polyalkylimide (PAI) and polyvinylalcohol (PVA) and “elastomers”, which includes polyurethanes (PUs). The first ones in the last decade have had a great application for soft tissue augmentation, due to their similarity to this tissue for their high water content, elasticity and oxygen permeability (Dini et al., 2005). The second ones, on the contrary, are widely used in cardiovascular applications (catheters, vascular grafts, ventricular assist devices, total artificial hearts) due to their good mechanical properties and hemocompatibility (Zdrahala R.J. and Zdrahala I.J., 1999). In the biocompatibility evaluation of these synthetic polymers, that is important for its potential use in clinical applications, a fundamental aspect is the knowledge of the polymers cytotoxicity and the effect of their interaction with cells, in particular with the cell populations involved in the inflammatory responses, i.e. monocyte/macrophages. In consideration of what above said, the aim of this study is the comprehension of the in vitro effect of PAI, PVA and PU on three cell lines that represent three different stages of macrophagic differentiation: U937 pro-monocytes, THP-1 monocytes and RAW 264.7 macrophages. Cytotoxicity was evaluated by measuring the rate of viability with MTT, Neutral Red and morphological analysis at light microscope in time-course dependent experiments. The influence of these polymers on monocyte/macrophage activation in terms of cells adhesion, monocyte differentiation in macrophages, antigens distribution, aspecific phagocytosis, fluid-phase endocitosis, pro-inflammatory cytokine (TNF-α, IL-1β, IL-6) and nitric oxide (NO) release was evaluated. In conclusion, our studies have indicated that the three different polymeric biomaterials are highly biocompatible, since they scarcely affected viability of U937, THP-1 and RAW 264.7 cells. Moreover, we have found that even though hydrogels and polyurethane influences monocyte/macrophage differentiation (depending on the particular type of cell and polymer), they are immunocompatible since they not induced significantly high cytokine release. For these reasons their clinical applications are strongly encouraged.
Resumo:
Citokines are proteins produced by several cell types and secreted in response to various stimuli. These molecules are able to modify the behaviour of other cells inducing activities like growth, differentiation and apoptosis. In the last years, veterinary scientists have investigated the role played by these factors; in fact, cytokines can act as intercellular communicative signals in immune response, cell damage repair and hematopoiesis. Up to date, various cytokines have been identified and in depth comprehension of their effects in physiology, pathology and therapy is an interesting field of research. This thesis aims to understand the role played by these mediators during natural or experimentally induced pathologies. In particular, it has been evaluated the genic and protein expressions of a large number of cytokines during several diseases and starting from different matrix. Considering the heterogeneity of materials used in experimentations, multiple methods and protocols of nucleic acids and proteins extractions have been standardized. Results on cytokines expression obtained from various in vitro and in vivo experimental studies have shown how important these mediators are in regulation and modulation of the host immune response also in veterinary medicine. In particular, the analysis of inflammatory and septic markers, like cytokines, has allowed a better understanding in the pathogenesis during horse Recurrent Airway Obstruction, foal sepsis, Bovine Viral Diarrhea Virus infection and dog Parvovirus infection and the effects of these agents on the host immune system. As experimentations with mice have shown, some pathologies of the respiratory and nervous system can be reduced or even erased by blocking cytokines inflammatory production. The in vitro cytokines expression evaluation in cells which are in vivo involved in the response to exogenous (like pathogens) or endogenous (as it happens during autoimmune diseases) inflammatory stimuli could represent a model for studying citokines effects during the host immune response. This has been analyzed using lymphocytes cultured with several St. aureus strains isolated from bovine mastitic milk and different colostrum products. In the first experiment different cytokines were expressed depending on enterotoxins produced, justifying a different behaviour of the microrganism in the mammal gland. In the second one, bone marrow cells derived incubated with murine lymphocytes with colostrum products have shown various cluster of differentiation expression , different proliferation and a modified cytokines profile. A better understanding of cytokine expression mechanisms will increase the know-how on immune response activated by several pathogen agents. In particular, blocking the cytokine production, the inhibition or catalyzation of the receptor binding mechanism and the modulation of signal transduction mechanism will represent a novel therapeutic strategy in veterinary medicine.
Resumo:
In this work I address the study of language comprehension in an “embodied” framework. Firstly I show behavioral evidence supporting the idea that language modulates the motor system in a specific way, both at a proximal level (sensibility to the effectors) and at the distal level (sensibility to the goal of the action in which the single motor acts are inserted). I will present two studies in which the method is basically the same: we manipulated the linguistic stimuli (the kind of sentence: hand action vs. foot action vs. mouth action) and the effector by which participants had to respond (hand vs. foot vs. mouth; dominant hand vs. non-dominant hand). Response times analyses showed a specific modulation depending on the kind of sentence: participants were facilitated in the task execution (sentence sensibility judgment) when the effector they had to use to respond was the same to which the sentences referred. Namely, during language comprehension a pre-activation of the motor system seems to take place. This activation is analogous (even if less intense) to the one detectable when we practically execute the action described by the sentence. Beyond this effector specific modulation, we also found an effect of the goal suggested by the sentence. That is, the hand effector was pre-activated not only by hand-action-related sentences, but also by sentences describing mouth actions, consistently with the fact that to execute an action on an object with the mouth we firstly have to bring it to the mouth with the hand. After reviewing the evidence on simulation specificity directly referring to the body (for instance, the kind of the effector activated by the language), I focus on the specific properties of the object to which the words refer, particularly on the weight. In this case the hypothesis to test was if both lifting movement perception and lifting movement execution are modulated by language comprehension. We used behavioral and kinematics methods, and we manipulated the linguistic stimuli (the kind of sentence: the lifting of heavy objects vs. the lifting of light objects). To study the movement perception we measured the correlations between the weight of the objects lifted by an actor (heavy objects vs. light objects) and the esteems provided by the participants. To study the movement execution we measured kinematics parameters variance (velocity, acceleration, time to the first peak of velocity) during the actual lifting of objects (heavy objects vs. light objects). Both kinds of measures revealed that language had a specific effect on the motor system, both at a perceptive and at a motoric level. Finally, I address the issue of the abstract words. Different studies in the “embodied” framework tried to explain the meaning of abstract words The limit of these works is that they account only for subsets of phenomena, so results are difficult to generalize. We tried to circumvent this problem by contrasting transitive verbs (abstract and concrete) and nouns (abstract and concrete) in different combinations. The behavioral study was conducted both with German and Italian participants, as the two languages are syntactically different. We found that response times were faster for both the compatible pairs (concrete verb + concrete noun; abstract verb + abstract noun) than for the mixed ones. Interestingly, for the mixed combinations analyses showed a modulation due to the specific language (German vs. Italian): when the concrete word precedes the abstract one responses were faster, regardless of the word grammatical class. Results are discussed in the framework of current views on abstract words. They highlight the important role of developmental and social aspects of language use, and confirm theories assigning a crucial role to both sensorimotor and linguistic experience for abstract words.
Resumo:
Data coming out from various researches carried out over the last years in Italy on the problem of school dispersion in secondary school show that difficulty in studying mathematics is one of the most frequent reasons of discomfort reported by students. Nevertheless, it is definitely unrealistic to think we can do without such knowledge in today society: mathematics is largely taught in secondary school and it is not confined within technical-scientific courses only. It is reasonable to say that, although students may choose academic courses that are, apparently, far away from mathematics, all students will have to come to terms, sooner or later in their life, with this subject. Among the reasons of discomfort given by the study of mathematics, some mention the very nature of this subject and in particular the complex symbolic language through which it is expressed. In fact, mathematics is a multimodal system composed by oral and written verbal texts, symbol expressions, such as formulae and equations, figures and graphs. For this, the study of mathematics represents a real challenge to those who suffer from dyslexia: this is a constitutional condition limiting people performances in relation to the activities of reading and writing and, in particular, to the study of mathematical contents. Here the difficulties in working with verbal and symbolic codes entail, in turn, difficulties in the comprehension of texts from which to deduce operations that, once combined together, would lead to the problem final solution. Information technologies may support this learning disorder effectively. However, these tools have some implementation limits, restricting their use in the study of scientific subjects. Vocal synthesis word processors are currently used to compensate difficulties in reading within the area of classical studies, but they are not used within the area of mathematics. This is because the vocal synthesis (or we should say the screen reader supporting it) is not able to interpret all that is not textual, such as symbols, images and graphs. The DISMATH software, which is the subject of this project, would allow dyslexic users to read technical-scientific documents with the help of a vocal synthesis, to understand the spatial structure of formulae and matrixes, to write documents with a technical-scientific content in a format that is compatible with main scientific editors. The system uses LaTex, a text mathematic language, as mediation system. It is set up as LaTex editor, whose graphic interface, in line with main commercial products, offers some additional specific functions with the capability to support the needs of users who are not able to manage verbal and symbolic codes on their own. LaTex is translated in real time into a standard symbolic language and it is read by vocal synthesis in natural language, in order to increase, through the bimodal representation, the ability to process information. The understanding of the mathematic formula through its reading is made possible by the deconstruction of the formula itself and its “tree” representation, so allowing to identify the logical elements composing it. Users, even without knowing LaTex language, are able to write whatever scientific document they need: in fact the symbolic elements are recalled by proper menus and automatically translated by the software managing the correct syntax. The final aim of the project, therefore, is to implement an editor enabling dyslexic people (but not only them) to manage mathematic formulae effectively, through the integration of different software tools, so allowing a better teacher/learner interaction too.
Resumo:
Class I phosphatidylinositol 3-kinases (PI3Ks) are heterodimeric lipid kinases consisting of a regulatory subunit and one of four catalytic subunits (p110α, p110β, p110γ or p110δ). p110γ/p110δ PI3Ks are highly enriched in leukocytes. In general, PI3Ks regulate a variety of cellular processes including cell proliferation, survival and metabolism, by generating the second messenger phosphatidylinositol-3,4,5-trisphosphate (PtdIns(3,4,5)P3). Their activity is tightly regulated by the phosphatase and tensin homolog (PTEN) lipid phosphatase. PI3Ks are widely implicated in human cancers, and in particular are upregulated in T-cell acute lymphoblastic leukemia (T-ALL), mainly due to loss of PTEN function. These observations lend compelling weight to the application of PI3K inhibitors in the therapy of T-ALL. At present different compounds which target single or multiple PI3K isoforms have entered clinical trials. In the present research, it has been analyzed the therapeutic potential of the pan-PI3K inhibitor BKM120, an orally bioavailable 2,6-dimorpholino pyrimidine derivative, which has entered clinical trials for solid tumors, on both T-ALL cell lines and patient samples. BKM120 treatment resulted in cell cycle arrest and apoptosis, being cytotoxic to a panel of T-ALL cell lines and patient T-lymphoblasts. Remarkably, BKM120 synergized with chemotherapeutic agents currently used for treating T-ALL patients. BKM120 efficacy was confirmed in in vivo studies to a subcutaneous xenotransplant model of human T-ALL. Because it is still unclear which agents among isoform-specific or pan inhibitors can achieve the greater efficacy, further analyses have been conducted to investigate the effects of PI3K inhibition, in order to elucidate the mechanisms responsible for the proliferative impairment of T-ALL. Overall, these results indicated that BKM120 may be an efficient treatment for T-ALLs that have aberrant up-regulation of the PI3K signaling pathway and strongly support clinical application of pan-class I PI3K rather than single-isoform inhibitors in T-ALL treatment.