20 resultados para entity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is aimed at analysing EU external relations from the perspective of the promotion of the rule of law in order to evaluate the effectiveness and consistency of its action within the international community. The research starts with an examination of the notion of the rule of law from a theoretical point of view. The first chapter initially describes the historical-political evolution of the establishment of the notion of the rule of law. Some of the most significant national experiences (France, the UK, Germany and Austria) are discussed. Then, the focus is put on the need to propose interpretations which explain the grounds of the rule of law, by highlighting the different formal and substantive interpretations. This philosophical-historical analysis is complemented by a reconstruction of how the notion of the rule of law was developed by the international community, with a view to searching a common notion at the international level by comparing theory and practice within the main international organisations such as the UN, OECD and the Council of Europe. Specific mention is made of the EU experience, whose configuration as a Community based on the rule of law is often debated, starting from the case law of the European Court of Justice. The second chapter deals with the conditionality policy and focuses on the development and scope of democratic conditionality according to the dominant approach of the doctrine. First, the birth of conditionality is analysed from an economic point of view, especially within international financial organisations and the different types of conditionality recreated in the scientific sector. Then an analysis is provided about the birth of democratic conditionality in the EC – in relation to its external relations – firstly as a mere political exercise to be then turned into a standardised system of clauses. Specific reference is made to the main scope of conditionality, that is to say enlargement policy and the development of the Copenhagen criteria. The third chapter provides further details about the legal questions connected to the use of democratic clauses: on the one hand, the power of the EC to include human rights clauses in international agreements, on the other, the variety and overlapping in the use of the legal basis. The chapter ends with an analysis of the measures of suspension of agreements with third countries in those rare but significant cases in which the suspension clause, included in the Lomè Convention first and in the Cotonou Agreement then, is applied. The last chapter is devoted to the analysis of democratic clauses in unilateral acts adopted by the European Union which affect third countries. The examination of this practice and the comparison with the approach analysed in the previous chapter entails a major theoretical question. It is the clear-cut distinction between conditionality and international sanction. This distinction is to be taken into account when considering the premises and consequences, in terms of legal relations, which are generated when democratic clauses are not complied with. The chapter ends with a brief analysis of what, according to the reconstruction suggested, can be rightly labelled as real democratic conditionality, that is to say the system of incentives, positive measures developed within the community GSP. The dissertation ends with a few general considerations about the difficulties experienced by the EU in promoting the rule of law. The contradictory aspects of the EU external actions are manifold, as well as its difficulties in choosing the most appropriate measures to be taken which, however, reflect all the repercussions and tension resulting from the balance of power within the international community. The thesis argues that it is difficult to grant full credibility to an entity like the EU which, although it proclaims itself as the guardian and promoter of the rule of law, in practice, is too often biased in managing its relations with third countries. However, she adds, we must acknowledge that the EU is committed and constantly strives towards identifying new spaces and strategies of action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is not unknown that the evolution of firm theories has been developed along a path paved by an increasing awareness of the organizational structure importance. From the early “neoclassical” conceptualizations that intended the firm as a rational actor whose aim is to produce that amount of output, given the inputs at its disposal and in accordance to technological or environmental constraints, which maximizes the revenue (see Boulding, 1942 for a past mid century state of the art discussion) to the knowledge based theory of the firm (Nonaka & Takeuchi, 1995; Nonaka & Toyama, 2005), which recognizes in the firm a knnowledge creating entity, with specific organizational capabilities (Teece, 1996; Teece & Pisano, 1998) that allow to sustaine competitive advantages. Tracing back a map of the theory of the firm evolution, taking into account the several perspectives adopted in the history of thought, would take the length of many books. Because of that a more fruitful strategy is circumscribing the focus of the description of the literature evolution to one flow connected to a crucial question about the nature of firm’s behaviour and about the determinants of competitive advantages. In so doing I adopt a perspective that allows me to consider the organizational structure of the firm as an element according to which the different theories can be discriminated. The approach adopted starts by considering the drawbacks of the standard neoclassical theory of the firm. Discussing the most influential theoretical approaches I end up with a close examination of the knowledge based perspective of the firm. Within this perspective the firm is considered as a knowledge creating entity that produce and mange knowledge (Nonaka, Toyama, & Nagata, 2000; Nonaka & Toyama, 2005). In a knowledge intensive organization, knowledge is clearly embedded for the most part in the human capital of the individuals that compose such an organization. In a knowledge based organization, the management, in order to cope with knowledge intensive productions, ought to develop and accumulate capabilities that shape the organizational forms in a way that relies on “cross-functional processes, extensive delayering and empowerment” (Foss 2005, p.12). This mechanism contributes to determine the absorptive capacity of the firm towards specific technologies and, in so doing, it also shape the technological trajectories along which the firm moves. After having recognized the growing importance of the firm’s organizational structure in the theoretical literature concerning the firm theory, the subsequent point of the analysis is that of providing an overview of the changes that have been occurred at micro level to the firm’s organization of production. The economic actors have to deal with challenges posed by processes of internationalisation and globalization, increased and increasing competitive pressure of less developed countries on low value added production activities, changes in technologies and increased environmental turbulence and volatility. As a consequence, it has been widely recognized that the main organizational models of production that fitted well in the 20th century are now partially inadequate and processes aiming to reorganize production activities have been widespread across several economies in recent years. Recently, the emergence of a “new” form of production organization has been proposed both by scholars, practitioners and institutions: the most prominent characteristic of such a model is its recognition of the importance of employees commitment and involvement. As a consequence it is characterized by a strong accent on the human resource management and on those practices that aim to widen the autonomy and responsibility of the workers as well as increasing their commitment to the organization (Osterman, 1994; 2000; Lynch, 2007). This “model” of production organization is by many defined as High Performance Work System (HPWS). Despite the increasing diffusion of workplace practices that may be inscribed within the concept of HPWS in western countries’ companies, it is an hazard, to some extent, to speak about the emergence of a “new organizational paradigm”. The discussion about organizational changes and the diffusion of HPWP the focus cannot abstract from a discussion about the industrial relations systems, with a particular accent on the employment relationships, because of their relevance, in the same way as production organization, in determining two major outcomes of the firm: innovation and economic performances. The argument is treated starting from the issue of the Social Dialogue at macro level, both in an European perspective and Italian perspective. The model of interaction between the social parties has repercussions, at micro level, on the employment relationships, that is to say on the relations between union delegates and management or workers and management. Finding economic and social policies capable of sustaining growth and employment within a knowledge based scenario is likely to constitute the major challenge for the next generation of social pacts, which are the main social dialogue outcomes. As Acocella and Leoni (2007) put forward the social pacts may constitute an instrument to trade wage moderation for high intensity in ICT, organizational and human capital investments. Empirical evidence, especially focused on the micro level, about the positive relation between economic growth and new organizational designs coupled with ICT adoption and non adversarial industrial relations is growing. Partnership among social parties may become an instrument to enhance firm competitiveness. The outcome of the discussion is the integration of organizational changes and industrial relations elements within a unified framework: the HPWS. Such a choice may help in disentangling the potential existence of complementarities between these two aspects of the firm internal structure on economic and innovative performance. With the third chapter starts the more original part of the thesis. The data utilized in order to disentangle the relations between HPWS practices, innovation and economic performance refer to the manufacturing firms of the Reggio Emilia province with more than 50 employees. The data have been collected through face to face interviews both to management (199 respondents) and to union representatives (181 respondents). Coupled with the cross section datasets a further data source is constituted by longitudinal balance sheets (1994-2004). Collecting reliable data that in turn provide reliable results needs always a great effort to which are connected uncertain results. Data at micro level are often subjected to a trade off: the wider is the geographical context to which the population surveyed belong the lesser is the amount of information usually collected (low level of resolution); the narrower is the focus on specific geographical context, the higher is the amount of information usually collected (high level of resolution). For the Italian case the evidence about the diffusion of HPWP and their effects on firm performances is still scanty and usually limited to local level studies (Cristini, et al., 2003). The thesis is also devoted to the deepening of an argument of particular interest: the existence of complementarities between the HPWS practices. It has been widely shown by empirical evidence that when HPWP are adopted in bundles they are more likely to impact on firm’s performances than when adopted in isolation (Ichniowski, Prennushi, Shaw, 1997). Is it true also for the local production system of Reggio Emilia? The empirical analysis has the precise aim of providing evidence on the relations between the HPWS dimensions and the innovative and economic performances of the firm. As far as the first line of analysis is concerned it must to be stressed the fundamental role that innovation plays in the economy (Geroski & Machin, 1993; Stoneman & Kwoon 1994, 1996; OECD, 2005; EC, 2002). On this point the evidence goes from the traditional innovations, usually approximated by R&D investment expenditure or number of patents, to the introduction and adoption of ICT, in the recent years (Brynjolfsson & Hitt, 2000). If innovation is important then it is critical to analyse its determinants. In this work it is hypothesised that organizational changes and firm level industrial relations/employment relations aspects that can be put under the heading of HPWS, influence the propensity to innovate in product, process and quality of the firm. The general argument may goes as follow: changes in production management and work organization reconfigure the absorptive capacity of the firm towards specific technologies and, in so doing, they shape the technological trajectories along which the firm moves; cooperative industrial relations may lead to smother adoption of innovations, because not contrasted by unions. From the first empirical chapter emerges that the different types of innovations seem to respond in different ways to the HPWS variables. The underlying processes of product, process and quality innovations are likely to answer to different firm’s strategies and needs. Nevertheless, it is possible to extract some general results in terms of the most influencing HPWS factors on innovative performance. The main three aspects are training coverage, employees involvement and the diffusion of bonuses. These variables show persistent and significant relations with all the three innovation types. The same do the components having such variables at their inside. In sum the aspects of the HPWS influence the propensity to innovate of the firm. At the same time, emerges a quite neat (although not always strong) evidence of complementarities presence between HPWS practices. In terns of the complementarity issue it can be said that some specific complementarities exist. Training activities, when adopted and managed in bundles, are related to the propensity to innovate. Having a sound skill base may be an element that enhances the firm’s capacity to innovate. It may enhance both the capacity to absorbe exogenous innovation and the capacity to endogenously develop innovations. The presence and diffusion of bonuses and the employees involvement also spur innovative propensity. The former because of their incentive nature and the latter because direct workers participation may increase workers commitment to the organizationa and thus their willingness to support and suggest inovations. The other line of analysis provides results on the relation between HPWS and economic performances of the firm. There have been a bulk of international empirical studies on the relation between organizational changes and economic performance (Black & Lynch 2001; Zwick 2004; Janod & Saint-Martin 2004; Huselid 1995; Huselid & Becker 1996; Cappelli & Neumark 2001), while the works aiming to capture the relations between economic performance and unions or industrial relations aspects are quite scant (Addison & Belfield, 2001; Pencavel, 2003; Machin & Stewart, 1990; Addison, 2005). In the empirical analysis the integration of the two main areas of the HPWS represent a scarcely exploited approach in the panorama of both national and international empirical studies. As remarked by Addison “although most analysis of workers representation and employee involvement/high performance work practices have been conducted in isolation – while sometimes including the other as controls – research is beginning to consider their interactions” (Addison, 2005, p.407). The analysis conducted exploiting temporal lags between dependent and covariates, possibility given by the merger of cross section and panel data, provides evidence in favour of the existence of HPWS practices impact on firm’s economic performance, differently measured. Although it does not seem to emerge robust evidence on the existence of complementarities among HPWS aspects on performances there is evidence of a general positive influence of the single practices. The results are quite sensible to the time lags, inducing to hypothesize that time varying heterogeneity is an important factor in determining the impact of organizational changes on economic performance. The implications of the analysis can be of help both to management and local level policy makers. Although the results are not simply extendible to other local production systems it may be argued that for contexts similar to the Reggio Emilia province, characterized by the presence of small and medium enterprises organized in districts and by a deep rooted unionism, with strong supporting institutions, the results and the implications here obtained can also fit well. However, a hope for future researches on the subject treated in the present work is that of collecting good quality information over wider geographical areas, possibly at national level, and repeated in time. Only in this way it is possible to solve the Gordian knot about the linkages between innovation, performance, high performance work practices and industrial relations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se il lavoro dello storico è capire il passato come è stato compreso dalla gente che lo ha vissuto, allora forse non è azzardato pensare che sia anche necessario comunicare i risultati delle ricerche con strumenti propri che appartengono a un'epoca e che influenzano la mentalità di chi in quell'epoca vive. Emergenti tecnologie, specialmente nell’area della multimedialità come la realtà virtuale, permettono agli storici di comunicare l’esperienza del passato in più sensi. In che modo la storia collabora con le tecnologie informatiche soffermandosi sulla possibilità di fare ricostruzioni storiche virtuali, con relativi esempi e recensioni? Quello che maggiormente preoccupa gli storici è se una ricostruzione di un fatto passato vissuto attraverso la sua ricreazione in pixels sia un metodo di conoscenza della storia che possa essere considerato valido. Ovvero l'emozione che la navigazione in una realtà 3D può suscitare, è un mezzo in grado di trasmettere conoscenza? O forse l'idea che abbiamo del passato e del suo studio viene sottilmente cambiato nel momento in cui lo si divulga attraverso la grafica 3D? Da tempo però la disciplina ha cominciato a fare i conti con questa situazione, costretta soprattutto dall'invasività di questo tipo di media, dalla spettacolarizzazione del passato e da una divulgazione del passato parziale e antiscientifica. In un mondo post letterario bisogna cominciare a pensare che la cultura visuale nella quale siamo immersi sta cambiando il nostro rapporto con il passato: non per questo le conoscenze maturate fino ad oggi sono false, ma è necessario riconoscere che esiste più di una verità storica, a volte scritta a volte visuale. Il computer è diventato una piattaforma onnipresente per la rappresentazione e diffusione dell’informazione. I metodi di interazione e rappresentazione stanno evolvendo di continuo. Ed è su questi due binari che è si muove l’offerta delle tecnologie informatiche al servizio della storia. Lo scopo di questa tesi è proprio quello di esplorare, attraverso l’utilizzo e la sperimentazione di diversi strumenti e tecnologie informatiche, come si può raccontare efficacemente il passato attraverso oggetti tridimensionali e gli ambienti virtuali, e come, nel loro essere elementi caratterizzanti di comunicazione, in che modo possono collaborare, in questo caso particolare, con la disciplina storica. La presente ricerca ricostruisce alcune linee di storia delle principali fabbriche attive a Torino durante la seconda guerra mondiale, ricordando stretta relazione che esiste tra strutture ed individui e in questa città in particolare tra fabbrica e movimento operaio, è inevitabile addentrarsi nelle vicende del movimento operaio torinese che nel periodo della lotta di Liberazione in città fu un soggetto politico e sociale di primo rilievo. Nella città, intesa come entità biologica coinvolta nella guerra, la fabbrica (o le fabbriche) diventa il nucleo concettuale attraverso il quale leggere la città: sono le fabbriche gli obiettivi principali dei bombardamenti ed è nelle fabbriche che si combatte una guerra di liberazione tra classe operaia e autorità, di fabbrica e cittadine. La fabbrica diventa il luogo di "usurpazione del potere" di cui parla Weber, il palcoscenico in cui si tengono i diversi episodi della guerra: scioperi, deportazioni, occupazioni .... Il modello della città qui rappresentata non è una semplice visualizzazione ma un sistema informativo dove la realtà modellata è rappresentata da oggetti, che fanno da teatro allo svolgimento di avvenimenti con una precisa collocazione cronologica, al cui interno è possibile effettuare operazioni di selezione di render statici (immagini), di filmati precalcolati (animazioni) e di scenari navigabili interattivamente oltre ad attività di ricerca di fonti bibliografiche e commenti di studiosi segnatamente legati all'evento in oggetto. Obiettivo di questo lavoro è far interagire, attraverso diversi progetti, le discipline storiche e l’informatica, nelle diverse opportunità tecnologiche che questa presenta. Le possibilità di ricostruzione offerte dal 3D vengono così messe a servizio della ricerca, offrendo una visione integrale in grado di avvicinarci alla realtà dell’epoca presa in considerazione e convogliando in un’unica piattaforma espositiva tutti i risultati. Divulgazione Progetto Mappa Informativa Multimediale Torino 1945 Sul piano pratico il progetto prevede una interfaccia navigabile (tecnologia Flash) che rappresenti la pianta della città dell’epoca, attraverso la quale sia possibile avere una visione dei luoghi e dei tempi in cui la Liberazione prese forma, sia a livello concettuale, sia a livello pratico. Questo intreccio di coordinate nello spazio e nel tempo non solo migliora la comprensione dei fenomeni, ma crea un maggiore interesse sull’argomento attraverso l’utilizzo di strumenti divulgativi di grande efficacia (e appeal) senza perdere di vista la necessità di valicare le tesi storiche proponendosi come piattaforma didattica. Un tale contesto richiede uno studio approfondito degli eventi storici al fine di ricostruire con chiarezza una mappa della città che sia precisa sia topograficamente sia a livello di navigazione multimediale. La preparazione della cartina deve seguire gli standard del momento, perciò le soluzioni informatiche utilizzate sono quelle fornite da Adobe Illustrator per la realizzazione della topografia, e da Macromedia Flash per la creazione di un’interfaccia di navigazione. La base dei dati descrittivi è ovviamente consultabile essendo contenuta nel supporto media e totalmente annotata nella bibliografia. È il continuo evolvere delle tecnologie d'informazione e la massiccia diffusione dell’uso dei computer che ci porta a un cambiamento sostanziale nello studio e nell’apprendimento storico; le strutture accademiche e gli operatori economici hanno fatto propria la richiesta che giunge dall'utenza (insegnanti, studenti, operatori dei Beni Culturali) di una maggiore diffusione della conoscenza storica attraverso la sua rappresentazione informatizzata. Sul fronte didattico la ricostruzione di una realtà storica attraverso strumenti informatici consente anche ai non-storici di toccare con mano quelle che sono le problematiche della ricerca quali fonti mancanti, buchi della cronologia e valutazione della veridicità dei fatti attraverso prove. Le tecnologie informatiche permettono una visione completa, unitaria ed esauriente del passato, convogliando tutte le informazioni su un'unica piattaforma, permettendo anche a chi non è specializzato di comprendere immediatamente di cosa si parla. Il miglior libro di storia, per sua natura, non può farlo in quanto divide e organizza le notizie in modo diverso. In questo modo agli studenti viene data l'opportunità di apprendere tramite una rappresentazione diversa rispetto a quelle a cui sono abituati. La premessa centrale del progetto è che i risultati nell'apprendimento degli studenti possono essere migliorati se un concetto o un contenuto viene comunicato attraverso più canali di espressione, nel nostro caso attraverso un testo, immagini e un oggetto multimediale. Didattica La Conceria Fiorio è uno dei luoghi-simbolo della Resistenza torinese. Il progetto è una ricostruzione in realtà virtuale della Conceria Fiorio di Torino. La ricostruzione serve a arricchire la cultura storica sia a chi la produce, attraverso una ricerca accurata delle fonti, sia a chi può poi usufruirne, soprattutto i giovani, che, attratti dall’aspetto ludico della ricostruzione, apprendono con più facilità. La costruzione di un manufatto in 3D fornisce agli studenti le basi per riconoscere ed esprimere la giusta relazione fra il modello e l’oggetto storico. Le fasi di lavoro attraverso cui si è giunti alla ricostruzione in 3D della Conceria: . una ricerca storica approfondita, basata sulle fonti, che possono essere documenti degli archivi o scavi archeologici, fonti iconografiche, cartografiche, ecc.; . La modellazione degli edifici sulla base delle ricerche storiche, per fornire la struttura geometrica poligonale che permetta la navigazione tridimensionale; . La realizzazione, attraverso gli strumenti della computer graphic della navigazione in 3D. Unreal Technology è il nome dato al motore grafico utilizzato in numerosi videogiochi commerciali. Una delle caratteristiche fondamentali di tale prodotto è quella di avere uno strumento chiamato Unreal editor con cui è possibile costruire mondi virtuali, e che è quello utilizzato per questo progetto. UnrealEd (Ued) è il software per creare livelli per Unreal e i giochi basati sul motore di Unreal. E’ stata utilizzata la versione gratuita dell’editor. Il risultato finale del progetto è un ambiente virtuale navigabile raffigurante una ricostruzione accurata della Conceria Fiorio ai tempi della Resistenza. L’utente può visitare l’edificio e visualizzare informazioni specifiche su alcuni punti di interesse. La navigazione viene effettuata in prima persona, un processo di “spettacolarizzazione” degli ambienti visitati attraverso un arredamento consono permette all'utente una maggiore immersività rendendo l’ambiente più credibile e immediatamente codificabile. L’architettura Unreal Technology ha permesso di ottenere un buon risultato in un tempo brevissimo, senza che fossero necessari interventi di programmazione. Questo motore è, quindi, particolarmente adatto alla realizzazione rapida di prototipi di una discreta qualità, La presenza di un certo numero di bug lo rende, però, in parte inaffidabile. Utilizzare un editor da videogame per questa ricostruzione auspica la possibilità di un suo impiego nella didattica, quello che le simulazioni in 3D permettono nel caso specifico è di permettere agli studenti di sperimentare il lavoro della ricostruzione storica, con tutti i problemi che lo storico deve affrontare nel ricreare il passato. Questo lavoro vuole essere per gli storici una esperienza nella direzione della creazione di un repertorio espressivo più ampio, che includa gli ambienti tridimensionali. Il rischio di impiegare del tempo per imparare come funziona questa tecnologia per generare spazi virtuali rende scettici quanti si impegnano nell'insegnamento, ma le esperienze di progetti sviluppati, soprattutto all’estero, servono a capire che sono un buon investimento. Il fatto che una software house, che crea un videogame di grande successo di pubblico, includa nel suo prodotto, una serie di strumenti che consentano all'utente la creazione di mondi propri in cui giocare, è sintomatico che l'alfabetizzazione informatica degli utenti medi sta crescendo sempre più rapidamente e che l'utilizzo di un editor come Unreal Engine sarà in futuro una attività alla portata di un pubblico sempre più vasto. Questo ci mette nelle condizioni di progettare moduli di insegnamento più immersivi, in cui l'esperienza della ricerca e della ricostruzione del passato si intreccino con lo studio più tradizionale degli avvenimenti di una certa epoca. I mondi virtuali interattivi vengono spesso definiti come la forma culturale chiave del XXI secolo, come il cinema lo è stato per il XX. Lo scopo di questo lavoro è stato quello di suggerire che vi sono grosse opportunità per gli storici impiegando gli oggetti e le ambientazioni in 3D, e che essi devono coglierle. Si consideri il fatto che l’estetica abbia un effetto sull’epistemologia. O almeno sulla forma che i risultati delle ricerche storiche assumono nel momento in cui devono essere diffuse. Un’analisi storica fatta in maniera superficiale o con presupposti errati può comunque essere diffusa e avere credito in numerosi ambienti se diffusa con mezzi accattivanti e moderni. Ecco perchè non conviene seppellire un buon lavoro in qualche biblioteca, in attesa che qualcuno lo scopra. Ecco perchè gli storici non devono ignorare il 3D. La nostra capacità, come studiosi e studenti, di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio che il 3D porta con sè, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Una ricostruzione storica può essere molto utile dal punto di vista educativo non sono da chi la visita ma, anche da chi la realizza. La fase di ricerca necessaria per la ricostruzione non può fare altro che aumentare il background culturale dello sviluppatore. Conclusioni La cosa più importante è stata la possibilità di fare esperienze nell’uso di mezzi di comunicazione di questo genere per raccontare e far conoscere il passato. Rovesciando il paradigma conoscitivo che avevo appreso negli studi umanistici, ho cercato di desumere quelle che potremo chiamare “leggi universali” dai dati oggettivi emersi da questi esperimenti. Da punto di vista epistemologico l’informatica, con la sua capacità di gestire masse impressionanti di dati, dà agli studiosi la possibilità di formulare delle ipotesi e poi accertarle o smentirle tramite ricostruzioni e simulazioni. Il mio lavoro è andato in questa direzione, cercando conoscere e usare strumenti attuali che nel futuro avranno sempre maggiore presenza nella comunicazione (anche scientifica) e che sono i mezzi di comunicazione d’eccellenza per determinate fasce d’età (adolescenti). Volendo spingere all’estremo i termini possiamo dire che la sfida che oggi la cultura visuale pone ai metodi tradizionali del fare storia è la stessa che Erodoto e Tucidide contrapposero ai narratori di miti e leggende. Prima di Erodoto esisteva il mito, che era un mezzo perfettamente adeguato per raccontare e dare significato al passato di una tribù o di una città. In un mondo post letterario la nostra conoscenza del passato sta sottilmente mutando nel momento in cui lo vediamo rappresentato da pixel o quando le informazioni scaturiscono non da sole, ma grazie all’interattività con il mezzo. La nostra capacità come studiosi e studenti di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio sottinteso al 3D, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Le esperienze raccolte nelle pagine precedenti ci portano a pensare che in un futuro non troppo lontano uno strumento come il computer sarà l’unico mezzo attraverso cui trasmettere conoscenze, e dal punto di vista didattico la sua interattività consente coinvolgimento negli studenti come nessun altro mezzo di comunicazione moderno.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular profiling of Peripheral T-cell lymphomas not otherwise specified Peripheral T-cell lymphomas (PTCLs) are a heterogeneous group of tumors that the WHO classification basically subdivides into specified and not otherwise specified (NOS). In Western countries, they represent around 12% of all non-Hodgkin's lymphomas. In particular, PTCL/NOS is the commonest subtype, corresponding to about 60-70% of all T-cell lymphomas. However, it remains a complex entity showing great variety regarding either morphology, immunophenotype or clinical behavior. Specially, the molecular pathology of these tumors is still poorly known. In fact, many alteration were found, but no single genes were demonstrated to have a pathogenetic role. Recently, gene expression profiling (GEP) allowed the identification of PTCL/NOS-associated molecular signatures, leading to better understanding of their histogenesis, pathogenesis and prognostication. Interestingly, proliferation pathways are commonly altered in PTCLs, being highly proliferative cases characterized by poorer prognosis. In this study, we aimed to investigate the possible role in PTCL/NOS pathogenesis of selected molecules, known to be relevant for proliferation control. In particular, we analyzed the cell cycle regulators PTEN and CDKN1B/p27, the NF-kB pathway, and the tyrosin kinase PDGFR. First, we found that PTEN and p27 seem to be regulated in PTCL/NOS as in normal T-lymphocytes, as to what expression and cellular localization are concerned, and do not present structural abnormalities in the vast majority of PTCL/NOS. Secondly, NF-kB pathway appeared to be variably activated in PTCL/NOS. In particular, according to NF-kB gene expression levels, the tumors could be divided into two clusters (C1 and C2). Specially, C1 corresponded to cases presenting with a global down-regulation of the entire pathway, while C2 showed over-expression of genes involved in TNF signaling. Notably, by immunohistochemistry, we showed that either the canonical or the alternative NK-kB pathway were activated in around 40% of cases. Finally, we found PGDFRA to be consistently over-expressed (at mRNA and protein level) and activated in almost all PTCLs/NOS. Noteworthy, when investigating possible causes for PDGFRA deregulation, we had evidences that PDGFR over-expression is due to the absence of miR-152, which appeared to be responsible for PDGFRA silencing in normal T-cells. Furthermore, we could demonstrate that its aberrant activation is sustained by an autocrine loop. Importantly, this is the first case, to the best of our knowledge, of hematological tumor in which tyrosin kinase aberrant activity is determined by deregulated miRNA expression and autocrine loop activation. Taken together, our results provide novel insight in PTCL/NOS pathogenesis by opening new intriguing scenarios for innovative therapeutic interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work tries to display a comprehensive and comparative study of the different legal and regulatory problems involved in international securitization transactions. First, an introduction to securitization is provided, with the basic elements of the transaction, followed by the different varieties of it, including dynamic securitization and synthetic securitization structures. Together with this introduction to the intricacies of the structure, a insight into the influence of securitization in the financial and economic crisis of 2007-2009 is provided too; as well as an overview of the process of regulatory competition and cooperation that constitutes the framework for the international aspects of securitization. The next Chapter focuses on the aspects that constitute the foundations of structured finance: the inception of the vehicle, and the transfer of risks associated to the securitized assets, with particular emphasis on the validity of those elements, and how a securitization transaction could be threatened at its root. In this sense, special importance is given to the validity of the trust as an instrument of finance, to the assignment of future receivables or receivables in block, and to the importance of formalities for the validity of corporations, trusts, assignments, etc., and the interaction of such formalities contained in general corporate, trust and assignment law with those contemplated under specific securitization regulations. Then, the next Chapter (III) focuses on creditor protection aspects. As such, we provide some insights on the debate on the capital structure of the firm, and its inadequacy to assess the financial soundness problems inherent to securitization. Then, we proceed to analyze the importance of rules on creditor protection in the context of securitization. The corollary is in the rules in case of insolvency. In this sense, we divide the cases where a party involved in the transaction goes bankrupt, from those where the transaction itself collapses. Finally, we focus on the scenario where a substance over form analysis may compromise some of the elements of the structure (notably the limited liability of the sponsor, and/or the transfer of assets) by means of veil piercing, substantive consolidation, or recharacterization theories. Once these elements have been covered, the next Chapters focus on the regulatory aspects involved in the transaction. Chapter IV is more referred to “market” regulations, i.e. those concerned with information disclosure and other rules (appointment of the indenture trustee, and elaboration of a rating by a rating agency) concerning the offering of asset-backed securities to the public. Chapter V, on the other hand, focuses on “prudential” regulation of the entity entrusted with securitizing assets (the so-called Special Purpose vehicle), and other entities involved in the process. Regarding the SPV, a reference is made to licensing requirements, restriction of activities and governance structures to prevent abuses. Regarding the sponsor of the transaction, a focus is made on provisions on sound originating practices, and the servicing function. Finally, we study accounting and banking regulations, including the Basel I and Basel II Frameworks, which determine the consolidation of the SPV, and the de-recognition of the securitized asset from the originating company’s balance-sheet, as well as the posterior treatment of those assets, in particular by banks. Chapters VI-IX are concerned with liability matters. Chapter VI is an introduction to the different sources of liability. Chapter VII focuses on the liability by the SPV and its management for the information supplied to investors, the management of the asset pool, and the breach of loyalty (or fiduciary) duties. Chapter VIII rather refers to the liability of the originator as a result of such information and statements, but also as a result of inadequate and reckless originating or servicing practices. Chapter IX finally focuses on third parties entrusted with the soundness of the transaction towards the market, the so-called gatekeepers. In this respect, we make special emphasis on the liability of indenture trustees, underwriters and rating agencies. Chapters X and XI focus on the international aspects of securitization. Chapter X contains a conflicts of laws analysis of the different aspects of structured finance. In this respect, a study is made of the laws applicable to the vehicle, to the transfer of risks (either by assignment or by means of derivatives contracts), to liability issues; and a study is also made of the competent jurisdiction (and applicable law) in bankruptcy cases; as well as in cases where a substance-over-form is performed. Then, special attention is also devoted to the role of financial and securities regulations; as well as to their territorial limits, and extraterritoriality problems involved. Chapter XI supplements the prior Chapter, for it analyzes the limits to the States’ exercise of regulatory power by the personal and “market” freedoms included in the US Constitution or the EU Treaties. A reference is also made to the (still insufficient) rules from the WTO Framework, and their significance to the States’ recognition and regulation of securitization transactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project was developed into three parts: the analysis of p63 isoform in breast tumours; the study of intra-tumour eterogeneicity in metaplastic breast carcinoma; the analysis of oncocytic breast carcinoma. p63 is a sequence-specific DNA-binding factor, homologue of the tumour suppressor and transcription factor p53. The human p63 gene is composed of 15 exons and transcription can occur from two distinct promoters: the transactivating isoforms (TAp63) are generated by a promoter upstream of exon 1, while the alternative promoter located in intron 3 leads to the expression of N-terminal truncated isoforms (ΔNp63). It has been demonstrated that anti-p63 antibodies decorate the majority of squamous cell carcinomas of different organs; moreover tumours with myoepithelial differentiation of the breast show nuclear p63 expression. Two new isoforms have been described with the same sequence as TAp63 and ΔNp63 but lacking exon 4: d4TAp63 and ΔNp73L, respectively. Purpose of the study was to investigate the molecular expression of N-terminal p63 isoforms in benign and malignant breast tissues. In the present study 40 specimens from normal breast, benign lesions, DIN/DCIS, and invasive carcinomas were analyzed by immunohistochemistry and RT-PCR (Reverse Transcriptase-PCR) in order to disclose the patterns of p63 expression. We have observed that the full-length isoforms can be detected in non neoplastic and neoplastic lesions, while the short isoforms are only present in the neoplastic cells of invasive carcinomas. Metaplastic carcinomas of the breast are a heterogeneous group of neoplasms which exhibit varied patterns of metaplasia and differentiation. The existence of such non-modal populations harbouring distinct genetic aberrations may explain the phenotypic diversity observed within a given tumour. Intra-tumour morphological heterogeneity is not uncommon in breast cancer and it can often be appreciated in metaplastic breast carcinomas. Aim of this study was to determine the existence of intra-tumour genetic heterogeneity in metaplastic breast cancers and whether areas with distinct morphological features in a given tumour might be underpinned by distinct patterns of genetic aberrations. 47 cases of metaplastic breast carcinomas were retrieved. Out of the 47 cases, 9 had areas that were of sufficient dimensions to be independently microdissected. Our results indicate that at least some breast cancers are composed of multiple non-modal populations of clonally related cells and provide direct evidence that at least some types of metaplastic breast cancers are composed of multiple non-modal clones harbouring distinct genetic aberrations. Oncocytic tumours represent a distinctive set of lesions with typical granular cytoplasmatic eosinophilia of the neoplastic cells. Only rare example of breast oncocytic carcinomas have been reported in literature and the incidence is probably underestimated. In this study we have analysed 33 cases of oncocytic invasive breast carcinoma of the breast, selected according to morphological and immunohistochemical criteria. These tumours were morphologically classified and studied by immunohistochemistry and aCGH. We have concluded that oncocytic breast carcinoma is a morphologic entity with distinctive ultrastructural and histological features; immunohistochemically is characterized by a luminal profile, it has a frequency of 19.8%, has not distinctive clinical features and, at molecular level, shows a specific constellation of genetic aberration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart Environments are currently considered a key factor to connect the physical world with the information world. A Smart Environment can be defined as the combination of a physical environment, an infrastructure for data management (called Smart Space), a collection of embedded systems gathering heterogeneous data from the environment and a connectivity solution to convey these data to the Smart Space. With this vision, any application which takes advantages from the environment could be devised, without the need to directly access to it, since all information are stored in the Smart Space in a interoperable format. Moreover, according to this vision, for each entity populating the physical environment, i.e. users, objects, devices, environments, the following questions can be arise: “Who?”, i.e. which are the entities that should be identified? “Where?” i.e. where are such entities located in physical space? and “What?” i.e. which attributes and properties of the entities should be stored in the Smart Space in machine understandable format, in the sense that its meaning has to be explicitly defined and all the data should be linked together in order to be automatically retrieved by interoperable applications. Starting from this the location detection is a necessary step in the creation of Smart Environments. If the addressed entity is a user and the environment a generic environment, a meaningful way to assign the position, is through a Pedestrian Tracking System. In this work two solution for these type of system are proposed and compared. One of the two solution has been studied and developed in all its aspects during the doctoral period. The work also investigates the problem to create and manage the Smart Environment. The proposed solution is to create, by means of natural interactions, links between objects and between objects and their environment, through the use of specific devices, i.e. Smart Objects

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nel corso degli ultimi due decenni in particolare si è andata evidenziando a livello epatologico una entità definita oggi Non-alcoholic Fatty Liver Disease (NAFLD) che si è andata ad affiancare alle cause in precedenza conosciute, fino a risultare, attraverso il succedersi di riscontri scientifici, la causa prevalente di epatopatia, in particolare nei paesi occidentali e industrializzati. Negli stessi anni un'altra problematica clinica complessa che va sotto il nome di Sindrome Metabolica si è andata via via delineando attraverso le sue molteplici correlazioni con quelle che sono le cause di morbidità e mortalità prevalenti nella nostra realtà, dal diabete alla patologia cardiovascolare e non ultima alla NAFLD stessa. Scopo dello Studio in oggetto a questa tesi era proprio di rivalutare nel territorio italiano la prevalenza di epatopatia in particolare correlabile alla NAFLD e la sua associazione con la Sindrome Metabolica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lo studio condotto durante il Dottorato di Ricerca è stato focalizzato sulla valutazione e sul monitoraggio delle diverse degradazioni termossidative in oli da frittura. Per raggiungere tale obiettivo si è ritenuto opportuno procedere mediante uno screening dei principali oli presenti sul mercato italiano e successiva messa a punto di due miscele di oli vegetali che sono state sottoposte a due piani sperimentali di frittura controllata e standardizzata in laboratorio, seguiti da due piani di frittura eseguiti in due differenti situazioni reali quali mensa e ristorante. Ognuna delle due miscele è stata messa a confronto con due oli di riferimento. A tal fine è stata identificato il profilo in acidi grassi, la stabilità ossidativa ed idrolitica, il punto di fumo, i composti polari, il contenuto in tocoferoli totali, ed i composti volatili sia sugli oli crudi che sottoposti ai diversi processi di frittura. Lo studio condotto ha permesso di identificare una delle miscele ideate come valida alternativa all’impiego dell’olio di palma ampiamente utilizzato nelle fritture degli alimenti, oltre a fornire delle indicazioni più precise sulla tipologia e sull’entità delle modificazioni che avvengono in frittura, a seconda delle condizioni impiegate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hepatitis B x protein (HBx) is a non structural, multifunctional protein of hepatitis B virus (HBV) that modulates a variety of host processes.Due to its transcriptional activity,able to alter the expression of growth-control genes,it has been implicated in hepatocarcinogenesis.Increased expression of HBx has been reported on the liver tissue samples of hepatocellular carcinoma (HCC),and a specific anti-HBx immune response can be detected in the peripheral blood of patients with chronic HBV.However,its role and entity has not been yet clarified.Thus,we performed a cross-sectional analysis of anti-HBx specific T cell response in HBV-infected patients in different stage of disease.A total of 70 HBV-infected subjects were evaluated:15 affected by chronic hepatitis (CH-median age 45 yrs),14 by cirrhosis (median age 55 yrs),11 with dysplastic nodules (median age 64 yrs),15 with HCC (median age 60 yrs),15 with IC(median age 53 yrs).All patients were infected by virus genotype D with different levels of HBV viremia and most of them (91%) were HBeAb positive.The HBx-specific T cell response was evaluated by anti-Interferon (IFN)-gamma Elispot assay after in vitro stimulation of peripheral blood mononuclear cells,using 20 overlapping synthetic peptides covering all HBx protein sequence.HBx-specific IFN-gamma-secreting T cells were found in 6 out of 15 patients with chronic hepatitis (40%), 3 out of 14 cirrhosis (21%), in 5 out of 11 cirrhosis with macronodules (54%), and in 10 out of 15 HCC patients (67%). The number of responding patients resulted significantly higher in HCC than IC (p=0.02) and cirrhosis (p=0.02). Central specific region of the protein x was preferentially recognize,between 86-88 peptides. HBx response does not correlate with clinical feature disease(AFP,MELD).The HBx specific T-cell response seems to increase accordingly to progression of the disease, being increased in subjects with dysplastic or neoplastic lesions and can represent an additional tool to monitor the patients at high risk to develop HCC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spinal cord injury (SCI) results not only in paralysis; but it is also associated with a range of autonomic dysregulation that can interfere with cardiovascular, bladder, bowel, temperature, and sexual function. The entity of the autonomic dysfunction is related to the level and severity of injury to descending autonomic (sympathetic) pathways. For many years there was limited awareness of these issues and the attention given to them by the scientific and medical community was scarce. Yet, even if a new system to document the impact of SCI on autonomic function has recently been proposed, the current standard of assessment of SCI (American Spinal Injury Association (ASIA) examination) evaluates motor and sensory pathways, but not severity of injury to autonomic pathways. Beside the severe impact on quality of life, autonomic dysfunction in persons with SCI is associated with increased risk of cardiovascular disease and mortality. Therefore, obtaining information regarding autonomic function in persons with SCI is pivotal and clinical examinations and laboratory evaluations to detect the presence of autonomic dysfunction and quantitate its severity are mandatory. Furthermore, previous studies demonstrated that there is an intimate relationship between the autonomic nervous system and sleep from anatomical, physiological, and neurochemical points of view. Although, even if previous epidemiological studies demonstrated that sleep problems are common in spinal cord injury (SCI), so far only limited polysomnographic (PSG) data are available. Finally, until now, circadian and state dependent autonomic regulation of blood pressure (BP), heart rate (HR) and body core temperature (BcT) were never assessed in SCI patients. Aim of the current study was to establish the association between the autonomic control of the cardiovascular function and thermoregulation, sleep parameters and increased cardiovascular risk in SCI patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduzione: le Coliti Microscopiche, altrimenti note come Colite Collagena e Colite Linfocitica, sono disordini infiammatori cronici del colon che causano diarrea e colpiscono più frequentemente donne in età avanzata e soggetti in terapia farmacologica. Negli ultimi anni la loro incidenza sembra aumentata in diversi paesi occidentali ma la prevalenza in Italia è ancora incerta. Scopo: il presente studio prospettico e multicentrico è stato disegnato per valutare la prevalenza delle CM in pazienti sottoposti a colonscopia per diarrea cronica non ematica. Pazienti e metodi: dal Maggio 2010 al Settembre 2010 sono stati arruolati consecutivamente tutti i soggetti adulti afferenti in due strutture dell’area metropolitana milanese per eseguire una pancolonscopia. Nei soggetti con diarrea cronica non ematica sono state eseguite biopsie multiple nel colon ascendente, sigma e retto nonché in presenza di lesioni macroscopiche. Risultati: delle 8008 colonscopie esaminate 265 sono state eseguite per diarrea cronica; tra queste, 8 presentavano informazioni incomplete, 52 riscontri endoscopici consistenti con altri disordini intestinali (i.e. IBD, tumori, diverticoliti). 205 colonscopie sono risultate sostanzialmente negative, 175 dotate di adeguato campionamento microscopico (M:F=70:105; età mediana 61 anni). L’analisi istologica ha permesso di documentare 38 nuovi casi di CM (M:F=14:24; età mediana 67.5 anni): 27 CC (M:F=10:17; età mediana 69 anni) e 11 CL (M:F=4:7; età mediana 66 anni). In altri 25 casi sono state osservate alterazioni microscopiche prive dei sufficienti requisiti per la diagnosi di CM. Conclusioni: nel presente studio l’analisi microscopica del colon ha identificato la presenza di CM nel 21,7% dei soggetti con diarrea cronica non ematica ed indagine pancolonscopica negativa. Lo studio microscopico del colon è pertanto un passo diagnostico fondamentale per il corretto inquadramento diagnostico delle diarree croniche, specialmente dopo i 60 anni di età. Ampi studi prospettici e multicentrici dovranno chiarire ruolo e peso dei fattori di rischio associati a questi disordini.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lo scopo del progetto triennale del dottorato di ricerca è lo studio delle alterazioni genetiche in un gruppo di pazienti affetti da micosi fungoide ed un gruppo di pazienti affetti da sindrome di Sezary. Dalle biopsie cutanee è stato estratto il DNA e analizzato, comparandolo con DNA sano di riferimento, utilizzando la tecnica array-CGH, allo scopo di identificare la presenza di geni potenzialmente implicati nel processo di oncogenesi. Questa analisi è stata eseguita, per ogni paziente, su biopsie effettuate ad una fase iniziale di malattia e ad una fase di progressione della stessa. Sugli stessi pazienti è stata inoltre eseguita un’analisi miRNA. Si ipotizza che il profilo d’espressione dei miRNA possa infatti dare informazioni utili per predire lo stato di malattia, il decorso clinico, la progressione tumorale e la riposta terapeutica. Questo lavoro è stato poi eseguito su biopsie effettuate in pazienti affetti da sindrome di Sezary che, quando non insorge primitivamente come tale, si può considerare una fase evolutiva della micosi fungoide. La valutazione delle alterazioni genetiche, ed in particolare la correlazione esistente tra duplicazione e delezione genetica e sovra/sottoespressione genetica, è stata possibile attraverso l’interpretazione e la comparazione dei dati ottenuti attraverso le tecniche array-CGH e miRNA. Sono stati comparati i risultati ottenuti per valutare quali fossero le alterazioni cromosomiche riscontrate nei diversi stadi di malattia. L’applicazione dell’array-CGH e della metodica di analisi mi-RNA si sono rivelate molto utili per l’identificazione delle diverse aberrazioni cromosomiche presenti nel genoma dei pazienti affetti da micosi fungoide e sindrome di Sezary, per valutare la prognosi del paziente e per cercare di migliorare o trovare nuove linee terapeutiche per il trattamento delle due patologie. Lo studio di questi profili può rappresentare quindi uno strumento di grande importanza nella classificazione e nella diagnosi dei tumori.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The post genomic era, set the challenge to develop drugs that target an ever-growing list of proteins associated with diseases. However, an increase in the number of drugs approved every year is nowadays still not observed. To overcome this gap, innovative approaches should be applied in drug discovery for target validation, and at the same time organic synthetic chemistry has to find new fruitful strategies to obtain biologically active small molecules not only as therapeutic agents, but also as diagnostic tools to identify possible cellular targets. In this context, in view of the multifactorial mechanistic nature of cancer, new chimeric molecules, which can be either antitumor lead candidates, or valuable chemical tools to study molecular pathways in cancer cells, were developed using a multitarget-directed drug design strategy. According to this approach, the desired hybrid compounds were obtained by combining in a single chemical entity SAHA analogues, targeting histone deacetylases (HDACs), with substituted stilbene or terphenyl derivatives able to block cell cycle, to induce apoptosis and cell differentiation and with Sorafenib derivative, a multikinase inhibitor. The new chimeric derivatives were characterized with respect to their cytotoxic activity and their effects on cell cycle progression on leukemia Bcr-Abl-expressing K562 cell lines, as well as their HDACs inhibition. Preliminary results confirmed that one of the hybrid compounds has the desired chimeric profile. A distinct project was developed in the laboratory of Dr Spring, regarding the synthesis of a diversity-oriented synthesis (DOS) library of macrocyclic peptidomimetics. From a biological point of view, this class of molecules is extremely interesting but underrepresented in drug discovery due to the poor synthetic accessibility. Therefore it represents a valid challenge for DOS to take on. A build/couple/pair (B/C/P) approach provided, in an efficient manner and in few steps, the structural diversity and complexity required for such compounds.