7 resultados para advanced glycosylation end-product receptor

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ischemic preconditioning is a complex cardioprotective phenomenon that involves adaptive changes in cells and molecules. This adaptation occurs in a biphasic pattern: an early phase which develops after 1-2 h, and a late phase that develops after 12-24 h. While it is widely accepted that reactive oxygen species (ROS) are strongly involved in triggering ischemic preconditiong, it is not clear if they play a major role in the early or late phase of preconditioning and which are the mechanisms involved. Methylglyoxal, a metabolic compound formed mainly from the glycolytic intermediate glyceraldehyde-3-phosphate., is a precursor of advanced glycation end product (AGEs) .It is more reactive than glucose and shows a stronger ability to cross-link with protein amino groups to form AGEs. Methylglyoxal induced cytotoxicity may be at least partially responsible for cardiovascular and Alzheimer diseases. Methylglyoxal omeostasis is controlled by the glyoxalase system that consists of two enzyme, glyoxalase 1 (GLO1) and glyoxalase 2. In a recent study it was demonstrated that the transcriptional levels of GLO1 are controlled by NF-E2-related factor 2 (Nrf2). The isothiocyanate sulforaphane, derived from the hydrolysis of glucoraphanin abundantly present in broccoli, represents one of the most potent inducers of phase II enzymes through the Keap1–Nrf2 pathway. The aim of this thesis was evaluated molecular mechanisms in cardio- and neuroprotection and the possibility of modulation by nutraceutical phytocomponents This thesis show to one side that the protection induced by H2O2 is mediated by detoxifying and antioxidant phase II enzymes induction, regulated, not only by transcriptional factor Nrf2, but also by Nrf1; on the other side our data represent an innovative result because for the first time it was demonstrated the possibility of inducing GLO1 by SF supplementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There have been almost fifty years since Harry Eckstein' s classic monograph, A Theory of Stable Democracy (Princeton, 1961), where he sketched out the basic tenets of the “congruence theory”, which was to become one of the most important and innovative contributions to understanding democratic rule. His next work, Division and Cohesion in Democracy, (Princeton University Press: 1966) is designed to serve as a plausibility probe for this 'theory' (ftn.) and is a case study of a Northern democratic system, Norway. What is more, this line of his work best exemplifies the contribution Eckstein brought to the methodology of comparative politics through his seminal article, “ “Case Study and Theory in Political Science” ” (in Greenstein and Polsby, eds., Handbook of Political Science, 1975), on the importance of the case study as an approach to empirical theory. This article demonstrates the special utility of “crucial case studies” in testing theory, thereby undermining the accepted wisdom in comparative research that the larger the number of cases the better. Although not along the same lines, but shifting the case study unit of research, I intend to take up here the challenge and build upon an equally unique political system, the Swedish one. Bearing in mind the peculiarities of the Swedish political system, my unit of analysis is going to be further restricted to the Swedish Social Democratic Party, the Svenska Arbetare Partiet. However, my research stays within the methodological framework of the case study theory inasmuch as it focuses on a single political system and party. The Swedish SAP endurance in government office and its electoral success throughout half a century (ftn. As of the 1991 election, there were about 56 years - more than half century - of interrupted social democratic "reign" in Sweden.) are undeniably a performance no other Social Democrat party has yet achieved in democratic conditions. Therefore, it is legitimate to inquire about the exceptionality of this unique political power combination. Which were the different components of this dominance power position, which made possible for SAP's governmental office stamina? I will argue here that it was the end-product of a combination of multifarious factors such as a key position in the party system, strong party leadership and organization, a carefully designed strategy regarding class politics and welfare policy. My research is divided into three main parts, the historical incursion, the 'welfare' part and the 'environment' part. The first part is a historical account of the main political events and issues, which are relevant for my case study. Chapter 2 is devoted to the historical events unfolding in the 1920-1960 period: the Saltsjoebaden Agreement, the series of workers' strikes in the 1920s and SAP's inception. It exposes SAP's ascent to power in the mid 1930s and the party's ensuing strategies for winning and keeping political office, that is its economic program and key economic goals. The following chapter - chapter 3 - explores the next period, i.e. the period from 1960s to 1990s and covers the party's troubled political times, its peak and the beginnings of the decline. The 1960s are relevant for SAP's planning of a long term economic strategy - the Rehn Meidner model, a new way of macroeconomic steering, based on the Keynesian model, but adapted to the new economic realities of welfare capitalist societies. The second and third parts of this study develop several hypotheses related to SAP's 'dominant position' (endurance in politics and in office) and test them afterwards. Mainly, the twin issues of economics and environment are raised and their political relevance for the party analyzed. On one hand, globalization and its spillover effects over the Swedish welfare system are important causal factors in explaining the transformative social-economic challenges the party had to put up with. On the other hand, Europeanization and environmental change influenced to a great deal SAP's foreign policy choices and its domestic electoral strategies. The implications of globalization on the Swedish welfare system will make the subject of two chapters - chapters four and five, respectively, whereupon the Europeanization consequences will be treated at length in the third part of this work - chapters six and seven, respectively. Apparently, at first sight, the link between foreign policy and electoral strategy is difficult to prove and uncanny, in the least. However, in the SAP's case there is a bulk of literature and public opinion statistical data able to show that governmental domestic policy and party politics are in a tight dependence to foreign policy decisions and sovereignty issues. Again, these country characteristics and peculiar causal relationships are outlined in the first chapters and explained in the second and third parts. The sixth chapter explores the presupposed relationship between Europeanization and environmental policy, on one hand, and SAP's environmental policy formulation and simultaneous agenda-setting at the international level, on the other hand. This chapter describes Swedish leadership in environmental policy formulation on two simultaneous fronts and across two different time spans. The last chapter, chapter eight - while trying to develop a conclusion, explores the alternative theories plausible in explaining the outlined hypotheses and points out the reasons why these theories do not fit as valid alternative explanation to my systemic corporatism thesis as the main causal factor determining SAP's 'dominant position'. Among the alternative theories, I would consider Traedgaardh L. and Bo Rothstein's historical exceptionalism thesis and the public opinion thesis, which alone are not able to explain the half century social democratic endurance in government in the Swedish case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The construction and use of multimedia corpora has been advocated for a while in the literature as one of the expected future application fields of Corpus Linguistics. This research project represents a pioneering experience aimed at applying a data-driven methodology to the study of the field of AVT, similarly to what has been done in the last few decades in the macro-field of Translation Studies. This research was based on the experience of Forlixt 1, the Forlì Corpus of Screen Translation, developed at the University of Bologna’s Department of Interdisciplinary Studies in Translation, Languages and Culture. As a matter of fact, in order to quantify strategies of linguistic transfer of an AV product, we need to take into consideration not only the linguistic aspect of such a product but all the meaning-making resources deployed in the filmic text. Provided that one major benefit of Forlixt 1 is the combination of audiovisual and textual data, this corpus allows the user to access primary data for scientific investigation, and thus no longer rely on pre-processed material such as traditional annotated transcriptions. Based on this rationale, the first chapter of the thesis sets out to illustrate the state of the art of research in the disciplinary fields involved. The primary objective was to underline the main repercussions on multimedia texts resulting from the interaction of a double support, audio and video, and, accordingly, on procedures, means, and methods adopted in their translation. By drawing on previous research in semiotics and film studies, the relevant codes at work in visual and acoustic channels were outlined. Subsequently, we concentrated on the analysis of the verbal component and on the peculiar characteristics of filmic orality as opposed to spontaneous dialogic production. In the second part, an overview of the main AVT modalities was presented (dubbing, voice-over, interlinguistic and intra-linguistic subtitling, audio-description, etc.) in order to define the different technologies, processes and professional qualifications that this umbrella term presently includes. The second chapter focuses diachronically on various theories’ contribution to the application of Corpus Linguistics’ methods and tools to the field of Translation Studies (i.e. Descriptive Translation Studies, Polysystem Theory). In particular, we discussed how the use of corpora can favourably help reduce the gap existing between qualitative and quantitative approaches. Subsequently, we reviewed the tools traditionally employed by Corpus Linguistics in regard to the construction of traditional “written language” corpora, to assess whether and how they can be adapted to meet the needs of multimedia corpora. In particular, we reviewed existing speech and spoken corpora, as well as multimedia corpora specifically designed to investigate Translation. The third chapter reviews Forlixt 1's main developing steps, from a technical (IT design principles, data query functions) and methodological point of view, by laying down extensive scientific foundations for the annotation methods adopted, which presently encompass categories of pragmatic, sociolinguistic, linguacultural and semiotic nature. Finally, we described the main query tools (free search, guided search, advanced search and combined search) and the main intended uses of the database in a pedagogical perspective. The fourth chapter lists specific compilation criteria retained, as well as statistics of the two sub-corpora, by presenting data broken down by language pair (French-Italian and German-Italian) and genre (cinema’s comedies, television’s soapoperas and crime series). Next, we concentrated on the discussion of the results obtained from the analysis of summary tables reporting the frequency of categories applied to the French-Italian sub-corpus. The detailed observation of the distribution of categories identified in the original and dubbed corpus allowed us to empirically confirm some of the theories put forward in the literature and notably concerning the nature of the filmic text, the dubbing process and Italian dubbed language’s features. This was possible by looking into some of the most problematic aspects, like the rendering of socio-linguistic variation. The corpus equally allowed us to consider so far neglected aspects, such as pragmatic, prosodic, kinetic, facial, and semiotic elements, and their combination. At the end of this first exploration, some specific observations concerning possible macrotranslation trends were made for each type of sub-genre considered (cinematic and TV genre). On the grounds of this first quantitative investigation, the fifth chapter intended to further examine data, by applying ad hoc models of analysis. Given the virtually infinite number of combinations of categories adopted, and of the latter with searchable textual units, three possible qualitative and quantitative methods were designed, each of which was to concentrate on a particular translation dimension of the filmic text. The first one was the cultural dimension, which specifically focused on the rendering of selected cultural references and on the investigation of recurrent translation choices and strategies justified on the basis of the occurrence of specific clusters of categories. The second analysis was conducted on the linguistic dimension by exploring the occurrence of phrasal verbs in the Italian dubbed corpus and by ascertaining the influence on the adoption of related translation strategies of possible semiotic traits, such as gestures and facial expressions. Finally, the main aim of the third study was to verify whether, under which circumstances, and through which modality, graphic and iconic elements were translated into Italian from an original corpus of both German and French films. After having reviewed the main translation techniques at work, an exhaustive account of possible causes for their non-translation was equally provided. By way of conclusion, the discussion of results obtained from the distribution of annotation categories on the French-Italian corpus, as well as the application of specific models of analysis allowed us to underline possible advantages and drawbacks related to the adoption of a corpus-based approach to AVT studies. Even though possible updating and improvement were proposed in order to help solve some of the problems identified, it is argued that the added value of Forlixt 1 lies ultimately in having created a valuable instrument, allowing to carry out empirically-sound contrastive studies that may be usefully replicated on different language pairs and several types of multimedia texts. Furthermore, multimedia corpora can also play a crucial role in L2 and translation teaching, two disciplines in which their use still lacks systematic investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatically recognizing faces captured under uncontrolled environments has always been a challenging topic in the past decades. In this work, we investigate cohort score normalization that has been widely used in biometric verification as means to improve the robustness of face recognition under challenging environments. In particular, we introduce cohort score normalization into undersampled face recognition problem. Further, we develop an effective cohort normalization method specifically for the unconstrained face pair matching problem. Extensive experiments conducted on several well known face databases demonstrate the effectiveness of cohort normalization on these challenging scenarios. In addition, to give a proper understanding of cohort behavior, we study the impact of the number and quality of cohort samples on the normalization performance. The experimental results show that bigger cohort set size gives more stable and often better results to a point before the performance saturates. And cohort samples with different quality indeed produce different cohort normalization performance. Recognizing faces gone after alterations is another challenging problem for current face recognition algorithms. Face image alterations can be roughly classified into two categories: unintentional (e.g., geometrics transformations introduced by the acquisition devide) and intentional alterations (e.g., plastic surgery). We study the impact of these alterations on face recognition accuracy. Our results show that state-of-the-art algorithms are able to overcome limited digital alterations but are sensitive to more relevant modifications. Further, we develop two useful descriptors for detecting those alterations which can significantly affect the recognition performance. In the end, we propose to use the Structural Similarity (SSIM) quality map to detect and model variations due to plastic surgeries. Extensive experiments conducted on a plastic surgery face database demonstrate the potential of SSIM map for matching face images after surgeries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis aims to expose the advances achieved in the practices of captive breeding of the European eel (Anguilla anguilla). Aspects investigated concern both approaches livestock (breeding selection, response to hormonal stimulation, reproductive performance, incubation of eggs) and physiological aspects (endocrine plasma profiles of players), as well as engineering aspects. Studies conducted on various populations of wild eel have shown that the main determining factor in the selection of wild females destined to captive breeding must be the Silver Index which may determine the stage of pubertal development. The hormonal induction protocol adopted, with increasing doses of carp pituitary extract, it has proven useful to ovarian development, with a synchronization effect that is positively reflected on egg production. The studies on the effects of photoperiod show how the condition of total darkness can positively influence practices of reproductions in captivity. The effects of photoperiod were also investigated at the physiological level, observing the plasma levels of steroids ( E2, T) and thyroid hormones (T3 and T4) and the expression in the liver of vitellogenin (vtg1 and vtg2) and estradiol membrane receptor (ESR1). From the comparison between spontaneous deposition and insemination techniques through the stripping is inferred as the first ports to a better qualitative and quantitative yield in the production of eggs capable of being fertilized, also the presence of a percentage of oocytes completely transparent can be used to obtain eggs at a good rate of fertility. Finally, the design and implementation of a system for recirculating aquaculture suited to meet the needs of species-specific eel showed how to improve the reproductive results, it would be preferable to adopt low-flow and low density incubation.