769 resultados para endless rolling
Resumo:
Premise: In the literary works of our anthropological and cultural imagination, the various languages and the different discursive practices are not necessarily quoted, expressly alluded to or declared through clear expressive mechanisms; instead, they rather constitute a substratum, a background, now consolidated, which with irony and intertextuality shines through the thematic and formal elements of each text. The various contaminations, hybridizations and promptings that we find in the expressive forms, the rhetorical procedures and the linguistic and thematic choices of post-modern literary texts are shaped as fluid and familiar categories. Exchanges and passages are no longer only allowed but also inevitable; the post-modern imagination is made up of an agglomeration of discourses that are no longer really separable, built up from texts that blend and quote one another, composing, each with its own specificities, the great family of the cultural products of our social scenario. A literary work, therefore, is not only a whole phenomenon, delimited hic et nunc by a beginning and an ending, but is a fragment of that complex, dense and boundless network that is given by the continual interrelations between human forms of communication and symbolization. The research hypothesis: A vision is delineated of comparative literature as a discipline attentive to the social contexts in which texts take shape and move and to the media-type consistency that literary phenomena inevitably take on. Hence literature is seen as an open systematicity that chooses to be contaminated by other languages and other discursive practices of an imagination that is more than ever polymorphic and irregular. Inside this interpretative framework the aim is to focus the analysis on the relationship that postmodern literature establishes with advertising discourse. On one side post-modern literature is inserted in the world of communication, loudly asserting the blending and reciprocal contamination of literary modes with media ones, absorbing their languages and signification practices, translating them now into thematic nuclei, motifs and sub-motifs and now into formal expedients and new narrative choices; on the other side advertising is chosen as a signification practice of the media universe, which since the 1960s has actively contributed to shaping the dynamics of our socio-cultural scenarios, in terms which are just as important as those of other discursive practices. Advertising has always been a form of communication and symbolization that draws on the collective imagination – myths, actors and values – turning them into specific narrative programs for its own texts. Hence the aim is to interpret and analyze this relationship both from a strictly thematic perspective – and therefore trying to understand what literature speaks about when it speaks about advertising, and seeking advertising quotations in post-modern fiction – and from a formal perspective, with a search for parallels and discordances between the rhetorical procedures, the languages and the verifiable stylistic choices in the texts of the two different signification practices. The analysis method chosen, for the purpose of constructive multiplication of the perspectives, aims to approach the analytical processes of semiotics, applying, when possible, the instruments of the latter, in order to highlight the thematic and formal relationships between literature and advertising. The corpus: The corpus of the literary texts is made up of various novels and, although attention is focused on the post-modern period, there will also be ineludible quotations from essential authors that with their works prompted various reflections: H. De Balzac, Zola, Fitzgerald, Joyce, Calvino, etc… However, the analysis focuses the corpus on three authors: Don DeLillo, Martin Amis and Aldo Nove, and in particular the followings novels: “Americana” (1971) and “Underworld” (1999) by Don DeLillo, “Money” (1984) by Martin Amis and “Woobinda and other stories without a happy ending” (1996) and “Superwoobinda” (1998) by Aldo Nove. The corpus selection is restricted to these novels for two fundamental reasons: 1. assuming parameters of spatio-temporal evaluation, the texts are representative of different socio-cultural contexts and collective imaginations (from the masterly glimpses of American life by DeLillo, to the examples of contemporary Italian life by Nove, down to the English imagination of Amis) and of different historical moments (the 1970s of DeLillo’s Americana, the 1980s of Amis, down to the 1990s of Nove, decades often used as criteria of division of postmodernism into phases); 2. adopting a perspective of strictly thematic analysis, as mentioned in the research hypothesis, the variations and the constants in the novels (thematic nuclei, topoi, images and narrative developments) frequently speak of advertising and inside the narrative plot they affirm various expressions and realizations of it: value ones, thematic ones, textual ones, urban ones, etc… In these novels the themes and the processes of signification of advertising discourse pervade time, space and the relationships that the narrator character builds around him. We are looking at “particle-characters” whose endless facets attest the influence and contamination of advertising in a large part of the narrative developments of the plot: on everyday life, on the processes of acquisition and encoding of the reality, on ideological and cultural baggage, on the relationships and interchanges with the other characters, etc… Often the characters are victims of the implacable consequentiality of the advertising mechanism, since the latter gets the upper hand over the usual processes of communication, which are overwhelmed by it, wittingly or unwittingly (for example: disturbing openings in which the protagonist kills his or her parents on the basis of a spot, former advertisers that live life codifying it through the commercial mechanisms of products, sons and daughters of advertisers that as children instead of playing outside for whole nights saw tapes of spots.) Hence the analysis arises from the text and aims to show how much the developments and the narrative plots of the novels encode, elaborate and recount the myths, the values and the narrative programs of advertising discourse, transforming them into novel components in their own right. And also starting from the text a socio-cultural reference context is delineated, a collective imagination that is different, now geographically, now historically, and from comparison between them the aim is to deduce the constants, the similarities and the variations in the relationship between literature and advertising.
Resumo:
The wheel - rail contact analysis plays a fundamental role in the multibody modeling of railway vehicles. A good contact model must provide an accurate description of the global contact phenomena (contact forces and torques, number and position of the contact points) and of the local contact phenomena (position and shape of the contact patch, stresses and displacements). The model has also to assure high numerical efficiency (in order to be implemented directly online within multibody models) and a good compatibility with commercial multibody software (Simpack Rail, Adams Rail). The wheel - rail contact problem has been discussed by several authors and many models can be found in the literature. The contact models can be subdivided into two different categories: the global models and the local (or differential) models. Currently, as regards the global models, the main approaches to the problem are the so - called rigid contact formulation and the semi – elastic contact description. The rigid approach considers the wheel and the rail as rigid bodies. The contact is imposed by means of constraint equations and the contact points are detected during the dynamic simulation by solving the nonlinear algebraic differential equations associated to the constrained multibody system. Indentation between the bodies is not permitted and the normal contact forces are calculated through the Lagrange multipliers. Finally the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces respectively. Also the semi - elastic approach considers the wheel and the rail as rigid bodies. However in this case no kinematic constraints are imposed and the indentation between the bodies is permitted. The contact points are detected by means of approximated procedures (based on look - up tables and simplifying hypotheses on the problem geometry). The normal contact forces are calculated as a function of the indentation while, as in the rigid approach, the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces. Both the described multibody approaches are computationally very efficient but their generality and accuracy turn out to be often insufficient because the physical hypotheses behind these theories are too restrictive and, in many circumstances, unverified. In order to obtain a complete description of the contact phenomena, local (or differential) contact models are needed. In other words wheel and rail have to be considered elastic bodies governed by the Navier’s equations and the contact has to be described by suitable analytical contact conditions. The contact between elastic bodies has been widely studied in literature both in the general case and in the rolling case. Many procedures based on variational inequalities, FEM techniques and convex optimization have been developed. This kind of approach assures high generality and accuracy but still needs very large computational costs and memory consumption. Due to the high computational load and memory consumption, referring to the current state of the art, the integration between multibody and differential modeling is almost absent in literature especially in the railway field. However this integration is very important because only the differential modeling allows an accurate analysis of the contact problem (in terms of contact forces and torques, position and shape of the contact patch, stresses and displacements) while the multibody modeling is the standard in the study of the railway dynamics. In this thesis some innovative wheel – rail contact models developed during the Ph. D. activity will be described. Concerning the global models, two new models belonging to the semi – elastic approach will be presented; the models satisfy the following specifics: 1) the models have to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the models have to consider generic railway tracks and generic wheel and rail profiles 3) the models have to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the models have to evaluate the number and the position of the contact points and, for each point, the contact forces and torques 4) the models have to be implementable directly online within the multibody models without look - up tables 5) the models have to assure computation times comparable with those of commercial multibody software (Simpack Rail, Adams Rail) and compatible with RT and HIL applications 6) the models have to be compatible with commercial multibody software (Simpack Rail, Adams Rail). The most innovative aspect of the new global contact models regards the detection of the contact points. In particular both the models aim to reduce the algebraic problem dimension by means of suitable analytical techniques. This kind of reduction allows to obtain an high numerical efficiency that makes possible the online implementation of the new procedure and the achievement of performance comparable with those of commercial multibody software. At the same time the analytical approach assures high accuracy and generality. Concerning the local (or differential) contact models, one new model satisfying the following specifics will be presented: 1) the model has to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the model has to consider generic railway tracks and generic wheel and rail profiles 3) the model has to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the model has to able to calculate both the global contact variables (contact forces and torques) and the local contact variables (position and shape of the contact patch, stresses and displacements) 4) the model has to be implementable directly online within the multibody models 5) the model has to assure high numerical efficiency and a reduced memory consumption in order to achieve a good integration between multibody and differential modeling (the base for the local contact models) 6) the model has to be compatible with commercial multibody software (Simpack Rail, Adams Rail). In this case the most innovative aspects of the new local contact model regard the contact modeling (by means of suitable analytical conditions) and the implementation of the numerical algorithms needed to solve the discrete problem arising from the discretization of the original continuum problem. Moreover, during the development of the local model, the achievement of a good compromise between accuracy and efficiency turned out to be very important to obtain a good integration between multibody and differential modeling. At this point the contact models has been inserted within a 3D multibody model of a railway vehicle to obtain a complete model of the wagon. The railway vehicle chosen as benchmark is the Manchester Wagon the physical and geometrical characteristics of which are easily available in the literature. The model of the whole railway vehicle (multibody model and contact model) has been implemented in the Matlab/Simulink environment. The multibody model has been implemented in SimMechanics, a Matlab toolbox specifically designed for multibody dynamics, while, as regards the contact models, the CS – functions have been used; this particular Matlab architecture allows to efficiently connect the Matlab/Simulink and the C/C++ environment. The 3D multibody model of the same vehicle (this time equipped with a standard contact model based on the semi - elastic approach) has been then implemented also in Simpack Rail, a commercial multibody software for railway vehicles widely tested and validated. Finally numerical simulations of the vehicle dynamics have been carried out on many different railway tracks with the aim of evaluating the performances of the whole model. The comparison between the results obtained by the Matlab/ Simulink model and those obtained by the Simpack Rail model has allowed an accurate and reliable validation of the new contact models. In conclusion to this brief introduction to my Ph. D. thesis, we would like to thank Trenitalia and the Regione Toscana for the support provided during all the Ph. D. activity. Moreover we would also like to thank the INTEC GmbH, the society the develops the software Simpack Rail, with which we are currently working together to develop innovative toolboxes specifically designed for the wheel rail contact analysis.
Resumo:
In order to improve the animal welfare, the Council Directive 1999/74/EC (defining minimum standards for the welfare of laying hens) will ban conventional cage systems since 2012, in favour of enriched cages or floor systems. As a consequence an increased risk of bacterial contamination of eggshell is expected (EFSA, 2005). Furthermore egg-associated salmonellosis is an important public health problem throughout the world (Roberts et al., 1994). In this regard the introduction of efficient measures to reduce eggshell contamination by S. Enteritidis or other bacterial pathogens, and thus to prevent any potential or additional food safety risk for Human health, may be envisaged. The hot air pasteurization can be a viable alternative for the decontamination of the surface of the egg shell. Few studies have been performed on the decontamination power of this technique on table eggs (Hou et al, 1996; James et al., 2002). The aim of this study was to develop innovative techniques to remove surface contamination of shell eggs by hot air under natural or forced convection. Initially two simplified finite element models describing the thermal interaction between the air and egg were developed, respectively for the natural and forced convection. The numerical models were validated using an egg simulant equipped by type-K thermocouple (Chromel/Alumel). Once validated, the models allowed the selection of a thermal cycle with an inner temperature always lower than 55°C. Subsequently a specific apparatus composed by two hot air generators, one cold air generator and rolling cylinder support, was built to physically condition the eggs. The decontamination power of the thermal treatments was evaluated on shell eggs experimentally inoculated with either Salmonella Enteritidis, Escherichia coli, Listeria monocytogenes and on shell eggs containing only the indigenous microflora. The applicability of treatments was further evaluated by comparing quality traits of treated and not treated eggs immediately after the treatment and after 28 days of storage at 20°C. The results showed that the treatment characterized by two shots of hot air at 350°C for 8 sec, spaced by a cooling interval of 32 (forced convection), reduce the bacterial population of more than 90% (Salmonella enteritidis and Listeria monocytogenes). No statistically significant results were obtained comparing E. coli treated and not treated eggs as well as indigenous microflora treated and not treated eggs. A reduction of 2.6 log was observed on Salmonella enteritidis load of eggs immediately after the treatment in oven at 200°C for 200 minutes (natural convection). Furthermore no detrimental effects on quality traits of treated eggs were recorded. These results support the hot air techniques for the surface decontamination of table eggs as an effective industrial process.
Resumo:
Strukturgeologische Untersuchungen belegen, daß die Anatoliden der Westtürkei im Eozän durch die Plazierung der Kykladischen Blauschiefereinheit entlang einer durchbrechenden Überschiebung auf die Menderes-Decken unter grünschieferfaziellen Metamorphosebedingungen entstanden.Die kykladischen Blauschiefer in der Westtürkei enthalten Relikte eines prograden alpinen Gefüges (DA1), welches hochruckmetamorph von Disthen und Chloritoid poikiloblastisch überwachsen wurde. Dieses Mineralstadium dauerte noch während des Beginns des nachfolgenden Deformationsereignisses (DA2) an, welches durch NE-gerichtete Scherung und Dekompression charakterisiert ist. Die nachfolgende Deformation (DA3) war das erste Ereignis, das beide Einheiten, sowohl die kykladische Blauschifereinheit als auch die Menderes-Decken, gemeinsam erfaßte. Der Überschiebungskontakt zwischen der kykladischen Blauschiefereinheit und den Menderes-Decken ist eine DA3-Scherzone: die Cycladic-Menderes Thrust (CMT). Entlang der CMT-Überschiebungsbahn wurden die kykladischen Blauschiefer gegen veschiedene Einheiten der MN plaziert. Die CMT steigt nach S zum strukturell Hangenden hin an und kann daher als eine durchbrechende Überschiebung entlang einer nach S ansteigenden Rampe betrachtet werden. In den kykladischen Blauschiefern überprägen DA3-Strukturen, die im Zusammenhang mit der CMT stehen, hochdruckmentamorphe Gefüge.In den Menderes-Decken, dem Liegenden der CMT, wird DA3 durch regional vebreitete Gefügeelemente dokumentiert, die im Zusammenhang mit S-gerichteten Schersinnindikatoren stehen. DA3-Gefüge haben die Decken intern deformiert und bilden jene Scherzonen, welche die Decken untereinander abgrenzen. In der Çine-Decke können granitische Gesteine in Orthogneise und Metagranite unterteilt werden. Die Deformationsgeschichte dieser Gesteine dokumentiert zwei Ereignisse. Ein frühes amphibolitfazielles Ereignis erfaßte nur die Orthogneise, in denen vorwiegend NE-SW orientierte Lineare und NE-gerichtete Schersinnindikatoren entstanden. Die jüngeren Metagranite wurden sowohl durch vereinzelte DA3-Scherzonen, als auch in einer großmaßstäblichen DA3-Scherzone am Südrand des Çine-Massivs deformiert. In DA3-Scherzonen sind die Lineare N-S orientiert und die zugehörigen Schersinnindikatoren zeigen S-gerichtete Scherung unter grünschieferfaziellen Bedingungen an. Diese grünschieferfaziellen Scherzonen überprägen die amphibolitfaziellen Gefüge in den Orthogneisen. Magmatische Zirkone aus einem Metagranit, der einen Orthogneiss mit Top-NE Gefügen durchschlägt, ergaben ein 207Pb/206Pb-Alter von 547,2±1,0 Ma. Dies deutet darauf hin, daß DPA proterozoischen Alters ist. Dies wird auch durch die Tatsache gestützt, daß triassische Granite in der Çine- und der Bozdag-Decke keine DPA-Gefüge zeigen. Die jüngeren Top-S-Gefüge sind wahrscheinlich zur gleichen Zeit entstanden wie die ältesten Gefüge der Bayindir-Decke.Das Fehlen von Hochdruck-Gefügen im Liegenden der CMT impliziert eine Exhumierung der kykladischen Blauschiefer von mehr ca. 35 km, bevor diese im Eozän auf die Menderes-Decken aufgeschoben wurden. Die substantiellen Unterschiede bezüglich in der tektonometamorphen Geschichte der kykladischen Blauschiefer und der Menderes-Decken widersprechen der Modellvorstellung eines lateral kontinuierlichen Orogengürtels, nach der die Menderes-Decken als östliche Fortsezung der kykladischen Blauschiefer angesehen werden.Die Analyse spröder spätalpiner Deformationsstrukturen und das regionale Muster mit Hilfe von Spaltspurdatierung modellierter Abkühlalter deuten darauf hin, daß die Struktur des Eozänen Deckenstapels durch miozäne bis rezente Kernkomplex-Bildung stark modifiziert wurde. Eine großmaßstäbliche Muldenstruktur im zentralen Teil der Anatoliden hat sich als Folge zweier symmetrisch angeordneter Detachment-Systeme von initial steilen zu heute flachen Orientierungen im Einflußbreich von Rolling Hinges gebildet. Die Detachment-Störungen begrenzen den Central Menderes metamorphic core complex (CMCC). Das Muster der Apatit-Spaltspuralter belegt, daß die Bildung des CMCC im Miozän begann. Durch die Rück-Deformierung von parallel zur Foliation konstruierten Linien gleicher Abkühlalter kann gezeigt werden, daß die Aufwölbung im Liegenden der Detachments zur Entstehung der Muldenstruktur führte. Das hohe topographische Relief im Bereich des CMCC ist eine Folge der Detachment-Störungen, was darauf hindeutet daß der obere Mantel in den Prozeß mit einbezogen gewesen ist.
Resumo:
This thesis deals with the professional characteristics of secondary school teachers, with particular regard to their competence and their education. The topic will be approached starting from the characteristics and trasnformations social research has identified concerning Italian teachers, focusing on secondary teacher training. After a brief look at Europe, the attention will be directed to Italy, with particular regard to the Postgraduate Schools of Specialisation for Secondary School Teachers (SSIS); hence the need for an analysis that focuses on teaching per se and its concrete pratice. For its nature to be fully grasped, teaching must be reconsidered as an independent object of study, a performance in which competence manifests itself and a form of action involving a set of tacit and personal knowledge. A further perspective opens up for analysis, according to which the professional characteristics of teachers are the result of an education in which the whole history of the subject is involved, in its educative, formative, professional and personal aspects. The teaching profession is imbued with implicit meanings which are inaccessible to conscience but orient action and affect the interpretation of experience. Through the analysis of three different empirical data sets, collected among teachers-in-training and teachers qualified at SSIS, I will try to investigate the actual existence, the nature and the features of such implicit knowledge. It appears necessary to put the claims of process-product approaches back in their right perspective, to the benefit of a holistic conception of teaching competence. The teacher is, at the same time, “he who is teaching” and offers a concrete receiver the fruit of an endless work of study, reflection, practice and self-update. To understand this process will mean to penetrate more and more deeply into the core of teaching and teaching competence , a competence that in some respects “is” always “that” teacher, with his or her own story, implicit knowledge and representations.
Resumo:
Since the industrial revolution, the ocean has absorbed around one third of the anthropogenic CO2, which induced a profound alteration of the carbonate system commonly known as ocean acidification. Since the preindustrial times, the average ocean surface water pH has fallen by 0.1 units, from approximately 8.2 to 8.1 and a further decrease of 0.4 pH units is expected for the end of the century. Despite their microscopic size, marine diatoms are bio-geo-chemically a very important group, responsible for the export of massive amount of carbon to deep waters and sediments. The knowledge of the potential effects of ocean acidification on the phytoplankton growth and on biological pump is still at its infancy. This study wants to investigate the effect of ocean acidification on the growth of the diatom Skeletonema marinoi and on its aggregation, using a mechanistic approach. The experiment consisted of two treatments (Present and Future) representing different pCO2 conditions and two sequential experimental phases. During the cell growth phase a culture of S. marinoi was inoculated into transparent bags and the effect of ocean acidification was studied on various growth parameters, including DOC and TEP production. The aggregation phase consisted in the incubation of the cultures into rolling tanks where the sinking of particles through the water column was simulated and aggregation promoted. Since few studies investigated the effect of pH on the growth of S. marinoi and none used pH ranges that are compatible with the OA scenarios, there were no baselines. I have shown here, that OA does not affect the cell growth of S. marinoi, suggesting that the physiology of this species is robust in respect to the changes in the carbonate chemistry expected for the end of the century. Furthermore, according to my results, OA does not affect the aggregation of S. marinoi in a consistent manner, suggesting that this process has a high natural variability but is not influenced by OA in a predictable way. The effect of OA was tested over a variety of factors including the number of aggregates produced, their size and sinking velocity, the algal, bacterial and TEP content. Many of these variables showed significant treatment effects but none of these were consistent between the two experiments.
Resumo:
Die vorliegende Arbeit beschäftigt sich mit der Entwicklung eines Funktionsapproximators und dessen Verwendung in Verfahren zum Lernen von diskreten und kontinuierlichen Aktionen: 1. Ein allgemeiner Funktionsapproximator – Locally Weighted Interpolating Growing Neural Gas (LWIGNG) – wird auf Basis eines Wachsenden Neuralen Gases (GNG) entwickelt. Die topologische Nachbarschaft in der Neuronenstruktur wird verwendet, um zwischen benachbarten Neuronen zu interpolieren und durch lokale Gewichtung die Approximation zu berechnen. Die Leistungsfähigkeit des Ansatzes, insbesondere in Hinsicht auf sich verändernde Zielfunktionen und sich verändernde Eingabeverteilungen, wird in verschiedenen Experimenten unter Beweis gestellt. 2. Zum Lernen diskreter Aktionen wird das LWIGNG-Verfahren mit Q-Learning zur Q-LWIGNG-Methode verbunden. Dafür muss der zugrunde liegende GNG-Algorithmus abgeändert werden, da die Eingabedaten beim Aktionenlernen eine bestimmte Reihenfolge haben. Q-LWIGNG erzielt sehr gute Ergebnisse beim Stabbalance- und beim Mountain-Car-Problem und gute Ergebnisse beim Acrobot-Problem. 3. Zum Lernen kontinuierlicher Aktionen wird ein REINFORCE-Algorithmus mit LWIGNG zur ReinforceGNG-Methode verbunden. Dabei wird eine Actor-Critic-Architektur eingesetzt, um aus zeitverzögerten Belohnungen zu lernen. LWIGNG approximiert sowohl die Zustands-Wertefunktion als auch die Politik, die in Form von situationsabhängigen Parametern einer Normalverteilung repräsentiert wird. ReinforceGNG wird erfolgreich zum Lernen von Bewegungen für einen simulierten 2-rädrigen Roboter eingesetzt, der einen rollenden Ball unter bestimmten Bedingungen abfangen soll.
Resumo:
Nucleic acid biosensors represent a powerful tool for clinical and environmental pathogens detection. For applications such as point-of-care biosensing, it is fundamental to develop sensors that should be automatic, inexpensive, portable and require a professional skill of the user that should be as low as possible. With the goal of determining the presence of pathogens when present in very small amount, such as for the screening of pathogens in drinking water, an amplification step must be implemented. Often this type of determinations should be performed with simple, automatic and inexpensive hardware: the use of a chemical (or nanotechnological) isothermal solution would be desirable. My Ph.D. project focused on the study and on the testing of four isothermal reactions which can be used to amplify the nucleic acid analyte before the binding event on the surface sensor or to amplify the signal after that the hybridization event with the probe. Recombinase polymerase amplification (RPA) and ligation-mediated rolling circle amplification (L-RCA) were investigated as methods for DNA and RNA amplification. Hybridization chain reaction (HCR) and Terminal deoxynucleotidil transferase-mediated amplification were investigated as strategies to achieve the enhancement of the signal after the surface hybridization event between target and probe. In conclusion, it can be said that only a small subset of the biochemical strategies that are proved to work in solution towards the amplification of nucleic acids does truly work in the context of amplifying the signal of a detection system for pathogens. Amongst those tested during my Ph.D. activity, recombinase polymerase amplification seems the best candidate for a useful implementation in diagnostic or environmental applications.
Resumo:
Le problematiche ambientali e socio – economiche legate alla costruzione di nuove infrastrutture viarie, impongono la progettazione e costruzione di strade che combinino ad elevati standard prestazionali, la riduzione dell’impatto ambientale in fase realizzativa e manutentiva. Quanto detto avvalora il crescente utilizzo di materiali bituminosi modificati con polimeri ed additivati con cere. I primi conferiscono alla miscela maggiore elastoplasticità, incrementandone la durabilità e la resistenza a fatica. Nei secondi la presenza del materiale paraffinico contribuisce a ridurre la viscosità del bitume, il che consente il notevole abbassamento della temperatura di produzione e stesa della miscela. Numerosi studi inoltre hanno dimostrato che le caratteristiche meccaniche della pavimentazione sono fortemente influenzate dal grado di ossidazione delle componenti organiche del bitume, ovvero dal fenomeno dell’invecchiamento o aging. Risulta pertanto fondamentale affiancare allo studio reologico del bitume, prove di simulazione dell’ invecchiamento nel breve e lungo termine. Nel corso della seguente ricerca si provvederà pertanto ad analizzare leganti modificati ed additivati secondo la teoria della viscoelasticità, simulando le reali condizioni di carico ed invecchiamento alle quali il bitume è sottoposto. Tutte le prove di caratterizzazione reologica avanzata prevederanno l’utilizzo del DSR (Dynamic Shear Rheometer) in varie configurazioni di prova e si simulerà l’invecchiamento a breve termine mediante RTFOT (Rolling thin film oven test). Si proporrà inoltre una nuova procedura di aging invecchiando il bitume alla temperatura di equiviscosità o Twork , ovvero a quel valore della temperatura tale per cui, in fase di messa in opera, si avrà una distribuzione molecolare omogenea del modificante all’interno del bitume. Verranno quindi effettuate ulteriori prove reologiche sui leganti invecchiati a tale temperatura. Si darà infine supporto ai risultati della ricerca effettuando prove chimiche con la tecnica analitica FTIR (Fourier Transform Infrared Spectroscopy), analizzando i cambiamenti molecolari avvenuti nel bitume a seguito dell’aggiunta del modificante e dell’invecchiamento.
Resumo:
E’ mostrata l’analisi e la modellazione di dati termocronologici di bassa temperatura da due regioni Alpine: il Sempione ed il Brennero. Le faglie distensive presenti bordano settori crostali profondi appartenenti al dominio penninico: il duomo metamorfico Lepontino al Sempione e la finestra dei Tauri al Brennero. I dati utilizzati sono FT e (U-Th)/He su apatite. Per il Sempione i dati provengono dalla bibliografia; per il Brennero si è provveduto ad un nuovo campionamento, sia in superficie che in sotterraneo. Gli attuali lavori per la galleria di base del Brennero (BBT), hanno consentito, per la prima volta, di raccogliere dati di FT e (U-Th)/He in apatite in sottosuolo per la finestra dei Tauri occidentale. Le analisi sono state effettuate tramite un codice a elementi finiti, Pecube, risolvente l’equazione di diffusione del calore per una topografia evolvente nel tempo. Il codice è stato modificato per tener conto dei dati sotterranei. L’inversione dei dati è stata effettuata usando il Neighbourhood Algorithm (NA), per ottenere il più plausibile scenario di evoluzione morfotettonico. I risultati ottenuti per il Sempione mostrano: ipotetica evoluzione dello stile tettonico della faglia del Sempione da rolling hinge a low angle detachment a 6.5 Ma e la cessazione dell’attività a 3 Ma; costruzione del rilievo fino a 5.5 Ma, smantellamento da 5.5 Ma ad oggi, in coincidenza dei cambiamenti climatici Messiniani e relativi all’inizio delle maggiori glaciazioni; incremento dell’esumazione da 0–0.6 mm/anno a 0.6–1.2 mm/anno a 2.4 Ma nell’emisfero settentrionale. I risultati al Brennero mostrano: maggiore attività tettonica della faglia del Brennero (1.3 mm/anno), maggiore attività esumativa (1–2 mm/anno) prima dei 10 Ma; crollo dell’attività della faglia del Brennero fra 10 Ma e oggi (0.1 mm/anno) e dell’attività esumativa nello stesso periodo (0.1–0.3 mm/anno); nessun aumento del tasso esumativo o variazioni topografiche negli ultimi 5 Ma.
Resumo:
Questa dissertazione tratterà l’argomento dello studio di metodi di progettazione e processi tecnologici innovativi per l’industrializzazione in medio-grande serie di componenti strutturali in materiale composito. L’interesse della ricerca verso questo ambito è suscitato dai notevoli vantaggi che l’utilizzo di materiali dall’alto rapporto prestazioni meccaniche/peso danno nella ricerca di elevate prestazioni in applicazioni sportive e diminuzione dei consumi ed emissioni inquinanti in mezzi di trasporto di grande serie. Lo studio di componenti in materiale composito è caratterizzato dalla peculiarità di non poter disgiungere la progettazione della geometria della parte da quella del materiale e del processo, ed in questo senso nella figura del progettista si vanno a riassumere sinergicamente competenze riguardanti i tre ambiti. Lo scopo di questo lavoro è la proposizione di una metodologia di progettazione e produzione di componenti strutturali che permetta l’utilizzazione ottimale della natura fibrosa del materiale composito sia dal punto di vista del trasferimento dei carichi tra diversi componenti, sia dal punto di vista del processo di laminazione che avviene per nastratura automatizzata. Lo studio è volto a mostrare in quali termini tale tecnologia sia potenzialmente in grado di superare i vincoli di forma ed i limiti di efficienza meccanica delle giunzioni tra le parti e di garantire maggiore produttività e costi inferiori rispetti ai diversi metodi di produzione che rappresentano oggi lo stato dell’arte dell’industrializzazione in medio-grande serie. Particolare attenzione verrà posta sull’utilizzo della tecnologia oggetto di studio per la produzione di telai automobilistici.
Synthetische Glycopeptide mit Sulfatyl-Lewis X-Struktur als potenzielle Inhibitoren der Zelladhäsion
Resumo:
Zelladhäsionsprozesse sind von großer Bedeutung für zahlreiche biologische Prozesse, wie etwa die Immunantwort, die Wundheilung und die Embryogenese. Außerdem spielen sie eine entscheidende Rolle im Verlauf inflammatorischer Prozesse. An der Zelladhäsion sind verschiedene Klassen von Adhäsionsmolekülen beteiligt. Die erste leichte „rollende“ Adhäsion von Leukozyten am Ort einer Entzündung wird durch die Selektine vermittelt. Diese binden über die Kohlenhydrat-Strukturen Sialyl-Lewisx und Sialyl-Lewisa über eine calciumabhängige Kohlenhydrat-Protein-Bindung an ihre spezifischen Liganden und vermitteln so den ersten Zellkontakt, bevor andere Adhäsionsmoleküle (Cadherine, Integrine) die feste Adhäsion und den Durchtritt durch das Endothel bewirken. Bei einer pathogenen Überexpression der Selektine kommt es jedoch zu zahlreichen chronischen Erkrankungen wie z. B. rheumatoider Arthritis, Erkrankungen der Herzkranzgefäße oder dem Reperfusions-syndrom. Außerdem wird eine Beteiligung der durch die Selektine vermittelten Zellkontakte bei der Metastasierung von Karzinomzellen angenommen. Ein Ansatzpunkt für die Behandlung der oben genannten Erkrankungen ist die Gabe löslicher kompetitiver Inhibitoren für die Selektine. Ziel der Arbeit war die Modifikation des Sialyl-Lewisx-Leitmotivs zur Steigerung der metabolischen Stabilität und dessen Einbau in die Peptidsequenz aus der für die Bindung verantwortlichen Domäne des endogenen Selektin-Liganden PSGL-1. Dazu wurden mit einer modifizierten Lewisx-Struktur glycosylierte Aminosäurebausteine dargestellt (Abb.1). Die Verwendung von Arabinose und des Sulfatrestes anstelle von Fusose und Sialinsäure sollte außerdem zu einer gesteigerten metabolischen Stabilität des synthetischen Liganden beitragen. Die so erhaltenen Glycosylaminosäuren sollten nun in die Festphasenpeptidsynthese eingesetzt werden. Aufgrund der großen säurelabilität konnte hier nicht auf das Standartverfahren (Wang-Harz, Abspaltung mit TFA) zurückgegriffen werden. Deshalb kam ein neuartiges UV-labiles Ankersystem zum Einsatz. Dazu wurde ein Protokoll für die Synthese und Abspaltung von Peptiden an diesem neuen System entwickelt. Daran gelang die Synthese des nichtglycosylierten Peptidrückgrats sowie eines mit der dem sulfatierten Lewisx-Motiv versehenen Glycopeptids. Ein vierfach sulfatiertes Glycopeptid, welches durch den Einsatz von im Vorfeld chemisch sulfatierer Tyrosin-Bausteinen dargestellt werden sollte, konnte massenspektrometrisch nachgewiesen werden.
Resumo:
This thesis deals with the synthesis and the conformation analysis of hybrid foldamers containing the 4-carboxyoxazolidin-2-one unit or related molecules, in which an imido-type function is obtained by coupling the nitrogen of the heterocycle with the carboxylic acid moiety of the next unit. The imide group is characterized by a nitrogen atom connected to an endocyclic and an exocyclic carbonyl, which tend always to adopt the trans conformation. As a consequence of this locally constrained disposition effect, these imide-type oligomers are forced to fold in ordered conformations. The synthetic approach is highly tuneable with endless variations, so, simply by changing the design and the synthesis, a wide variety of foldamers with the required properties may be prepared “on demand”. Thus a wide variety of unusual secondary structures and interesting supramolecular materials may be obtained with hybrid foldamers. The behaviour in the solid state of some of these compounds has been analyzed in detail, thus showing the formation of different kinds of supramolecular materials that may be used for several applications. A winning example is the production of a bolaamphiphilic gelators that may also be doped with small amounts of dansyl containing compounds, needed to show the cellular uptake into IGROV-1 cells, by confocal laser scanning microscopy. These gels are readily internalized by cells and are biologically inactive, making them very good candidates in the promising field of drug delivery. In the last part of the thesis, a particular attention was directed to the search of new scaffolds that behave as constrained amino acid mimetics, showing that tetramic acids derivatives could be good candidates for the synthesis and applications of molecules having an ordered secondary structure.
Resumo:
L’esposizione degli operatori in campo agricolo alle vibrazioni trasmesse al corpo intero, produce effetti dannosi alla salute nel breve e nel lungo termine. Le vibrazioni che si generano sulle trattrici agricole hanno una elevata intensità e una bassa frequenza. Le componenti orizzontali, amplificate dalla posizione elevata della postazione di guida dall’asse di rollio, presentano maggiori criticità per quanto riguarda i sistemi di smorzamento rispetto alle componenti verticali. Queste caratteristiche rendono difficoltosa la progettazione dei sistemi dedicati alla riduzione del livello vibrazionale per questa categoria di macchine agricole. Nonostante l’installazione di diversi sistemi di smorzamento, il livello di vibrazioni a cui è sottoposto l’operatore può superare, in diverse condizioni di impiego, i livelli massimi imposti dalla legge per la salvaguardia della salute. L’obiettivo di questo lavoro è quello di valutare l’influenza dei moti rigidi di una trattrice (beccheggio, rollio e saltellamento) dotata di sospensione assale anteriore, sospensione cabina e sospensione sedile, sul livello vibrazionale trasmesso all’operatore.E’ stata pertanto strumenta una trattrice con accelerometri e inclinometri installati su telaio, cabina e sedile e utilizzata in diverse condizioni di lavoro in campo e di trasporto su strada. Dall’analisi delle prove effettuate emerge che durante il trasporto su strada è predominante l’accelerazione longitudinale, a causa dell’elevata influenza del beccheggio. La sospensione riduce notevolmente il moto rigido di beccheggio mentre l’effetto della sospensione della cabina è quello di incrementare, in ogni condizione di lavoro, il livello di accelerazione trasmesso dal telaio della macchina.
Resumo:
Lo studio effettuato pone le sue basi sulla ricerca di materiali stradali che combinino ad elevati standard prestazionali, la riduzione dell’impatto ambientale in fase realizzativa e manutentiva. In particolare il seguente lavoro si occupa dello studio di 7 leganti modificati con polimeri ed additivati con cere. I primi infatti conferiscono alla miscela maggiore elastoplasticità, incrementandone la durabilità e la resistenza a fatica. Nei secondi la presenza del materiale paraffinico contribuisce a ridurre la viscosità del bitume, consentendo un notevole abbassamento della temperatura di produzione e stesa della miscela. Numerosi studi hanno dimostrato che le caratteristiche meccaniche della pavimentazione sono fortemente influenzate dal grado di ossidazione delle componenti organiche del bitume, ovvero dal fenomeno dell’invecchiamento o aging. Pertanto allo studio reologico del bitume, si sono affiancate prove di simulazione dell’ invecchiamento nel breve e lungo termine. In fase di ricerca sperimentale si sono analizzati i leganti modificati ed additivati secondo la teoria della viscoelasticità, simulando le reali condizioni di carico ed invecchiamento alle quali il bitume è sottoposto. Tutte le prove di caratterizzazione reologica avanzata sono state effettuate mediante l’utilizzo del DSR (Dynamic Shear Rheometer - UNI EN 14770 ) in varie configurazioni di prova e l’invecchiamento a breve termine è stato simulato mediante RTFOT (Rolling thin film oven test -UNI EN 12607-1). Si è proposto inoltre una nuova procedura di aging invecchiando il bitume alla temperatura di Twork, ovvero a quel valore della temperatura tale per cui, in fase di messa in opera, si avrà una distribuzione molecolare omogenea del modificante all’interno del bitume.