806 resultados para Hand shape


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study is part of the EU Integrated Project “GEHA – Genetics of Healthy Aging” (Franceschi C et al., Ann N Y Acad Sci. 1100: 21-45, 2007), whose aim is to identify genes involved in healthy aging and longevity, which allow individuals to survive to advanced age in good cognitive and physical function and in the absence of major age-related diseases. Aims The major aims of this thesis were the following: 1. to outline the recruitment procedure of 90+ Italian siblings performed by the recruiting units of the University of Bologna (UNIBO) and Rome (ISS). The procedures related to the following items necessary to perform the study were described and commented: identification of the eligible area for recruitment, demographic aspects related to the need of getting census lists of 90+siblings, mail and phone contact with 90+ subjects and their families, bioethics aspects of the whole procedure, standardization of the recruitment methodology and set-up of a detailed flow chart to be followed by the European recruitment centres (obtainment of the informed consent form, anonimization of data by using a special code, how to perform the interview, how to collect the blood, how to enter data in the GEHA Phenotypic Data Base hosted at Odense). 2. to provide an overview of the phenotypic characteristics of 90+ Italian siblings recruited by the recruiting units of the University of Bologna (UNIBO) and Rome (ISS). The following items were addressed: socio-demographic characteristics, health status, cognitive assessment, physical conditions (handgrip strength test, chair-stand test, physical ability including ADL, vision and hearing ability, movement ability and doing light housework), life-style information (smoking and drinking habits) and subjective well-being (attitude towards life). Moreover, haematological parameters collected in the 90+ sibpairs as optional parameters by the Bologna and Rome recruiting units were used for a more comprehensive evaluation of the results obtained using the above mentioned phenotypic characteristics reported in the GEHA questionnaire. 3. to assess 90+ Italian siblings as far as their health/functional status is concerned on the basis of three classification methods proposed in previous studies on centenarians, which are based on: • actual functional capabilities (ADL, SMMSE, visual and hearing abilities) (Gondo et al., J Gerontol. 61A (3): 305-310, 2006); • actual functional capabilities and morbidity (ADL, ability to walk, SMMSE, presence of cancer, ictus, renal failure, anaemia, and liver diseases) (Franceschi et al., Aging Clin Exp Res, 12:77-84, 2000); • retrospectively collected data about past history of morbidity and age of disease onset (hypertension, heart disease, diabetes, stroke, cancer, osteopororis, neurological diseases, chronic obstructive pulmonary disease and ocular diseases) (Evert et al., J Gerontol A Biol Sci Med Sci. 58A (3): 232-237, 2003). Firstly these available models to define the health status of long-living subjects were applied to the sample and, since the classifications by Gondo and Franceschi are both based on the present functional status, they were compared in order to better recognize the healthy aging phenotype and to identify the best group of 90+ subjects out of the entire studied population. 4. to investigate the concordance of health and functional status among 90+ siblings in order to divide sibpairs in three categories: the best (both sibs are in good shape), the worst (both sibs are in bad shape) and an intermediate group (one sib is in good shape and the other is in bad shape). Moreover, the evaluation wanted to discover which variables are concordant among siblings; thus, concordant variables could be considered as familiar variables (determined by the environment or by genetics). 5. to perform a survival analysis by using mortality data at 1st January 2009 from the follow-up as the main outcome and selected functional and clinical parameters as explanatory variables. Methods A total of 765 90+ Italian subjects recruited by UNIBO (549 90+ siblings, belonging to 258 families) and ISS (216 90+ siblings, belonging to 106 families) recruiting units are included in the analysis. Each subject was interviewed according to a standardized questionnaire, comprising extensively utilized questions that have been validated in previous European studies on elderly subjects and covering demographic information, life style, living conditions, cognitive status (SMMSE), mood, health status and anthropometric measurements. Moreover, subjects were asked to perform some physical tests (Hand Grip Strength test and Chair Standing test) and a sample of about 24 mL of blood was collected and then processed according to a common protocol for the preparation and storage of DNA aliquots. Results From the analysis the main findings are the following: - a standardized protocol to assess cognitive status, physical performances and health status of European nonagenarian subjects was set up, in respect to ethical requirements, and it is available as a reference for other studies in this field; - GEHA families are enriched in long-living members and extreme survival, and represent an appropriate model for the identification of genes involved in healthy aging and longevity; - two simplified sets of criteria to classify 90+ sibling according to their health status were proposed, as operational tools for distinguishing healthy from non healthy subjects; - cognitive and functional parameters have a major role in categorizing 90+ siblings for the health status; - parameters such as education and good physical abilities (500 metres walking ability, going up and down the stairs ability, high scores at hand grip and chair stand tests) are associated with a good health status (defined as “cognitive unimpairment and absence of disability”); - male nonagenarians show a more homogeneous phenotype than females, and, though far fewer in number, tend to be healthier than females; - in males the good health status is not protective for survival, confirming the male-female health survival paradox; - survival after age 90 was dependent mainly on intact cognitive status and absence of functional disabilities; - haemoglobin and creatinine levels are both associated with longevity; - the most concordant items among 90+ siblings are related to the functional status, indicating that they contain a familiar component. It is still to be investigated at what level this familiar component is determined by genetics or by environment or by the interaction between genetics, environment and chance (and at what level). Conclusions In conclusion, we could state that this study, in accordance with the main objectives of the whole GEHA project, represents one of the first attempt to identify the biological and non biological determinants of successful/unsuccessful aging and longevity. Here, the analysis was performed on 90+ siblings recruited in Northern and Central Italy and it could be used as a reference for others studies in this field on Italian population. Moreover, it contributed to the definition of “successful” and “unsuccessful” aging and categorising a very large cohort of our most elderly subjects into “successful” and “unsuccessful” groups provided an unrivalled opportunity to detect some of the basic genetic/molecular mechanisms which underpin good health as opposed to chronic disability. Discoveries in the topic of the biological determinants of healthy aging represent a real possibility to identify new markers to be utilized for the identification of subgroups of old European citizens having a higher risk to develop age-related diseases and disabilities and to direct major preventive medicine strategies for the new epidemic of chronic disease in the 21st century.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Premise: In the literary works of our anthropological and cultural imagination, the various languages and the different discursive practices are not necessarily quoted, expressly alluded to or declared through clear expressive mechanisms; instead, they rather constitute a substratum, a background, now consolidated, which with irony and intertextuality shines through the thematic and formal elements of each text. The various contaminations, hybridizations and promptings that we find in the expressive forms, the rhetorical procedures and the linguistic and thematic choices of post-modern literary texts are shaped as fluid and familiar categories. Exchanges and passages are no longer only allowed but also inevitable; the post-modern imagination is made up of an agglomeration of discourses that are no longer really separable, built up from texts that blend and quote one another, composing, each with its own specificities, the great family of the cultural products of our social scenario. A literary work, therefore, is not only a whole phenomenon, delimited hic et nunc by a beginning and an ending, but is a fragment of that complex, dense and boundless network that is given by the continual interrelations between human forms of communication and symbolization. The research hypothesis: A vision is delineated of comparative literature as a discipline attentive to the social contexts in which texts take shape and move and to the media-type consistency that literary phenomena inevitably take on. Hence literature is seen as an open systematicity that chooses to be contaminated by other languages and other discursive practices of an imagination that is more than ever polymorphic and irregular. Inside this interpretative framework the aim is to focus the analysis on the relationship that postmodern literature establishes with advertising discourse. On one side post-modern literature is inserted in the world of communication, loudly asserting the blending and reciprocal contamination of literary modes with media ones, absorbing their languages and signification practices, translating them now into thematic nuclei, motifs and sub-motifs and now into formal expedients and new narrative choices; on the other side advertising is chosen as a signification practice of the media universe, which since the 1960s has actively contributed to shaping the dynamics of our socio-cultural scenarios, in terms which are just as important as those of other discursive practices. Advertising has always been a form of communication and symbolization that draws on the collective imagination – myths, actors and values – turning them into specific narrative programs for its own texts. Hence the aim is to interpret and analyze this relationship both from a strictly thematic perspective – and therefore trying to understand what literature speaks about when it speaks about advertising, and seeking advertising quotations in post-modern fiction – and from a formal perspective, with a search for parallels and discordances between the rhetorical procedures, the languages and the verifiable stylistic choices in the texts of the two different signification practices. The analysis method chosen, for the purpose of constructive multiplication of the perspectives, aims to approach the analytical processes of semiotics, applying, when possible, the instruments of the latter, in order to highlight the thematic and formal relationships between literature and advertising. The corpus: The corpus of the literary texts is made up of various novels and, although attention is focused on the post-modern period, there will also be ineludible quotations from essential authors that with their works prompted various reflections: H. De Balzac, Zola, Fitzgerald, Joyce, Calvino, etc… However, the analysis focuses the corpus on three authors: Don DeLillo, Martin Amis and Aldo Nove, and in particular the followings novels: “Americana” (1971) and “Underworld” (1999) by Don DeLillo, “Money” (1984) by Martin Amis and “Woobinda and other stories without a happy ending” (1996) and “Superwoobinda” (1998) by Aldo Nove. The corpus selection is restricted to these novels for two fundamental reasons: 1. assuming parameters of spatio-temporal evaluation, the texts are representative of different socio-cultural contexts and collective imaginations (from the masterly glimpses of American life by DeLillo, to the examples of contemporary Italian life by Nove, down to the English imagination of Amis) and of different historical moments (the 1970s of DeLillo’s Americana, the 1980s of Amis, down to the 1990s of Nove, decades often used as criteria of division of postmodernism into phases); 2. adopting a perspective of strictly thematic analysis, as mentioned in the research hypothesis, the variations and the constants in the novels (thematic nuclei, topoi, images and narrative developments) frequently speak of advertising and inside the narrative plot they affirm various expressions and realizations of it: value ones, thematic ones, textual ones, urban ones, etc… In these novels the themes and the processes of signification of advertising discourse pervade time, space and the relationships that the narrator character builds around him. We are looking at “particle-characters” whose endless facets attest the influence and contamination of advertising in a large part of the narrative developments of the plot: on everyday life, on the processes of acquisition and encoding of the reality, on ideological and cultural baggage, on the relationships and interchanges with the other characters, etc… Often the characters are victims of the implacable consequentiality of the advertising mechanism, since the latter gets the upper hand over the usual processes of communication, which are overwhelmed by it, wittingly or unwittingly (for example: disturbing openings in which the protagonist kills his or her parents on the basis of a spot, former advertisers that live life codifying it through the commercial mechanisms of products, sons and daughters of advertisers that as children instead of playing outside for whole nights saw tapes of spots.) Hence the analysis arises from the text and aims to show how much the developments and the narrative plots of the novels encode, elaborate and recount the myths, the values and the narrative programs of advertising discourse, transforming them into novel components in their own right. And also starting from the text a socio-cultural reference context is delineated, a collective imagination that is different, now geographically, now historically, and from comparison between them the aim is to deduce the constants, the similarities and the variations in the relationship between literature and advertising.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Zusammenfassung:Das Ziel dieser Arbeit ist ein besseres Verständnis von der Art und Weise wie sich Formregelungsgefüge entwicklen. Auf dieser Basis wird der Nutzen von Formregelungsgefügen für die Geologie evaluiert. Untersuchungsmethoden sind Geländearbeit und -auswertung, numerische Simulationen und Analogexperimente. Untersuchungen an Formregelungsgefügen in Gesteinen zeigen, daß ein Formregelungsgefüge nur zu einem begrenzten Grad als Anzeiger für die Stärke der Verformung benutzt werden kann. Der angenommene Grund hierfür ist der Einfluß des Verhältnisses von ursprünglicher zu rekristallisierter Korngröße auf die Gefügeentwicklung und von der Art und Weise wie dynamische Rekristallisation ein Gefüge verändert. Um diese Beobachtung zu evaluieren, wurden verschiedene numerische Simulationen von dynamischer Rekristallisation durchgeführt. Ein neuer Deformationsapparat, mit dem generelle Fließregime modelliert werden können, wurde entwickelt. Die rheologischen Eigenschaften von Materialien, die für solche Experimente benutzt werden, wurden untersucht und diskutiert. Ergebnisse von Analogexperimenten zeigen, daß die Intensität eines Formregelungsgefüges positiv mit der Abnahme der 'kinematic vorticity number' und einem nicht-Newtonianischen, 'power law' Verhalten des Materixmaterials korreliert ist. Experimente, in denen die Formveränderung von viskosen Einschlüssen während der progressiven Verformung modelliert werden, zeigen, daß verschiedene Viskositätskontraste zwischen Matrix- und Einschlußmaterial in charakteristische Formgefüge resultieren.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stabilization of nanoparticles against their irreversible particle aggregation and oxidation reactions. is a requirement for further advancement in nanoparticle science and technology. For this reason the research aim on this topic focuses on the synthesis of various metal nanoparticles protected with monolayers containing different reactive head groups and functional tail groups. In this work cuprous bromide nanocrystals haave been synthetized with a diameter of about 20 nanometers according to a new sybthetic method adding dropwise ascorbic acid to a water solution of lithium bromide and cupric chloride under continuous stirring and nitrogen flux. Butane thiolate Cu protected nanoparticles have been synthetized according to three different syntesys methods. Their morphologies appear related to the physicochemical conditions during the synthesis and to the dispersing medium used to prepare the sample. Synthesis method II allows to obtain stable nanoparticles of 1-2 nm in size both isolated and forming clusters. Nanoparticle cluster formation was enhanced as water was used as dispersing medium probably due to the idrophobic nature of the butanethiolate layers coating the nanoparticle surface. Synthesis methods I and III lead to large unstable spherical nanoparticles with size ranging between 20 to 50 nm. These nanoparticles appeared in the TEM micrograph with the same morphology independently on the dispersing medium used in the sample preparation. The stability and dimensions of the copper nanoparticles appear inversely related. Using the same methods above described for the butanethiolate protected copper nanoparticles 4-methylbenzenethiol protected copper nanoparticles have been prepared. Diffractometric and spectroscopic data reveal that decomposition processes didn’t occur in both the 4-methylbenzenethiol copper protected nanoparticles precipitates from formic acid and from water in a period of time six month long. Se anticarcinogenic effects by multiple mechanisms have been extensively investigated and documented and Se is defined a genuine nutritional cancer-protecting element and a significant protective effect of Se against major forms of cancer. Furthermore phloroglucinol was found to possess cytoprotective effects against oxidative stress, thanks to reactive oxygen species (ROS) which are associated with cells and tissue damages and are the contributing factors for inflammation, aging, cancer, arteriosclerosis, hypertension and diabetes. The goal of our work has been to set up a new method to synthesize in mild conditions amorphous Se nanopaticles surface capped with phloroglucinol, which is used during synthesis as reducing agent to obtain stable Se nanoparticles in ethanol, performing the synergies offered by the specific anticarcinogenic properties of Se and the antioxiding ones of phloroalucinol. We have synthesized selenium nanoparticles protected by phenolic molecules chemically bonded to their surface. The phenol molecules coating the nanoparticles surfaces form low ordered arrays as can be seen from the wider shape of the absorptions in the FT-IR spectrum with respect to those appearing in that of crystalline phenol. On the other hand, metallic nanoparticles with unique optical properties, facile surface chemistry and appropriate size scale are generating much enthusiasm in nanomedicine. In fact Au nanoparticles has immense potential for both cancer diagnosis and therapy. Especially Au nanoparticles efficiently convert the strongly adsorbed light into localized heat, which can be exploited for the selective laser photothermal therapy of cancer. According to the about, metal nanoparticles-HA nanocrystals composites should have tremendous potential in novel methods for therapy of cancer. 11 mercaptoundecanoic surface protected Au4Ag1 nanoparticles adsorbed on nanometric apathyte crystals we have successfully prepared like an anticancer nanoparticles deliver system utilizing biomimetic hydroxyapatyte nanocrystals as deliver agents. Furthermore natural chrysotile, formed by densely packed bundles of multiwalled hollow nanotubes, is a mineral very suitable for nanowires preparation when their inner nanometer-sized cavity is filled with a proper material. Bundles of chrysotile nanotubes can then behave as host systems, where their large interchannel separation is actually expected to prevent the interaction between individual guest metallic nanoparticles and act as a confining barrier. Chrysotile nanotubes have been filled with molten metals such as Hg, Pb, Sn, semimetals, Bi, Te, Se, and with semiconductor materials such as InSb, CdSe, GaAs, and InP using both high-pressure techniques and metal-organic chemical vapor deposition. Under hydrothermal conditions chrysotile nanocrystals have been synthesized as a single phase and can be utilized as a very suitable for nanowires preparation filling their inner nanometer-sized cavity with metallic nanoparticles. In this research work we have synthesized and characterized Stoichiometric synthetic chrysotile nanotubes have been partially filled with bi and monometallic highly monodispersed nanoparticles with diameters ranging from 1,7 to 5,5 nm depending on the core composition (Au, Au4Ag1, Au1Ag4, Ag). In the case of 4 methylbenzenethiol protected silver nanoparticles, the filling was carried out by convection and capillarity effect at room temperature and pressure using a suitable organic solvent. We have obtained new interesting nanowires constituted of metallic nanoparticles filled in inorganic nanotubes with a inner cavity of 7 nm and an isolating wall with a thick ranging from 7 to 21 nm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sudden cardiac death due to ventricular arrhythmia is one of the leading causes of mortality in the world. In the last decades, it has proven that anti-arrhythmic drugs, which prolong the refractory period by means of prolongation of the cardiac action potential duration (APD), play a good role in preventing of relevant human arrhythmias. However, it has long been observed that the “class III antiarrhythmic effect” diminish at faster heart rates and that this phenomenon represent a big weakness, since it is the precise situation when arrhythmias are most prone to occur. It is well known that mathematical modeling is a useful tool for investigating cardiac cell behavior. In the last 60 years, a multitude of cardiac models has been created; from the pioneering work of Hodgkin and Huxley (1952), who first described the ionic currents of the squid giant axon quantitatively, mathematical modeling has made great strides. The O’Hara model, that I employed in this research work, is one of the modern computational models of ventricular myocyte, a new generation began in 1991 with ventricular cell model by Noble et al. Successful of these models is that you can generate novel predictions, suggest experiments and provide a quantitative understanding of underlying mechanism. Obviously, the drawback is that they remain simple models, they don’t represent the real system. The overall goal of this research is to give an additional tool, through mathematical modeling, to understand the behavior of the main ionic currents involved during the action potential (AP), especially underlining the differences between slower and faster heart rates. In particular to evaluate the rate-dependence role on the action potential duration, to implement a new method for interpreting ionic currents behavior after a perturbation effect and to verify the validity of the work proposed by Antonio Zaza using an injected current as a perturbing effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Persistent Topology is an innovative way of matching topology and geometry, and it proves to be an effective mathematical tool in shape analysis. In order to express its full potential for applications, it has to interface with the typical environment of Computer Science: It must be possible to deal with a finite sampling of the object of interest, and with combinatorial representations of it. Following that idea, the main result claims that it is possible to construct a relation between the persistent Betti numbers (PBNs; also called rank invariant) of a compact, Riemannian submanifold X of R^m and the ones of an approximation U of X itself, where U is generated by a ball covering centered in the points of the sampling. Moreover we can state a further result in which, this time, we relate X with a finite simplicial complex S generated, thanks to a particular construction, by the sampling points. To be more precise, strict inequalities hold only in "blind strips'', i.e narrow areas around the discontinuity sets of the PBNs of U (or S). Out of the blind strips, the values of the PBNs of the original object, of the ball covering of it, and of the simplicial complex coincide, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Der Vergleich der deutschen und der schweizerischen Rundfunkordnung unter dem Aspekt des Dualismus 1.Einleitung: Bedeutung und Grundlagen des „Dualismus“ 2.Das „duale System“ in der deutschen Rundfunkordnung 2.1 Die Genese des „dualen Systems“ - Historische und rechtliche Rahmenbedingungen 2.2 Die aktuelle Ausgestaltung des „dualen Systems“ 2.3 Das „duale System“ im europäischen Raum – europarechtliche Einflüsse und Vorgaben 3. Das „duale System“ in der schweizerischen Rundfunkordnung 3.1 Die Genese des „dualen Systems“ - Historische und rechtliche Rahmenbedingungen 3.2 Die aktuelle Ausgestaltung des „dualen Systems“ 3.3 Vergleichende Betrachtung unterschiedlicher Ausprägungen des „dualen Systems“ im Rahmen der Revision des RTVG 4. Vergleichende Betrachtung der „dualen Systeme“ 4.1 Historische und gesetzliche Rahmenbedingungen 4.2 Die spezifischen Besonderheiten des schweizerischen Rundfunkmarktes 4.3 Die einzelnen Elemente der Rundfunkordnung 5. Endergebnis Duale Systeme im Bereich des Rundfunkrechtes bedeuten Koexistenz von privaten und öffentlich-rechtlichen Rundfunkveranstaltern. Die in der Verfassung der Bundesrepublik Deutschland angelegte Rundfunkordnung ist im wesentlichen durch die Rechtsprechung des Bundesverfassungsgerichts geprägt worden. Das aufgrund dieser Vorgaben gewachsene duale System besteht aus einem starken öffentlich-rechtlichen Rundfunk, dessen Position durch die vorrangige Finanzierung aus Gebühren privilegiert wird. Im Gegenzug wird ihm die zentrale Aufgabe zur Sicherung der Grundversorgung zugewiesen. Daneben bestehen die privaten Rundfunkveranstalter, die sich aus Werbeeinnahmen und Nutzungsentgelten finanzieren und insoweit dem Wettbewerb im Markt in höherem Maße ausgeliefert sind. Im europäischen Bereich fällt der Schutz von Pluralismus und Meinungsvielfalt in erster Linie in den Zuständigkeitsbereich der Mitgliedstaaten. Die Medienlandschaften der Mitgliedstaaten sind durch vielfältige Eigenheiten und Traditionen geprägt, die gerade erhalten bleiben sollen. Die Ausgestaltung des dualen Systems im europäischen Rahmen wirft mithin Bedenken allein im Hinblick auf die Finanzierung der öffentlich-rechtlichen Veranstalter aus öffentlichen Ressourcen und die darauf basierende Wettbewerbsverzerrung auf. Mit dem Radio- und Fernsehgesetz von 1991 wurde in der Schweiz ein duales Rundfunksystem eingeführt. Das Treuhandmodell wurde ergänzt durch das Marktmodell. Allerdings galt das duale System für Rundfunk und Fernsehen in der Schweiz nur in der abgeschwächten Form eines staatlich geordneten Wettbewerbs. Es bestand ein Drei-Ebenen-Modell, das eine direkte Konkurrenz zwischen der nationalen Dachorganisation SRG (Schweizerische Rundfunkgesellschaft) und privaten Unternehmen weitestgehend vermied. Die Hauptverpflichtung des Service public oblag der SRG, die auch die Gebühren erhielt. Daneben wurden allerdings alle Veranstalter zu Service-public-Leistungen verpflichtet. Im Gegenzug dazu sah der Gesetzgeber in marktschwachen Regionen ein Gebührensplitting vor. Mit dem neuen RTVG soll dem Service Public eine Bestands- und Entwicklungsgarantie zugesichert werden. Anstelle einer scharfen Trennung zwischen gebühren- und werbefinanzierten Anbietern mit entsprechend unterschiedlichen Funktionen im Mediensystem sollen allerdings die elektronischen Medien in der Schweiz großflächig subventioniert und vermehrt mit Leistungsaufträgen gesteuert werden. Gerade auf lokaler Ebene wird eine Ausweitung des Gebührensplittings vorgesehen. Nicht nur einer, sondern eine Vielzahl von Veranstaltern soll künftig mit der Grundversorgung beauftragt werden. Insbesondere der Service public régional soll von privaten Anbietern und der SRG erbracht werden. Eine Inpflichtnahme sämtlicher privater Rundfunkveranstalter wird indes nicht vorgesehen. Anhand dieser Masterarbeit sollen weiterhin die Unterschiede herausgearbeitet werden, die einzelne nationale Rundfunksysteme aufweisen können und damit auch die rundfunkpolitischen Modelle trotz des gleich bleibenden Grundgedankens, hier des Dualismus. Die Modelle sind stets in ihrem spezifischen politischen und kulturellen Kontext zu sehen, woraus sie historisch gewachsen sind. Durch den Vergleich sollen auf der einen Seite die Probleme der Rundfunkmodelle dargelegt werden, die diesen unabhängig von ihrer Ausgestaltung in mehr oder minder ausgeprägter Form generell innewohnen (Definition der Grundversorgung - des Service public/ Ressourcenknappheit/ Krisen des dualen Systems). Andererseits sollen die spezifischen Probleme der Schweiz aufgrund ihrer mehrsprachigen, kleinstaatlichen Struktur verdeutlicht werden (Hoher Marktanteil an ausländischen, überwiegend deutschsprachigen Programmen an der Fernsehnutzung; Mehrsprachigkeit; Kleinräumigkeit von Zuschauer- und Zuhörermärkten sowie der Werbemärkte).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Entre les années 1950 et 1980, émerge une nouvelle forme de labyrinthe chez des romanciers européens comme Michel Butor, Alain Robbe-Grillet, Italo Calvino, Patrick Modiano et Alasdair Gray : un labyrinthe insaisissable et non cartographiable. Pour en rendre compte nous avons recours au modèle du rhizome, issu de la philosophie de Gilles Deleuze et de Félix Guattari, aussi bien qu'au concept d'hétérotopie de Michel Foucault. La spatialité de nos romans nous pousse à prendre en compte également les réécritures ironiques du mythe de Thésée, Ariane, le Minotaure, Dédale. Les citations et les allusions au mythe nous font remarquer la distance d'avec le modèle traditionnel et les effets de ce qu'on peut considérer comme un « bricolage mythique », dans le cadre d'un regard ironique, parodique ou satirique. La représentation romanesque du labyrinthe accentue d'un côté l'absence d'un centre, et de l'autre côté l'ouverture extrême de cet espace qu'est la ville contemporaine. En même temps, la présence de nombreux « espaces autres », les hétérotopies de Foucault, définit l'égarement des protagonistes des romans. Au fur et à mesure que les écrivains acquièrent conscience des caractéristiques « labyrinthiques » de ces espaces, celles-ci commencent à informer l'œuvre romanesque, créant ainsi un espace métafictionnel. Entre les années Cinquante et le début des années Soixante-dix, les Nouveaux romanciers français accentuent ainsi l'idée de pouvoir jouer avec les instruments de la fiction, pour exaspérer l'absence d'un sens dans la ville comme dans la pratique de l'écriture. Calvino reformule cette conception du roman, remarquant l'importance d'un sens, même s'il est caché et difficile à saisir. Pour cette raison, à la fin de l'époque que nous analysons, des auteurs comme Modiano et Gray absorbent les techniques d'écriture de ces prédécesseurs, en les faisant jouer avec la responsabilité éthique de l'auteur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is a collection of essays about the instrumental use of commitment decisions to facilitate the completion of the European internal electricity market. European policy can shape markets in many ways, two most evident being regulation and competition enforcement. The interplay between these two instruments attracts a lot of scholarly attention. One of the major concerns in the competition vs. regulation debate is the instrumental use of competition rules. It has been observed that competition enforcement is triggered not only as a response to an anticompetitive harm occurring in the market, but that it sometimes becomes a powerful tool in the European Commission’s hands to pursue regulatory goals. This thesis looks for examples of such instrumentalisation in the context of electricity markets and finds that the Commission is very pragmatic in using all the possible instruments it has at hand to push forward its project of creating the internal electricity market. This includes regulation, competition enforcement and all sorts of political pressure. To the extent that commitment decisions accelerate sector-specific regulation and overcome political deadlocks, they contribute to the Commission’s energy policy goals. However, instrumentalisation of competition rules comes at a certain cost to competition policy, energy policy and, most importantly, to electricity markets themselves. Markets might be negatively affected either indirectly, by application of sector-specific regulation or competition policy building on previous commitment decisions, or directly, through the implementation of inadequate commitments in individual cases. Concluding, commitment decisions generally contributed to achieving the policy objectives of the internal electricity market, but their use for that purpose does not come without cost. Given that this cost is ultimately borne by the internal electricity market, the Commission should take a more balanced approach to the instrumental use of commitment decisions so that it does not do more harm than good.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Die DNA-Doppelhelix ist eine relativ dicke (Ø ≈ 2 nm), kompakte und dadurch auf kurzen Längenskalen relativ steife Verbindung (lp[dsDNA] ≈ 50-60 nm), mit einer klar definierten Struktur, die durch biologische Methoden sehr präzise manipuliert werden kann. Die Auswirkungen der primären Sequenz auf die dreidimensionale Strukturbildung ist gut verstanden und exakt vorhersagbar. Des Weiteren kann DNA an verschiedenen Stellen mit anderen Molekülen verknüpft werden, ohne dass ihre Selbsterkennung gestört wird. Durch die helikale Struktur besteht außerdem ein Zusammenhang zwischen der Lage und der räumlichen Orientierung von eingeführten Modifikationen. Durch moderne Syntheseverfahren lassen sich beliebige Oligonukleotidsequenzen im Bereich bis etwa 150-200 Basen relativ preiswert im Milligrammmaßstab herstellen. Diese Eigenschaften machen die DNA zu einem idealen Kandidaten zur Erzeugung komplexer Strukturen, die durch Selbsterkennung der entsprechenden Sequenzen gebildet werden. In der hier vorgelegten Arbeit wurden einzelsträngige DNA-Abschnitte (ssDNA) als adressierbare Verknüpfungsstellen eingesetzt, um verschiedene molekulare Bausteine zu diskreten nicht periodischen Strukturen zu verbinden. Als Bausteine dienten flexible synthetische Polymerblöcke und semiflexible Doppelstrang-DNA-Abschnitte (dsDNA), die an beiden Enden mit unterschiedlichen Oligonukleotidsequenzen „funktionalisiert“ sind. Die zur Verknüpfung genutzten Oligonukleotidabschnitte wurden so gewählt (n > 20 Basen), dass ihre Hybridisierung zu einer bei Raumtemperatur stabilen Doppelstrangbildung führt. Durch Kombination der Phosphoramiditsynthese von DNA mit einer festkörpergestützten Blockkopplungsreaktion konnte am Beispiel von Polyethylenoxiden ein sehr effektiver Syntheseweg zur Herstellung von ssDNA1-PEO-ssDNA2-Triblockcopolymeren entwickelt werden, der sich problemlos auf andere Polymere übertragen lassen sollte. Die Längen und Basenabfolgen der beiden Oligonukleotidsequenzen können dabei unabhängig voneinander frei gewählt werden. Somit wurden die Voraussetzungen geschaffen, um die Selbsterkennung von Oligonukleotiden durch Kombination verschiedener Triblockcopolymere zur Erzeugung von Multiblockcopolymeren zu nutzen, die mit klassischen Synthesetechniken nicht zugänglich sind. Semiflexible Strukturelemente lassen sich durch die Synthese von Doppelstrangfragmenten mit langen überstehenden Enden (sticky-ends) realisieren. Die klassischen Ansätze der molekularen Genetik zur Erzeugung von sticky-ends sind in diesem Fall nicht praktikabel, da sie zu Einschränkungen im Bezug auf Länge und Sequenz der überhängenden Enden führen. Als Methode der Wahl haben sich zwei verschiedene Varianten der Polymerase Kettenreaktion (PCR) erwiesen, die auf der Verwendung von teilkomplementären Primern beruhen. Die eigentlichen Primersequenzen wurden am 5´-Ende entweder über ein 2´-Desoxyuridin oder über einen kurzen Polyethylenoxid-Spacer (n = 6) mit einer frei wählbaren „sticky-end-Sequenz“ verknüpft. Mit diesen Methoden sind sowohl 3´- als auch 5´-Überhänge zugänglich und die Länge der Doppelstrangabschnitte kann über einen breiten Molmassenbereich sehr exakt eingestellt werden. Durch Kombination derartiger Doppelstrangfragmente mit den biosynthetischen Triblockcopolymeren lassen sich Strukturen erzeugen, die als Modellsysteme zur Untersuchung verschiedener Biomoleküle genutzt werden können, die in Form eines mehrfach gebrochenen Stäbchens vorliegen. Im letzten Abschnitt wurde gezeigt, dass durch geeignete Wahl der überstehenden Enden bzw. durch Hybridisierung der Doppelstrangfragmente mit passenden Oligonukleotiden verzweigte DNA-Strukturen mit Armlängen von einigen hundert Nanometern zugänglich sind. Im Vergleich zu den bisher veröffentlichten Methoden bietet diese Herangehensweise zwei entscheidende Vorteile: Zum einen konnte der Syntheseaufwand auf ein Minimum reduziert werden, zum anderen ist es auf diesem Weg möglich die Längen der einzelnen Arme, unabhängig voneinander, über einen breiten Molmassenbereich zu variieren.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When a liquid crystal is confined to a cavity its director field becomes subject to competing forces: on the one hand, the surface of the cavity orients the director field (``surface anchoring''), on the other hand deformations of the director field cost elastic energy. Hence the equilibrium director field is determined by a compromise between surface anchoring and elasticity. One example of a confined liquid crystal that has attracted particular interest from physicists is the nematic droplet. In this thesis a system of hard rods is considered as the simplest model for nematic liquid crystals consisting of elongated molecules. First, systems of hard spherocylinders in a spherical geometry are investigated by means of canonical Monte Carlo simulations. In contrast to previous simulation work on this problem, a continuum model is used. In particular, the effects of ordering near hard curved walls are studied for the low-density regime. With increasing density, first a uniaxial surface film forms and then a biaxial surface film, which eventually fills the entire cavity. We study how the surface order, the adsorption and the shape of the director field depend on the curvature of the wall. We find that orientational ordering at a curved wall in a cavity is stronger than at a flat wall, while adsorption is weaker. For densities above the isotropic-nematic transition, we always find bipolar configurations. As a next step, an extension of the Asakura-Oosawa-Vrij model for colloid-polymer mixtures to anisotropic colloids is considered. By means of computer simulations we study how droplets of hard, rod-like particles optimize their shape and structure under the influence of the osmotic compression caused by the presence of spherical particles that act as depletion agents. At sufficiently high osmotic pressures the rods that make up the drops spontaneously align to turn them into uniaxial nematic liquid crystalline droplets. The nematic droplets or ``tactoids'' that so form are not spherical but elongated, resulting from the competition between the anisotropic surface tension and the elastic deformation of the director field. In agreement with recent theoretical predictions we find that sufficiently small tactoids have a uniform director field, whilst large ones are characterized by a bipolar director field. From the shape and director-field transformation of the droplets we estimate the surface anchoring strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Es wurde ein für bodengebundene Feldmessungen geeignetes System zur digital-holographischen Abbildung luftgetragener Objekte entwickelt und konstruiert. Es ist, abhängig von der Tiefenposition, geeignet zur direkten Bestimmung der Größe luftgetragener Objekte oberhalb von ca. 20 µm, sowie ihrer Form bei Größen oberhalb von ca. 100µm bis in den Millimeterbereich. Die Entwicklung umfaßte zusätzlich einen Algorithmus zur automatisierten Verbesserung der Hologrammqualität und zur semiautomatischen Entfernungsbestimmung großer Objekte entwickelt. Eine Möglichkeit zur intrinsischen Effizienzsteigerung der Bestimmung der Tiefenposition durch die Berechnung winkelgemittelter Profile wurde vorgestellt. Es wurde weiterhin ein Verfahren entwickelt, das mithilfe eines iterativen Ansatzes für isolierte Objekte die Rückgewinnung der Phaseninformation und damit die Beseitigung des Zwillingsbildes erlaubt. Weiterhin wurden mithilfe von Simulationen die Auswirkungen verschiedener Beschränkungen der digitalen Holographie wie der endlichen Pixelgröße untersucht und diskutiert. Die geeignete Darstellung der dreidimensionalen Ortsinformation stellt in der digitalen Holographie ein besonderes Problem dar, da das dreidimensionale Lichtfeld nicht physikalisch rekonstruiert wird. Es wurde ein Verfahren entwickelt und implementiert, das durch Konstruktion einer stereoskopischen Repräsentation des numerisch rekonstruierten Meßvolumens eine quasi-dreidimensionale, vergrößerte Betrachtung erlaubt. Es wurden ausgewählte, während Feldversuchen auf dem Jungfraujoch aufgenommene digitale Hologramme rekonstruiert. Dabei ergab sich teilweise ein sehr hoher Anteil an irregulären Kristallformen, insbesondere infolge massiver Bereifung. Es wurden auch in Zeiträumen mit formal eisuntersättigten Bedingungen Objekte bis hinunter in den Bereich ≤20µm beobachtet. Weiterhin konnte in Anwendung der hier entwickelten Theorie des ”Phasenrandeffektes“ ein Objekt von nur ca. 40µm Größe als Eisplättchen identifiziert werden. Größter Nachteil digitaler Holographie gegenüber herkömmlichen photographisch abbildenden Verfahren ist die Notwendigkeit der aufwendigen numerischen Rekonstruktion. Es ergibt sich ein hoher rechnerischer Aufwand zum Erreichen eines einer Photographie vergleichbaren Ergebnisses. Andererseits weist die digitale Holographie Alleinstellungsmerkmale auf. Der Zugang zur dreidimensionalen Ortsinformation kann der lokalen Untersuchung der relativen Objektabstände dienen. Allerdings zeigte sich, dass die Gegebenheiten der digitalen Holographie die Beobachtung hinreichend großer Mengen von Objekten auf der Grundlage einzelner Hologramm gegenwärtig erschweren. Es wurde demonstriert, dass vollständige Objektgrenzen auch dann rekonstruiert werden konnten, wenn ein Objekt sich teilweise oder ganz außerhalb des geometrischen Meßvolumens befand. Weiterhin wurde die zunächst in Simulationen demonstrierte Sub-Bildelementrekonstruktion auf reale Hologramme angewandt. Dabei konnte gezeigt werden, dass z.T. quasi-punktförmige Objekte mit Sub-Pixelgenauigkeit lokalisiert, aber auch bei ausgedehnten Objekten zusätzliche Informationen gewonnen werden konnten. Schließlich wurden auf rekonstruierten Eiskristallen Interferenzmuster beobachtet und teilweise zeitlich verfolgt. Gegenwärtig erscheinen sowohl kristallinterne Reflexion als auch die Existenz einer (quasi-)flüssigen Schicht als Erklärung möglich, wobei teilweise in Richtung der letztgenannten Möglichkeit argumentiert werden konnte. Als Ergebnis der Arbeit steht jetzt ein System zur Verfügung, das ein neues Meßinstrument und umfangreiche Algorithmen umfaßt. S. M. F. Raupach, H.-J. Vössing, J. Curtius und S. Borrmann: Digital crossed-beam holography for in-situ imaging of atmospheric particles, J. Opt. A: Pure Appl. Opt. 8, 796-806 (2006) S. M. F. Raupach: A cascaded adaptive mask algorithm for twin image removal and its application to digital holograms of ice crystals, Appl. Opt. 48, 287-301 (2009) S. M. F. Raupach: Stereoscopic 3D visualization of particle fields reconstructed from digital inline holograms, (zur Veröffentlichung angenommen, Optik - Int. J. Light El. Optics, 2009)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Early-Type galaxies (ETGs) are embedded in hot (10^6-10^7 K), X-ray emitting gaseous haloes, produced mainly by stellar winds and heated by Type Ia supernovae explosions, by the thermalization of stellar motions and occasionally by the central super-massive black hole (SMBH). In particular, the thermalization of the stellar motions is due to the interaction between the stellar and the SNIa ejecta and the hot interstellar medium (ISM) already residing in the ETG. A number of different astrophysical phenomena determine the X-ray properties of the hot ISM, such as stellar population formation and evolution, galaxy structure and internal kinematics, Active Galactic Nuclei (AGN) presence, and environmental effects. With the aid of high-resolution hydrodynamical simulations performed on state-of-the-art galaxy models, in this Thesis we focus on the effects of galaxy shape, stellar kinematics and star formation on the evolution of the X-ray coronae of ETGs. Numerical simulations show that the relative importance of flattening and rotation are functions of the galaxy mass: at low galaxy masses, adding flattening and rotation induces a galactic wind, thus lowering the X-ray luminosity; at high galaxy masses the angular momentum conservation keeps the central regions of rotating galaxies at low density, whereas in non-rotating models a denser and brighter atmosphere is formed. The same dependence from the galaxy mass is present in the effects of star formation (SF): in light galaxies SF contributes to increase the spread in Lx, while at high galaxy masses the halo X-ray properties are marginally sensitive to SF effects. In every case, the star formation rate at the present epoch quite agrees with observations, and the massive, cold gaseous discs are partially or completely consumed by SF on a time-scale of few Gyr, excluding the presence of young stellar discs at the present epoch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I tre capitoli in cui è suddiviso il lavoro cercano di offrire una ricognizione delle questioni che ruotano intorno alla sussistenza o meno di prerogative presidenziali riguardo l’attività legislativa, e in generale sulle eventuali trasformazioni avvenute, soprattutto negli ultimi anni, nel ruolo del Presidente della Repubblica all’interno del sistema costituzionale italiano. Il lavoro esordisce con una premessa metodologica con cui si tenta di valorizzare la questione relativa ai diversi approcci metodologici che si possono seguire nello studio delle tematiche presidenziali: ossia, tenere nettamente distinti il piano delle norme da quello delle prassi oppure analizzarli e valutarli congiuntamente. Il primo capitolo è dedicato all’analisi della figura presidenziale così come delineata dalla Costituzione e arricchita dagli oltre sessant’anni di letteratura costituzionale. Vengono quindi analizzate le disposizioni costituzionali di riferimento e viene dato conto delle principali teorie costituzionali espresse dalla letteratura giuridica. Il secondo capitolo è dedicato a quella che viene offerta come una delle possibili cause da cui traggono origine le evoluzioni registrate in questi ultimi anni dalla figura presidenziale: ossia, il mutamento del sistema costituzionale di rappresentanza politica. Tale elemento è analizzato soprattutto nel suo ambito istituzionale, vale a dire il modello di forma di governo rappresentato dal circuito parlamento-governo. Il terzo capitolo entra direttamente nelle due questioni di fondo della ricerca: da un lato lo studio della generale attività presidenziale di intervento nelle questioni politiche, attraverso comunicati o esternazioni; dall’altro lato l’analisi di alcuni specifici casi paradigmatici di intervento presidenziale in sede di emanazione o promulgazione (o, comunque, discussione) degli atti legislativi del governo e del parlamento, accaduti tra il 2006 e il 2013. Inoltre, il lavoro è arricchito da una importante sezione (allegata) di “case studies” contenente il risultato di una ricerca effettuata su oltre tremila documenti presidenziali, dal 2006 al 2013, resi pubblici dagli uffici del Quirinale.