916 resultados para Invisible brackets
Resumo:
INTRODUCTION: The opinion on the 'straight-wire' concept has been evolving since its origin, characterized by faithful followers or absolute skepticism. Currently, it seems reasonable to state that most professionals have a more realistic and critical viewpoint, with an attitude that reveals Orthodontics' maturity and greater knowledge on the technique. The most relevant criticisms refer to the impossibility of the both the Straight-Wire and the Standard systems to completely express the characteristics related to the brackets due to mechanical deficiencies, such as bracket/wire play. OBJECTIVES: A critical analysis of this relationship, which is unclear due to lack of studies, was the scope of this paper. METHODS: The compensatory treatment of two patients, using Capelozza's individualized brackets, works as the scenery for cephalometric evaluation of changes in incisor inclination produced by different dimensions of leveling archwires. RESULTS: The evaluation of these cases showed that, while the introduction of a 0.019 x 0.025-in stainless steel archwire in a 0.022 x 0.030-in slot did not produce significant changes in incisor inclination, the 0.021 x 0.025-in archwire was capable of changing it, mainly in mandibular incisors, and in the opposite direction to the compensation. CONCLUSION: Considering compensatory treatments, even when using an individualized prescription according to the malocclusion, the bracket/wire play seems to be a positive factor for malocclusion correction, without undesirable movements. Therefore, it seems reasonable to admit that, until a bracket system can have absolute individualization, the use of rectangular wires that still have a certain play with the bracket slot is advisable.
Resumo:
Information technology (IT) is on the verge of another revolution. Driven by the increasing capabilities and ever declining costs of computing and communications devices, IT is being embedded into a growing range of physical devices linked together through networks and will become ever more pervasive as the component technologies become smaller, faster, and cheaper. [..] These networked systems of embedded computers, referred to as EmNets throughout this report, have the potential to change radically the way people interact with their environment by linking together a range of devices and sensors that will allow information to be collected, shared, and processed in unprecedented ways.[..] The use of EmNets throughout society could well dwarf previous milestones in the information revolution.[..] IT will eventually become \textbf{an invisible component of almost everything} in everyone`s surroundings. Con il ridursi dei costi e l'aumentare della capacità di computazione dei componenti elettronici sono proliferate piattaforme che permettono al bambino come all'ingegnere di sviluppare un'idea che trasversalmente taglia il mondo reale e quello virtuale. Una collisione tra due mondi che fino a poco tempo fa era consentita esclusivamente a professionisti. Oggetti che possono acquisire o estendere funzionalità, che ci permettono di estendere la nostra percezione del mondo e di rivalutarne i suoi limiti. Oggetti connessi alla 'rete delle reti' che condividono ed elaborano dati per un nuovo utilizzo delle informazioni. Con questa tesi si vuole andare ad esplorare l'applicazione degli agenti software alle nuove piattaforme dei sistemi embedded e dell'Internet of Things, tecnologie abbastanza mature eppure non ancora esplorate a fondo. Ha senso modellare un sistema embedded con gli agenti?
Resumo:
This thesis was aimed at verifying the role of the superior colliculus (SC) in human spatial orienting. To do so, subjects performed two experimental tasks that have been shown to involve SC’s activation in animals, that is a multisensory integration task (Experiment 1 and 2) and a visual target selection task (Experiment 3). To investigate this topic in humans, we took advantage of neurophysiological finding revealing that retinal S-cones do not send projections to the collicular and magnocellular pathway. In the Experiment 1, subjects performed a simple reaction-time task in which they were required to respond as quickly as possible to any sensory stimulus (visual, auditory or bimodal audio-visual). The visual stimulus could be an S-cone stimulus (invisible to the collicular and magnocellular pathway) or a long wavelength stimulus (visible to the SC). Results showed that when using S-cone stimuli, RTs distribution was simply explained by probability summation, indicating that the redundant auditory and visual channels are independent. Conversely, with red long-wavelength stimuli, visible to the SC, the RTs distribution was related to nonlinear neural summation, which constitutes evidence of integration of different sensory information. We also demonstrate that when AV stimuli were presented at fixation, so that the spatial orienting component of the task was reduced, neural summation was possible regardless of stimulus color. Together, these findings provide support for a pivotal role of the SC in mediating multisensory spatial integration in humans, when behavior involves spatial orienting responses. Since previous studies have shown an anatomical asymmetry of fibres projecting to the SC from the hemiretinas, the Experiment 2 was aimed at investigating temporo-nasal asymmetry in multisensory integration. To do so, subjects performed monocularly the same task shown in the Experiment 1. When spatially coincident audio-visual stimuli were visible to the SC (i.e. red stimuli), the RTE depended on a neural coactivation mechanism, suggesting an integration of multisensory information. When using stimuli invisible to the SC (i.e. purple stimuli), the RTE depended only on a simple statistical facilitation effect, in which the two sensory stimuli were processed by independent channels. Finally, we demonstrate that the multisensory integration effect was stronger for stimuli presented to the temporal hemifield than to the nasal hemifield. Taken together, these findings suggested that multisensory stimulation can be differentially effective depending on specific stimulus parameters. The Experiment 3 was aimed at verifying the role of the SC in target selection by using a color-oddity search task, comprising stimuli either visible or invisible to the collicular and magnocellular pathways. Subjects were required to make a saccade toward a target that could be presented alone or with three distractors of another color (either S-cone or long-wavelength). When using S-cone distractors, invisible to the SC, localization errors were similar to those observed in the distractor-free condition. Conversely, with long-wavelength distractors, visible to the SC, saccadic localization error and variability were significantly greater than in either the distractor-free condition or the S-cone distractors condition. Our results clearly indicate that the SC plays a direct role in visual target selection in humans. Overall, our results indicate that the SC plays an important role in mediating spatial orienting responses both when required covert (Experiments 1 and 2) and overt orienting (Experiment 3).
Resumo:
In dieser Arbeit wird eine Messung des magnetischen Moments des Elektronsin wasserstoffähnlichem Kohlenstoff vorgestellt. Das Ergebnis derMessungen an einem einzelnen gespeicherten12C5+-Ionist: g = 2,001 041 596 4 (8)(6)(44). Der erste Fehler bezeichnet die statistischeUnsicherheit, der zweite Fehler die systematische Unsicherheit. Der letzteFehler resultiert aus der Unsicherheit des Verhältnisses der Massedes 12C5+-Ions und der des Elektrons. Die hohe Genauigkeitder Messung wurde durch die räumliche Trennung des Nachweises derAusrichtung des Spins und des Induzierens der spin-flips erreicht. DieMessung stellt die bisher genaueste Bestimmung eines atomaren g-Faktorsdar und bestätigt den theoretischen Wert der Göteborger Theoriegruppeauf 7*10-9. Zusammen mit diesen Rechnungen verifiziert sie dieBound-State-QED-Korrekturen genauer als 1%. Somit ist der g-Faktor desin12C5+ gebunden Elektrons neben Messungen der Lambshiftin schweren hochgeladenen Ionen der genaueste Test der Bound-State-QED.
Wird auf die Richtigkeit der Berechnung des g-Faktors des gebundenenElektrons vertraut, kann folgender Wert für die atomare Masse desElektrons gewonnen werden: me= 0,000 548 579 912 8 (15) u.
Resumo:
Healthcare, Human Computer Interfaces (HCI), Security and Biometry are the most promising application scenario directly involved in the Body Area Networks (BANs) evolution. Both wearable devices and sensors directly integrated in garments envision a word in which each of us is supervised by an invisible assistant monitoring our health and daily-life activities. New opportunities are enabled because improvements in sensors miniaturization and transmission efficiency of the wireless protocols, that achieved the integration of high computational power aboard independent, energy-autonomous, small form factor devices. Application’s purposes are various: (I) data collection to achieve off-line knowledge discovery; (II) user notification of his/her activities or in case a danger occurs; (III) biofeedback rehabilitation; (IV) remote alarm activation in case the subject need assistance; (V) introduction of a more natural interaction with the surrounding computerized environment; (VI) users identification by physiological or behavioral characteristics. Telemedicine and mHealth [1] are two of the leading concepts directly related to healthcare. The capability to borne unobtrusiveness objects supports users’ autonomy. A new sense of freedom is shown to the user, not only supported by a psychological help but a real safety improvement. Furthermore, medical community aims the introduction of new devices to innovate patient treatments. In particular, the extension of the ambulatory analysis in the real life scenario by proving continuous acquisition. The wide diffusion of emerging wellness portable equipment extended the usability of wearable devices also for fitness and training by monitoring user performance on the working task. The learning of the right execution techniques related to work, sport, music can be supported by an electronic trainer furnishing the adequate aid. HCIs made real the concept of Ubiquitous, Pervasive Computing and Calm Technology introduced in the 1988 by Marc Weiser and John Seeley Brown. They promotes the creation of pervasive environments, enhancing the human experience. Context aware, adaptive and proactive environments serve and help people by becoming sensitive and reactive to their presence, since electronics is ubiquitous and deployed everywhere. In this thesis we pay attention to the integration of all the aspects involved in a BAN development. Starting from the choice of sensors we design the node, configure the radio network, implement real-time data analysis and provide a feedback to the user. We present algorithms to be implemented in wearable assistant for posture and gait analysis and to provide assistance on different walking conditions, preventing falls. Our aim, expressed by the idea to contribute at the development of a non proprietary solutions, driven us to integrate commercial and standard solutions in our devices. We use sensors available on the market and avoided to design specialized sensors in ASIC technologies. We employ standard radio protocol and open source projects when it was achieved. The specific contributions of the PhD research activities are presented and discussed in the following. • We have designed and build several wireless sensor node providing both sensing and actuator capability making the focus on the flexibility, small form factor and low power consumption. The key idea was to develop a simple and general purpose architecture for rapid analysis, prototyping and deployment of BAN solutions. Two different sensing units are integrated: kinematic (3D accelerometer and 3D gyroscopes) and kinetic (foot-floor contact pressure forces). Two kind of feedbacks were implemented: audio and vibrotactile. • Since the system built is a suitable platform for testing and measuring the features and the constraints of a sensor network (radio communication, network protocols, power consumption and autonomy), we made a comparison between Bluetooth and ZigBee performance in terms of throughput and energy efficiency. Test in the field evaluate the usability in the fall detection scenario. • To prove the flexibility of the architecture designed, we have implemented a wearable system for human posture rehabilitation. The application was developed in conjunction with biomedical engineers who provided the audio-algorithms to furnish a biofeedback to the user about his/her stability. • We explored off-line gait analysis of collected data, developing an algorithm to detect foot inclination in the sagittal plane, during walk. • In collaboration with the Wearable Lab – ETH, Zurich, we developed an algorithm to monitor the user during several walking condition where the user carry a load. The remainder of the thesis is organized as follows. Chapter I gives an overview about Body Area Networks (BANs), illustrating the relevant features of this technology and the key challenges still open. It concludes with a short list of the real solutions and prototypes proposed by academic research and manufacturers. The domain of the posture and gait analysis, the methodologies, and the technologies used to provide real-time feedback on detected events, are illustrated in Chapter II. The Chapter III and IV, respectively, shown BANs developed with the purpose to detect fall and monitor the gait taking advantage by two inertial measurement unit and baropodometric insoles. Chapter V reports an audio-biofeedback system to improve balance on the information provided by the use centre of mass. A walking assistant based on the KNN classifier to detect walking alteration on load carriage, is described in Chapter VI.
Resumo:
Numerose evidenze dimostrano che le proprietà dei materiali compositi sono strettamente legate ai processi produttivi, alle tipologie di fibra e resina impiegate nel materiale stesso. Proprietà caratterizzate anche dai difetti contenuti nel materiale stesso. Nella tesi si presta particolare attenzione al processo produttivo con prepreg e autoclave trattando anche il tema della stesura di un ply-book. Si valutano in modo teorico e critico alcuni tra i metodi N.D.T. più avanzati tra cui: P.T.(Penetrant Test), Rx(Radiography Test), UT (Ultrasound Test in Phased Array) e IRT (InfraRed Termography - Pulsata). Molteplici sono i componenti testati che variano tra loro per: tipologia di resina e fibra impiegata, processo produttivo e geometria. Tutti questi componenti permettono di capire come i singoli parametri influenzino la visualizzazione e l'applicabilità delle tecniche N.D.T. sopra citate. Su alcuni provini è stata eseguita la prova meccanica Drop Weight Test secondo ASTM D7136 per correlare le aree di delaminazione indotte e la sensibilità di ogni singolo metodo, visualizzando così la criticità indotta dagli urti con bassa energia di impatto (BVID Barely Invisible Impact)di cui i materiali compositi soffrono durante la "service life". A conclusione del lavoro si potrà comprendere come solo l'analisi con più metodi in parallelo permetta di ottenere una adeguata Probability Of Detection.
Resumo:
In dieser Arbeit wurde die Elektronenemission von Nanopartikeln auf Oberflächen mittels spektroskopischen Photoelektronenmikroskopie untersucht. Speziell wurden metallische Nanocluster untersucht, als selbstorganisierte Ensembles auf Silizium oder Glassubstraten, sowie ferner ein Metall-Chalcogenid (MoS2) Nanoröhren-Prototyp auf Silizium. Der Hauptteil der Untersuchungen war auf die Wechselwirkung von fs-Laserstrahlung mit den Nanopartikeln konzentriert. Die Energie der Lichtquanten war kleiner als die Austrittsarbeit der untersuchten Proben, so dass Ein-Photonen-Photoemission ausgeschlossen werden konnte. Unsere Untersuchungen zeigten, dass ausgehend von einem kontinuierlichen Metallfilm bis hin zu Clusterfilmen ein anderer Emissionsmechanismus konkurrierend zur Multiphotonen-Photoemission auftritt und für kleine Cluster zu dominieren beginnt. Die Natur dieses neuen Mechanismus` wurde durch verschiedenartige Experimente untersucht. Der Übergang von einem kontinuierlichen zu einem Nanopartikelfilm ist begleitet von einer Zunahme des Emissionsstroms von mehr als eine Größenordnung. Die Photoemissions-Intensität wächst mit abnehmender zeitlicher Breite des Laserpulses, aber diese Abhängigkeit wird weniger steil mit sinkender Partikelgröße. Die experimentellen Resultate wurden durch verschiedene Elektronenemissions-Mechanismen erklärt, z.B. Multiphotonen-Photoemission (nPPE), thermionische Emission und thermisch unterstützte nPPE sowie optische Feldemission. Der erste Mechanismus überwiegt für kontinuierliche Filme und Partikel mit Größen oberhalb von mehreren zehn Nanometern, der zweite und dritte für Filme von Nanopartikeln von einer Größe von wenigen Nanometern. Die mikrospektroskopischen Messungen bestätigten den 2PPE-Emissionsmechanismus von dünnen Silberfilmen bei „blauer“ Laseranregung (hν=375-425nm). Das Einsetzen des Ferminiveaus ist relativ scharf und verschiebt sich um 2hν, wenn die Quantenenergie erhöht wird, wogegen es bei „roter“ Laseranregung (hν=750-850nm) deutlich verbreitert ist. Es zeigte sich, dass mit zunehmender Laserleistung die Ausbeute von niederenergetischen Elektronen schwächer zunimmt als die Ausbeute von höherenergetischen Elektronen nahe der Fermikante in einem Spektrum. Das ist ein klarer Hinweis auf eine Koexistenz verschiedener Emissionsmechanismen in einem Spektrum. Um die Größenabhängigkeit des Emissionsverhaltens theoretisch zu verstehen, wurde ein statistischer Zugang zur Lichtabsorption kleiner Metallpartikel abgeleitet und diskutiert. Die Elektronenemissionseigenschaften bei Laseranregung wurden in zusätzlichen Untersuchungen mit einer anderen Anregungsart verglichen, der Passage eines Tunnelstroms durch einen Metall-Clusterfilm nahe der Perkolationsschwelle. Die elektrischen und Emissionseigenschaften von stromtragenden Silberclusterfilmen, welche in einer schmalen Lücke (5-25 µm Breite) zwischen Silberkontakten auf einem Isolator hergestellt wurden, wurden zum ersten Mal mit einem Emissions-Elektronenmikroskop (EEM) untersucht. Die Elektronenemission beginnt im nicht-Ohmschen Bereich der Leitungsstrom-Spannungskurve des Clusterfilms. Wir untersuchten das Verhalten eines einzigen Emissionszentrums im EEM. Es zeigte sich, dass die Emissionszentren in einem stromleitenden Silberclusterfilm Punktquellen für Elektronen sind, welche hohe Emissions-Stromdichten (mehr als 100 A/cm2) tragen können. Die Breite der Energieverteilung der Elektronen von einem einzelnen Emissionszentrum wurde auf etwa 0.5-0.6 eV abgeschätzt. Als Emissionsmechanismus wird die thermionische Emission von dem „steady-state“ heißen Elektronengas in stromdurchflossenen metallischen Partikeln vorgeschlagen. Größenselektierte, einzelne auf Si-Substraten deponierte MoS2-Nanoröhren wurden mit einer Flugzeit-basierten Zweiphotonen-Photoemissions-Spektromikroskopie untersucht. Die Nanoröhren-Spektren wiesen bei fs-Laser Anregung eine erstaunlich hohe Emissionsintensität auf, deutlich höher als die SiOx Substratoberfläche. Dagegen waren die Röhren unsichtbar bei VUV-Anregung bei hν=21.2 eV. Eine ab-initio-Rechnung für einen MoS2-Slab erklärt die hohe Intensität durch eine hohe Dichte freier intermediärer Zustände beim Zweiphotonen-Übergang bei hν=3.1 eV.
Resumo:
Il tomografo sonico è uno strumento di recente applicazione nell’analisi morfo-sintomatica delle alberature. Si tratta di uno strumento che sfrutta la propagazione delle onde sonore nel legno per determinarne la densità e le possibili alterazioni interne. Oltre all’applicazione su larga scala in un parco di Imola, per effettuare una valutazione approfondita di tutti gli esemplari, lo strumento è stato applicato per scopi diversi. In prima analisi è stato utilizzato per valutare stadi precoci di alterazione e l’evoluzione delle patologie interne nel tempo. Successivamente si voleva identificare il percorso di sostanze liquide iniettate con mezzi endoterapici nel tronco, attraverso l’applicazione di tomografia sonica sopra e sotto il punto di iniezione. In ultima analisi è stato effettuato un confronto tra tomografia sonica e risonanza magnetica nucleare per identificare patologie invisibili ai normali strumenti utilizzati nell’analisi della stabilità delle piante.
Resumo:
In base ad una recensione esaustiva dei riferimenti alla musica e al sonoro nella produzione filosofica di Gilles Deleuze e Félix Guattari, la presente ricerca s’incentra sulla posizione che il pensiero musicale di John Cage occupa in alcuni testi deleuziani. Il primo capitolo tratta del periodo creativo di Cage fra il 1939 e il 1952, focalizzandosi su due aspetti principali: la struttura micro-macrocosmica che contraddistingue i suoi primi lavori, e i quattro elementi che in questo momento sintetizzano per Cage la composizione musicale. Questi ultimi sono considerati in riferimento alla teoria della doppia articolazione che Deleuze e Guattari riprendono da Hjelmslev; entrambi gli aspetti rimandano al sistema degli strati e della stratificazione esposta su Mille piani. Il secondo capitolo analizza la musica dei decenni centrali della produzione cagiana alla luce del luogo in Mille piani dove Cage è messo in rapporto al concetto di “piano fisso sonoro”. Un’attenzione particolare è posta al modo in cui Cage concepisce il rapporto fra durata e materiali sonori, e al grado variabile in cui sono presenti il caso e l’indeterminazione. Le composizioni del periodo in questione sono inoltre viste in riferimento al concetto deleuzo-guattariano di cartografia, e nelle loro implicazioni per il tempo musicale. L’ultimo quindicennio della produzione di Cage è considerata attraverso il concetto di rizoma inteso come teoria delle molteplicità. In primo luogo è esaminata la partitura di Sylvano Bussotti che figura all’inizio di Mille piani; in seguito, i lavori testuali e musicali di Cage sono considerati secondo le procedure compositive cagiane del mesostico, delle parentesi di tempo che concorrono a formare una struttura variabile, e dell’armonia anarchica dell’ultimo Cage.
Resumo:
La memoria pubblica della Sho'ah è inscritta in una quantità proliferante di immagini e spazi memoriali. Ciò è riscontrabile in modo particolare nei principali "siti dello sterminio" assurti a simbolo nel corso degli anni, mentre molti altri "luoghi di memoria" della Deportazione soffrono di una condizione di intrinseca debolezza. Essa è riconducibile in primo luogo alla fragilità del dato materiale, i cui resti ormai privi di eloquenza risultano difficili da interpretare e conservare, in secondo luogo alla sovrapposizione di memorie concorrenti venutesi a determinare in conseguenza dei riusi successivi a cui queste strutture sono spesso andate soggette dopo la guerra, infine alla difficoltà di rendere espressione compiuta alla tragedia della Deportazione. Il caso del campo di Fossoli è paradigmatico: esso interroga la capacità del progetto di "dare forma" al palinsesto delle memorie, rendendo possibile il riconoscimento ed esplicitando una significazione delle tracce, senza aggiungere ulteriori interpretazioni. Lo spazio e il paesaggio, in quanto linguaggi indentitari, possono offrirsi come strumenti da questo punto di vista. Michel De Certeau vi fa riferimento quando afferma che lo spazio coincide con «l’effetto prodotto dalle operazioni che lo orientano, che lo circostanziano, o temporalizzano e lo fanno funzionare come unità polivalente di programmi conflittuali o di prossimità contrattuali». Lo spazio gioca un ruolo cruciale nel conformare l'esperienza del presente e allo stesso tempo nel rendere visibili le esperienze passate, compresse nella memoria collettiva. Lo scopo di questa ricerca è interrogare le potenzialità spaziali del luogo, considerate sotto il profilo culturale e semantico, come valida alternativa alla forma-monumento nella costruzione di una o più narrazioni pertinenti della memoria.
Resumo:
Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2)_mu discrepancy, proposed U(1) extensions of the SM gauge group have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle, the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for hidden photons, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e+(A,Z)->e+(A,Z)+l^+l^- is investigated and a search for a very narrow resonance in the invariant mass distribution of the lepton pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsäcker-Williams approximation to calculate the signal cross section of the process, which is widely used to design such experimental setups, is investigated. In a next step, the reaction e+(A,Z)->e+(A,Z)+l^+l^- is analyzed as signal and background process in order to describe existing data obtained by the A1 experiment at MAMI with the aim to give accurate predictions of exclusion limits for the hidden photon parameter space. Finally, the derived methods are used to find predictions for future experiments, e.g., at MESA or at JLAB, allowing for a comprehensive study of the discovery potential of the complementary experiments. In the last part, a feasibility study for probing the hidden photon model by rare kaon decays is performed. For this purpose, invisible as well as visible decays of the hidden photon are considered within different classes of models. This allows one to find bounds for the parameter space from existing data and to estimate the reach of future experiments.
Resumo:
Résumé: Par le biais de cette thèse, nous nous aventurons dans le monde des Villes invisibles d'Italo Calvino à travers un voyage qui nous portera à affronter des thèmes tels que l'insuffisance du langage pour décrire les multiples formes de la réalité ; l'importance de la relation entre l'oeil et l'esprit ; les instruments déployés par Italo Calvino pour atteindre son objectif: toucher du doigt l'essence même de la réalité. Pour ce faire, l'écrivain puisera dans une tradition séculaire d'oeuvres littéraires, y compris et surtout Ovide et ses Métamorphoses, qui fera l'objet dune étude plus poussée étant donné que Calvino et lui partagent la conviction que l'art et l'illusion qu'il comporte nous guident vers une réalité nouvelle. Nous finirons par l'étude de certains concepts récurrents dans la littérature en général, mais surtout dans celle de l'écrivain. Des thèmes comme l'échiquier, le labyrinthe ou encore la carte nous apparaissent comme autant de moyens employés par Calvino pour comprendre notre monde. Le tout à travers un voyage initiatique qui nous permettra non seulement de mieux comprendre la réalité dans laquelle nous vivons et par conséquent nous-même, mais aussi d'entrer dans l'esprit de l'auteur sans toutefois oublier que nous ne pouvons qu'effleurer la surface de ce dernier. Abstract: Through this thesis, we will enter the world of Italo Calvino's Invisible cities in a trip that will bring us to analyse subjects such as the language inadequacy when it comes to describing reality and its multiple forms ; the importance of the link between the eye and the mind ; the tools used by Italo Calvino in order to achieve his goal: reach the very essence of reality. To do so, the writer will make full use of a tradition that comes from centuries of literary books, included and most of all Ovid and his Metamorphoses, which we will study more deeply as Calvino shares with him the belief that art and illusion that derives from it, bring us towards a new reality. Finally, we will analyse some topics which are generally recurrent not only in literature, but also in Calvino's work. Topics such as chessboard, labyrinth, map that are may different ways used by Calvino to understand our world. All of this will take us through an initiatory trip that will allow us not only to better understand the reality we live in and therefore ourselves, but also to enter the writer's mind without however forgetting that we cannot but only scratch the surface of the latter.
Resumo:
White spot lesion (WSL) infiltration has been recommended immediately after debonding of orthodontic brackets. It is however not clear if established inactive WSLs can also be masked through infiltrationOrthodontic treatment of a 19-year-old patient had to be terminated prematurely due to development of multiple WSLs of varying severity. Three months after debonding, the patient presented for lesion infiltration. After etching with 15% HCl gel and re-wetting of the dried surfaces it seemed that a good outcome could be expected. Lesion infiltration led to complete masking of less severe WSLs. The visual appearance of moderate and severe WSLs was improved but they were still visible after treatment.Inactive WSLs may not represent an increased caries risk, but patients are often bothered esthetically. Infiltration by repeated etching might be a viable approach even for inactive WSLs. Controlled clinical trials are needed to investigate the long-term performance of this technique.
Resumo:
INTRODUCTION: Fixed orthodontic appliances can alter the subgingival microbiota. Our aim was to compare the subgingival microbiota and clinical parameters in adolescent subjects at sites of teeth treated with orthodontic bands with margins at (OBM) or below the gingival margin (OBSM), or with brackets (OBR). METHODS: Microbial samples were collected from 33 subjects (ages, 12-18 years) in treatment more than 6 months. The microbiota was assessed by the DNA-DNA checkerboard hybridization method. RESULTS: Bacterial samples were taken from 83 OBR,103 OBSM, and 54 OBM sites. Probing pocket depths differed by orthodontic type (P <0.001) with mean values of 2.9 mm (SD, 0.6) at OBSM sites, 2.5 mm (SD, 0.6) at OBM sites, and 2.3 mm (SD, 0.5) at OBR sites. Only Actinomyces israelii (P <0.001) and Actinomyces naeslundii (P <0.001) had higher levels at OBR sites, whereas Neisseria mucosa had higher levels at sites treated with OBSM or OBM (P <0.001). Aggregatibacter actinomycetemcomitans was found in 25% of sites independent of the appliance. CONCLUSIONS: Different types of orthodontic appliances cause minor differences in the subgingival microbiota (A israelii and A naeslundii) and higher levels at sites treated with orthodontic brackets. More sites with bleeding on probing and deeper pockets were found around orthodontic bands.
Resumo:
Our aim in this study was to compare intermolar widths after alignment of crowded mandibular dental arches in nonextraction adolescent patients between conventional and self-ligating brackets.