708 resultados para lex credendi
Resumo:
Conoscere dal punto di vista tecnico-costruttivo una costruzione storica è fondamentale per un’attendibile valutazione della sicurezza strutturale attuale e per la scelta di un eventuale ed efficace intervento di miglioramento. Per le strutture di interesse storico appartenenti al patrimonio culturale, ma anche per gli altri edifici, risulta difficile conoscere le origini del manufatto, le modifiche avvenute nel corso del tempo dovute a fenomeni di danneggiamento derivanti dal peggioramento delle condizioni dei materiali e dall’avvenimento di eventi calamitosi a causa della mancanza di documentazione storica. La mia tesi e’ focalizzata su tecniche di indagine non distruttive in modo da migliorare la conoscenza della struttura per poi intervenire in modo corretto. L’obiettivo del lavoro svolto e’ stato indagare il contributo delle indagini sperimentali alla diagnosi di edifici storici, in particolare elementi strutturali lignei e di muratura applicando indagini sperimentali non distruttive. Ho dapprima descritto lo stato dell’arte delle varie prove effettuate attraverso la lettura e il confronto di diversi articoli tecnico-scientifici riportando gli obiettivi, la strumentazione impiegata e i risultati ottenuti dalle diverse prove. Ho poi effettuato uno studio del materiale legno utilizzato per le costruzioni, riportandone la descrizione dal punto di vista strutturale, le caratteristiche fisiche, chimiche e meccaniche, le diverse classificazioni e le fasi di lavorazione. Quindi ho analizzato alcune delle prove non distruttive necessarie per una diagnosi di elementi lignei. Per ogni prova vengono riportate alcune caratteristiche, il principio di funzionamento e la strumentazione utilizzata con eventuali software annessi. Negli ultimi 3 capitoli si procede con l’applicazione sperimentale delle prove in sito o in laboratorio precedentemente descritte, in diversi casi di studio; in particolare 1) l’applicazione della prova di compressione assiale su alcuni provini ricavati da un elemento strutturale in legno antico per ricavare vari parametri fisici e meccanici; 2) lo studio di una capriata di legno presente in laboratorio, recuperata dopo il sisma dell’Emilia del 2012, iniziando dall’ispezione visuale classificazione a vista degli elementi sulla base di quanto riportato nella normativa per poi analizzare i dati delle varie prove non distruttive eseguite. 3) Infine è applicata la prova termografica ad un edificio di interesse storico, come l’ex Monastero di Santa Marta, situato in via S. Vitale a Bologna per indagare la tipologia strutturale, le tecnologie costruttive impiegate nelle varie epoche di questo complesso.
Resumo:
L’interpretazione televisiva desta particolare interesse nella ricerca sull’interpretazione, perché questo contesto lavorativo dà una grande visibilità alla professione dell’interprete. Dato il considerevole aumento dell’importanza dei mezzi di comunicazione di massa e la loro capillarità, si impone una forma di trasferimento linguistico che renda fruibili al più vasto pubblico gli eventi trasmessi. Lo scopo di questo studio è presentare l’interpretazione televisiva nelle sue varie forme, soffermandosi poi sull’interpretazione simultanea e in particolare sull’analisi di due interpretazioni di uno stesso evento mediatico: il secondo dibattito televisivo del 2012 tra i due candidati alla presidenza degli Stati Uniti d’America: l’ex governatore del Massachusetts Mitt Romney e il presidente uscente Barack Obama. Il primo capitolo si basa sui dibattiti presidenziali statunitensi e illustra la storia dei dibattiti televisivi statunitensi, oltre al contesto e al discorso politico americano. Nel secondo capitolo viene analizzata l’interpretazione per la televisione. Dopo alcuni cenni storici, si passano in rassegna le diverse modalità di conversione linguistica utilizzate nel contesto televisivo. Il terzo capitolo verte sulla presentazione dei dati raccolti, la selezione e la trascrizione degli stessi. Il quarto capitolo contiene la vera e propria analisi delle rese interpretative in italiano dei dibattiti presidenziali americani scelti. Vengono prese in esame e confrontate le interpretazioni in italiano dello stesso dibattito trasmesse da due diversi canali: Rai News e Sky TG24. Il quinto capitolo contiene alcune considerazioni conclusive che si basano sul risultato dell’analisi dei dati. Nell’appendice viene invece presentata una parte del materiale analizzato, ovvero le trascrizioni segmentate e allineate del dibattito in originale e delle due interpretazioni.
Resumo:
Intento della tesi è fornire una soluzione progettuale capace di comprendere e valorizzare i vuoti urbani che caratterizzano la città di Berlino comprendendone la genesi e l'evoluzione. La fase di analisi ha portato all'individuazione di un isolato, l’Holzuferblock, caratterizzato da tutte le peculiarità dei vuoti urbani berlinesi. Questo isolato, nonostante si presenti particolarmente frammentato e privo di identità, ha in sé tutte le caratteristiche per incentivare fenomeni di riappropriazione dei luoghi: la presenza di un edificio industriale abbandonato, la vicinanza del fiume e la presenza di resti del sistema di difesa del Muro. Cogliendo nei numerosi frammenti presenti nell’area un valore anziché una criticità, il progetto ha portato alla definizione di un “recinto” residenziale in grado di valorizzare i vuoti urbani e i frammenti presenti, tra cui la Eisfabrik, l’ex fabbrica del ghiaccio ormai abbandonata. Individuando nella fabbrica un potenziale catalizzatore urbano in grado di innescare la rivitalizzazione dell’area, è stato progettato un sistema di supporto in grado di lavorare in sinergia con essa, costituendo un sistema unitario. Il limite che definisce il vuoto è costituito da un sistema di blocchi residenziali che, reinterpretando il blocco urbano berlinese, concretizzano l’eterogeneità e la diversità tipiche dell’area.
Resumo:
This in vitro study evaluated the performance of three ceramic and two commonly used polishing methods on two CAD/CAM ceramics. Surface roughness and quality were compared. A glazed group (GLGR) of each ceramic material served as reference. One-hundred and twenty specimens of VITABLOCS Mark II (VITA) and 120 specimens of IPS Empress CAD (IPS) were roughened in a standardized manner. Twenty VITA and 20 IPS specimens were glazed (VITA Akzent Glaze/Empress Universal Glaze). Five polishing methods were investigated (n=20/group): 1) EVE Diacera W11DC-Set (EVE), 2) JOTA 9812-Set (JOTA), 3) OptraFine-System (OFI), 4) Sof-Lex 2382 discs (SOF) and 5) Brownie/Greenie/Occlubrush (BGO). Polishing quality was measured with a surface roughness meter (Ra and Rz values). The significance level was set at alpha=0.05. Kruskal Wallis tests and pairwise Wilcoxon rank sum tests with Bonferroni-Holm adjustment were used. Qualitative surface evaluation of representative specimens was done with SEM. On VITA ceramics, SOF produced lower Ra (p<0.00001) but higher Rz values than GLGR (p=0.003); EVE, JOTA, OFI and BGO yielded significantly higher Ra and Rz values than GLGR. On IPS ceramics, SOF and JOTA exhibited lower Ra values than GLGR (p<0.0001). Equivalent Ra but significantly higher Rz values occurred between GLGR and EVE, OFI or BGO. VITA and IPS exhibited the smoothest surfaces when polished with SOF. Nevertheless, ceramic polishing systems are still of interest to clinicians using CAD/CAM, as these methods are universally applicable and showed an increased durability compared to the investigated silicon polishers.
Resumo:
The article discusses the problems of applicable law to copyright infringements online. It firstly identifies the main problems related to the well established territoriality principle and the lex loci protectionis rules. Then; the discussion focuses on the "ubiquitous infringement" rule recently proposed by the American Law Institute (ALI) and the European Max Planck Group for Conflicts of Law and Intellectual Propoperty (CLIP). The author strongly welcomes a compromise between the territoriality and universality approaches suggested in respect of ubiquitous infringement cases. At the same time; the paper draws the attention that the interests of "good faith" online service providers (such as legal certainty and foreseeability) have been until now underestimated and invites to take these interests into account when merging the projects into a common international proposal.
Resumo:
The purpose of the article is to provide first a doctrinal summary of the concept, rules and policy of exhaustion, first, on the international and EU level, and, later, under the law of the United States. Based upon this introduction, the paper turns to the analysis of the doctrine by the pioneer court decisions handed over in the UsedSoft, ReDigi, the German e-book/audio book cases, and the pending Tom Kabinet case from the Netherlands. Questions related to the licence versus sale dichotomy; the so-called umbrella solution; the “new copy theory”, migration of digital copies via the internet; the forward-and-delete technology; the issue of lex specialis and the theory of functional equivalence are covered later on. The author of the present article stresses that the answers given by the respective judges of the referred cases are not the final stop in the discussion. The UsedSoft preliminary ruling and the subsequent German domestic decisions highlight a special treatment for computer programs. On the other hand, the refusal of digital exhaustion in the ReDigi and the audio book/e-book cases might be in accordance with the present wording of copyright law; however, they do not necessarily reflect the proper trends of our ages. The paper takes the position that the need for digital exhaustion is constantly growing in society and amongst businesses. Indeed, there are reasonable arguments in favour of equalizing the resale of works sold in tangible and intangible format. Consequently, the paper urges the reconsideration of the norms on exhaustion on the international and EU level.
Resumo:
Enforcement of copyright online and fighting online “piracy” is a high priority on the EU agenda. Private international law questions have recently become some of the most challenging issues in this area. Internet service providers are still uncertain how the Brussels I Regulation (Recast) provisions would apply in EU-wide copyright infringement cases and in which country they can be sued for copyright violations. Meanwhile, because of the territorial approach that still underlies EU copyright law, right holders are unable to acquire EU-wide relief for copyright infringements online. This article first discusses the recent CJEU rulings in the Pinckney and Hejduk cases and argues that the “access approach” that the Court adopted for solving jurisdiction questions could be quite reasonable if it is applied with additional legal measures at the level of substantive law, such as the targeting doctrine. Secondly, the article explores the alternatives to the currently established lex loci protectionis rule that would enable right holders to get EU-wide remedies under a single applicable law. In particular, the analysis focuses on the special applicable law rule for ubiquitous copyright infringements, as suggested by the CLIP Group, and other international proposals.
Resumo:
Atmospheric concentrations of the three important greenhouse gases (GHGs) CO2, CH4 and N2O are mediated by processes in the terrestrial biosphere that are sensitive to climate and CO2. This leads to feedbacks between climate and land and has contributed to the sharp rise in atmospheric GHG concentrations since pre-industrial times. Here, we apply a process-based model to reproduce the historical atmospheric N2O and CH4 budgets within their uncertainties and apply future scenarios for climate, land-use change and reactive nitrogen (Nr) inputs to investigate future GHG emissions and their feedbacks with climate in a consistent and comprehensive framework1. Results suggest that in a business-as-usual scenario, terrestrial N2O and CH4 emissions increase by 80 and 45%, respectively, and the land becomes a net source of C by AD 2100. N2O and CH4 feedbacks imply an additional warming of 0.4–0.5 °C by AD 2300; on top of 0.8–1.0 °C caused by terrestrial carbon cycle and Albedo feedbacks. The land biosphere represents an increasingly positive feedback to anthropogenic climate change and amplifies equilibrium climate sensitivity by 22–27%. Strong mitigation limits the increase of terrestrial GHG emissions and prevents the land biosphere from acting as an increasingly strong amplifier to anthropogenic climate change.
Resumo:
Objectives: To investigate surface roughness and microhardness of two recent resin-ceramic materials for computer-aided design/computer-aided manufacturing (CAD/CAM) after polishing with three polishing systems. Surface roughness and microhardness were measured immediately after polishing and after six months storage including monthly artificial toothbrushing. Methods: Sixty specimens of Lava Ultimate (3M ESPE) and 60 specimens of VITA ENAMIC (VITA Zahnfabrik) were roughened in a standardized manner and polished with one of three polishing systems (n=20/group): Sof-Lex XT discs (SOFLEX; three-step (medium-superfine); 3M ESPE), VITA Polishing Set Clinical (VITA; two-step; VITA Zahnfabrik), or KENDA Unicus (KENDA; one-step; KENDA Dental). Surface roughness (Ra; μm) was measured with a profilometer and microhardness (Vickers; VHN) with a surface hardness indentation device. Ra and VHN were measured immediately after polishing and after six months storage (tap water, 37°C) including monthly artificial toothbrushing (500 cycles/month, toothpaste RDA ~70). Ra- and VHN-values were analysed with nonparametric ANOVA followed by Wilcoxon rank sum tests (α=0.05). Results: For Lava Ultimate, Ra (mean [standard deviation] before/after storage) remained the same when polished with SOFLEX (0.18 [0.09]/0.19 [0.10]; p=0.18), increased significantly with VITA (1.10 [0.44]/1.27 [0.39]; p=0.0001), and decreased significantly with KENDA (0.35 [0.07]/0.33 [0.08]; p=0.03). VHN (mean [standard deviation] before/after storage) decreased significantly regardless of polishing system (SOFLEX: 134.1 [5.6]/116.4 [3.6], VITA: 138.2 [10.5]/115.4 [5.9], KENDA: 135.1 [6.2]/116.7 [6.3]; all p<0.0001). For VITA ENAMIC, Ra (mean [standard deviation] before/after storage) increased significantly when polished with SOFLEX (0.37 [0.18]/0.41 [0.14]; p=0.01) and remained the same with VITA (1.32 [0.37]/1.31 [0.40]; p=0.58) and with KENDA (0.81 [0.35]/0.78 [0.32]; p=0.21). VHN (mean [standard deviation] before/after storage) remained the same regardless of polishing system (SOFLEX: 284.9 [24.6]/282.4 [31.8], VITA: 284.6 [28.5]/276.4 [25.8], KENDA: 292.6 [26.9]/282.9 [24.3]; p=0.42-1.00). Conclusion: Surface roughness and microhardness of Lava Ultimate was more affected by storage and artificial toothbrushing than was VITA ENAMIC.
Resumo:
The goal of asthma treatment is to obtain clinical control and reduce future risks to the patient. To reach this goal in children with asthma, ongoing monitoring is essential. While all components of asthma, such as symptoms, lung function, bronchial hyperresponsiveness and inflammation, may exist in various combinations in different individuals, to date there is limited evidence on how to integrate these for optimal monitoring of children with asthma. The aims of this ERS Task Force were to describe the current practise and give an overview of the best available evidence on how to monitor children with asthma. 22 clinical and research experts reviewed the literature. A modified Delphi method and four Task Force meetings were used to reach a consensus. This statement summarises the literature on monitoring children with asthma. Available tools for monitoring children with asthma, such as clinical tools, lung function, bronchial responsiveness and inflammatory markers, are described as are the ways in which they may be used in children with asthma. Management-related issues, comorbidities and environmental factors are summarised. Despite considerable interest in monitoring asthma in children, for many aspects of monitoring asthma in children there is a substantial lack of evidence.
Resumo:
The domain of context-free languages has been extensively explored and there exist numerous techniques for parsing (all or a subset of) context-free languages. Unfortunately, some programming languages are not context-free. Using standard context-free parsing techniques to parse a context-sensitive programming language poses a considerable challenge. Im- plementors of programming language parsers have adopted various techniques, such as hand-written parsers, special lex- ers, or post-processing of an ambiguous parser output to deal with that challenge. In this paper we suggest a simple extension of a top-down parser with contextual information. Contrary to the tradi- tional approach that uses only the input stream as an input to a parsing function, we use a parsing context that provides ac- cess to a stream and possibly to other context-sensitive infor- mation. At a same time we keep the context-free formalism so a grammar definition stays simple without mind-blowing context-sensitive rules. We show that our approach can be used for various purposes such as indent-sensitive parsing, a high-precision island parsing or XML (with arbitrary el- ement names) parsing. We demonstrate our solution with PetitParser, a parsing-expression grammar based, top-down, parser combinator framework written in Smalltalk.
Resumo:
Welsch (Projektbearbeiter): "Am 5. April 1849 wurden im Hafen von E.[ckernförde] das dänische Linienschiff 'Christian VIII.' und die Fregatte 'Gefion' von den deutschen Strandbatterien beschossen, wobei ersteres aufflog, letztere sich ergeben mußte" [Meyers Konv.-Lex., 6. Aufl.]
Resumo:
Boberach: Die Schwäche des deutschen Liberalismus und prominente Mitglieder der Nationalversammlung, vor allem Gagern, werden angegriffen. - Welsch (Projektbearbeiter): "... im heinesierenden Chronikenstil gehaltene satirische Fresken aus der Paulskirche" [Meyers Konv.-Lex., 6. Aufl.] des Dichters und Abgeordneten der Frankfurter Nationalversammlung Hartmann