999 resultados para base di trascendenza disgiunzione lineare estensione separabile base di trascendenza campo perfetto
Resumo:
Aim: To evaluate the effects of 10% NaOCl gel application on the dentin bond strengths and morphology of resin-dentin interfaces formed by three adhesives. Methods: Two etch-and-rinse adhesives (One-Step Plus, Bisco Inc. and Clearfil Photo Bond, Kuraray Noritake Dental) and one self-etch adhesive (Clearfil SE Bond, Kuraray Noritake Dental) were applied on dentin according to the manufacturers’ instructions or after the treatment with 10% NaOCl (ED-Gel, Kuraray Noritake Dental) for 60 s. For interfacial analysis, specimens were subjected to acid-base challenge and observed by SEM to identify the formation of the acid-base resistant zone (ABRZ). For microtensile bond strength, the same groups were investigated and the restored teeth were thermocycled (5,000 cycles) or not before testing. Bond strength data were subjected to two-way ANOVA and Tukey’s test (p<0.05). Results: NaOCl application affected the bond strengths for One-Step Plus and Clearfil Photo Bond. Thermocycling reduced the bond strengths for Clearfil Photo Bond and Clearfil SE Bond when used after NaOCl application and One-Step Plus when used as recommended by manufacturer. ABRZ was observed adjacent to the hybrid layer for self-etch primer. The etch-and-rinse systems showed external lesions after acid-base challenge and no ABRZ formation when applied according to manufacturer’s instructions. Conclusions: 10% NaOCl changed the morphology of the bonding interfaces and its use with etch-&-rinse adhesives reduced the dentin bond strength. Formation of ABRZ was material-dependent and the interface morphologies were different among the tested materials.
Resumo:
Il presente elaborato si propone di analizzare i moderni sistemi d’isolamento sismico. In particolare si studiano il funzionamento e le procedure di calcolo degli isolatori a pendolo scorrevole, facendo riferimento alle prescrizioni dell’attuale normativa tecnica e si analizza l’affidabilità sismica di una struttura isolata con tali dispositivi. Nell’ottica di voler introdurre i concetti base della progettazione antisismica, nel primo capitolo, vengono descritte le origini del fenomeno sismico, per arrivare alla definizione di accelerogramma di un terremoto. Per comprendere il passaggio da accelerogramma a spettro dell’azione sismica, si introduce la dinamica dei sistemi ad un grado di libertà per poi arrivare a definire gli spettri di risposta previsti dall’attuale normativa. Nel secondo capitolo vengono illustrati gli scopi e le funzionalità dell’isolamento sismico per poi passare in rassegna le principali tipologie di dispositivi antisismici attualmente in commercio. Nel terzo capitolo si analizza nel dettaglio il funzionamento degli isolatori a pendolo scorrevole, studiandone la modellazione matematica alla base e le procedure di calcolo previste dall’Eurocodice 8 e dalle NTC 08. Nello stesso capitolo viene progettato un dispositivo di isolamento a pendolo scorrevole, secondo l'analisi lineare equivalente. Nel quarto capitolo viene illustrata l’analisi sismica effettuata dagli ingegneri Castaldo, Palazzo e Della Vecchia, dell’università di Salerno, i cui risultati sono riportati nella rivista scientifica Engeneering structures, n. 95, 2015, pp. 80-93. Lo studio è effettuato sul modello strutturale tridimensionale di un edificio in calcestruzzo armato, isolato alla base con dispositivi a pendolo scorrevole.
Resumo:
Stroke stands for one of the most frequent causes of death, without distinguishing age or genders. Despite representing an expressive mortality fig-ure, the disease also causes long-term disabilities with a huge recovery time, which goes in parallel with costs. However, stroke and health diseases may also be prevented considering illness evidence. Therefore, the present work will start with the development of a decision support system to assess stroke risk, centered on a formal framework based on Logic Programming for knowledge rep-resentation and reasoning, complemented with a Case Based Reasoning (CBR) approach to computing. Indeed, and in order to target practically the CBR cycle, a normalization and an optimization phases were introduced, and clustering methods were used, then reducing the search space and enhancing the cases re-trieval one. On the other hand, and aiming at an improvement of the CBR theo-retical basis, the predicates` attributes were normalized to the interval 0…1, and the extensions of the predicates that match the universe of discourse were re-written, and set not only in terms of an evaluation of its Quality-of-Information (QoI), but also in terms of an assessment of a Degree-of-Confidence (DoC), a measure of one`s confidence that they fit into a given interval, taking into account their domains, i.e., each predicate attribute will be given in terms of a pair (QoI, DoC), a simple and elegant way to represent data or knowledge of the type incomplete, self-contradictory, or even unknown.
Resumo:
As a matter of fact, an Intensive Care Unit (ICU) stands for a hospital facility where patients require close observation and monitoring. Indeed, predicting Length-of-Stay (LoS) at ICUs is essential not only to provide them with improved Quality-of-Care, but also to help the hospital management to cope with hospital resources. Therefore, in this work one`s aim is to present an Artificial Intelligence based Decision Support System to assist on the prediction of LoS at ICUs, which will be centered on a formal framework based on a Logic Programming acquaintance for knowledge representation and reasoning, complemented with a Case Based approach to computing, and able to handle unknown, incomplete, or even contradictory data, information or knowledge.
Resumo:
It is well known that human resources play a valuable role in a sustainable organizational development. Indeed, this work will focus on the development of a decision support system to assess workers’ satisfaction based on factors related to human resources management practices. The framework is built on top of a Logic Programming approach to Knowledge Representation and Reasoning, complemented with a Case Based approach to computing. The proposed solution is unique in itself, once it caters for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in terms of a qualitative or quantitative setting. Furthermore, clustering methods based on similarity analysis among cases were used to distinguish and aggregate collections of historical data or knowledge in order to reduce the search space, therefore enhancing the cases retrieval and the overall computational process.
Resumo:
Studying moduli spaces of semistable Higgs bundles (E, \phi) of rank n on a smooth curve C, a key role is played by the spectral curve X (Hitchin), because an important result by Beauville-Narasimhan-Ramanan allows us to study isomorphism classes of such Higgs bundles in terms of isomorphism classes of rank-1 torsion-free sheaves on X. This way, the generic fibre of the Hitchin map, which associates to any semistable Higgs bundle the coefficients of the characteristic polynomial of \phi, is isomorphic to the Jacobian of X. Focusing on rank-2 Higgs data, this construction was extended by Barik to the case in which the curve C is reducible, one-nodal, having two smooth components. Such curve is called of compact type because its Picard group is compact. In this work, we describe and clarify the main points of the construction by Barik and we give examples, especially concerning generic fibres of the Hitchin map. Referring to Hausel-Pauly, we consider the case of SL(2,C)-Higgs bundles on a smooth base curve, which are such that the generic fibre of the Hitchin map is a subvariety of the Jacobian of X, the Prym variety. We recall the description of special loci, called endoscopic loci, such that the associated Prym variety is not connected. Then, letting G be an affine reductive group having underlying Lie algebra so(4,C), we consider G-Higgs bundles on a smooth base curve. Starting from the construction by Bradlow-Schaposnik, we discuss the associated endoscopic loci. By adapting these studies to a one-nodal base curve of compact type, we describe the fibre of the SL(2,C)-Hitchin map and of the G-Hitchin map, together with endoscopic loci. In the Appendix, we give an interpretation of generic spectral curves in terms of families of double covers.
Resumo:
Nowadays, an important world’s population growth forecast establish that an increase of 2 billion people is expected by 2050. (UN,2019). This increment of people worldwide involves more humans, as well as growth of the demand for the construction of new residential, institutional, industrial, and infrastructural areas, prompting to a higher consumption of natural resources as required for construction materials. In addition, an effect of this population growth is the production and accumulation of waste causing a serious environmental and economic issue around the world. As an alternative to just producing more waste at the final stage of a building, house, road, among other concrete-based structures, adequate techniques must be applied for recycling and reusing these potential materials. The main priority of the thesis is to foment and evaluate the sustainable construction work leading to environmental-friendly actions that promote the reuse and recycling of construction waste, focusing on the use of construction recycled construction materials as an alternative for sub-base and base of road structure application. This thesis is committed to the analysis of the several laboratory tests carried out for achieving the physical-mechanical properties of the studied materials (recycled concrete aggregates + reclaimed asphalt pavement (RCA+RAP) and stabilized crushed sleepers). All these tests have been carried out in the Laboratory of Roads from the University of Bologna and in the experimental site in CAR srl., at Imola. The results are reported in tables, graphs, and are discussed. The mechanical properties values obtained from the laboratory tests are analysed and compared with standard values declared in the Italian and European normative for roads construction and to the results obtained from in-situ tests in the experimentation field (CAR srl in Imola) with the same materials. This to analyse the performance of them under natural conditions.
Resumo:
La tesi analizza il modello Input-Output, introdotto da Leontief nel 1936, per studiare la reazione dei sistemi industriali di Germania, Spagna ed Italia alle restrizioni imposte dai governi per limitare la diffusione della pandemia da COVID-19. Si studiano le economie considerando gli scambi tra i settori produttivi intermedi e la domanda finale. La formulazione originale del modello necessita diverse modifiche per descrivere realisticamente le reti di produzione e comunque non è del tutto esaustiva in quanto si ipotizza che la produttività dei sistemi sia sempre tale da soddisfare pienamente la domanda che giunge per il prodotto emesso. Perciò si introduce una distinzione tra le variabili del problema, assumendo che alcune componenti di produzione siano indipendenti dalla richiesta e che altre componenti siano endogene. Le soluzioni di questo sistema tuttavia non sempre risultano appartenenti al dominio di definizione delle variabili. Dunque utilizzando tecniche di programmazione lineare, si osservano i livelli massimi di produzione e domanda corrisposta in un periodo di crisi anche quando i sistemi non raggiungono questa soglia poiché non pienamente operativi. Si propongono diversi schemi di razionamento per distribuire tra i richiedenti i prodotti emessi: 1) programma proporzionale in base alle domande di tutti i richiedenti; 2) programma proporzionale in base alle richieste, con precedenza ai settori intermedi; 3) programma prioritario in cui vengono riforniti i settori intermedi in base alla dimensione dell’ordine; 4) programma prioritario con fornitura totale degli ordini e ordine di consegna casuale. I risultati ottenuti dipendono dal modello di fornitura scelto, dalla dimensione dello shock cui i settori sono soggetti e dalle proprietà della rete industriale, descritta come grafo pesato.
Resumo:
Il telerilevamento satellitare costituisce una delle tecniche di osservazione maggiormente efficace nel monitoraggio e nello studio dell'estensione del manto nevoso. Il manto ricopre un ruolo di estrema importanza quale risorsa idrica per gli ecosistemi e per le applicazioni agricole e industriali. Inoltre, è un indicatore climatologico in un contesto di cambiamenti climatici regionali e globali. In questo senso, la copertura nevosa è da considerarsi come una importante risorsa economica e sociale. Lo scopo della presente tesi è di produrre mappe di copertura nevosa giornaliere per la stima dell'estensione del manto nevoso della regione Emilia-Romagna e di analizzare la sua variabilità spazio-temporale nel periodo 2000-2020. Le mappe sono state sviluppate sulla base dei prodotti di neve, M*D10A1, del sensore MODIS, che consistono in mappe di classificazione della copertura in funzione dell'indice NDSI. Inizialmente, è stato costruito un albero decisionale con criteri a soglia multipla in grado di rielaborare la classificazione della superficie del sensore e produrre mappe di copertura nevosa sulla base di tre classi: neve, no-neve e nube. L'accuratezza delle mappe è stata validata tramite test statistici effettuati confrontandole con i dati di altezza del manto nevoso in situ di 24 stazioni meteorologiche, per il periodo di compresenza dei dati 2000-2012. I risultati della procedura di validazione hanno mostrato come, in generale, vi sia buon accordo tra il dato satellitare e la rispettiva osservazione al suolo, soprattutto nei pressi di stazioni lontane da vegetazione sempreverde e/o di ambiente urbano. Infine, è stata valutata la variabilità climatologica dell'estensione del manto nevoso tramite l'elaborazione degli indici di neve SCF, SCD e diversi indici SCA. L'attenzione è stata particolarmente focalizzata sugli indici di massima estensione invernale del manto, del valore mediano e del numero di giorni con estensione superiore al 39.5% della regione.
Resumo:
Uno degli obiettivi più ambizioni e interessanti dell'informatica, specialmente nel campo dell'intelligenza artificiale, consiste nel raggiungere la capacità di far ragionare un computer in modo simile a come farebbe un essere umano. I più recenti successi nell'ambito delle reti neurali profonde, specialmente nel campo dell'elaborazione del testo in linguaggio naturale, hanno incentivato lo studio di nuove tecniche per affrontare tale problema, a cominciare dal ragionamento deduttivo, la forma più semplice e lineare di ragionamento logico. La domanda fondamentale alla base di questa tesi è infatti la seguente: in che modo una rete neurale basata sull'architettura Transformer può essere impiegata per avanzare lo stato dell'arte nell'ambito del ragionamento deduttivo in linguaggio naturale? Nella prima parte di questo lavoro presento uno studio approfondito di alcune tecnologie recenti che hanno affrontato questo problema con intuizioni vincenti. Da questa analisi emerge come particolarmente efficace l'integrazione delle reti neurali con tecniche simboliche più tradizionali. Nella seconda parte propongo un focus sull'architettura ProofWriter, che ha il pregio di essere relativamente semplice e intuitiva pur presentando prestazioni in linea con quelle dei concorrenti. Questo approfondimento mette in luce la capacità dei modelli T5, con il supporto del framework HuggingFace, di produrre più risposte alternative, tra cui è poi possibile cercare esternamente quella corretta. Nella terza e ultima parte fornisco un prototipo che mostra come si può impiegare tale tecnica per arricchire i sistemi tipo ProofWriter con approcci simbolici basati su nozioni linguistiche, conoscenze specifiche sul dominio applicativo o semplice buonsenso. Ciò che ne risulta è un significativo miglioramento dell'accuratezza rispetto al ProofWriter originale, ma soprattutto la dimostrazione che è possibile sfruttare tale capacità dei modelli T5 per migliorarne le prestazioni.
Resumo:
The cranial base, composed of the midline and lateral basicranium, is a structurally important region of the skull associated with several key traits, which has been extensively studied in anthropology and primatology. In particular, most studies have focused on the association between midline cranial base flexion and relative brain size, or encephalization. However, variation in lateral basicranial morphology has been studied less thoroughly. Platyrrhines are a group of primates that experienced a major evolutionary radiation accompanied by extensive morphological diversification in Central and South America over a large temporal scale. Previous studies have also suggested that they underwent several evolutionarily independent processes of encephalization. Given these characteristics, platyrrhines present an excellent opportunity to study, on a large phylogenetic scale, the morphological correlates of primate diversification in brain size. In this study we explore the pattern of variation in basicranial morphology and its relationship with phylogenetic branching and with encephalization in platyrrhines. We quantify variation in the 3D shape of the midline and lateral basicranium and endocranial volumes in a large sample of platyrrhine species, employing high-resolution CT-scans and geometric morphometric techniques. We investigate the relationship between basicranial shape and encephalization using phylogenetic regression methods and calculate a measure of phylogenetic signal in the datasets. The results showed that phylogenetic structure is the most important dimension for understanding platyrrhine cranial base diversification; only Aotus species do not show concordance with our molecular phylogeny. Encephalization was only correlated with midline basicranial flexion, and species that exhibit convergence in their relative brain size do not display convergence in lateral basicranial shape. The evolution of basicranial variation in primates is probably more complex than previously believed, and understanding it will require further studies exploring the complex interactions between encephalization, brain shape, cranial base morphology, and ecological dimensions acting along the species divergence process.
Resumo:
A simple analytical method for extraction and quantification of lutein colorant added to yogurt was developed and validated. The method allowed complete extraction of carotenoids using tetrahydrofuran in vortex, followed by centrifugation, partition to diethyl ether/petroleum ether, and drying. The carotenoids dissolved in ethanol were quantified by UV-Vis spectrophotometry. This method showed linearity in the range tested (1.41-13.42 µg g-1), limits of detection and quantification of 0.42 and 1.28 µg g-1, respectively, low relative standard deviation (3.4%) and recovery ranging from 95 to 103%. The method proved reliable for quantification of lutein added to yogurt.
Resumo:
A base-cutter represented for a mechanism of four bars, was developed using the Autocad program. The normal force of reaction of the profile in the contact point was determined through the dynamic analysis. The equations of dynamic balance were based on the laws of Newton-Euler. The linkage was subject to an optimization technique that considered the peak value of soil reaction force as the objective function to be minimized while the link lengths and the spring constant varied through a specified range. The Algorithm of Sequential Quadratic Programming-SQP was implemented of the program computational Matlab. Results were very encouraging; the maximum value of the normal reaction force was reduced from 4,250.33 to 237.13 N, making the floating process much less disturbing to the soil and the sugarcane rate. Later, others variables had been incorporated the mechanism optimized and new otimization process was implemented .
Resumo:
Base cutting and feeding into harvesters of plants lying close to the ground surface require an efficient sweeping action of the cutting mechanism. It is not the case of conventional sugarcane harvesters which have rigid blades mounted on discs capable to contaminate the cane with dirt as well as damage the ratoons. The objective of this work was to simulate the sweeping performance of a segmented base cutter. The model was developed using the laws of dynamic. Simulation included two rotational speeds (400 and 600 rpm), two cutting heights (0.12 and 0.13 m) and two disk tilting angles (-10º and -12º). The simulated sweeping angle varied between 56º and 193º, which are very promising as a mean to cutting and feeding cane sticks lying on the ground. Cutting height was the variable that affected sweeping action the most. This behavior indicates the need to have an automatic control of the cutting disk height in order to keep good sweeping performance as the harvester moves forward.
Resumo:
Rice husk, employed as an energy source at milling industries in Brazil generates, after burning, a dark ash. This residue is not yet conveniently disposed, being currently dumped on large areas, causing environmental problems. This research intended to evaluate the applications of residual rice husk ashes (RHA) as a partial replacement of cement for mortar production. Rice husk ash was chemically characterized through X-ray fluorescence, determination of carbon content, X-ray diffraction, and laser granulometric analysis. Mortar specimens were submitted to two different exposure conditions: internal and external environments at a maximum period of five months. Physical-mechanical testing were compressive strength and ultrasonic pulse velocity (UPV). Although presenting good mechanical performance, the mortar based on ash (RHA) did not present pozolanicity but it can be employed in cement matrices as inert material (filler).