8 resultados para What-if Analysis

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

90.00% 90.00%

Publicador:

Resumo:

La ricognizione delle opere composte da Filippo Tommaso Marinetti tra il 1909 e il 1912 è sostenuta da una tesi paradossale: il futurismo di Marinetti non sarebbe un'espressione della modernità, bensì una reazione anti-moderna, che dietro a una superficiale ed entusiastica adesione ad alcune parole d'ordine della seconda rivoluzione industriale nasconderebbe un pessimismo di fondo nei confronti dell'uomo e della storia. In questo senso il futurismo diventa un emblema del ritardo culturale e del gattopardismo italiano, e anticipa l’analoga operazione svolta in politica da Mussolini: dietro un’adesione formale ad alcune istanze della modernità, la preservazione dello Status Quo. Marinetti è descritto come un corpo estraneo rispetto alla cultura scientifica del Novecento: un futurista senza futuro (rarissime in Marinetti sono le proiezioni fantascientifiche). Questo aspetto è particolarmente evidente nelle opere prodotte del triennio 1908-1911, che non solo sono molto diverse dalle opere futuriste successive, ma per alcuni aspetti rappresentano una vera e propria antitesi di ciò che diventerà il futurismo letterario a partire dal 1912, con la pubblicazione del Manifesto tecnico della letteratura futurista e l'invenzione delle parole in libertà. Nelle opere precedenti, a un sostanziale disinteresse per il progressismo tecnologico corrispondeva un'attenzione ossessiva per la corporeità e un ricorso continuo all'allegoria, con effetti particolarmente grotteschi (soprattutto nel romanzo Mafarka le futuriste) nei quali si rilevano tracce di una concezione del mondo di sapore ancora medioevo-rinascimentale. Questa componente regressiva del futurismo marinettiano viene platealmente abbandonata a partire dal 1912, con Zang Tumb Tumb, salvo riaffiorare ciclicamente, come una corrente sotterranea, in altre fasi della sua carriera: nel 1922, ad esempio, con la pubblicazione de Gli indomabili (un’altra opera allegorica, ricca di reminiscenze letterarie). Quella del 1912 è una vera e propria frattura, che nel primo capitolo è indagata sia da un punto di vista storico (attraverso la documentazione epistolare e giornalistica vengono portate alla luce le tensioni che portarono gran parte dei poeti futuristi ad abbandonare il movimento proprio in quell'anno) che da un punto di vista linguistico: sono sottolineate le differenze sostanziali tra la produzione parolibera e quella precedente, e si arrischia anche una spiegazione psicologica della brusca svolta impressa da Marinetti al suo movimento. Nel secondo capitolo viene proposta un'analisi formale e contenutistica della ‘funzione grottesca’ nelle opere di Marinetti. Nel terzo capitolo un'analisi comparata delle incarnazioni della macchine ritratte nelle opere di Marinetti ci svela che quasi sempre in questo autore la macchina è associata al pensiero della morte e a una pulsione masochistica (dominante, quest'ultima, ne Gli indomabili); il che porta ad arrischiare l'ipotesi che l'esperienza futurista, e in particolare il futurismo parolibero posteriore al 1912, sia la rielaborazione di un trauma. Esso può essere interpretato metaforicamente come lo choc del giovane Marinetti, balzato in pochi anni dalle sabbie d'Alessandria d'Egitto alle brume industriali di Milano, ma anche come una reale esperienza traumatica (l'incidente automobilistico del 1908, “mitologizzato” nel primo manifesto, ma che in realtà fu vissuto dall'autore come esperienza realmente perturbante).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The coastal area along the Emilia-Romagna (ER), in the Italian side of the northern Adriatic Sea, is considered to implement an unstructured numerical ocean model with the aim to develop innovative tools for the coastal management and a forecasting system for the storm surge risk reduction. The Adriatic Sea has been the focus of several studies because of its peculiar dynamics driven by many forcings acting at basin and local scales. The ER coast is particularly exposed to storm surge events. In particular conditions, winds, tides and seicehs may combine and contribute to the flooding of the coastal area. The global sea level rise expected in the next decades will increase even more the hazard along the ER and Adriatic coast. Reliable Adriatic and Mediterranean scale numerical ocean models are now available to allow the dynamical downscaling of very high-resolution models in limited coastal areas. In this work the numerical ocean model SHYFEM is implemented in the Goro lagoon (named GOLFEM) and along the ER coast (ShyfER) to test innovative solutions against sea related coastal hazards. GOLFEM was succesfully applied to analyze the Goro lagoon dynamics and to assess the dynamical effects of human interventions through the analysis of what-if scenarios. The assessment of storm surge hazard in the Goro lagoon was carried out through the development of an ensemble storm surge forecasting system with GOLFEM using forcing from different operational meteorological and ocean models showing the fundamental importance of the boundary conditions. The ShyfER domain is used to investigate innovative solutions against storm surge related hazard along the ER coast. The seagrass is assessed as a nature-based solution (NBS) for coastal protection under present and future climate conditions. The results show negligible effects on sea level but sensible effects in reducing bottom current velocity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation deals with the problems and the opportunities of a semiotic approach to perception. Is perception, seen as the ability to detect and articulate an coherent picture of the surrounding environment, describable in semiotic terms? Is it possibile, for a discipline wary of any attempt to reduce semiotic meaning to a psychological and naturalized issue, to come to terms with the cognitive, automatic and genetically hard-wired specifics of our perceptive systems? In order to deal with perceptive signs, is it necessary to modify basic assumptions in semiotics, or can we simply extend the range of our conceptual instruments and definitions? And what if perception is a wholly different semiotic machinery, to be considered as sui generis, but nonetheless interesting for a general theory of semiotics? By exposing the major ideas put forward by the main thinkers in the semiotic field, Mattia de Bernardis gives a comprehensive picture of the theoretical situation, adding to the classical dichotomy between structuralist and interpretative semiotics another distinction, that between homogeneist and etherogeneist theories of perception. Homogeneist semioticians see perception as one of many semiotic means of sign production, totally similar to the other ones, while heterogeneist semioticians consider perceptive meaning as essentially different from normal semiotic meaning, so much so that it requires new methods and ideas to be analyzed. The main example of etherogeneist approach to perception in semiotic literature, Umberto Eco’s “primary semiosis” is then presented, critically examined and eventually rejected and the homogeneist stance is affirmed as the most promising path towards a semiotic theory of perception.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this dissertation the pyrolytic conversion of biomass into chemicals and fuels was investigated from the analytical point of view. The study was focused on the liquid (bio-oil) and solid (char) fractions obtainable from biomass pyrolysis. The drawbacks of Py-GC-MS described so far were partially solved by coupling different analytical configurations (Py-GC-MS, Py-GC-MIP-AED and off-line Py-SPE and Py-SPME-GC-MS with derivatization procedures). The application of different techniques allowed a satisfactory comparative analysis of pyrolysis products of different biomass and a high throughput screening on effect of 33 catalysts on biomass pyrolysis. As the results of the screening showed, the most interesting catalysts were those containing copper (able to reduce the high molecular weight fraction of bio-oil without large yield decrease) and H-ZSM-5 (able to entirely convert the bio-oil into “gasoline like” aromatic products). In order to establish the noxious compounds content of the liquid product, a clean-up step was included in the Py-SPE procedure. This allowed to investigate pollutants (PAHs) generation from pyrolysis and catalytic pyrolysis of biomass. In fact, bio-oil from non-catalytic pyrolysis of biomass showed a moderate PAHs content, while the use of H-ZSM-5 catalyst for bio-oil up-grading determined an astonishing high production of PAHs (if compared to what observed in alkanes cracking), indicating an important concern in the substitution fossil fuel with bio-oil derived from biomass. Moreover, the analytical procedures developed in this thesis were directly applied for the detailed study of the most useful process scheme and up-grading route to chemical intermediates (anhydrosugars), transportation fuels or commodity chemicals (aromatic hydrocarbons). In the applied study, poplar and microalgae biomass were investigated and overall GHGs balance of pyrolysis of agricultural residues in Ravenna province was performed. A special attention was put on the comparison of the effect of bio-char different use (fuel or as soil conditioner) on the soil health and GHGs emissions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Animal neocentromeres are defined as ectopic centromeres that have formed in non-centromeric locations and avoid some of the features, like the DNA satellite sequence, that normally characterize canonical centromeres. Despite this, they are stable functional centromeres inherited through generations. The only existence of neocentromeres provide convincing evidence that centromere specification is determined by epigenetic rather than sequence-specific mechanisms. For all this reasons, we used them as simplified models to investigate the molecular mechanisms that underlay the formation and the maintenance of functional centromeres. We collected human cell lines carrying neocentromeres in different positions. To investigate the region involved in the process at the DNA sequence level we applied a recent technology that integrates Chromatin Immuno-Precipitation and DNA microarrays (ChIP-on-chip) using rabbit polyclonal antibodies directed against CENP-A or CENP-C human centromeric proteins. These DNA binding-proteins are required for kinetochore function and are exclusively targeted to functional centromeres. Thus, the immunoprecipitation of DNA bound by these proteins allows the isolation of centromeric sequences, including those of the neocentromeres. Neocentromeres arise even in protein-coding genes region. We further analyzed if the increased scaffold attachment sites and the corresponding tighter chromatin of the region involved in the neocentromerization process still were permissive or not to transcription of within encoded genes. Centromere repositioning is a phenomenon in which a neocentromere arisen without altering the gene order, followed by the inactivation of the canonical centromere, becomes fixed in population. It is a process of chromosome rearrangement fundamental in evolution, at the bases of speciation. The repeat-free region where the neocentromere initially forms, progressively acquires extended arrays of satellite tandem repeats that may contribute to its functional stability. In this view our attention focalized to the repositioned horse ECA11 centromere. ChIP-on-chip analysis was used to define the region involved and SNPs studies, mapping within the region involved into neocentromerization, were carried on. We have been able to describe the structural polymorphism of the chromosome 11 centromeric domain of Caballus population. That polymorphism was seen even between homologues chromosome of the same cells. That discovery was the first described ever. Genomic plasticity had a fundamental role in evolution. Centromeres are not static packaged region of genomes. The key question that fascinates biologists is to understand how that centromere plasticity could be combined to the stability and maintenance of centromeric function. Starting from the epigenetic point of view that underlies centromere formation, we decided to analyze the RNA content of centromeric chromatin. RNA, as well as secondary chemically modifications that involve both histones and DNA, represents a good candidate to guide somehow the centromere formation and maintenance. Many observations suggest that transcription of centromeric DNA or of other non-coding RNAs could affect centromere formation. To date has been no thorough investigation addressing the identity of the chromatin-associated RNAs (CARs) on a global scale. This prompted us to develop techniques to identify CARs in a genome-wide approach using high-throughput genomic platforms. The future goal of this study will be to focalize the attention on what strictly happens specifically inside centromere chromatin.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Frame. Assessing the difficulty of source texts and parts thereof is important in CTIS, whether for research comparability, for didactic purposes or setting price differences in the market. In order to empirically measure it, Campbell & Hale (1999) and Campbell (2000) developed the Choice Network Analysis (CNA) framework. Basically, the CNA’s main hypothesis is that the more translation options (a group of) translators have to render a given source text stretch, the higher the difficulty of that text stretch will be. We will call this the CNA hypothesis. In a nutshell, this research project puts the CNA hypothesis to the test and studies whether it does actually measure difficulty. Data collection. Two groups of participants (n=29) of different profiles and from two universities in different countries had three translation tasks keylogged with Inputlog, and filled pre- and post-translation questionnaires. Participants translated from English (L2) into their L1s (Spanish or Italian), and worked—first in class and then at home—using their own computers, on texts ca. 800–1000 words long. Each text was translated in approximately equal halves in two 1-hour sessions, in three consecutive weeks. Only the parts translated at home were considered in the study. Results. A very different picture emerged from data than that which the CNA hypothesis might predict: there was no prevalence of disfluent task segments when there were many translation options, nor was a prevalence of fluent task segments associated to fewer translation options. Indeed, there was no correlation between the number of translation options (many and few) and behavioral fluency. Additionally, there was no correlation between pauses and both behavioral fluency and typing speed. The discussed theoretical flaws and the empirical evidence lead to the conclusion that the CNA framework does not and cannot measure text and translation difficulty.