14 resultados para Time-space evolution

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study concerns the representation of space in Caribbean literature, both francophone and Anglophone and, in particular, but not only, in the martinican literature, in the works of the authors born in the island. The analysis focus on the second half of the last century, a period in which the martinican production of novels and romances increased considerably, and where the representation and the rule of space had a relevant place. So, the thesis explores the literary modalities of this representation. The work is constituted of 5 chapters and the critical and methodological approaches are both of an analytical and comparative type. The first chapter “The caribbean space: geography, history and society” presents the geographic context, through an analysis of the historical and political major events occurred in the Caribbean archipelago, in particular of the French Antilles, from the first colonization until the départementalisation. The first paragraph “The colonized space: historical-political excursus” the explores the history of the European colonization that marked forever the theatre of the relationship between Europe, Africa and the New World. This social situation take a long and complex process of “Re-appropriation and renegotiation of the space”, (second paragraph) always the space of the Other, that interest both the Antillean society and the writers’ universe. So, a series of questions take place in the third paragraph “Landscape and identity”: what is the function of space in the process of identity construction? What are the literary forms and representations of space in the Caribbean context? Could the writing be a tool of cultural identity definition, both individual and collective? The second chapter “The literary representation of the Antillean space” is a methodological analysis of the notions of literary space and descriptive gender. The first paragraph “The literary space of and in the novel” is an excursus of the theory of such critics like Blanchot, Bachelard, Genette and Greimas, and in particular the recent innovation of the 20th century; the second one “Space of the Antilles, space of the writing” is an attempt to apply this theory to the Antillean literary space. Finally the last paragraph “Signs on the page: the symbolic places of the antillean novel landscape” presents an inventory of the most recurrent antillean places (mornes, ravines, traces, cachots, En-ville,…), symbols of the history and the past, described in literary works, but according to new modalities of representation. The third chapter, the core of the thesis, “Re-drawing the map of the French Antilles” focused the study of space representation on francophone literature, in particular on a selected works of four martinican writers, like Roland Brival, Édouard Glissant, Patrick Chamoiseau and Raphaël Confiant. Through this section, a spatial evolution comes out step by step, from the first to the second paragraph, whose titles are linked together “The novel space evolution: from the forest of the morne… to the jungle of the ville”. The virgin and uncontaminated space of the Antilles, prior to the colonisation, where the Indios lived in harmony with the nature, find a representation in both works of Brival (Le sang du roucou, Le dernier des Aloukous) and of Glissant (Le Quatrième siècle, Ormerod). The arrival of the European colonizer brings a violent and sudden metamorphosis of the originary space and landscape, together with the traditions and culture of the Caraïbes population. These radical changes are visible in the works of Chamoiseau (Chronique des sept misères, Texaco, L’esclave vieil homme et le molosse, Livret des villes du deuxième monde, Un dimanche au cachot) and Confiant (Le Nègre et l’Amiral, Eau de Café, Ravines du devant-jour, Nègre marron) that explore the urban space of the creole En-ville. The fourth chapter represents the “2nd step: the Anglophone novel space” in the exploration of literary representation of space, through an analytical study of the works of three Anglophone writers, the 19th century Lafcadio Hearn (A Midsummer Trip To the West Indies, Two Years in the French West Indies, Youma) and the contemporary authors Derek Walcott (Omeros, Map of the New World, What the Twilight says) and Edward Kamau Brathwaite (The Arrivants: A New World Trilogy). The Anglophone voice of the Caribbean archipelago brings a very interesting contribution to the critical idea of a spatial evolution in the literary representation of space, started with francophone production: “The spatial evolution goes on: from the Martiniques Sketches of Hearn… to the modern bards of Caribbean archipelago” is the new linked title of the two paragraphs. The fifth chapter “Extended look, space shared: the Caribbean archipelago” is a comparative analysis of the results achieved in the prior sections, through a dialogue between all the texts in the first paragraph “Francophone and Anglophone representation of space compared: differences and analogies”. The last paragraph instead is an attempt of re-negotiate the conventional notions of space and place, from a geographical and physical meaning, to the new concept of “commonplace”, not synonym of prejudice, but “common place” of sharing and dialogue. The question sets in the last paragraph “The “commonplaces” of the physical and mental map of the Caribbean archipelago: toward a non-place?” contains the critical idea of the entire thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Premise: In the literary works of our anthropological and cultural imagination, the various languages and the different discursive practices are not necessarily quoted, expressly alluded to or declared through clear expressive mechanisms; instead, they rather constitute a substratum, a background, now consolidated, which with irony and intertextuality shines through the thematic and formal elements of each text. The various contaminations, hybridizations and promptings that we find in the expressive forms, the rhetorical procedures and the linguistic and thematic choices of post-modern literary texts are shaped as fluid and familiar categories. Exchanges and passages are no longer only allowed but also inevitable; the post-modern imagination is made up of an agglomeration of discourses that are no longer really separable, built up from texts that blend and quote one another, composing, each with its own specificities, the great family of the cultural products of our social scenario. A literary work, therefore, is not only a whole phenomenon, delimited hic et nunc by a beginning and an ending, but is a fragment of that complex, dense and boundless network that is given by the continual interrelations between human forms of communication and symbolization. The research hypothesis: A vision is delineated of comparative literature as a discipline attentive to the social contexts in which texts take shape and move and to the media-type consistency that literary phenomena inevitably take on. Hence literature is seen as an open systematicity that chooses to be contaminated by other languages and other discursive practices of an imagination that is more than ever polymorphic and irregular. Inside this interpretative framework the aim is to focus the analysis on the relationship that postmodern literature establishes with advertising discourse. On one side post-modern literature is inserted in the world of communication, loudly asserting the blending and reciprocal contamination of literary modes with media ones, absorbing their languages and signification practices, translating them now into thematic nuclei, motifs and sub-motifs and now into formal expedients and new narrative choices; on the other side advertising is chosen as a signification practice of the media universe, which since the 1960s has actively contributed to shaping the dynamics of our socio-cultural scenarios, in terms which are just as important as those of other discursive practices. Advertising has always been a form of communication and symbolization that draws on the collective imagination – myths, actors and values – turning them into specific narrative programs for its own texts. Hence the aim is to interpret and analyze this relationship both from a strictly thematic perspective – and therefore trying to understand what literature speaks about when it speaks about advertising, and seeking advertising quotations in post-modern fiction – and from a formal perspective, with a search for parallels and discordances between the rhetorical procedures, the languages and the verifiable stylistic choices in the texts of the two different signification practices. The analysis method chosen, for the purpose of constructive multiplication of the perspectives, aims to approach the analytical processes of semiotics, applying, when possible, the instruments of the latter, in order to highlight the thematic and formal relationships between literature and advertising. The corpus: The corpus of the literary texts is made up of various novels and, although attention is focused on the post-modern period, there will also be ineludible quotations from essential authors that with their works prompted various reflections: H. De Balzac, Zola, Fitzgerald, Joyce, Calvino, etc… However, the analysis focuses the corpus on three authors: Don DeLillo, Martin Amis and Aldo Nove, and in particular the followings novels: “Americana” (1971) and “Underworld” (1999) by Don DeLillo, “Money” (1984) by Martin Amis and “Woobinda and other stories without a happy ending” (1996) and “Superwoobinda” (1998) by Aldo Nove. The corpus selection is restricted to these novels for two fundamental reasons: 1. assuming parameters of spatio-temporal evaluation, the texts are representative of different socio-cultural contexts and collective imaginations (from the masterly glimpses of American life by DeLillo, to the examples of contemporary Italian life by Nove, down to the English imagination of Amis) and of different historical moments (the 1970s of DeLillo’s Americana, the 1980s of Amis, down to the 1990s of Nove, decades often used as criteria of division of postmodernism into phases); 2. adopting a perspective of strictly thematic analysis, as mentioned in the research hypothesis, the variations and the constants in the novels (thematic nuclei, topoi, images and narrative developments) frequently speak of advertising and inside the narrative plot they affirm various expressions and realizations of it: value ones, thematic ones, textual ones, urban ones, etc… In these novels the themes and the processes of signification of advertising discourse pervade time, space and the relationships that the narrator character builds around him. We are looking at “particle-characters” whose endless facets attest the influence and contamination of advertising in a large part of the narrative developments of the plot: on everyday life, on the processes of acquisition and encoding of the reality, on ideological and cultural baggage, on the relationships and interchanges with the other characters, etc… Often the characters are victims of the implacable consequentiality of the advertising mechanism, since the latter gets the upper hand over the usual processes of communication, which are overwhelmed by it, wittingly or unwittingly (for example: disturbing openings in which the protagonist kills his or her parents on the basis of a spot, former advertisers that live life codifying it through the commercial mechanisms of products, sons and daughters of advertisers that as children instead of playing outside for whole nights saw tapes of spots.) Hence the analysis arises from the text and aims to show how much the developments and the narrative plots of the novels encode, elaborate and recount the myths, the values and the narrative programs of advertising discourse, transforming them into novel components in their own right. And also starting from the text a socio-cultural reference context is delineated, a collective imagination that is different, now geographically, now historically, and from comparison between them the aim is to deduce the constants, the similarities and the variations in the relationship between literature and advertising.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By using a symbolic method, known in the literature as the classical umbral calculus, a symbolic representation of Lévy processes is given and a new family of time-space harmonic polynomials with respect to such processes, which includes and generalizes the exponential complete Bell polynomials, is introduced. The usefulness of time-space harmonic polynomials with respect to Lévy processes is that it is a martingale the stochastic process obtained by replacing the indeterminate x of the polynomials with a Lévy process, whereas the Lévy process does not necessarily have this property. Therefore to find such polynomials could be particularly meaningful for applications. This new family includes Hermite polynomials, time-space harmonic with respect to Brownian motion, Poisson-Charlier polynomials with respect to Poisson processes, Laguerre and actuarial polynomials with respect to Gamma processes , Meixner polynomials of the first kind with respect to Pascal processes, Euler, Bernoulli, Krawtchuk, and pseudo-Narumi polynomials with respect to suitable random walks. The role played by cumulants is stressed and brought to the light, either in the symbolic representation of Lévy processes and their infinite divisibility property, either in the generalization, via umbral Kailath-Segall formula, of the well-known formulae giving elementary symmetric polynomials in terms of power sum symmetric polynomials. The expression of the family of time-space harmonic polynomials here introduced has some connections with the so-called moment representation of various families of multivariate polynomials. Such moment representation has been studied here for the first time in connection with the time-space harmonic property with respect to suitable symbolic multivariate Lévy processes. In particular, multivariate Hermite polynomials and their properties have been studied in connection with a symbolic version of the multivariate Brownian motion, while multivariate Bernoulli and Euler polynomials are represented as powers of multivariate polynomials which are time-space harmonic with respect to suitable multivariate Lévy processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advances that have been characterizing spatial econometrics in recent years are mostly theoretical and have not found an extensive empirical application yet. In this work we aim at supplying a review of the main tools of spatial econometrics and to show an empirical application for one of the most recently introduced estimators. Despite the numerous alternatives that the econometric theory provides for the treatment of spatial (and spatiotemporal) data, empirical analyses are still limited by the lack of availability of the correspondent routines in statistical and econometric software. Spatiotemporal modeling represents one of the most recent developments in spatial econometric theory and the finite sample properties of the estimators that have been proposed are currently being tested in the literature. We provide a comparison between some estimators (a quasi-maximum likelihood, QML, estimator and some GMM-type estimators) for a fixed effects dynamic panel data model under certain conditions, by means of a Monte Carlo simulation analysis. We focus on different settings, which are characterized either by fully stable or quasi-unit root series. We also investigate the extent of the bias that is caused by a non-spatial estimation of a model when the data are characterized by different degrees of spatial dependence. Finally, we provide an empirical application of a QML estimator for a time-space dynamic model which includes a temporal, a spatial and a spatiotemporal lag of the dependent variable. This is done by choosing a relevant and prolific field of analysis, in which spatial econometrics has only found limited space so far, in order to explore the value-added of considering the spatial dimension of the data. In particular, we study the determinants of cropland value in Midwestern U.S.A. in the years 1971-2009, by taking the present value model (PVM) as the theoretical framework of analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Il presente lavoro si propone principalmente di fornire un’analisi delle declinazioni assunte dal principio di continuità nel diritto amministrativo, tentando di metterne in luce al contempo le basi fondanti che caratterizzano ogni principio generale e le sfumature più attuali emerse dall’elaborazione della dottrina e della giurisprudenza più recenti. Partendo dal fondamentale presupposto secondo cui la maggior parte degli interpreti si è interessata al principio di continuità in campo amministrativo con prevalente riferimento all’ambito organizzativo-strutturale, si è tentato di estendere l’analisi sino a riconoscervi una manifestazione di principi chiave della funzione amministrativa complessivamente intesa quali efficienza, buon andamento, realizzazione di buoni risultati. La rilevanza centrale della continuità discende dalla sua infinita declinabilità, ma in questo lavoro si insiste particolarmente sul fatto che di essa possono darsi due fondamentali interpretazioni, tra loro fortemente connesse, che si influenzano reciprocamente: a quella che la intende come segno di stabilità perenne, capace di assicurare certezza sul modus operandi delle pubbliche amministrazioni e tutela degli affidamenti da esse ingenerati, si affianca una seconda visione che ne privilegia invece l’aspetto dinamico, interpretandola come il criterio che impone alla P.A. di assecondare la realtà che muta, evolvendo contestualmente ad essa, al fine di assicurare la permanenza del risultato utile per la collettività, in ossequio alla sua missione di cura. In questa prospettiva, il presente lavoro si propone di analizzare, nella sua prima parte, i risultati già raggiunti dall’elaborazione esegetica in materia di continuità amministrativa, con particolare riferimento alle sue manifestazioni nel campo dell’organizzazione e dell’attività amministrative, nonché ad alcune sue espressioni concrete nel settore degli appalti e dei servizi pubblici. La seconda parte è invece dedicata a fornire alcuni spunti ed ipotesi per nuove interpretazioni del principio in chiave sistematica, in relazione a concetti generali quali il tempo, lo spazio e il complessivo disegno progettuale della funzione amministrativa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Da anni si assiste ad una continua e incessante evoluzione del settore marittimo, nel tentativo costante di introdurre nuove tecnologie per migliorare i servizi offerti, con l’obiettivo di accelerare la velocità degli scambi commerciali e ridurre i costi delle operazioni. Nell’intento di offrire un’assistenza all’avanguardia ed in tempi ridotti, particolare importanza hanno avuto le evoluzioni in tema di Polizza di Carico. La Polizza di Carico o Bill of Lading (B/L) è il principale documento utilizzato nel trasporto marittimo a livello internazionale caratterizzandosi come ricevuta di carico, titolo rappresentativo di merce e prova del contratto di trasporto. Dagli anni Ottanta del secolo scorso, diversi sono stati i tentativi di sviluppare una polizza di carico elettronica che presenti le stesse funzionalità di quella cartacea al fine di ovviare agli inconvenienti del documento tradizionale garantendone, al tempo stesso, la medesima sicurezza ed efficacia. Di recente, il progresso scientifico e l’innovazione digitale, hanno contribuito all’introduzione nel settore dello shipping di una polizza di carico basata sulla Blockchain. Nonostante sia indubbio che tale tecnologia possa garantire molteplici vantaggi, il reale utilizzo di una Blockchain Bill of Lading risulta, a tuttora, un’ipotesi in piena evoluzione.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Italia, l’archeologia pubblica ha subìto negli ultimi anni un accelerazione, grazie ad una maggiore consapevolezza da parte degli archeologi sulla necessità di un dialogo con il pubblico. Sempre più progetti di ricerca archeologica investono risorse nell’organizzazione di iniziative volte a coinvolgere la comunità, sia onsite che online. Ma per valutare effettivamente l’impatto di queste iniziative c’è bisogno di delineare una metodologia condivisa, assente attualmente in Italia: manca una progettualità, una pianificazione, una riflessione critica sull’argomento. Il mio lavoro, inserito all’interno del progetto di “Archeologia dei paesaggi di Ravenna” dell’Università di Bologna, ha l’obiettivo di individuare le strategie comunicative migliori per coinvolgere il pubblico verso tematiche storico-archeologiche, ricorrendo in particolare all’utilizzo dei social media, e di individuare una metodologia efficace per la realizzazione e valutazione delle iniziative di archeologia pubblica nell’ambito della ricerca universitaria. Essendoci la necessità di una pianificazione, ho deciso di considerare il mio progetto come se fosse un’azienda con un prodotto o servizio da portare sul mercato. In tal caso, l’azienda deve ricorrere alla stesura di un piano marketing, un documento che formalizza le analisi e le conseguenti decisioni di marketing per un determinato prodotto/mercato, in un determinato arco temporale/spaziale al fine di conseguire gli obiettivi prefissati.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quasars and AGN play an important role in many aspects of the modern cosmology. Of particular interest is the issue of the interplay between AGN activity and formation and evolution of galaxies and structures. Studies on nearby galaxies revealed that most (and possibly all) galaxy nuclei contain a super-massive black hole (SMBH) and that between a third and half of them are showing some evidence of activity (Kormendy and Richstone, 1995). The discovery of a tight relation between black holes mass and velocity dispersion of their host galaxy suggests that the evolution of the growth of SMBH and their host galaxy are linked together. In this context, studying the evolution of AGN, through the luminosity function (LF), is fundamental to constrain the theories of galaxy and SMBH formation and evolution. Recently, many theories have been developed to describe physical processes possibly responsible of a common formation scenario for galaxies and their central black hole (Volonteri et al., 2003; Springel et al., 2005a; Vittorini et al., 2005; Hopkins et al., 2006a) and an increasing number of observations in different bands are focused on collecting larger and larger quasar samples. Many issues remain however not yet fully understood. In the context of the VVDS (VIMOS-VLT Deep Survey), we collected and studied an unbiased sample of spectroscopically selected faint type-1 AGN with a unique and straightforward selection function. Indeed, the VVDS is a large, purely magnitude limited spectroscopic survey of faint objects, free of any morphological and/or color preselection. We studied the statistical properties of this sample and its evolution up to redshift z 4. Because of the contamination of the AGN light by their host galaxies at the faint magnitudes explored by our sample, we observed that a significant fraction of AGN in our sample would be missed by the UV excess and morphological criteria usually adopted for the pre-selection of optical QSO candidates. If not properly taken into account, this failure in selecting particular sub-classes of AGN could, in principle, affect some of the conclusions drawn from samples of AGN based on these selection criteria. The absence of any pre-selection in the VVDS leads us to have a very complete sample of AGN, including also objects with unusual colors and continuum shape. The VVDS AGN sample shows in fact redder colors than those expected by comparing it, for example, with the color track derived from the SDSS composite spectrum. In particular, the faintest objects have on average redder colors than the brightest ones. This can be attributed to both a large fraction of dust-reddened objects and a significant contamination from the host galaxy. We have tested these possibilities by examining the global spectral energy distribution of each object using, in addition to the U, B, V, R and I-band magnitudes, also the UV-Galex and the IR-Spitzer bands, and fitting it with a combination of AGN and galaxy emission, allowing also for the possibility of extinction of the AGN flux. We found that for 44% of our objects the contamination from the host galaxy is not negligible and this fraction decreases to 21% if we restrict the analysis to a bright subsample (M1450 <-22.15). Our estimated integral surface density at IAB < 24.0 is 500 AGN per square degree, which represents the highest surface density of a spectroscopically confirmed sample of optically selected AGN. We derived the luminosity function in B-band for 1.0 < z < 3.6 using the 1/Vmax estimator. Our data, more than one magnitude fainter than previous optical surveys, allow us to constrain the faint part of the luminosity function up to high redshift. A comparison of our data with the 2dF sample at low redshift (1 < z < 2.1) shows that the VDDS data can not be well fitted with the pure luminosity evolution (PLE) models derived by previous optically selected samples. Qualitatively, this appears to be due to the fact that our data suggest the presence of an excess of faint objects at low redshift (1.0 < z < 1.5) with respect to these models. By combining our faint VVDS sample with the large sample of bright AGN extracted from the SDSS DR3 (Richards et al., 2006b) and testing a number of different evolutionary models, we find that the model which better represents the combined luminosity functions, over a wide range of redshift and luminosity, is a luminosity dependent density evolution (LDDE) model, similar to those derived from the major Xsurveys. Such a parameterization allows the redshift of the AGN density peak to change as a function of luminosity, thus fitting the excess of faint AGN that we find at 1.0 < z < 1.5. On the basis of this model we find, for the first time from the analysis of optically selected samples, that the peak of the AGN space density shifts significantly towards lower redshift going to lower luminosity objects. The position of this peak moves from z 2.0 for MB <-26.0 to z 0.65 for -22< MB <-20. This result, already found in a number of X-ray selected samples of AGN, is consistent with a scenario of “AGN cosmic downsizing”, in which the density of more luminous AGN, possibly associated to more massive black holes, peaks earlier in the history of the Universe (i.e. at higher redshift), than that of low luminosity ones, which reaches its maximum later (i.e. at lower redshift). This behavior has since long been claimed to be present in elliptical galaxies and it is not easy to reproduce it in the hierarchical cosmogonic scenario, where more massive Dark Matter Halos (DMH) form on average later by merging of less massive halos.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A full set of geochemical and Sr, Nd and Pb isotope data both on bulk-rock and mineral samples is provided for volcanic rocks representative of the whole stratigraphic succession of Lipari Island in the Aeolian archipelago. These data, together with petrographic observations and melt/fluid inclusion investigations from the literature, give outlines on the petrogenesis and evolution of magmas through the magmatic and eruptive history of Lipari. This is the result of nine successive Eruptive Epochs developing between 271 ka and historical times, as derived from recentmost volcanological and stratigraphic studies, combined with available radiometric ages and correlation of tephra layers and marine terrace deposits. These Eruptive Epochs are characterized by distinctive vents partly overlapping in space and time, mostly under control of the main regional tectonic trends (NNW-SSE, N-S and minor E-W). A large variety of lava flows, scoriaceous deposits, lava domes, coulees and pyroclastics are emplaced, ranging in composition through time from calcalkaline (CA) and high-K (HKCA) basaltic andesites to rhyolites. CA and HKCA basaltic andesitic to dacitic magmas were erupted between 271 and 81 ka (Eruptive Epochs 1-6) from volcanic edifices located along the western coast of the island (and subordinately the eastern Monterosa) and the M.Chirica and M.S.Angelo stratocones. These mafic to intermediate magmas mainly evolved through AFC and RAFC processes, involving fractionation of mafic phases, assimilation of wall rocks and mixing with newly injected mafic magmas. Following a 40 ka-long period of volcanic quiescence, the rhyolitic magmas were lately erupted from eruptive vents located in the southern and north-eastern sectors of Lipari between 40 ka and historical times (Eruptive Epochs 7-9). They are suggested to derive from the previous mafic to intermediate melts through AFC processes. During the early phases of rhyolitic magmatism (Eruptive Epochs 7-8), enclaves-rich rocks and banded pumices, ranging in composition from HKCA dacites to low-SiO2 rhyolites were erupted, representing the products of magma mixing between fresh mafic magmas and the fractionated rhyolitic melts. The interaction of mantle-derived magmas with the crust represents an essential process during the whole magmatic hystory of Lipari, and is responsible for the wide range of observed geochemical and isotopic variations. The crustal contribution was particularly important during the intermediate phases of activity of Lipari when the cordierite-bearing lavas were erupted from the M. S.Angelo volcano (Eruptive Epoch 5, 105 ka). These lavas are interpreted as the result of mixing and subsequent hybridization of mantle-derived magmas, akin to the ones characterizing the older phases of activity of Lipari (Eruptive Epochs 1-4), and crustal anatectic melts derived from dehydration-melting reactions of metapelites in the lower crust. A comparison between the adjacent islands of Lipari and Vulcano outlines that their mafic to intermediate magmas seem to be genetically connected and derive from a similar mantle source affected by different degrees of partial melting (and variable extent of crustal assimilation) producing either the CA magmas of Lipari (higher degrees) or the HKCA to SHO magmas of Vulcano (lower degrees). On a regional scale, the most primitive rocks (SiO2<56%, MgO>3.5%) of Lipari, Vulcano, Salina and Filicudi are suggested to derive from a similar MORB-like source, variably metasomatized by aqueous fluids coming from the slab and subordinately by the additions of sediments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis, I have investigated the evolution of the high-redshift (z > 3) AGN population by collecting data from some of the major Chandra and XMM-Newton surveys. The final sample (141 sources) is one of the largest selected at z> 3 in the X- rays and it is characterised by a very high redshift completeness (98%). I derived the spectral slopes and obscurations through a spectral anaysis and I assessed the high-z evolution by deriving the luminosity function and the number counts of the sample. The best representation of the AGN evolution is a pure density evolution (PDE) model: the AGN space density is found to decrease by a factor of 10 from z=3 to z=5. I also found that about 50% of AGN are obscured by large column densities (logNH > 23). By comparing these data with those in the Local Universe, I found a positive evolution of the obscured AGN fraction with redshift, especially for luminous (logLx > 44) AGN. I also studied the gas content of z < 1 AGN-hosting galaxies and compared it with that of inactive galaxies. For the first time, I applied to AGN a method to derive the gas mass previously used for inactive galaxies only. AGN are found to live preferentially in gas-rich galaxies. This result on the one hand can help us in understanding the AGN triggering mechanisms, on the other hand explains why AGN are preferentially hosted by star-forming galaxies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Slot and van Emde Boas Invariance Thesis states that a time (respectively, space) cost model is reasonable for a computational model C if there are mutual simulations between Turing machines and C such that the overhead is polynomial in time (respectively, linear in space). The rationale is that under the Invariance Thesis, complexity classes such as LOGSPACE, P, PSPACE, become robust, i.e. machine independent. In this dissertation, we want to find out if it possible to define a reasonable space cost model for the lambda-calculus, the paradigmatic model for functional programming languages. We start by considering an unusual evaluation mechanism for the lambda-calculus, based on Girard's Geometry of Interaction, that was conjectured to be the key ingredient to obtain a space reasonable cost model. By a fine complexity analysis of this schema, based on new variants of non-idempotent intersection types, we disprove this conjecture. Then, we change the target of our analysis. We consider a variant over Krivine's abstract machine, a standard evaluation mechanism for the call-by-name lambda-calculus, optimized for space complexity, and implemented without any pointer. A fine analysis of the execution of (a refined version of) the encoding of Turing machines into the lambda-calculus allows us to conclude that the space consumed by this machine is indeed a reasonable space cost model. In particular, for the first time we are able to measure also sub-linear space complexities. Moreover, we transfer this result to the call-by-value case. Finally, we provide also an intersection type system that characterizes compositionally this new reasonable space measure. This is done through a minimal, yet non trivial, modification of the original de Carvalho type system.