14 resultados para Superlinear and Semi–Superlinear Convergence
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
China is a large country characterized by remarkable growth and distinct regional diversity. Spatial disparity has always been a hot issue since China has been struggling to follow a balanced growth path but still confronting with unprecedented pressures and challenges. To better understand the inequality level benchmarking spatial distributions of Chinese provinces and municipalities and estimate dynamic trajectory of sustainable development in China, I constructed the Composite Index of Regional Development (CIRD) with five sub pillars/dimensions involving Macroeconomic Index (MEI), Science and Innovation Index (SCI), Environmental Sustainability Index (ESI), Human Capital Index (HCI) and Public Facilities Index (PFI), endeavoring to cover various fields of regional socioeconomic development. Ranking reports on the five sub dimensions and aggregated CIRD were provided in order to better measure the developmental degrees of 31 or 30 Chinese provinces and municipalities over 13 years from 1998 to 2010 as the time interval of three “Five-year Plans”. Further empirical applications of this CIRD focused on clustering and convergence estimation, attempting to fill up the gap in quantifying the developmental levels of regional comprehensive socioeconomics and estimating the dynamic convergence trajectory of regional sustainable development in a long run. Four clusters were benchmarked geographically-oriented in the map on the basis of cluster analysis, and club-convergence was observed in the Chinese provinces and municipalities based on stochastic kernel density estimation.
Resumo:
This thesis focuses on two aspects of European economic integration: exchange rate stabilization between non-euro Countries and the Euro Area, and real and nominal convergence of Central and Eastern European Countries. Each Chapter covers these aspects from both a theoretical and empirical perspective. Chapter 1 investigates whether the introduction of the euro was accompanied by a shift in the de facto exchange rate policy of European countries outside the euro area, using methods recently developed by the literature to detect "Fear of Floating" episodes. I find that European Inflation Targeters have tried to stabilize the euro exchange rate, after its introduction; fixed exchange rate arrangements, instead, apart from official policy changes, remained stable. Finally, the euro seems to have gained a relevant role as a reference currency even outside Europe. Chapter 2 proposes an approach to estimate Central Bank preferences starting from the Central Bank's optimization problem within a small open economy, using Sweden as a case study, to find whether stabilization of the exchange rate played a role in the Monetary Policy rule of the Riksbank. The results show that it did not influence interest rate setting; exchange rate stabilization probably occurred as a result of increased economic integration and business cycle convergence. Chapter 3 studies the interactions between wages in the public sector, the traded private sector and the closed sector in ten EU Transition Countries. The theoretical literature on wage spillovers suggests that the traded sector should be the leader in wage setting, with non-traded sectors wages adjusting. We show that large heterogeneity across countries is present, and sheltered and public sector wages are often leaders in wage determination. This result is relevant from a policy perspective since wage spillovers, leading to costs growing faster than productivity, may affect the international cost competitiveness of the traded sector.
Resumo:
The fall of the Berlin Wall opened the way for a reform path – the transition process – which accompanied ten former Socialist countries in Central and South Eastern Europe to knock at the EU doors. By the way, at the time of the EU membership several economic and structural weaknesses remained. A tendency towards convergence between the new Member States (NMS) and the EU average income level emerged, together with a spread of inequality at the sub-regional level, mainly driven by the backwardness of the agricultural and rural areas. Several progresses were made in evaluating the policies for rural areas, but a shared definition of rurality is still missing. Numerous indicators were calculated for assessing the effectiveness of the Common Agricultural Policy and Rural Development Policy. Previous analysis on the Central and Eastern European countries found that the characteristics of the most backward areas were insufficiently addressed by the policies enacted; the low data availability and accountability at a sub-regional level, and the deficiencies in institutional planning and implementation represented an obstacle for targeting policies and payments. The next pages aim at providing a basis for understanding the connections between the peculiarities of the transition process, the current development performance of NMS and the EU role, with particular attention to the agricultural and rural areas. Applying a mixed methodological approach (multivariate statistics, non-parametric methods, spatial econometrics), this study contributes to the identification of rural areas and to the analysis of the changes occurred during the EU membership in Hungary, assessing the effect of CAP introduction and its contribution to the convergence of the Hungarian agricultural and rural. The author believes that more targeted – and therefore efficient – policies for agricultural and rural areas require a deeper knowledge of their structural and dynamic characteristics.
Resumo:
The need for a convergence between semi-structured data management and Information Retrieval techniques is manifest to the scientific community. In order to fulfil this growing request, W3C has recently proposed XQuery Full Text, an IR-oriented extension of XQuery. However, the issue of query optimization requires the study of important properties like query equivalence and containment; to this aim, a formal representation of document and queries is needed. The goal of this thesis is to establish such formal background. We define a data model for XML documents and propose an algebra able to represent most of XQuery Full-Text expressions. We show how an XQuery Full-Text expression can be translated into an algebraic expression and how an algebraic expression can be optimized.
Resumo:
In this work a multidisciplinary study of the December 26th, 2004 Sumatra earthquake has been carried out. We have investigated both the effect of the earthquake on the Earth rotation and the stress field variations associated with the seismic event. In the first part of the work we have quantified the effects of a water mass redistribution associated with the propagation of a tsunami wave on the Earth’s pole path and on the length-of-day (LOD) and applied our modeling results to the tsunami following the 2004 giant Sumatra earthquake. We compared the result of our simulations on the instantaneous rotational axis variations with some preliminary instrumental evidences on the pole path perturbation (which has not been confirmed yet) registered just after the occurrence of the earthquake, which showed a step-like discontinuity that cannot be attributed to the effect of a seismic dislocation. Our results show that the perturbation induced by the tsunami on the instantaneous rotational pole is characterized by a step-like discontinuity, which is compatible with the observations but its magnitude turns out to be almost one hundred times smaller than the detected one. The LOD variation induced by the water mass redistribution turns out to be not significant because the total effect is smaller than current measurements uncertainties. In the second part of this work of thesis we modeled the coseismic and postseismic stress evolution following the Sumatra earthquake. By means of a semi-analytical, viscoelastic, spherical model of global postseismic deformation and a numerical finite-element approach, we performed an analysis of the stress diffusion following the earthquake in the near and far field of the mainshock source. We evaluated the stress changes due to the Sumatra earthquake by projecting the Coulomb stress over the sequence of aftershocks taken from various catalogues in a time window spanning about two years and finally analyzed the spatio-temporal pattern. The analysis performed with the semi-analytical and the finite-element modeling gives a complex picture of the stress diffusion, in the area under study, after the Sumatra earthquake. We believe that the results obtained with the analytical method suffer heavily for the restrictions imposed, on the hypocentral depths of the aftershocks, in order to obtain the convergence of the harmonic series of the stress components. On the contrary we imposed no constraints on the numerical method so we expect that the results obtained give a more realistic description of the stress variations pattern.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
In the last decades the development of bone substitutes characterized by a superior biomimetism has become of particular interest, owing to the increasing economic and societal impact of the bone diseases. In the present work of research the development of bone substitutes characterized by improved biomimetism, has been faced in a chemical, structural and morphological perspective. From a chemical point of view, it has been developed the synthesis of hydroxyapatite powders, exhibiting multiple ionic substitutions in both cationic and anionic sites, so to simulate the chemical composition of the natural bone. Particular emphasis has been given to the effect of silicon on the chemical-physical and solubility properties of the obtained hydroxyapatites. From a structural point of view, it has been developed the synthesis of ceramic composite materials, based on hydroxyapatite and calcium silicates, employed both as a reinforcing phase, to raise the mechanical strength of the composite compared to hydroxyapatite, and as a bioactive phase, able to increase the bioactivity properties of the whole ceramic. Finally the unique morphological features of the bone were mimicked by taking inspiration by Nature, so that native wood structures were treated in chemical and thermal way to obtain hydroxyapatite porous materials characterized by the same morphology as the native wood. The results obtained in the present work were positive in all the three different areas of investigation, so to cover the three different aspects of biomimetism, chemical, structural and morphological. Anyway, only at the convergence of the three different fields it is possible to find out the best solutions to develop the ideal bone-like scaffold. Thus, the future activity should be devoted to solve the problems at the borderline between the different research lines, which hamper this convergence and in consequence, the achievement of a bone scaffold able to mimic the various aspects exhibited by the bone tissue
Resumo:
Porous materials are widely used in many fields of industrial applications, to achieve the requirements of noise reduction, that nowadays derive from strict regulations. The modeling of porous materials is still a problematic issue. Numerical simulations are often problematic in case of real complex geometries, especially in terms of computational times and convergence. At the same time, analytical models, even if partly limited by restrictive simplificative hypotheses, represent a powerful instrument to capture quickly the physics of the problem and general trends. In this context, a recently developed numerical method, called the Cell Method, is described, is presented in the case of the Biot's theory and applied for representative cases. The peculiarity of the Cell Method is that it allows for a direct algebraic and geometrical discretization of the field equations, without any reduction to a weak integral form. Then, the second part of the thesis presents the case of interaction between two poroelastic materials under the context of double porosity. The idea of using periodically repeated inclusions of a second porous material into a layer composed by an original material is described. In particular, the problem is addressed considering the efficiency of the analytical method. A analytical procedure for the simulation of heterogeneous layers based is described and validated considering both conditions of absorption and transmission; a comparison with the available numerical methods is performed. ---------------- I materiali porosi sono ampiamente utilizzati per diverse applicazioni industriali, al fine di raggiungere gli obiettivi di riduzione del rumore, che sono resi impegnativi da norme al giorno d'oggi sempre più stringenti. La modellazione dei materiali porori per applicazioni vibro-acustiche rapprensenta un aspetto di una certa complessità. Le simulazioni numeriche sono spesso problematiche quando siano coinvolte geometrie di pezzi reali, in particolare riguardo i tempi computazionali e la convergenza. Allo stesso tempo, i modelli analitici, anche se parzialmente limitati a causa di ipotesi semplificative che ne restringono l'ambito di utilizzo, rappresentano uno strumento molto utile per comprendere rapidamente la fisica del problema e individuare tendenze generali. In questo contesto, un metodo numerico recentemente sviluppato, il Metodo delle Celle, viene descritto, implementato nel caso della teoria di Biot per la poroelasticità e applicato a casi rappresentativi. La peculiarità del Metodo delle Celle consiste nella discretizzazione diretta algebrica e geometrica delle equazioni di campo, senza alcuna riduzione a forme integrali deboli. Successivamente, nella seconda parte della tesi viene presentato il caso delle interazioni tra due materiali poroelastici a contatto, nel contesto dei materiali a doppia porosità. Viene descritta l'idea di utilizzare inclusioni periodicamente ripetute di un secondo materiale poroso all'interno di un layer a sua volta poroso. In particolare, il problema è studiando il metodo analitico e la sua efficienza. Una procedura analitica per il calcolo di strati eterogenei di materiale viene descritta e validata considerando sia condizioni di assorbimento, sia di trasmissione; viene effettuata una comparazione con i metodi numerici a disposizione.
Resumo:
The Northern Apennines (NA) chain is the expression of the active plate margin between Europe and Adria. Given the low convergence rates and the moderate seismic activity, ambiguities still occur in defining a seismotectonic framework and many different scenarios have been proposed for the mountain front evolution. Differently from older models that indicate the mountain front as an active thrust at the surface, a recently proposed scenario describes the latter as the frontal limb of a long-wavelength fold (> 150 km) formed by a thrust fault tipped around 17 km at depth, and considered as the active subduction boundary. East of Bologna, this frontal limb is remarkably very straight and its surface is riddled with small, but pervasive high- angle normal faults. However, west of Bologna, some recesses are visible along strike of the mountain front: these perturbations seem due to the presence of shorter wavelength (15 to 25 km along strike) structures showing both NE and NW-vergence. The Pleistocene activity of these structures was already suggested, but not quantitative reconstructions are available in literature. This research investigates the tectonic geomorphology of the NA mountain front with the specific aim to quantify active deformations and infer possible deep causes of both short- and long-wavelength structures. This study documents the presence of a network of active extensional faults, in the foothills south and east of Bologna. For these structures, the strain rate has been measured to find a constant throw-to-length relationship and the slip rates have been compared with measured rates of erosion. Fluvial geomorphology and quantitative analysis of the topography document in detail the active tectonics of two growing domal structures (Castelvetro - Vignola foothills and the Ghiardo plateau) embedded in the mountain front west of Bologna. Here, tilting and river incision rates (interpreted as that long-term uplift rates) have been measured respectively at the mountain front and in the Enza and Panaro valleys, using a well defined stratigraphy of Pleistocene to Holocene river terraces and alluvial fan deposits as growth strata, and seismic reflection profiles relationships. The geometry and uplift rates of the anticlines constrain a simple trishear fault propagation folding model that inverts for blind thrust ramp depth, dip, and slip. Topographic swath profiles and the steepness index of river longitudinal profiles that traverse the anti- clines are consistent with stratigraphy, structures, aquifer geometry, and seismic reflection profiles. Available focal mechanisms of earthquakes with magnitude between Mw 4.1 to 5.4, obtained from a dataset of the instrumental seismicity for the last 30 years, evidence a clear vertical separation at around 15 km between shallow extensional and deeper compressional hypocenters along the mountain front and adjacent foothills. In summary, the studied anticlines appear to grow at rates slower than the growing rate of the longer- wavelength structure that defines the mountain front of the NA. The domal structures show evidences of NW-verging deformation and reactivations of older (late Neogene) thrusts. The reconstructed river incision rates together with rates coming from several other rivers along a 250 km wide stretch of the NA mountain front and recently available in the literature, all indicate a general increase from Middle to Late Pleistocene. This suggests focusing of deformation along a deep structure, as confirmed by the deep compressional seismicity. The maximum rate is however not constant along the mountain front, but varies from 0.2 mm/yr in the west to more than 2.2 mm/yr in the eastern sector, suggesting a similar (eastward-increasing) trend of the apenninic subduction.
Resumo:
The aim of this work was to show that refined analyses of background, low magnitude seismicity allow to delineate the main active faults and to accurately estimate the directions of the regional tectonic stress that characterize the Southern Apennines (Italy), a structurally complex area with high seismic potential. Thanks the presence in the area of an integrated dense and wide dynamic network, was possible to analyzed an high quality microearthquake data-set consisting of 1312 events that occurred from August 2005 to April 2011 by integrating the data recorded at 42 seismic stations of various networks. The refined seismicity location and focal mechanisms well delineate a system of NW-SE striking normal faults along the Apenninic chain and an approximately E-W oriented, strike-slip fault, transversely cutting the belt. The seismicity along the chain does not occur on a single fault but in a volume, delimited by the faults activated during the 1980 Irpinia M 6.9 earthquake, on sub-parallel predominant normal faults. Results show that the recent low magnitude earthquakes belongs to the background seismicity and they are likely generated along the major fault segments activated during the most recent earthquakes, suggesting that they are still active today thirty years after the mainshock occurrences. In this sense, this study gives a new perspective to the application of the high quality records of low magnitude background seismicity for the identification and characterization of active fault systems. The analysis of the stress tensor inversion provides two equivalent models to explain the microearthquake generation along both the NW-SE striking normal faults and the E- W oriented fault with a dominant dextral strike-slip motion, but having different geological interpretations. We suggest that the NW-SE-striking Africa-Eurasia convergence acts in the background of all these structures, playing a primary and unifying role in the seismotectonics of the whole region.
Resumo:
This collection of essays examines various aspects of regional development and the issues of internationalization. The first essay investigates the implications of the impressive growth of China from a rural-urban perspective and addresses the topic of convergence in China by employing a non-parametrical approach to study the distribution dynamics of per capita income at province, rural and urban levels. To better understand the degree of inequality characterizing China and the long-term predictions of convergence or divergence of its different territorial aggregations, the second essay formulates a composite indicator of Regional Development (RDI) to benchmark development at province and sub-province level. The RDI goes beyond the uni-dimensional concept of development, generally proxied by the GDP per capita, and gives attention to the rural-urban dimension. The third essay “Internationalization and Trade Specialization in Italy. The role of China in the international intra-firm trade of the Italian regions” - deals with another aspect of regional economic development: the progressive de-industrialisation and de-localization of the local production. This essay looks at the trade specialization of selected Italian regions (those regions specialized in manufacturing) and the fragmentation of the local production on a global scale. China represents in this context an important stakeholder and the paper documents the importance of this country in the regional intra-firm trade.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
The aim of this work is to present various aspects of numerical simulation of particle and radiation transport for industrial and environmental protection applications, to enable the analysis of complex physical processes in a fast, reliable, and efficient way. In the first part we deal with speed-up of numerical simulation of neutron transport for nuclear reactor core analysis. The convergence properties of the source iteration scheme of the Method of Characteristics applied to be heterogeneous structured geometries has been enhanced by means of Boundary Projection Acceleration, enabling the study of 2D and 3D geometries with transport theory without spatial homogenization. The computational performances have been verified with the C5G7 2D and 3D benchmarks, showing a sensible reduction of iterations and CPU time. The second part is devoted to the study of temperature-dependent elastic scattering of neutrons for heavy isotopes near to the thermal zone. A numerical computation of the Doppler convolution of the elastic scattering kernel based on the gas model is presented, for a general energy dependent cross section and scattering law in the center of mass system. The range of integration has been optimized employing a numerical cutoff, allowing a faster numerical evaluation of the convolution integral. Legendre moments of the transfer kernel are subsequently obtained by direct quadrature and a numerical analysis of the convergence is presented. In the third part we focus our attention to remote sensing applications of radiative transfer employed to investigate the Earth's cryosphere. The photon transport equation is applied to simulate reflectivity of glaciers varying the age of the layer of snow or ice, its thickness, the presence or not other underlying layers, the degree of dust included in the snow, creating a framework able to decipher spectral signals collected by orbiting detectors.
Resumo:
The present dissertation aims at analyzing the construction of American adolescent culture through teen-targeted television series and the shift in perception that occurs as a consequence of the translation process. In light of the recent changes in television production and consumption modes, largely caused by new technologies, this project explores the evolution of Italian audiences, focusing on fansubbing (freely distributed amateur subtitles made by fans for fan consumption) and social viewing (the re-aggregation of television consumption based on social networks and dedicated platforms, rather than on physical presence). These phenomena are symptoms of a sort of ‘viewership 2.0’ and of a new type of active viewing, which calls for a revision of traditional AVT strategies. Using a framework that combines television studies, new media studies, and fandom studies with an approach to AVT based on Descriptive Translation Studies (Toury 1995), this dissertation analyzes the non-Anglophone audience’s growing need to participation in the global dialogue and appropriation process based on US scheduling and informed by the new paradigm of convergence culture, transmedia storytelling, and affective economics (Jenkins 2006 and 2007), as well as the constraints intrinsic to multimodal translation and the different types of linguistic and cultural adaptation performed through dubbing (which tends to be more domesticating; Venuti 1995) and fansubbing (typically more foreignizing). The study analyzes a selection of episodes from six of the most popular teen television series between 1990 and 2013, which has been divided into three ages based on the different modes of television consumption: top-down, pre-Internet consumption (Beverly Hills, 90210, 1990 – 2000), emergence of audience participation (Buffy the Vampire Slayer, 1997 – 2003; Dawson’s Creek, 1998 – 2003), age of convergence and Viewership 2.0 (Gossip Girl, 2007 – 2012; Glee, 2009 – present; The Big Bang Theory, 2007 - present).