28 resultados para new keynesian model

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Animal models have been relevant to study the molecular mechanisms of cancer and to develop new antitumor agents. Anyway, the huge divergence in mouse and human evolution made difficult the translation of the gained achievements in preclinical mouse based studies. The generation of clinically relevant murine models requires their humanization both concerning the creation of transgenic models and the generation of humanized mice in which to engraft a functional human immune system, and reproduce the physiological effects and molecular mechanisms of growth and metastasization of human tumors. In particular, the availability of genotypically stable immunodepressed mice able to accept tumor injection and allow human tumor growth and metastasization would be important to develop anti-tumor and anti-metastatic strategies. Recently, Rag2-/-;gammac-/- mice, double knockout for genes involved in lymphocyte differentiation, had been developed (CIEA, Central Institute for Experimental Animals, Kawasaki, Japan). Studies of human sarcoma metastasization in Rag2-/-; gammac-/- mice (lacking B, T and NK functionality) revealed their high metastatic efficiency and allowed the expression of human metastatic phenotypes not detectable in the conventionally used nude murine model. In vitro analysis to investigate the molecular mechanisms involved in the specific pattern of human sarcomas metastasization revealed the importance of liver-produced growth and motility factors, in particular the insulin-like growth factors (IGFs). The involvement of this growth factor was then demonstrated in vivo through inhibition of IGF signalling pathway. Due to the high growth and metastatic propensity of tumor cells, Rag2-/-;gammac-/- mice were used as model to investigate the metastatic behavior of rhabdomyosarcoma cells engineered to improve the differentiation. It has been recently shown that this immunodeficient model can be reconstituted with a human immune system through the injection of human cord blood progenitor cells. The work illustrated in this thesis revealed that the injection of different human progenitor cells (CD34+ or CD133+) showed peculiar engraftment and differentiation abilities. Experiments of cell vaccination were performed to investigate the functionality of the engrafted human immune system and the induction of specific human immune responses. Results from such experiments will allow to collect informations about human immune responses activated during cell vaccination and to define the best reconstitution and experimental conditions to create a humanized model in which to study, in a preclinical setting, immunological antitumor strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The instability of river bank can result in considerable human and land losses. The Po river is the most important in Italy, characterized by main banks of significant and constantly increasing height. This study presents multilayer perceptron of artificial neural network (ANN) to construct prediction models for the stability analysis of river banks along the Po River, under various river and groundwater boundary conditions. For this aim, a number of networks of threshold logic unit are tested using different combinations of the input parameters. Factor of safety (FS), as an index of slope stability, is formulated in terms of several influencing geometrical and geotechnical parameters. In order to obtain a comprehensive geotechnical database, several cone penetration tests from the study site have been interpreted. The proposed models are developed upon stability analyses using finite element code over different representative sections of river embankments. For the validity verification, the ANN models are employed to predict the FS values of a part of the database beyond the calibration data domain. The results indicate that the proposed ANN models are effective tools for evaluating the slope stability. The ANN models notably outperform the derived multiple linear regression models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aims to provide a theoretical examination of three recently created bodies of the United Nations mandated to investigate the alleged international crimes committed in Syria (IIIM), Iraq (UNITAD) and Myanmar (IIMM). Established as a compromise solution in the paralysis of international criminal jurisdictions, these essentially overlapping entities have been depicted as a ‘new generation’ of UN investigative mechanisms. While non-judicial in nature, they depart indeed from traditional commissions of inquiry in several respects due to their increased criminal or ‘quasi-prosecutorial’ character. After clarifying their legal basis and different mandating authorities, a comparative institutional analysis is thus carried out in order to ascertain whether these ‘mechanisms’ can be said to effectively represent a new institutional model. Through an in-depth assessment of their mandates, the thesis is also intended to outline both the strengths and the criticalities of these organs. Given their aim to facilitate criminal proceedings by sharing information and case files, it is suggested that more attention shall be paid to the position of the person under investigation. To this end, some proposals are made in order to enhance the mechanisms’ frameworks, especially from the angle of procedural safeguards. As a third aspect, the cooperation with judicial authorities is explored, in order to shed light on the actors involved, the relevant legal instruments and the possible obstacles, in particular from a human rights perspective. Ultimately, drawing from the detected issues, the thesis seeks to identify some lessons learned which could be taken into account in case of creation of new ad hoc investigative mechanisms or of a permanent institution of this kind.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wave breaking is an important coastal process, influencing hydro-morphodynamic processes such as turbulence generation and wave energy dissipation, run-up on the beach and overtopping of coastal defence structures. During breaking, waves are complex mixtures of air and water (“white water”) whose properties affect velocity and pressure fields in the vicinity of the free surface and, depending on the breaker characteristics, different mechanisms for air entrainment are usually observed. Several laboratory experiments have been performed to investigate the role of air bubbles in the wave breaking process (Chanson & Cummings, 1994, among others) and in wave loading on vertical wall (Oumeraci et al., 2001; Peregrine et al., 2006, among others), showing that the air phase is not negligible since the turbulent energy dissipation involves air-water mixture. The recent advancement of numerical models has given valuable insights in the knowledge of wave transformation and interaction with coastal structures. Among these models, some solve the RANS equations coupled with a free-surface tracking algorithm and describe velocity, pressure, turbulence and vorticity fields (Lara et al. 2006 a-b, Clementi et al., 2007). The single-phase numerical model, in which the constitutive equations are solved only for the liquid phase, neglects effects induced by air movement and trapped air bubbles in water. Numerical approximations at the free surface may induce errors in predicting breaking point and wave height and moreover, entrapped air bubbles and water splash in air are not properly represented. The aim of the present thesis is to develop a new two-phase model called COBRAS2 (stands for Cornell Breaking waves And Structures 2 phases), that is the enhancement of the single-phase code COBRAS0, originally developed at Cornell University (Lin & Liu, 1998). In the first part of the work, both fluids are considered as incompressible, while the second part will treat air compressibility modelling. The mathematical formulation and the numerical resolution of the governing equations of COBRAS2 are derived and some model-experiment comparisons are shown. In particular, validation tests are performed in order to prove model stability and accuracy. The simulation of the rising of a large air bubble in an otherwise quiescent water pool reveals the model capability to reproduce the process physics in a realistic way. Analytical solutions for stationary and internal waves are compared with corresponding numerical results, in order to test processes involving wide range of density difference. Waves induced by dam-break in different scenarios (on dry and wet beds, as well as on a ramp) are studied, focusing on the role of air as the medium in which the water wave propagates and on the numerical representation of bubble dynamics. Simulations of solitary and regular waves, characterized by both spilling and plunging breakers, are analyzed with comparisons with experimental data and other numerical model in order to investigate air influence on wave breaking mechanisms and underline model capability and accuracy. Finally, modelling of air compressibility is included in the new developed model and is validated, revealing an accurate reproduction of processes. Some preliminary tests on wave impact on vertical walls are performed: since air flow modelling allows to have a more realistic reproduction of breaking wave propagation, the dependence of wave breaker shapes and aeration characteristics on impact pressure values is studied and, on the basis of a qualitative comparison with experimental observations, the numerical simulations achieve good results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There have been almost fifty years since Harry Eckstein' s classic monograph, A Theory of Stable Democracy (Princeton, 1961), where he sketched out the basic tenets of the “congruence theory”, which was to become one of the most important and innovative contributions to understanding democratic rule. His next work, Division and Cohesion in Democracy, (Princeton University Press: 1966) is designed to serve as a plausibility probe for this 'theory' (ftn.) and is a case study of a Northern democratic system, Norway. What is more, this line of his work best exemplifies the contribution Eckstein brought to the methodology of comparative politics through his seminal article, “ “Case Study and Theory in Political Science” ” (in Greenstein and Polsby, eds., Handbook of Political Science, 1975), on the importance of the case study as an approach to empirical theory. This article demonstrates the special utility of “crucial case studies” in testing theory, thereby undermining the accepted wisdom in comparative research that the larger the number of cases the better. Although not along the same lines, but shifting the case study unit of research, I intend to take up here the challenge and build upon an equally unique political system, the Swedish one. Bearing in mind the peculiarities of the Swedish political system, my unit of analysis is going to be further restricted to the Swedish Social Democratic Party, the Svenska Arbetare Partiet. However, my research stays within the methodological framework of the case study theory inasmuch as it focuses on a single political system and party. The Swedish SAP endurance in government office and its electoral success throughout half a century (ftn. As of the 1991 election, there were about 56 years - more than half century - of interrupted social democratic "reign" in Sweden.) are undeniably a performance no other Social Democrat party has yet achieved in democratic conditions. Therefore, it is legitimate to inquire about the exceptionality of this unique political power combination. Which were the different components of this dominance power position, which made possible for SAP's governmental office stamina? I will argue here that it was the end-product of a combination of multifarious factors such as a key position in the party system, strong party leadership and organization, a carefully designed strategy regarding class politics and welfare policy. My research is divided into three main parts, the historical incursion, the 'welfare' part and the 'environment' part. The first part is a historical account of the main political events and issues, which are relevant for my case study. Chapter 2 is devoted to the historical events unfolding in the 1920-1960 period: the Saltsjoebaden Agreement, the series of workers' strikes in the 1920s and SAP's inception. It exposes SAP's ascent to power in the mid 1930s and the party's ensuing strategies for winning and keeping political office, that is its economic program and key economic goals. The following chapter - chapter 3 - explores the next period, i.e. the period from 1960s to 1990s and covers the party's troubled political times, its peak and the beginnings of the decline. The 1960s are relevant for SAP's planning of a long term economic strategy - the Rehn Meidner model, a new way of macroeconomic steering, based on the Keynesian model, but adapted to the new economic realities of welfare capitalist societies. The second and third parts of this study develop several hypotheses related to SAP's 'dominant position' (endurance in politics and in office) and test them afterwards. Mainly, the twin issues of economics and environment are raised and their political relevance for the party analyzed. On one hand, globalization and its spillover effects over the Swedish welfare system are important causal factors in explaining the transformative social-economic challenges the party had to put up with. On the other hand, Europeanization and environmental change influenced to a great deal SAP's foreign policy choices and its domestic electoral strategies. The implications of globalization on the Swedish welfare system will make the subject of two chapters - chapters four and five, respectively, whereupon the Europeanization consequences will be treated at length in the third part of this work - chapters six and seven, respectively. Apparently, at first sight, the link between foreign policy and electoral strategy is difficult to prove and uncanny, in the least. However, in the SAP's case there is a bulk of literature and public opinion statistical data able to show that governmental domestic policy and party politics are in a tight dependence to foreign policy decisions and sovereignty issues. Again, these country characteristics and peculiar causal relationships are outlined in the first chapters and explained in the second and third parts. The sixth chapter explores the presupposed relationship between Europeanization and environmental policy, on one hand, and SAP's environmental policy formulation and simultaneous agenda-setting at the international level, on the other hand. This chapter describes Swedish leadership in environmental policy formulation on two simultaneous fronts and across two different time spans. The last chapter, chapter eight - while trying to develop a conclusion, explores the alternative theories plausible in explaining the outlined hypotheses and points out the reasons why these theories do not fit as valid alternative explanation to my systemic corporatism thesis as the main causal factor determining SAP's 'dominant position'. Among the alternative theories, I would consider Traedgaardh L. and Bo Rothstein's historical exceptionalism thesis and the public opinion thesis, which alone are not able to explain the half century social democratic endurance in government in the Swedish case.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents a universal model of documents and deltas. This model formalize what it means to find differences between documents and to shows a single shared formalization that can be used by any algorithm to describe the differences found between any kind of comparable documents. The main scientific contribution of this thesis is a universal delta model that can be used to represent the changes found by an algorithm. The main part of this model are the formal definition of changes (the pieces of information that records that something has changed), operations (the definitions of the kind of change that happened) and deltas (coherent summaries of what has changed between two documents). The fundamental mechanism tha makes the universal delta model a very expressive tool is the use of encapsulation relations between changes. In the universal delta model, changes are not always simple records of what has changed, they can also be combined into more complex changes that reflects the detection of more meaningful modifications. In addition to the main entities (i.e., changes, operations and deltas), the model describes and defines also documents and the concept of equivalence between documents. As a corollary to the model, there is also an extensible catalog of possible operations that algorithms can detect, used to create a common library of operations, and an UML serialization of the model, useful as a reference when implementing APIs that deal with deltas. The universal delta model presented in this thesis acts as the formal groundwork upon which algorithm can be based and libraries can be implemented. It removes the need to recreate a new delta model and terminology whenever a new algorithm is devised. It also alleviates the problems that toolmakers have when adapting their software to new diff algorithms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis reports an integrated analytical and physicochemical approach for the study of natural substances and new drugs based on mass spectrometry techniques combined with liquid chromatography. In particular, Chapter 1 concerns the study of Berberine a natural substance with pharmacological activity for the treatment of hepatobiliary and intestinal diseases. The first part focused on the relationships between physicochemical properties, pharmacokinetics and metabolism of Berberine and its metabolites. For this purpose a sensitive HPLC-ES-MS/MS method have been developed, validated and used to determine these compounds during their physicochemical properties studies and plasma levels of berberine and its metabolites including berberrubine(M1), demethylenberberine(M3), and jatrorrhizine(M4) in humans. Data show that M1, could have an efficient intestinal absorption by passive diffusion due to a keto-enol tautomerism confirmed by NMR studies and its higher plasma concentration. In the second part of Chapter 1, a comparison between M1 and BBR in vivo biodistribution in rat has been studied. In Chapter 2 a new HPLC-ES-MS/MS method for the simultaneous determination and quantification of glucosinolates, as glucoraphanin, glucoerucin and sinigrin, and isothiocyanates, as sulforaphane and erucin, has developed and validated. This method has been used for the analysis of functional foods enriched with vegetable extracts. Chapter 3 focused on a physicochemical study of the interaction between the bile acid sequestrants used in the treatment of hypercholesterolemia including colesevelam and cholestyramine with obeticolic acid (OCA), potent agonist of nuclear receptor farnesoid X (FXR). In particular, a new experimental model for the determination of equilibrium binding isotherm was developed. Chapter 4 focused on methodological aspects of new hard ionization coupled with liquid chromatography (Direct-EI-UHPLC-MS) not yet commercially available and potentially useful for qualitative analysis and for “transparent” molecules to soft ionization techniques. This method was applied to the analysis of several steroid derivatives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation concerns active fibre-reinforced composites with embedded shape memory alloy wires. The structural application of active materials allows to develop adaptive structures which actively respond to changes in the environment, such as morphing structures, self-healing structures and power harvesting devices. In particular, shape memory alloy actuators integrated within a composite actively control the structural shape or stiffness, thus influencing the composite static and dynamic properties. Envisaged applications include, among others, the prevention of thermal buckling of the outer skin of air vehicles, shape changes in panels for improved aerodynamic characteristics and the deployment of large space structures. The study and design of active composites is a complex and multidisciplinary topic, requiring in-depth understanding of both the coupled behaviour of active materials and the interaction between the different composite constituents. Both fibre-reinforced composites and shape memory alloys are extremely active research topics, whose modelling and experimental characterisation still present a number of open problems. Thus, while this dissertation focuses on active composites, some of the research results presented here can be usefully applied to traditional fibre-reinforced composites or other shape memory alloy applications. The dissertation is composed of four chapters. In the first chapter, active fibre-reinforced composites are introduced by giving an overview of the most common choices available for the reinforcement, matrix and production process, together with a brief introduction and classification of active materials. The second chapter presents a number of original contributions regarding the modelling of fibre-reinforced composites. Different two-dimensional laminate theories are derived from a parent three-dimensional theory, introducing a procedure for the a posteriori reconstruction of transverse stresses along the laminate thickness. Accurate through the thickness stresses are crucial for the composite modelling as they are responsible for some common failure mechanisms. A new finite element based on the First-order Shear Deformation Theory and a hybrid stress approach is proposed for the numerical solution of the two-dimensional laminate problem. The element is simple and computationally efficient. The transverse stresses through the laminate thickness are reconstructed starting from a general finite element solution. A two stages procedure is devised, based on Recovery by Compatibility in Patches and three-dimensional equilibrium. Finally, the determination of the elastic parameters of laminated structures via numerical-experimental Bayesian techniques is investigated. Two different estimators are analysed and compared, leading to the definition of an alternative procedure to improve convergence of the estimation process. The third chapter focuses on shape memory alloys, describing their properties and applications. A number of constitutive models proposed in the literature, both one-dimensional and three-dimensional, are critically discussed and compared, underlining their potential and limitations, which are mainly related to the definition of the phase diagram and the choice of internal variables. Some new experimental results on shape memory alloy material characterisation are also presented. These experimental observations display some features of the shape memory alloy behaviour which are generally not included in the current models, thus some ideas are proposed for the development of a new constitutive model. The fourth chapter, finally, focuses on active composite plates with embedded shape memory alloy wires. A number of di®erent approaches can be used to predict the behaviour of such structures, each model presenting different advantages and drawbacks related to complexity and versatility. A simple model able to describe both shape and stiffness control configurations within the same context is proposed and implemented. The model is then validated considering the shape control configuration, which is the most sensitive to model parameters. The experimental work is divided in two parts. In the first part, an active composite is built by gluing prestrained shape memory alloy wires on a carbon fibre laminate strip. This structure is relatively simple to build, however it is useful in order to experimentally demonstrate the feasibility of the concept proposed in the first part of the chapter. In the second part, the making of a fibre-reinforced composite with embedded shape memory alloy wires is investigated, considering different possible choices of materials and manufacturing processes. Although a number of technological issues still need to be faced, the experimental results allow to demonstrate the mechanism of shape control via embedded shape memory alloy wires, while showing a good agreement with the proposed model predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge on how ligaments and articular surfaces guide passive motion at the human ankle joint complex is fundamental for the design of relevant surgical treatments. The dissertation presents a possible improvement of this knowledge by a new kinematic model of the tibiotalar articulation. In this dissertation two one-DOF spatial equivalent mechanisms are presented for the simulation of the passive motion of the human ankle joint: the 5-5 fully parallel mechanism and the fully parallel spherical wrist mechanism. These mechanisms are based on the main anatomical structures of the ankle joint, namely the talus/calcaneus and the tibio/fibula bones at their interface, and the TiCaL and CaFiL ligaments. In order to show the accuracy of the models and the efficiency of the proposed procedure, these mechanisms are synthesized from experimental data and the results are compared with those obtained both during experimental sessions and with data published in the literature. Experimental results proved the efficiency of the proposed new mechanisms to simulate the ankle passive motion and, at the same time, the potentiality of the mechanism to replicate the ankle’s main anatomical structures quite well. The new mechanisms represent a powerful tool for both pre-operation planning and new prosthesis design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The last decades have seen a large effort of the scientific community to study and understand the physics of sea ice. We currently have a wide - even though still not exhaustive - knowledge of the sea ice dynamics and thermodynamics and of their temporal and spatial variability. Sea ice biogeochemistry is instead largely unknown. Sea ice algae production may account for up to 25% of overall primary production in ice-covered waters of the Southern Ocean. However, the influence of physical factors, such as the location of ice formation, the role of snow cover and light availability on sea ice primary production is poorly understood. There are only sparse localized observations and little knowledge of the functioning of sea ice biogeochemistry at larger scales. Modelling becomes then an auxiliary tool to help qualifying and quantifying the role of sea ice biogeochemistry in the ocean dynamics. In this thesis, a novel approach is used for the modelling and coupling of sea ice biogeochemistry - and in particular its primary production - to sea ice physics. Previous attempts were based on the coupling of rather complex sea ice physical models to empirical or relatively simple biological or biogeochemical models. The focus is moved here to a more biologically-oriented point of view. A simple, however comprehensive, physical model of the sea ice thermodynamics (ESIM) was developed and coupled to a novel sea ice implementation (BFM-SI) of the Biogeochemical Flux Model (BFM). The BFM is a comprehensive model, largely used and validated in the open ocean environment and in regional seas. The physical model has been developed having in mind the biogeochemical properties of sea ice and the physical inputs required to model sea ice biogeochemistry. The central concept of the coupling is the modelling of the Biologically-Active-Layer (BAL), which is the time-varying fraction of sea ice that is continuously connected to the ocean via brines pockets and channels and it acts as rich habitat for many microorganisms. The physical model provides the key physical properties of the BAL (e.g., brines volume, temperature and salinity), and the BFM-SI simulates the physiological and ecological response of the biological community to the physical enviroment. The new biogeochemical model is also coupled to the pelagic BFM through the exchange of organic and inorganic matter at the boundaries between the two systems . This is done by computing the entrapment of matter and gases when sea ice grows and release to the ocean when sea ice melts to ensure mass conservation. The model was tested in different ice-covered regions of the world ocean to test the generality of the parameterizations. The focus was particularly on the regions of landfast ice, where primary production is generally large. The implementation of the BFM in sea ice and the coupling structure in General Circulation Models will add a new component to the latters (and in general to Earth System Models), which will be able to provide adequate estimate of the role and importance of sea ice biogeochemistry in the global carbon cycle.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Iberia Africa plate boundary, cross, roughly W-E, connecting the eastern Atlantic Ocean from Azores triple junction to the Continental margin of Morocco. Relative movement between the two plate change along the boundary, from transtensive near the Azores archipelago, through trascurrent movement in the middle at the Gloria Fracture Zone, to transpressive in the Gulf of Cadiz area. This study presents the results of geophysical and geological analysis on the plate boundary area offshore Gibraltar. The main topic is to clarify the geodynamic evolution of this area from Oligocene to Quaternary. Recent studies have shown that the new plate boundary is represented by a 600 km long set of aligned, dextral trascurrent faults (the SWIM lineaments) connecting the Gloria fault to the Riff orogene. The western termination of these lineaments crosscuts the Gibraltar accretionary prism and seems to reach the Moroccan continental shelf. In the past two years newly acquired bathymetric data collected in the Moroccan offshore permit to enlighten the present position of the eastern portion of the plate boundary, previously thought to be a diffuse plate boundary. The plate boundary evolution, from the onset of compression in the Oligocene to the Late Pliocene activation of trascurrent structures, is not yet well constrained. The review of available seismics lines, gravity and bathymetric data, together with the analysis of new acquired bathymetric and high resolution seismic data offshore Morocco, allows to understand how the deformation acted at lithospheric scale under the compressive regime. Lithospheric folding in the area is suggested, and a new conceptual model is proposed for the propagation of the deformation acting in the brittle crust during this process. Our results show that lithospheric folding, both in oceanic and thinned continental crust, produced large wavelength synclines bounded by short wavelength, top thrust, anticlines. Two of these anticlines are located in the Gulf of Cadiz, and are represented by the Gorringe Ridge and Coral Patch seamounts. Lithospheric folding probably interacted with the Monchique – Madeira hotspot during the 72 Ma to Recent, NNE – SSW transit. Plume related volcanism is for the first time described on top of the Coral Patch seamount, where nine volcanoes are found by means of bathymetric data. 40Ar-39Ar age of 31.4±1.98 Ma are measured from one rock sample of one of these volcanoes. Analysis on biogenic samples show how the Coral Patch act as a starved offshore seamount since the Chattian. We proposed that compression stress formed lithospheric scale structures playing as a reserved lane for the upwelling of mantle material during the hotspot transit. The interaction between lithospheric folding and the hotspot emplacement can be also responsible for the irregularly spacing, and anomalous alignments, of individual islands and seamounts belonging to the Monchique - Madeira hotspot.