957 resultados para Hype Cycle Model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The link between the atmospheric CO2 level and the ventilation state of the deep ocean is an important building block of the key hypotheses put forth to explain glacial-interglacial CO2 fluctuations. In this study, we systematically examine the sensitivity of atmospheric CO2 and its carbon isotope composition to changes in deep ocean ventilation, the ocean carbon pumps, and sediment formation in a global 3-D ocean-sediment carbon cycle model. Our results provide support for the hypothesis that a break up of Southern Ocean stratification and invigorated deep ocean ventilation were the dominant drivers for the early deglacial CO2 rise of ~35 ppm between the Last Glacial Maximum and 14.6 ka BP. Another rise of 10 ppm until the end of the Holocene is attributed to carbonate compensation responding to the early deglacial change in ocean circulation. Our reasoning is based on a multi-proxy analysis which indicates that an acceleration of deep ocean ventilation during early deglaciation is not only consistent with recorded atmospheric CO2 but also with the reconstructed opal sedimentation peak in the Southern Ocean at around 16 ka BP, the record of atmospheric δ13CCO2, and the reconstructed changes in the Pacific CaCO3 saturation horizon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate targets are designed to inform policies that would limit the magnitude and impacts of climate change caused by anthropogenic emissions of greenhouse gases and other substances. The target that is currently recognized by most world governments1 places a limit of two degrees Celsius on the global mean warming since preindustrial times. This would require large sustained reductions in carbon dioxide emissions during the twenty-first century and beyond2, 3, 4. Such a global temperature target, however, is not sufficient to control many other quantities, such as transient sea level rise5, ocean acidification6, 7 and net primary production on land8, 9. Here, using an Earth system model of intermediate complexity (EMIC) in an observation-informed Bayesian approach, we show that allowable carbon emissions are substantially reduced when multiple climate targets are set. We take into account uncertainties in physical and carbon cycle model parameters, radiative efficiencies10, climate sensitivity11 and carbon cycle feedbacks12, 13 along with a large set of observational constraints. Within this framework, we explore a broad range of economically feasible greenhouse gas scenarios from the integrated assessment community14, 15, 16, 17 to determine the likelihood of meeting a combination of specific global and regional targets under various assumptions. For any given likelihood of meeting a set of such targets, the allowable cumulative emissions are greatly reduced from those inferred from the temperature target alone. Therefore, temperature targets alone are unable to comprehensively limit the risks from anthropogenic emissions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background In Switzerland, age is the predominant driver of solidarity transfers in risk adjustment (RA). Concerns have been voiced regarding growing imbalances in cost sharing between young and old insured due to demographic changes (larger fraction of elderly >65 years and rise in average age). Particularly young adults aged 19–25 with limited incomes have to shoulder increasing solidarity burdens. Between 1996 and 2011, monthly intergenerational solidarity payments for young adults have doubled from CHF 87 to CHF 182, which corresponds to the highest absolute transfer increase of all age groups. Results By constructing models for age-specific RA growth and for calculating the lifetime sum of RA transfers we investigated the causes and consequences of demographic changes on RA payments. The models suggest that the main driver for RA increases in the past was below average health care expenditure (HCE) growth in young adults, which was only half as high (average 2% per year) compared with older adults (average 4% per year). Shifts in age group distributions were only accountable for 2% of the CHF 95 rise in RA payments. Despite rising risk adjustment debts for young insured the balance of lifetime transfers remains positive as long as HCE growth rates are greater than the discount rate used in this model (3%). Moreover, the life-cycle model predicts that the lifetime rate of return on RA payments may even be further increased by demographic changes. Nevertheless, continued growth of RA contributions may overwhelm vulnerable age groups such as young adults. We therefore propose methods to limit the burden of social health insurance for specific age groups (e.g. young adults in Switzerland) by capping solidarity payments. Conclusions Taken together, our mathematical modelling framework helps to gain a better understanding of how demographic changes interact with risk adjustment and how redistribution of funds between age groups can be controlled without inducing further selection incentives. Those methods can help to construct more equitable systems of health financing in light of population aging.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Measurements of the calcium isotopic composition (d44/40Ca) of planktonic foraminifera from the western equatorial Pacific and the Indian sector of the Southern Ocean show variations of about 0.6 per mil over the past 24 Myr. The stacked d44/40Ca record of Globigerinoides trilobus and Globigerina bulloides indicates a minimum in d44/40Casw (seawater calcium) at 15 to 16 Ma and a subsequent general increase toward the present, interrupted by a second minimum at 3 to 5 Ma. Applying a coupled calcium/carbon cycle model, we find two scenarios that can explain a large portion of the observed d44/40Casw variations. In both cases, variations in the Ca input flux to the ocean without proportional changes in the carbonate flux are invoked. The first scenario increases the riverine calcium input to the ocean without a proportional increase of the carbonate flux. The second scenario generates an additional calcium flux from the exchange of Ca by Mg during dolomitization. In both cases the calcium flux variations lead to drastic changes in the seawater Ca concentrations on million year timescales. Our d44/40Casw record therefore indicates that the global calcium cycle may be much more dynamic than previously assumed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reconstructions of atmospheric CO2 concentrations based on Antarctic ice cores reveal significant changes during the Holocene epoch, but the processes responsible for these changes in CO2 concentrations have not been unambiguously identified. Distinct characteristics in the carbon isotope signatures of the major carbon reservoirs (ocean, biosphere, sediments and atmosphere) constrain variations in the CO2 fluxes between those reservoirs. Here we present a highly resolved atmospheric d13C record for the past 11,000 years from measurements on atmospheric CO2 trapped in an Antarctic ice core. From mass-balance inverse model calculations performed with a simplified carbon cycle model, we show that the decrease in atmospheric CO2 of about 5 parts per million by volume (p.p.m.v.) and the increase in d13C of about 0.25% during the early Holocene is most probably the result of a combination of carbon uptake of about 290 gigatonnes of carbon by the land biosphere and carbon release from the ocean in response to carbonate compensation of the terrestrial uptake during the termination of the last ice age. The 20 p.p.m.v. increase of atmospheric CO2 and the small decrease in d13C of about 0.05% during the later Holocene can mostly be explained by contributions from carbonate compensation of earlier land-biosphere uptake and coral reef formation, with only a minor contribution from a small decrease of the land-biosphere carbon inventory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Remote sensing instruments are key players to map land surface temperature (LST) at large temporal and spatial scales. In this paper, we present how we combine passive microwave and thermal infrared data to estimate LST during summer snow-free periods over northern high latitudes. The methodology is based on the SSM/I-SSMIS 37 GHz measurements at both vertical and horizontal polarizations on a 25 km × 25 km grid size. LST is retrieved from brightness temperatures introducing an empirical linear relationship between emissivities at both polarizations as described in Royer and Poirier (2010). This relationship is calibrated at pixel scale, using cloud-free independent LST data from MODIS instruments. The SSM/I-SSMIS and MODIS data are synchronized by fitting a diurnal cycle model built on skin temperature reanalysis provided by the European Centre for Medium-Range Weather Forecasts (ECMWF). The resulting temperature dataset is provided at 25 km scale and at an hourly time step during the ten-year analysis period (2000-2011). This new product was locally evaluated at five experimental sites of the EU-PAGE21 project against air temperature measurements and meteorological model reanalysis, and compared to the MODIS LST product at both local and circumpolar scale. The results giving a mean RMSE of the order of 2.2 K demonstrate the usefulness of the microwave product, which is unaffected by clouds as opposed to thermal infrared products and offers a better resolution compared to model reanalysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

habilidades de comprensión y resolución de problemas. Tanto es así que se puede afirmar con rotundidad que no existe el método perfecto para cada una de las etapas de desarrollo y tampoco existe el modelo de ciclo de vida perfecto: cada nuevo problema que se plantea es diferente a los anteriores en algún aspecto y esto hace que técnicas que funcionaron en proyectos anteriores fracasen en los proyectos nuevos. Por ello actualmente se realiza un planteamiento integrador que pretende utilizar en cada caso las técnicas, métodos y herramientas más acordes con las características del problema planteado al ingeniero. Bajo este punto de vista se plantean nuevos problemas. En primer lugar está la selección de enfoques de desarrollo. Si no existe el mejor enfoque, ¿cómo se hace para elegir el más adecuado de entre el conjunto de los existentes? Un segundo problema estriba en la relación entre las etapas de análisis y diseño. En este sentido existen dos grandes riesgos. Por un lado, se puede hacer un análisis del problema demasiado superficial, con lo que se produce una excesiva distancia entre el análisis y el diseño que muchas veces imposibilita el paso de uno a otro. Por otro lado, se puede optar por un análisis en términos del diseño que provoca que no cumpla su objetivo de centrarse en el problema, sino que se convierte en una primera versión de la solución, lo que se conoce como diseño preliminar. Como consecuencia de lo anterior surge el dilema del análisis, que puede plantearse como sigue: para cada problema planteado hay que elegir las técnicas más adecuadas, lo que requiere que se conozcan las características del problema. Para ello, a su vez, se debe analizar el problema, eligiendo una técnica antes de conocerlo. Si la técnica utiliza términos de diseño entonces se ha precondicionado el paradigma de solución y es posible que no sea el más adecuado para resolver el problema. En último lugar están las barreras pragmáticas que frenan la expansión del uso de métodos con base formal, dificultando su aplicación en la práctica cotidiana. Teniendo en cuenta todos los problemas planteados, se requieren métodos de análisis del problema que cumplan una serie de objetivos, el primero de los cuales es la necesidad de una base formal, con el fin de evitar la ambigüedad y permitir verificar la corrección de los modelos generados. Un segundo objetivo es la independencia de diseño: se deben utilizar términos que no tengan reflejo directo en el diseño, para que permitan centrarse en las características del problema. Además los métodos deben permitir analizar problemas de cualquier tipo: algorítmicos, de soporte a la decisión o basados en el conocimiento, entre otros. En siguiente lugar están los objetivos relacionados con aspectos pragmáticos. Por un lado deben incorporar una notación textual formal pero no matemática, de forma que se facilite su validación y comprensión por personas sin conocimientos matemáticos profundos pero al mismo tiempo sea lo suficientemente rigurosa para facilitar su verificación. Por otro lado, se requiere una notación gráfica complementaria para representar los modelos, de forma que puedan ser comprendidos y validados cómodamente por parte de los clientes y usuarios. Esta tesis doctoral presenta SETCM, un método de análisis que cumple estos objetivos. Para ello se han definido todos los elementos que forman los modelos de análisis usando una terminología independiente de paradigmas de diseño y se han formalizado dichas definiciones usando los elementos fundamentales de la teoría de conjuntos: elementos, conjuntos y relaciones entre conjuntos. Por otro lado se ha definido un lenguaje formal para representar los elementos de los modelos de análisis – evitando en lo posible el uso de notaciones matemáticas – complementado con una notación gráfica que permite representar de forma visual las partes más relevantes de los modelos. El método propuesto ha sido sometido a una intensa fase de experimentación, durante la que fue aplicado a 13 casos de estudio, todos ellos proyectos reales que han concluido en productos transferidos a entidades públicas o privadas. Durante la experimentación se ha evaluado la adecuación de SETCM para el análisis de problemas de distinto tamaño y en sistemas cuyo diseño final usaba paradigmas diferentes e incluso paradigmas mixtos. También se ha evaluado su uso por analistas con distinto nivel de experiencia – noveles, intermedios o expertos – analizando en todos los casos la curva de aprendizaje, con el fin de averiguar si es fácil de aprender su uso, independientemente de si se conoce o no alguna otra técnica de análisis. Por otro lado se ha estudiado la capacidad de ampliación de modelos generados con SETCM, para comprobar si permite abordar proyectos realizados en varias fases, en los que el análisis de una fase consista en ampliar el análisis de la fase anterior. En resumidas cuentas, se ha tratado de evaluar la capacidad de integración de SETCM en una organización como la técnica de análisis preferida para el desarrollo de software. Los resultados obtenidos tras esta experimentación han sido muy positivos, habiéndose alcanzado un alto grado de cumplimiento de todos los objetivos planteados al definir el método.---ABSTRACT---Software development is an inherently complex activity, which requires specific abilities of problem comprehension and solving. It is so difficult that it can even be said that there is no perfect method for each of the development stages and that there is no perfect life cycle model: each new problem is different to the precedent ones in some respect and the techniques that worked in other problems can fail in the new ones. Given that situation, the current trend is to integrate different methods, tools and techniques, using the best suited for each situation. This trend, however, raises some new problems. The first one is the selection of development approaches. If there is no a manifestly single best approach, how does one go about choosing an approach from the array of available options? The second problem has to do with the relationship between the analysis and design phases. This relation can lead to two major risks. On one hand, the analysis could be too shallow and far away from the design, making it very difficult to perform the transition between them. On the other hand, the analysis could be expressed using design terminology, thus becoming more a kind of preliminary design than a model of the problem to be solved. In third place there is the analysis dilemma, which can be expressed as follows. The developer has to choose the most adequate techniques for each problem, and to make this decision it is necessary to know the most relevant properties of the problem. This implies that the developer has to analyse the problem, choosing an analysis method before really knowing the problem. If the chosen technique uses design terminology then the solution paradigm has been preconditioned and it is possible that, once the problem is well known, that paradigm wouldn’t be the chosen one. The last problem consists of some pragmatic barriers that limit the applicability of formal based methods, making it difficult to use them in current practice. In order to solve these problems there is a need for analysis methods that fulfil several goals. The first one is the need of a formal base, which prevents ambiguity and allows the verification of the analysis models. The second goal is design-independence: the analysis should use a terminology different from the design, to facilitate a real comprehension of the problem under study. In third place the analysis method should allow the developer to study different kinds of problems: algorithmic, decision-support, knowledge based, etc. Next there are two goals related to pragmatic aspects. Firstly, the methods should have a non mathematical but formal textual notation. This notation will allow people without deep mathematical knowledge to understand and validate the resulting models, without losing the needed rigour for verification. Secondly, the methods should have a complementary graphical notation to make more natural the understanding and validation of the relevant parts of the analysis. This Thesis proposes such a method, called SETCM. The elements conforming the analysis models have been defined using a terminology that is independent from design paradigms. Those terms have been then formalised using the main concepts of the set theory: elements, sets and correspondences between sets. In addition, a formal language has been created, which avoids the use of mathematical notations. Finally, a graphical notation has been defined, which can visually represent the most relevant elements of the models. The proposed method has been thoroughly tested during the experimentation phase. It has been used to perform the analysis of 13 actual projects, all of them resulting in transferred products. This experimentation allowed evaluating the adequacy of SETCM for the analysis of problems of varying size, whose final design used different paradigms and even mixed ones. The use of the method by people with different levels of expertise was also evaluated, along with the corresponding learning curve, in order to assess if the method is easy to learn, independently of previous knowledge on other analysis techniques. In addition, the expandability of the analysis models was evaluated, assessing if the technique was adequate for projects organised in incremental steps, in which the analysis of one step grows from the precedent models. The final goal was to assess if SETCM can be used inside an organisation as the preferred analysis method for software development. The obtained results have been very positive, as SETCM has obtained a high degree of fulfilment of the goals stated for the method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El turismo residencial en Valle de Bravo (Estado de México) surge a partir de la consolidación de su presa, iniciando un proceso de transformación no sólo paisajística, sino territorial y socioeconómica, el cual además, ha marcado la pauta para llevar a este destino a convertirse en uno de los sitios turísticos más importantes del Estado de México. El presente artículo tiene como objetivo realizar una caracterización territorial y urbana de las etapas que ha tenido Valle de Bravo debido a la proliferación de residencias desde sus inicios como sitio turístico hasta la época actual, analizando su posible tendencia de crecimiento para los próximos años. Para poder realizar este análisis, se utilizó el modelo del Ciclo de Vida Turístico en el que diversas variables cuantitativas y cualitativas fueron utilizadas para ejemplificar cómo ha ido evolucionando Valle de Bravo. A través de éste análisis se demuestra que la zona está acercándose a límites de capacidad de carga, lo cual supone un mayor impacto en la zona principalmente en sus recursos más importantes: el espacio físico y, su alto valor paisajístico y natural.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The standard paradigm that the Paleocene/Eocene thermal maximum (PETM) represents a threshold event intrinsic to Earth's climate and connected in some way with long-term warming has influenced interpretations of the geochemical, climate, and biological perturbations that occurred at this event. As recent high-resolution data have demonstrated that the onset of the event was geologically instantaneous, attempts to account for the event solely through endogenous mechanisms have become increasingly strained. The rapid onset of the event indicates that it was triggered by a catastrophic event which we suggest was most likely a bolide impact. We discuss features of the PETM that require explanation and argue that mechanisms that have previously been proposed either cannot explain all of these features or would require some sort of high-energy trigger. A bolide impact could provide such a trigger and, in the event of a comet impact, could contribute directly to the shape of the carbon isotope curve. We introduce a carbon cycle model that would explain the PETM by global warming following a bolide impact, leading to the oxidation of terrestrial organic carbon stores built up during the late Paleocene. Our intention is to encourage other researchers to seriously consider an impact trigger for the PETM, especially in the absence of plausible alternative mechanisms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We compare a compilation of 220 sediment core d13C data from the glacial Atlantic Ocean with three-dimensional ocean circulation simulations including a marine carbon cycle model. The carbon cycle model employs circulation fields which were derived from previous climate simulations. All sediment data have been thoroughly quality controlled, focusing on epibenthic foraminiferal species (such as Cibicidoides wuellerstorfi or Planulina ariminensis) to improve the comparability of model and sediment core carbon isotopes. The model captures the general d13C pattern indicated by present-day water column data and Late Holocene sediment cores but underestimates intermediate and deep water values in the South Atlantic. The best agreement with glacial reconstructions is obtained for a model scenario with an altered freshwater balance in the Southern Ocean that mimics enhanced northward sea ice export and melting away from the zone of sea ice production. This results in a shoaled and weakened North Atlantic Deep Water flow and intensified Antarctic Bottom Water export, hence confirming previous reconstructions from paleoproxy records. Moreover, the modeled abyssal ocean is very cold and very saline, which is in line with other proxy data evidence.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We use a dynamic factor model to provide a semi-structural representation for 101 quarterly US macroeconomic series. We find that (i) the US economy is well described by a number of structural shocks between two and six. Focusing on the four-shock specification, we identify, using sign restrictions, two non-policy shocks, demand and supply, and two policy shocks, monetary and fiscal. We obtain the following results. (ii) Both supply and demand shocks are important sources of fluctuations; supply prevails for GDP, while demand prevails for employment and inflation. (ii) Policy matters, Both monetary and fiscal policy shocks have sizeable effects on output and prices, with little evidence of crowding out; both monetary and fiscal authorities implement important systematic countercyclical policies reacting to demand shocks. (iii) Negative demand shocks have a large long-run positive effect on productivity, consistently with the Schumpeterian "cleansing" view of recessions.