990 resultados para Last planner system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mechanisms involved in Atlantic meridional overturning circulation (AMOC) decadal variability and predictability over the last 50 years are analysed in the IPSL–CM5A–LR model using historical and initialised simulations. The initialisation procedure only uses nudging towards sea surface temperature anomalies with a physically based restoring coefficient. When compared to two independent AMOC reconstructions, both the historical and nudged ensemble simulations exhibit skill at reproducing AMOC variations from 1977 onwards, and in particular two maxima occurring respectively around 1978 and 1997. We argue that one source of skill is related to the large Mount Agung volcanic eruption starting in 1963, which reset an internal 20-year variability cycle in the North Atlantic in the model. This cycle involves the East Greenland Current intensity, and advection of active tracers along the subpolar gyre, which leads to an AMOC maximum around 15 years after the Mount Agung eruption. The 1997 maximum occurs approximately 20 years after the former one. The nudged simulations better reproduce this second maximum than the historical simulations. This is due to the initialisation of a cooling of the convection sites in the 1980s under the effect of a persistent North Atlantic oscillation (NAO) positive phase, a feature not captured in the historical simulations. Hence we argue that the 20-year cycle excited by the 1963 Mount Agung eruption together with the NAO forcing both contributed to the 1990s AMOC maximum. These results support the existence of a 20-year cycle in the North Atlantic in the observations. Hindcasts following the CMIP5 protocol are launched from a nudged simulation every 5 years for the 1960–2005 period. They exhibit significant correlation skill score as compared to an independent reconstruction of the AMOC from 4-year lead-time average. This encouraging result is accompanied by increased correlation skills in reproducing the observed 2-m air temperature in the bordering regions of the North Atlantic as compared to non-initialized simulations. To a lesser extent, predicted precipitation tends to correlate with the nudged simulation in the tropical Atlantic. We argue that this skill is due to the initialisation and predictability of the AMOC in the present prediction system. The mechanisms evidenced here support the idea of volcanic eruptions as a pacemaker for internal variability of the AMOC. Together with the existence of a 20-year cycle in the North Atlantic they propose a novel and complementary explanation for the AMOC variations over the last 50 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As laid out in its convention there are 8 different objectives for ECMWF. One of the major objectives will consist of the preparation, on a regular basis, of the data necessary for the preparation of medium-range weather forecasts. The interpretation of this item is that the Centre will make forecasts once a day for a prediction period of up to 10 days. It is also evident that the Centre should not carry out any real weather forecasting but merely disseminate to the member countries the basic forecasting parameters with an appropriate resolution in space and time. It follows from this that the forecasting system at the Centre must from the operational point of view be functionally integrated with the Weather Services of the Member Countries. The operational interface between ECMWF and the Member Countries must be properly specified in order to get a reasonable flexibility for both systems. The problem of making numerical atmospheric predictions for periods beyond 4-5 days differs substantially from 2-3 days forecasting. From the physical point we can define a medium range forecast as a forecast where the initial disturbances have lost their individual structure. However we are still interested to predict the atmosphere in a similar way as in short range forecasting which means that the model must be able to predict the dissipation and decay of the initial phenomena and the creation of new ones. With this definition, medium range forecasting is indeed very difficult and generally regarded as more difficult than extended forecasts, where we usually only predict time and space mean values. The predictability of atmospheric flow has been extensively studied during the last years in theoretical investigations and by numerical experiments. As has been discussed elsewhere in this publication (see pp 338 and 431) a 10-day forecast is apparently on the fringe of predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of simulations of the last glacial maximum (LGM) made with 17 atmospheric general circulation models (AGCMs) participating in the Paleoclimate Modelling Intercomparison Project, and a high-resolution (T106) version of one of the models (CCSR1), show that changes in the elevation of tropical snowlines (as estimated by the depression of the maximum altitude of the 0 °C isotherm) are primarily controlled by changes in sea-surface temperatures (SSTs). The correlation between the two variables, averaged for the tropics as a whole, is 95%, and remains >80% even at a regional scale. The reduction of tropical SSTs at the LGM results in a drier atmosphere and hence steeper lapse rates. Changes in atmospheric circulation patterns, particularly the weakening of the Asian monsoon system and related atmospheric humidity changes, amplify the reduction in snowline elevation in the northern tropics. Colder conditions over the tropical oceans combined with a weakened Asian monsoon could produce snowline lowering of up to 1000 m in certain regions, comparable to the changes shown by observations. Nevertheless, such large changes are not typical of all regions of the tropics. Analysis of the higher resolution CCSR1 simulation shows that differences between the free atmospheric and along-slope lapse rate can be large, and may provide an additional factor to explain regional variations in observed snowline changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Center for Atmospheric Research-Community Climate System Model (NCAR-CCSM) is used in a coupled atmosphere–ocean–sea-ice simulation of the Last Glacial Maximum (LGM, around 21,000 years ago) climate. In the tropics, the simulation shows a moderate cooling of 3 °C over land and 2 °C in the ocean in zonal average. This cooling is about 1 °C cooler than the CLIMAP sea surface temperatures (SSTs) but consistent with recent estimates of both land and sea surface temperature changes. Subtropical waters are cooled by 2–2.5 °C, also in agreement with recent estimates. The simulated oceanic thermohaline circulation at the LGM is not only shallower but also weaker than the modern with a migration of deep-water formation site in the North Atlantic as suggested by the paleoceanographic evidences. The simulated northward flow of Antarctic Bottom Water (AABW) is enhanced. These deep circulation changes are attributable to the increased surface density flux in the Southern Ocean caused by sea-ice expansion at the LGM. Both the Gulf Stream and the Kuroshio are intensified due to the overall increase of wind stress over the subtropical oceans. The intensified zonal wind stress and southward shift of its maximum in the Southern Ocean effectively enhances the transport of the Antarctic Circumpolar Current (ACC) by more than 50%. Simulated SSTs are lowered by up to 8 °C in the midlatitudes. Simulated conditions in the North Atlantic are warmer and with less sea-ice than indicated by CLIMAP again, in agreement with more recent estimates. The increased meridional SST gradient at the LGM results in an enhanced Hadley Circulation and increased midlatitude storm track precipitation. The increased baroclinic storm activity also intensifies the meridional atmospheric heat transport. A sensitivity experiment shows that about half of the simulated tropical cooling at the LGM originates from reduced atmospheric concentrations of greenhouse gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural mineral aerosol (dust) is an active component of the climate system and plays multiple roles in mediating physical and biogeochemical exchanges between the atmosphere, land surface and ocean. Changes in the amount of dust in the atmosphere are caused both by changes in climate (precipitation, wind strength, regional moisture balance) and changes in the extent of dust sources caused by either anthropogenic or climatically induced changes in vegetation cover. Models of the global dust cycle take into account the physical controls on dust deflation from prescribed source areas (based largely on soil wetness and vegetation cover thresholds), dust transport within the atmospheric column, and dust deposition through sedimentation and scavenging by precipitation. These models successfully reproduce the first-order spatial and temporal patterns in atmospheric dust loading under modern conditions. Atmospheric dust loading was as much as an order-of-magnitude larger than today during the last glacial maximum (LGM). While the observed increase in emissions from northern Africa can be explained solely in terms of climate changes (colder, drier and windier glacial climates), increased emissions from other regions appear to have been largely a response to climatically induced changes in vegetation cover and hence in the extent of dust source areas. Model experiments suggest that the increased dust loading in tropical regions had an effect on radiative forcing comparable to that of low glacial CO2 levels. Changes in land-use are already increasing the dust loading of the atmosphere. However, simulations show that anthropogenically forced climate changes substantially reduce the extent and productivity of natural dust sources. Positive feedbacks initiated by a reduction of dust emissions from natural source areas on both radiative forcing and atmospheric CO2 could substantially mitigate the impacts of land-use changes, and need to be considered in climate change assessments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

∆14Catm has been estimated as 420 ± 80‰ (IntCal09) during the Last Glacial Maximum (LGM) compared to preindustrial times (0‰), but mechanisms explaining this difference are not yet resolved. ∆14Catm is a function of both cosmogenic production in the high atmosphere and of carbon cycling and partitioning in the Earth system. 10Be-based reconstructions show a contribution of the cosmogenic production term of only 200 ± 200‰ in the LGM. The remaining 220‰ have thus to be explained by changes in the carbon cycle. Recently, Bouttes et al. (2010, 2011) proposed to explain most of the difference in pCO2atm and δ13C between glacial and interglacial times as a result of brine-induced ocean stratification in the Southern Ocean. This mechanism involves the formation of very saline water masses that contribute to high carbon storage in the deep ocean. During glacial times, the sinking of brines is enhanced and more carbon is stored in the deep ocean, lowering pCO2atm. Moreover, the sinking of brines induces increased stratification in the Southern Ocean, which keeps the deep ocean well isolated from the surface. Such an isolated ocean reservoir would be characterized by a low ∆14C signature. Evidence of such 14C-depleted deep waters during the LGM has recently been found in the Southern Ocean (Skinner et al. 2010). The degassing of this carbon with low ∆14C would then reduce ∆14Catm throughout the deglaciation. We have further developed the CLIMBER-2 model to include a cosmogenic production of 14C as well as an interactive atmospheric 14C reservoir. We investigate the role of both the sinking of brine and cosmogenic production, alongside iron fertilization mechanisms, to explain changes in ∆14Catm during the last deglaciation. In our simulations, not only is the sinking of brine mechanism consistent with past ∆14C data, but it also explains most of the differences in pCO2atm and ∆14Catm between the LGM and preindustrial times. Finally, this study represents the first time to our knowledge that a model experiment explains glacial-interglacial differences in pCO2atm, δ13C, and ∆14C together with a coherent LGM climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the social theories implicit in system dynamics (SD) practice. Groupings of SD practice are observed in different parts of a framework for studying social theories. Most are seen to be located within `functionalist sociology'. To account for the remainder, two new forms of practice are discussed, each related to a different paradigm. Three competing conclusions are then offered: 1. The implicit assumption that SD is grounded in functionalist sociology is correct and should be made explicit. 2. Forrester's ideas operate at the level of method not social theory so SD, though not wedded to a particular social theoretic paradigm, can be re-crafted for use within different paradigms. 3. SD is consistent with social theories which dissolve the individual/society divide by taking a dialectical, or feedback, stance. It can therefore bring a formal modelling approach to the `agency/structure' debate within social theory and so bring SD into the heart of social science. The last conclusion is strongly recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade the English planning system has placed greater emphasis on the financial viability of development. ‘Calculative’ practices have been used to quantify and capture land value uplifts. Development viability appraisal (DVA) has become a key part of the evidence base used in planning decision-making and informs both ‘site-specific’ negotiations about the level of land value capture for individual schemes and ‘area-wide’ planning policy formation. This paper investigates how implementation of DVA is governed in planning policy formation. It is argued that the increased use of DVA raises important questions about how planning decisions are made and operationalised, not least because DVA is often poorly understood by some key stakeholders. The paper uses the concept of governance to thematically analyse semi-structured interviews conducted with the producers of DVAs and considers key procedural issues including (in)consistencies in appraisal practices, levels of stakeholder consultation and the potential for client and producer bias. Whilst stakeholder consultation is shown to be integral to the appraisal process in order to improve the quality of the appraisals and to legitimise the outputs, participation is restricted to industry experts and excludes some interest groups, including local communities. It is concluded that, largely because of its recent adoption and knowledge asymmetries between local planning authorities and appraisers, DVA is a weakly governed process characterised by emerging and contested guidance and is therefore ‘up for grabs’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For over three decades, negotiated planning obligations have been the primary form of land value capture in England. Diffusing and evolving over the last decade, a significant policy innovation has been the use of financial calculations to estimate the extent to which policies on planning obligations for actual, proposed development projects and in plan making affect the financial viability of development. This paper assesses the extent to which the use of financial appraisals has provided a robust, just and practical procedure to support land value capture. It is concluded that development viability appraisals are saturated with intrinsic uncertainty and that land value capture that is based on such calculations is, to some extent, capricious. In addition, clear incentives for developers and land owners to bias viability calculations, the economic dependence of many viability consultants on developers and land owners, a lack of transparency, contested or ambiguous guidance and the opportunities created by input uncertainty for bias are further failings. It is argued that how viability calculations are applied has been, is being and will continue to be shaped by power relations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the results of a new investigation of the Guarani Aquifer System (SAG) in Sao Paulo state. New data were acquired about sedimentary framework, flow pattern, and hydrogeochemistry. The flow direction in the north of the state is towards the southwest and not towards the west as expected previously. This is linked to the absence of SAG outcrop in the northeast of Sao Paulo state. Both the underlying Piramboia Formation and the overlying Botucatu Formation possess high porosity (18.9% and 19.5%, respectively), which was not modified significantly by diagenetic changes. Investigation of sediments confirmed a zone of chalcedony cement close to the SAG outcrop and a zone of calcite cement in the deep confined zone. The main events in the SAG post-sedimentary history were: (1) adhesion of ferrugineous coatings on grains, (2) infiltration of clays in eodiagenetic stage, (3) regeneration of coatings with formation of smectites, (4) authigenic overgrowth of quartz and K-feldspar in advanced eodiagenetic stage, (5) bitumen cementation of Piramboia Formation in mesodiagenetic stage, (6) cementation by calcite in mesodiagenetic and telodiagenetic stages in Piramboia Formation, (7) formation of secondary porosity by dissolution of unstable minerals after appearance of hydraulic gradient and penetration of the meteoric water caused by the uplift of the Serra do Mar coastal range in the Late Cretaceous, (8) authigenesis of kaolinite and amorphous silica in unconfined zone of the SAG and cation exchange coupled with the dissolution of calcite at the transition between unconfined and confined zone, and (9) authigenesis of analcime in the confined SAG zone. The last two processes are still under operation. The deep zone of the SAG comprises an alkaline pH, Na-HCO(3) groundwater type with old water and enriched delta(13)C values (<-3.9), which evolved from a neutral pH, Ca-HCO(3) groundwater type with young water and depleted delta(13)C values (>-18.8) close to the SAG outcrop. This is consistent with a conceptual geochemical model of the SAG, suggesting dissolution of calcite driven by cation exchange, which occurs at a relatively narrow front recently moving downgradient at much slower rate compared to groundwater flow. More depleted values of delta(18)O in the deep confined zone close to the Parana River compared to values of relative recent recharged water indicate recharge occur during a period of cold climate. The SAG is a ""storage-dominated"" type of aquifer which has to be managed properly to avoid its overexploitation. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Math teachers everywhere agree: the way to learn math is to do math. Effective homework is a key to a successful math course. With this goal in mind, a group of math professors at BCC spent the last year working with the online homework system WeBWork. Our intention is to expand our current implementation, with the hope of working across campuses. We will discuss the advantages of WeBWork and how we might work collaboratively across CUNY.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report contains a suggestion for a simple monitoring and evaluation guideline for PV-diesel hybrid systems. It offers system users a way to better understand if their system is operated in a way that will make it last for a long time. It also gives suggestions on how to act if there are signs of unfavourable use or failure. The application of the guide requires little technical equipment, but daily manual measurements. For the most part, it can be managed by pen and paper, by people with no earlier experience of power systems.The guide is structured and expressed in a way that targets PV-diesel hybrid system users with no, or limited, earlier experience of power engineering. It is less detailed in terms of motivations for certain choices and limitations, but rich in details concerning calculations, evaluation procedures and maintenance routines. A more scientific description of the guide can be found in a related journal article.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the UK, urban river basins are particularly vulnerable to flash floods due to short and intense rainfall. This paper presents potential flood resilience approaches for the highly urbanised Wortley Beck river basin, south west of the Leeds city centre. The reach of Wortley Beck is approximately 6km long with contributing catchment area of 30km2 that drain into the River Aire. Lower Wortley has experienced regular flooding over the last few years from a range of sources, including Wortley Beck and surface and ground water, that affects properties both upstream and downstream of Farnley Lake as well as Wortley Ring Road. This has serious implications for society, the environment and economy activity in the City of Leeds. The first stage of the study involves systematically incorporating Wortley Beck’s land scape features on an Arc-GIS platform to identify existing green features in the region. This process also enables the exploration of potential blue green features: green spaces, green roofs, water retention ponds and swales at appropriate locations and connect them with existing green corridors to maximize their productivity. The next stage is involved in developing a detailed 2D urban flood inundation model for the Wortley Beck region using the CityCat model. CityCat is capable to model the effects of permeable/impermeable ground surfaces and buildings/roofs to generate flood depth and velocity maps at 1m caused by design storm events. The final stage of the study is involved in simulation of range of rainfall and flood event scenarios through CityCat model with different blue green features. Installation of other hard engineering individual property protection measures through water butts and flood walls are also incorporated in the CityCat model. This enables an integrated sustainable flood resilience strategy for this region.