117 resultados para Anomaly
Resumo:
This article examines the potential to improve numerical weather prediction (NWP) by estimating upper and lower bounds on predictability by re-visiting the original study of Lorenz (1982) but applied to the most recent version of the European Centre for Medium Range Weather Forecasts (ECMWF) forecast system, for both the deterministic and ensemble prediction systems (EPS). These bounds are contrasted with an older version of the same NWP system to see how they have changed with improvements to the NWP system. The computations were performed for the earlier seasons of DJF 1985/1986 and JJA 1986 and the later seasons of DJF 2010/2011 and JJA 2011 using the 500-hPa geopotential height field. Results indicate that for this field, we may be approaching the limit of deterministic forecasting so that further improvements might only be obtained by improving the initial state. The results also show that predictability calculations with earlier versions of the model may overestimate potential forecast skill, which may be due to insufficient internal variability in the model and because recent versions of the model are more realistic in representing the true atmospheric evolution. The same methodology is applied to the EPS to calculate upper and lower bounds of predictability of the ensemble mean forecast in order to explore how ensemble forecasting could extend the limits of the deterministic forecast. The results show that there is a large potential to improve the ensemble predictions, but for the increased predictability of the ensemble mean, there will be a trade-off in information as the forecasts will become increasingly smoothed with time. From around the 10-d forecast time, the ensemble mean begins to converge towards climatology. Until this point, the ensemble mean is able to predict the main features of the large-scale flow accurately and with high consistency from one forecast cycle to the next. By the 15-d forecast time, the ensemble mean has lost information with the anomaly of the flow strongly smoothed out. In contrast, the control forecast is much less consistent from run to run, but provides more detailed (unsmoothed) but less useful information.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
A new record of sea surface temperature (SST) for climate applications is described. This record provides independent corroboration of global variations estimated from SST measurements made in situ. Infrared imagery from Along-Track Scanning Radiometers (ATSRs) is used to create a 20 year time series of SST at 0.1° latitude-longitude resolution, in the ATSR Reprocessing for Climate (ARC) project. A very high degree of independence of in situ measurements is achieved via physics-based techniques. Skin SST and SST estimated for 20 cm depth are provided, with grid cell uncertainty estimates. Comparison with in situ data sets establishes that ARC SSTs generally have bias of order 0.1 K or smaller. The precision of the ARC SSTs is 0.14 K during 2003 to 2009, from three-way error analysis. Over the period 1994 to 2010, ARC SSTs are stable, with better than 95% confidence, to within 0.005 K yr−1(demonstrated for tropical regions). The data set appears useful for cleanly quantifying interannual variability in SST and major SST anomalies. The ARC SST global anomaly time series is compared to the in situ-based Hadley Centre SST data set version 3 (HadSST3). Within known uncertainties in bias adjustments applied to in situ measurements, the independent ARC record and HadSST3 present the same variations in global marine temperature since 1996. Since the in situ observing system evolved significantly in its mix of measurement platforms and techniques over this period, ARC SSTs provide an important corroboration that HadSST3 accurately represents recent variability and change in this essential climate variable.
Resumo:
Sea surface temperature (SST) measurements are required by operational ocean and atmospheric forecasting systems to constrain modeled upper ocean circulation and thermal structure. The Global Ocean Data Assimilation Experiment (GODAE) High Resolution SST Pilot Project (GHRSST-PP) was initiated to address these needs by coordinating the provision of accurate, high-resolution, SST products for the global domain. The pilot project is now complete, but activities continue within the Group for High Resolution SST (GHRSST). The pilot project focused on harmonizing diverse satellite and in situ data streams that were indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework implemented in an internationally distributed manner. Data with meaningful error estimates developed within GHRSST are provided by services within R/GTS. Currently, several terabytes of data are processed at international centers daily, creating more than 25 gigabytes of product. Ensemble SST analyses together with anomaly SST outputs are generated each day, providing confidence in SST analyses via diagnostic outputs. Diagnostic data sets are generated and Web interfaces are provided to monitor the quality of observation and analysis products. GHRSST research and development projects continue to tackle problems of instrument calibration, algorithm development, diurnal variability, skin temperature deviation, and validation/verification of GHRSST products. GHRSST also works closely with applications and users, providing a forum for discussion and feedback between SST users and producers on a regular basis. All data within the GHRSST R/GTS framework are freely available. This paper reviews the progress of GHRSST-PP, highlighting achievements that have been fundamental to the success of the pilot project.
Resumo:
To study the transient atmospheric response to midlatitude SST anomalies, a three-layer quasigeostrophic (QG) model coupled to a slab oceanic mixed layer in the North Atlantic is used. As diagnosed from a coupled run in perpetual winter conditions, the first two modes of SST variability are linked to the model North Atlantic Oscillation (NAO) and eastern Atlantic pattern (EAP), respectively, the dominant atmospheric modes in the Atlantic sector. The two SST anomaly patterns are then prescribed as fixed anomalous boundary conditions for the model atmosphere, and its transient responses are established from a large ensemble of simulations. In both cases, the tendency of the air–sea heat fluxes to damp the SST anomalies results in an anomalous diabatic heating of the atmosphere that, in turn, forces a baroclinic response, as predicted by linear theory. This initial response rapidly modifies the transient eddy activity and thus the convergence of eddy momentum and heat fluxes. The latter transforms the baroclinic response into a growing barotropic one that resembles the atmospheric mode that had created the SST anomaly in the coupled run and is thus associated with a positive feedback. The total adjustment time is as long as 3–4 months for the NAO-like response and 1–2 months for the EAP-like one. The positive feedback, in both cases, is dependent on the polarity of the SST anomaly, but is stronger in the NAO case, thereby contributing to its predominance at low frequency in the coupled system. However, the feedback is too weak to lead to an instability of the atmospheric modes and primarily results in an increase of their amplitude and persistence and a weakening of the heat flux damping of the SST anomaly.
Resumo:
The RAPID-MOCHA array has observed the Atlantic Meridional overturning circulation (AMOC) at 26.5°N since 2004. During 2009/2010, there was a transient 30% weakening of the AMOC driven by anomalies in geostrophic and Ekman transports. Here, we use simulations based on the Met Office Forecast Ocean Assimilation Model (FOAM) to diagnose the relative importance of atmospheric forcings and internal ocean dynamics in driving the anomalous geostrophic circulation of 2009/10. Data assimilating experiments with FOAM accurately reproduce the mean strength and depth of the AMOC at 26.5°N. In addition, agreement between simulated and observed stream functions in the deep ocean is improved when we calculate the AMOC using a method that approximates the RAPID observations. The main features of the geostrophic circulation anomaly are captured by an ensemble of simulations without data-assimilation. These model results suggest that the atmosphere played a dominant role in driving recent interannual variability of the AMOC.
Resumo:
The global mean temperature in 2008 was slightly cooler than that in 2007; however, it still ranks within the 10 warmest years on record. Annual mean temperatures were generally well above average in South America, northern and southern Africa, Iceland, Europe, Russia, South Asia, and Australia. In contrast, an exceptional cold outbreak occurred during January across Eurasia and over southern European Russia and southern western Siberia. There has been a general increase in land-surface temperatures and in permafrost temperatures during the last several decades throughout the Arctic region, including increases of 1° to 2°C in the last 30 to 35 years in Russia. Record setting warm summer (JJA) air temperatures were observed throughout Greenland. The year 2008 was also characterized by heavy precipitation in a number of regions of northern South America, Africa, and South Asia. In contrast, a prolonged and intense drought occurred during most of 2008 in northern Argentina, Paraguay, Uruguay, and southern Brazil, causing severe impacts to agriculture and affecting many communities. The year began with a strong La Niña episode that ended in June. Eastward surface current anomalies in the tropical Pacific Ocean in early 2008 played a major role in adjusting the basin from strong La Niña conditions to ENSO-neutral conditions by July–August, followed by a return to La Niña conditions late in December. The La Niña conditions resulted in far-reaching anomalies such as a cooling in the central tropical Pacific, Arctic Ocean, and the regions extending from the Gulf of Alaska to the west coast of North America; changes in the sea surface salinity and heat content anomalies in the tropics; and total column water vapor, cloud cover, tropospheric temperature, and precipitation patterns typical of a La Niña. Anomalously salty ocean surface salinity values in climatologically drier locations and anomalously fresh values in rainier locations observed in recent years generally persisted in 2008, suggesting an increase in the hydrological cycle. The 2008 Atlantic hurricane season was the 14th busiest on record and the only season ever recorded with major hurricanes each month from July through November. Conversely, activity in the northwest Pacific was considerably below normal during 2008. While activity in the north Indian Ocean was only slightly above average, the season was punctuated by Cyclone Nargis, which killed over 145,000 people; in addition, it was the seventh-strongest cyclone ever in the basin and the most devastating to hit Asia since 1991. Greenhouse gas concentrations continued to rise, increasing by more than expected based on with CO2 the 1979 to 2007 trend. In the oceans, the global mean uptake for 2007 is estimated to be 1.67 Pg-C, about CO2 0.07 Pg-C lower than the long-term average, making it the third-largest anomaly determined with this method since 1983, with the largest uptake of carbon over the past decade coming from the eastern Indian Ocean. Global phytoplankton chlorophyll concentrations were slightly elevated in 2008 relative to 2007, but regional changes were substantial (ranging to about 50%) and followed long-term patterns of net decreases in chlorophyll with increasing sea surface temperature. Ozone-depleting gas concentrations continued to fall globally to about 4% below the peak levels of the 2000–02 period. Total column ozone concentrations remain well below pre-1980, levels and the 2008 ozone hole was unusually large (sixth worst on record) and persistent, with low ozone values extending into the late December period. In fact the polar vortex in 2008 persisted longer than for any previous year since 1979. Northern Hemisphere snow cover extent for the year was well below average due in large part to the record-low ice extent in March and despite the record-maximum coverage in January and the shortest snow cover duration on record (which started in 1966) in the North American Arctic. Limited preliminary data imply that in 2008 glaciers continued to lose mass, and full data for 2007 show it was the 17th consecutive year of loss. The northern region of Greenland and adjacent areas of Arctic Canada experienced a particularly intense melt season, even though there was an abnormally cold winter across Greenland's southern half. One of the most dramatic signals of the general warming trend was the continued significant reduction in the extent of the summer sea-ice cover and, importantly, the decrease in the amount of relatively older, thicker ice. The extent of the 2008 summer sea-ice cover was the second-lowest value of the satellite record (which started in 1979) and 36% below the 1979–2000 average. Significant losses in the mass of ice sheets and the area of ice shelves continued, with several fjords on the northern coast of Ellesmere Island being ice free for the first time in 3,000–5,500 years. In Antarctica, the positive phase of the SAM led to record-high total sea ice extent for much of early 2008 through enhanced equatorward Ekman transport. With colder continental temperatures at this time, the 2007–08 austral summer snowmelt season was dramatically weakened, making it the second shortest melt season since 1978 (when the record began). There was strong warming and increased precipitation along the Antarctic Peninsula and west Antarctica in 2008, and also pockets of warming along coastal east Antarctica, in concert with continued declines in sea-ice concentration in the Amundsen/Bellingshausen Seas. One significant event indicative of this warming was the disintegration and retreat of the Wilkins Ice Shelf in the southwest peninsula area of Antarctica.
Resumo:
We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.
Resumo:
In projections of twenty-first century climate, Arctic sea ice declines and at the same time exhibits strong interannual anomalies. Here, we investigate the potential to predict these strong sea-ice anomalies under a perfect-model assumption, using the Max-Planck-Institute Earth System Model in the same setup as in the Coupled Model Intercomparison Project Phase 5 (CMIP5). We study two cases of strong negative sea-ice anomalies: a 5-year-long anomaly for present-day conditions, and a 10-year-long anomaly for conditions projected for the middle of the twenty-first century. We treat these anomalies in the CMIP5 projections as the truth, and use exactly the same model configuration for predictions of this synthetic truth. We start ensemble predictions at different times during the anomalies, considering lagged-perfect and sea-ice-assimilated initial conditions. We find that the onset and amplitude of the interannual anomalies are not predictable. However, the further deepening of the anomaly can be predicted for typically 1 year lead time if predictions start after the onset but before the maximal amplitude of the anomaly. The magnitude of an extremely low summer sea-ice minimum is hard to predict: the skill of the prediction ensemble is not better than a damped-persistence forecast for lead times of more than a few months, and is not better than a climatology forecast for lead times of two or more years. Predictions of the present-day anomaly are more skillful than predictions of the mid-century anomaly. Predictions using sea-ice-assimilated initial conditions are competitive with those using lagged-perfect initial conditions for lead times of a year or less, but yield degraded skill for longer lead times. The results presented here suggest that there is limited prospect of predicting the large interannual sea-ice anomalies expected to occur throughout the twenty-first century.
Resumo:
This paper examines the evidence for a day-of-the-week effect in five Southeast Asian stock markets: South Korea, Malaysia, the Philippines, Taiwan and Thailand. Findings indicate significant seasonality for three of the five markets. Market risk, proxied by the return on the FTA World Price Index, is not sufficient to explain this calendar anomaly. Although an extension of the risk-return equation to incorporate interactive seasonal dummy variables can explain some significant day-of-the-week effects, market risk alone appears insufficient to characterize this phenomenon.
Resumo:
In the 1960s North Atlantic sea surface temperatures (SST) cooled rapidly. The magnitude of the cooling was largest in the North Atlantic subpolar gyre (SPG), and was coincident with a rapid freshening of the SPG. Here we analyze hindcasts of the 1960s North Atlantic cooling made with the UK Met Office’s decadal prediction system (DePreSys), which is initialised using observations. It is shown that DePreSys captures—with a lead time of several years—the observed cooling and freshening of the North Atlantic SPG. DePreSys also captures changes in SST over the wider North Atlantic and surface climate impacts over the wider region, such as changes in atmospheric circulation in winter and sea ice extent. We show that initialisation of an anomalously weak Atlantic Meridional Overturning Circulation (AMOC), and hence weak northward heat transport, is crucial for DePreSys to predict the magnitude of the observed cooling. Such an anomalously weak AMOC is not captured when ocean observations are not assimilated (i.e. it is not a forced response in this model). The freshening of the SPG is also dominated by ocean salt transport changes in DePreSys; in particular, the simulation of advective freshwater anomalies analogous to the Great Salinity Anomaly were key. Therefore, DePreSys suggests that ocean dynamics played an important role in the cooling of the North Atlantic in the 1960s, and that this event was predictable.
Resumo:
In the 1960s and early 1970s sea surface temperatures in the North Atlantic Ocean cooled rapidly. There is still considerable uncertainty about the causes of this event, although various mechanisms have been proposed. In this observational study it is demonstrated that the cooling proceeded in several distinct stages. Cool anomalies initially appeared in the mid-1960s in the Nordic Seas and Gulf Stream Extension, before spreading to cover most of the Subpolar Gyre. Subsequently, cool anomalies spread into the tropical North Atlantic before retreating, in the late 1970s, back to the Subpolar Gyre. There is strong evidence that changes in atmospheric circulation, linked to a southward shift of the Atlantic ITCZ, played an important role in the event, particularly in the period 1972-76. Theories for the cooling event must account for its distinctive space-time evolution. Our analysis suggests that the most likely drivers were: 1) The “Great Salinity Anomaly” of the late 1960s; 2) An earlier warming of the subpolar North Atlantic, which may have led to a slow-down in the Atlantic Meridional Overturning Circulation; 3) An increase in anthropogenic sulphur dioxide emissions. Determining the relative importance of these factors is a key area for future work.
Resumo:
The transient atmospheric response to interactive SST anomalies in the midlatitudes is investigated using a three-layer QG model coupled in perpetual winter conditions to a slab oceanic mixed layer in the North Atlantic. The SST anomalies are diagnosed from a coupled run and prescribed as initial conditions, but are free to evolve. The initial evolution of the atmospheric response is similar to that obtained with a prescribed SST anomaly, starting as a quasi-linear baroclinic and then quickly evolving into a growing equivalent barotropic one. Because of the heat flux damping, the SST anomaly amplitude slowly decreases, albeit with little change in pattern. Correspondingly, the atmospheric response only increases until it reaches a maximum amplitude after about 1–3.5 months, depending on the SST anomaly considered. The response is similar to that at equilibrium in the fixed SST case, but it is 1.5–2 times smaller, and then slowly decays away.
Resumo:
Time series of global and regional mean Surface Air Temperature (SAT) anomalies are a common metric used to estimate recent climate change. Various techniques can be used to create these time series from meteorological station data. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques relative to the reanalysis reference. Kriging techniques provided the smallest errors in estimates of Arctic anomalies and Simple Kriging was often the best kriging method in this study, especially over sea ice. A linear interpolation technique had, on average, Root Mean Square Errors (RMSEs) up to 0.55 K larger than the two kriging techniques tested. Non-interpolating techniques provided the least representative anomaly estimates. Nonetheless, they serve as useful checks for confirming whether estimates from interpolating techniques are reasonable. The interaction of meteorological station coverage with estimation techniques between 1850 and 2011 was simulated using an ensemble dataset comprising repeated individual years (1979-2011). All techniques were found to have larger RMSEs for earlier station coverages. This supports calls for increased data sharing and data rescue, especially in sparsely observed regions such as the Arctic.
Resumo:
Previous studies documented that a distinct southward shift of central-Pacific low-level wind anomalies occurring during the ENSO decaying phase, is caused by an interaction between the Western Pacific annual cycle and El Niño-Southern Oscillation (ENSO) variability. The present study finds that the meridional movement of the central-Pacific wind anomalies appears only during traditional Eastern-Pacific (or EP) El Niño events rather than in Central-Pacific (CP) El Niño events in which sea surface temperature (SST) anomalies are confined to the central Pacific. The zonal structure of ENSO-related SST anomalies therefore has an important effect on meridional asymmetry in the associated atmospheric response and its modulation by the annual cycle. In contrast to EP El Niño events, the SST anomalies of CP El Niño events extend further west towards to the warm pool region with its climatological warm SSTs. In the warm pool region, relatively small SST anomalies thus are able to excite convection anomalies on both sides of the equator, even with a meridionally asymmetric SST background state. Therefore, almost meridionally symmetric precipitation and wind anomalies are observed over the central Pacific during the decaying phase of CP El Niño events. The SST anomaly pattern of La Niña events is similar to CP El Niño events with a reversed sign. Accordingly, no distinct southward displacement of the atmospheric response occurs over the central Pacific during the La Niña decaying phase. These results have important implications for ENSO climate impacts over East Asia, since the anomalous low-level anticyclone over the western North Pacific is an integral part of the annual cycle-modulated ENSO response.