288 resultados para Geomagnetic Storm
Resumo:
Historic geomagnetic activity observations have been used to reveal centennial variations in the open solar flux and the near-Earth heliospheric conditions (the interplanetary magnetic field and the solar wind speed). The various methods are in very good agreement for the past 135 years when there were sufficient reliable magnetic observatories in operation to eliminate problems due to site-specific errors and calibration drifts. This review underlines the physical principles that allow these reconstructions to be made, as well as the details of the various algorithms employed and the results obtained. Discussion is included of: the importance of the averaging timescale; the key differences between “range” and “interdiurnal variability” geomagnetic data; the need to distinguish source field sector structure from heliospherically-imposed field structure; the importance of ensuring that regressions used are statistically robust; and uncertainty analysis. The reconstructions are exceedingly useful as they provide calibration between the in-situ spacecraft measurements from the past five decades and the millennial records of heliospheric behaviour deduced from measured abundances of cosmogenic radionuclides found in terrestrial reservoirs. Continuity of open solar flux, using sunspot number to quantify the emergence rate, is the basis of a number of models that have been very successful in reproducing the variation derived from geomagnetic activity. These models allow us to extend the reconstructions back to before the development of the magnetometer and to cover the Maunder minimum. Allied to the radionuclide data, the models are revealing much about how the Sun and heliosphere behaved outside of grand solar maxima and are providing a means of predicting how solar activity is likely to evolve now that the recent grand maximum (that had prevailed throughout the space age) has come to an end.
Resumo:
The ability to predict times of greater galactic cosmic ray (GCR) fluxes is important for reducing the hazards caused by these particles to satellite communications, aviation, or astronauts. The 11-year solar-cycle variation in cosmic rays is highly correlated with the strength of the heliospheric magnetic field. Differences in GCR flux during alternate solar cycles yield a 22-year cycle, known as the Hale Cycle, which is thought to be due to different particle drift patterns when the northern solar pole has predominantly positive (denoted as qA>0 cycle) or negative (qA<0) polarities. This results in the onset of the peak cosmic-ray flux at Earth occurring earlier during qA>0 cycles than for qA<0 cycles, which in turn causes the peak to be more dome-shaped for qA>0 and more sharply peaked for qA<0. In this study, we demonstrate that properties of the large-scale heliospheric magnetic field are different during the declining phase of the qA<0 and qA>0 solar cycles, when the difference in GCR flux is most apparent. This suggests that particle drifts may not be the sole mechanism responsible for the Hale Cycle in GCR flux at Earth. However, we also demonstrate that these polarity-dependent heliospheric differences are evident during the space-age but are much less clear in earlier data: using geomagnetic reconstructions, we show that for the period of 1905 - 1965, alternate polarities do not give as significant a difference during the declining phase of the solar cycle. Thus we suggest that the 22-year cycle in cosmic-ray flux is at least partly the result of direct modulation by the heliospheric magnetic field and that this effect may be primarily limited to the grand solar maximum of the space-age.
Resumo:
Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.
Resumo:
The distribution of dust in the ecliptic plane between 0.96 and 1.04 au has been inferred from impacts on the two Solar Terrestrial Relations Observatory (STEREO) spacecraft through observation of secondary particle trails and unexpected off-points in the heliospheric imager (HI) cameras. This study made use of analysis carried out by members of a distributed web-based citizen science project Solar Stormwatch. A comparison between observations of the brightest particle trails and a survey of fainter trails shows consistent distributions. While there is no obvious correlation between this distribution and the occurrence of individual meteor streams at Earth, there are some broad longitudinal features in these distributions that are also observed in sources of the sporadic meteor population. The different position of the HI instrument on the two STEREO spacecraft leads to each sampling different populations of dust particles. The asymmetry in the number of trails seen by each spacecraft and the fact that there are many more unexpected off-points in the HI-B than in HI-A indicates that the majority of impacts are coming from the apex direction. For impacts causing off-points in the HI-B camera, these dust particles are estimated to have masses in excess of 10−17 kg with radii exceeding 0.1 μm. For off-points observed in the HI-A images, which can only have been caused by particles travelling from the anti-apex direction, the distribution is consistent with that of secondary ‘storm’ trails observed by HI-B, providing evidence that these trails also result from impacts with primary particles from an anti-apex source. Investigating the mass distribution for the off-points of both HI-A and HI-B, it is apparent that the differential mass index of particles from the apex direction (causing off-points in HI-B) is consistently above 2. This indicates that the majority of the mass is within the smaller particles of this population. In contrast, the differential mass index of particles from the anti-apex direction (causing off-points in HI-A) is consistently below 2, indicating that the majority of the mass is to be found in larger particles of this distribution.
Resumo:
In late February 2010 the extraordinary windstorm Xynthia crossed over Southwestern and Central Europe and caused severe damage, affecting particularly the Spanish and French Atlantic coasts. The storm was embedded in uncommon large-scale atmospheric and boundary conditions prior to and during its development, namely enhanced sea surface temperatures (SST) within the low-level entrainment zone of air masses, an unusual southerly position of the polar jet stream, and a remarkable split jet structure in the upper troposphere. To analyse the processes that led to the rapid intensification of this exceptional storm originating close to the subtropics (30°N), the sensitivity of the cyclone intensification to latent heat release is determined using the regional climate model COSMO-CLM forced with ERA-Interim data. A control simulation with observed SST shows that moist and warm air masses originating from the subtropical North Atlantic were involved in the cyclogenesis process and led to the formation of a vertical tower with high values of potential vorticity (PV). Sensitivity studies with reduced SST or increased laminar boundary roughness for heat led to reduced surface latent heat fluxes. This induced both a weaker and partly retarded development of the cyclone and a weakening of the PV-tower together with reduced diabatic heating rates, particularly at lower and mid levels. We infer that diabatic processes played a crucial role during the phase of rapid deepening of Xynthia and thus to its intensity over the Southeastern North Atlantic. We suggest that windstorms like Xynthia may occur more frequently under future climate conditions due to the warming SSTs and potentially enhanced latent heat release, thus increasing the windstorm risk for Southwestern Europe.
Resumo:
The European summer of 2012 was marked by strongly contrasting rainfall anomalies, which led to flooding in northern Europe and droughts and wildfires in southern Europe. This season was not an isolated event, rather the latest in a string of summers characterized by a southward shifted Atlantic storm track as described by the negative phase of the SNAO. The degree of decadal variability in these features suggests a role for forcing from outside the dynamical atmosphere, and preliminary numerical experiments suggest that the global SST and low Arctic sea ice extent anomalies are likely to have played a role and that warm North Atlantic SSTs were a particular contributing factor. The direct effects of changes in radiative forcing from greenhouse gas and aerosol forcing are not included in these experiments, but both anthropogenic forcing and natural variability may have influenced the SST and sea ice changes.
Resumo:
A steady decline in Arctic sea ice has been observed over recent decades. General circulation models predict further decreases under increasing greenhouse gas scenarios. Sea ice plays an important role in the climate system in that it influences ocean-to-atmosphere fluxes, surface albedo, and ocean buoyancy. The aim of this study is to isolate the climate impacts of a declining Arctic sea ice cover during the current century. The Hadley Centre Atmospheric Model (HadAM3) is forced with observed sea ice from 1980 to 2000 (obtained from satellite passive microwave radiometer data derived with the Bootstrap algorithm) and predicted sea ice reductions until 2100 under one moderate scenario and one severe scenario of ice decline, with a climatological SST field and increasing SSTs. Significant warming of the Arctic occurs during the twenty-first century (mean increase of between 1.6° and 3.9°C), with positive anomalies of up to 22°C locally. The majority of this is over ocean and limited to high latitudes, in contrast to recent observations of Northern Hemisphere warming. When a climatological SST field is used, statistically significant impacts on climate are only seen in winter, despite prescribing sea ice reductions in all months. When correspondingly increasing SSTs are incorporated, changes in climate are seen in both winter and summer, although the impacts in summer are much smaller. Alterations in atmospheric circulation and precipitation patterns are more widespread than temperature, extending down to midlatitude storm tracks. Results suggest that areas of Arctic land ice may even undergo net accumulation due to increased precipitation that results from loss of sea ice. Intensification of storm tracks implies that parts of Europe may experience higher precipitation rates.
Resumo:
The ability of the climate models participating in phase 5 of the Coupled Model Intercomparison Project (CMIP5) to simulate North Atlantic extratropical cyclones in winter [December–February (DJF)] and summer [June–August (JJA)] is investigated in detail. Cyclones are identified as maxima in T42 vorticity at 850 hPa and their propagation is tracked using an objective feature-tracking algorithm. By comparing the historical CMIP5 simulations (1976–2005) and the ECMWF Interim Re-Analysis (ERA-Interim; 1979–2008), the authors find that systematic biases affect the number and intensity of North Atlantic cyclones in CMIP5 models. In DJF, the North Atlantic storm track tends to be either too zonal or displaced southward, thus leading to too few and weak cyclones over the Norwegian Sea and too many cyclones in central Europe. In JJA, the position of the North Atlantic storm track is generally well captured but some CMIP5 models underestimate the total number of cyclones. The dynamical intensity of cyclones, as measured by either T42 vorticity at 850 hPa or mean sea level pressure, is too weak in both DJF and JJA. The intensity bias has a hemispheric character, and it cannot be simply attributed to the representation of the North Atlantic large- scale atmospheric state. Despite these biases, the representation of Northern Hemisphere (NH) storm tracks has improved since CMIP3 and some CMIP5 models are able of representing well both the number and the intensity of North Atlantic cyclones. In particular, some of the higher-atmospheric-resolution models tend to have a better representation of the tilt of the North Atlantic storm track and of the intensity of cyclones in DJF.
Resumo:
The response of North Atlantic and European extratropical cyclones to climate change is investigated in the climate models participating in phase 5 of the Coupled Model Intercomparison Project (CMIP5). In contrast to previous multimodel studies, a feature-tracking algorithm is here applied to separately quantify the re- sponses in the number, the wind intensity, and the precipitation intensity of extratropical cyclones. Moreover, a statistical framework is employed to formally assess the uncertainties in the multimodel projections. Under the midrange representative concentration pathway (RCP4.5) emission scenario, the December–February (DJF) response is characterized by a tripolar pattern over Europe, with an increase in the number of cyclones in central Europe and a decreased number in the Norwegian and Mediterranean Seas. The June–August (JJA) response is characterized by a reduction in the number of North Atlantic cyclones along the southern flank of the storm track. The total number of cyclones decreases in both DJF (24%) and JJA (22%). Classifying cyclones according to their intensity indicates a slight basinwide reduction in the number of cy- clones associated with strong winds, but an increase in those associated with strong precipitation. However, in DJF, a slight increase in the number and intensity of cyclones associated with strong wind speeds is found over the United Kingdom and central Europe. The results are confirmed under the high-emission RCP8.5 scenario, where the signals tend to be larger. The sources of uncertainty in these projections are discussed.
Resumo:
Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.
Resumo:
We present a new speleothem record of atmospheric Δ14C between 28 and 44 ka that offers considerable promise for resolving some of the uncertainty associated with existing radiocarbon calibration curves for this time period. The record is based on a comprehensive suite of AMS 14C ages, using new low-blank protocols, and U–Th ages using high precision MC-ICPMS procedures. Atmospheric Δ14C was calculated by correcting 14C ages with a constant dead carbon fraction (DCF) of 22.7 ± 5.9%, based on a comparison of stalagmite 14C ages with the IntCal04 (Reimer et al., 2004) calibration curve between 15 and 11 ka. The new Δ14C speleothem record shows similar structure and amplitude to that derived from Cariaco Basin foraminifera (Hughen et al., 2004, 2006), and the match is further improved if the latter is tied to the most recent Greenland ice core chronology (Svensson et al., 2008). These data are however in conflict with a previously published 14C data set for a stalagmite record from the Bahamas — GB-89-24-1 (Beck et al., 2001), which likely suffered from 14C analytical blank subtraction issues in the older part of the record. The new Bahamas speleothem ∆14C data do not show the extreme shifts between 44 and 40 ka reported in the previous study (Beck et al., 2001). Causes for the observed structure in derived atmospheric Δ14C variation based on the new speleothem data are investigated with a suite of simulations using an earth system model of intermediate complexity. Data-model comparison indicates that major fluctuations in atmospheric ∆14C during marine isotope stage 3 is primarily a function of changes in geomagnetic field intensity, although ocean–atmosphere system reorganisation also played a supporting role.
Resumo:
Using a water balance modelling framework, this paper analyses the effects of urban design on the water balance, with a focus on evapotranspiration and storm water. First, two quite different urban water balance models are compared: Aquacycle which has been calibrated for a suburban catchment in Canberra, Australia, and the single-source urban evapotranspiration-interception scheme (SUES), an energy-based approach with a biophysically advanced representation of interception and evapotranspiration. A fair agreement between the two modelled estimates of evapotranspiration was significantly improved by allowing the vegetation cover (leaf area index, LAI) to vary seasonally, demonstrating the potential of SUES to quantify the links between water sensitive urban design and microclimates and the advantage of comparing the two modelling approaches. The comparison also revealed where improvements to SUES are needed, chiefly through improved estimates of vegetation cover dynamics as input to SUES, and more rigorous parameterization of the surface resistance equations using local-scale suburban flux measurements. Second, Aquacycle is used to identify the impact of an array of water sensitive urban design features on the water balance terms. This analysis confirms the potential to passively control urban microclimate by suburban design features that maximize evapotranspiration, such as vegetated roofs. The subsequent effects on daily maximum air temperatures are estimated using an atmospheric boundary layer budget. Potential energy savings of about 2% in summer cooling are estimated from this analysis. This is a clear ‘return on investment’ of using water to maintain urban greenspace, whether as parks distributed throughout an urban area or individual gardens or vegetated roofs.
Resumo:
Tests, as learning events, are often more effective than are additional study opportunities, especially when recall is tested after a long retention interval. To what degree, though, do prior test or study events support subsequent study activities? We set out to test an implication of Bjork and Bjork’s (1992) new theory of disuse—that, under some circumstances, prior study may facilitate subsequent study more than does prior testing. Participants learned English–Swahili translations and then underwent a practice phase during which some items were tested (without feedback) and other items were restudied. Although tested items were better recalled after a 1-week delay than were restudied items, this benefit did not persist after participants had the opportunity to study the items again via feedback. In fact, after this additional study opportunity, items that had been restudied earlier were better recalled than were items that had been tested earlier. These results suggest that measuring the memorial consequences of testing requires more than a single test of retention and, theoretically, a consideration of the differing status of initially recallable and nonrecallable items.
Resumo:
Meteosat infra-red imagery for the Great Storm of October 1987 is analysed to show a series of very shallow arc-shaped and smaller chevron-shaped cloud features that were associated with damaging surface winds in the dry-slot region of this extra-tropical cyclone. Hypotheses are presented that attribute these low-level cloud features to boundary-layer convergence lines ahead of wind maxima associated with the downward transport of high momentum from overrunning, so-called sting-jet, flows originating in the storm's main cloud head. Copyright © 2004 Royal Meteorological Society.
Resumo:
Under particular large-scale atmospheric conditions, several windstorms may affect Europe within a short time period. The occurrence of such cyclone families leads to large socioeconomic impacts and cumulative losses. The serial clustering of windstorms is analyzed for the North Atlantic/western Europe. Clustering is quantified as the dispersion (ratio variance/mean) of cyclone passages over a certain area. Dispersion statistics are derived for three reanalysis data sets and a 20-run European Centre Hamburg Version 5 /Max Planck Institute Version–Ocean Model Version 1 global climate model (ECHAM5/MPI-OM1 GCM) ensemble. The dependence of the seriality on cyclone intensity is analyzed. Confirming previous studies, serial clustering is identified in reanalysis data sets primarily on both flanks and downstream regions of the North Atlantic storm track. This pattern is a robust feature in the reanalysis data sets. For the whole area, extreme cyclones cluster more than nonextreme cyclones. The ECHAM5/MPI-OM1 GCM is generally able to reproduce the spatial patterns of clustering under recent climate conditions, but some biases are identified. Under future climate conditions (A1B scenario), the GCM ensemble indicates that serial clustering may decrease over the North Atlantic storm track area and parts of western Europe. This decrease is associated with an extension of the polar jet toward Europe, which implies a tendency to a more regular occurrence of cyclones over parts of the North Atlantic Basin poleward of 50°N and western Europe. An increase of clustering of cyclones is projected south of Newfoundland. The detected shifts imply a change in the risk of occurrence of cumulative events over Europe under future climate conditions.