843 resultados para Predictable routing
Resumo:
In the 1960s North Atlantic sea surface temperatures (SST) cooled rapidly. The magnitude of the cooling was largest in the North Atlantic subpolar gyre (SPG), and was coincident with a rapid freshening of the SPG. Here we analyze hindcasts of the 1960s North Atlantic cooling made with the UK Met Office’s decadal prediction system (DePreSys), which is initialised using observations. It is shown that DePreSys captures—with a lead time of several years—the observed cooling and freshening of the North Atlantic SPG. DePreSys also captures changes in SST over the wider North Atlantic and surface climate impacts over the wider region, such as changes in atmospheric circulation in winter and sea ice extent. We show that initialisation of an anomalously weak Atlantic Meridional Overturning Circulation (AMOC), and hence weak northward heat transport, is crucial for DePreSys to predict the magnitude of the observed cooling. Such an anomalously weak AMOC is not captured when ocean observations are not assimilated (i.e. it is not a forced response in this model). The freshening of the SPG is also dominated by ocean salt transport changes in DePreSys; in particular, the simulation of advective freshwater anomalies analogous to the Great Salinity Anomaly were key. Therefore, DePreSys suggests that ocean dynamics played an important role in the cooling of the North Atlantic in the 1960s, and that this event was predictable.
Resumo:
[1] We present a model of the dust cycle that successfully predicts dust emissions as determined by land surface properties, monthly vegetation and snow cover, and 6-hourly surface wind speeds for the years 1982–1993. The model takes account of the role of dry lake beds as preferential source areas for dust emission. The occurrence of these preferential sources is determined by a water routing and storage model. The dust source scheme also explicitly takes into account the role of vegetation type as well as monthly vegetation cover. Dust transport is computed using assimilated winds for the years 1987–1990. Deposition of dust occurs through dry and wet deposition, where subcloud scavenging is calculated using assimilated precipitation fields. Comparison of simulated patterns of atmospheric dust loading with the Total Ozone Mapping Spectrometer satellite absorbing aerosol index shows that the model produces realistic results from daily to interannual timescales. The magnitude of dust deposition agrees well with sediment flux data from marine sites. Emission of submicron dust from preferential source areas are required for the computation of a realistic dust optical thickness. Sensitivity studies show that Asian dust source strengths are particularly sensitive to the seasonality of vegetation cover.
Resumo:
Runoff fields over northern Africa (10–25°N, 20°W–30°E) derived from 17 atmospheric general circulation models driven by identical 6 ka BP orbital forcing, sea surface temperatures, and CO2 concentration have been analyzed using a hydrological routing scheme (HYDRA) to simulate changes in lake area. The AGCM-simulated runoff produced six-fold differences in simulated lake area between models, although even the largest simulated changes considerably underestimate the observed changes in lake area during the mid-Holocene. The inter-model differences in simulated lake area are largely due to differences in simulated runoff (the squared correlation coefficient, R2, is 0.84). Most of these differences can be attributed to differences in the simulated precipitation (R2=0.83). The higher correlation between runoff and simulated lake area (R2=0.92) implies that simulated differences in evaporation have a contributory effect. When runoff is calculated using an offline land-surface scheme (BIOME3), the correlation between runoff and simulated lake area is (R2=0.94). Finally, the spatial distribution of simulated precipitation can exert an important control on the overall response.
Resumo:
The response of lightning rates over Europe to arrival of high speed solar wind streams at Earth is investigated using a superposed epoch analysis. Fast solar wind stream arrival is determined from modulation of the solar wind V y component, measured by the Advanced Composition Explorer spacecraft. Lightning rate changes around these event times are determined from the very low frequency arrival time difference (ATD) system of the UK Met Office. Arrival of high speed streams at Earth is found to be preceded by a decrease in total solar irradiance and an increase in sunspot number and Mg II emissions. These are consistent with the high speed stream's source being co-located with an active region appearing on the Eastern solar limb and rotating at the 27 d period of the Sun. Arrival of the high speed stream at Earth also coincides with a small (~1%) but rapid decrease in galactic cosmic ray flux, a moderate (~6%) increase in lower energy solar energetic protons (SEPs), and a substantial, statistically significant increase in lightning rates. These changes persist for around 40 d in all three quantities. The lightning rate increase is corroborated by an increase in the total number of thunder days observed by UK Met stations, again persisting for around 40 d after the arrival of a high speed solar wind stream. This result appears to contradict earlier studies that found an anti-correlation between sunspot number and thunder days over solar cycle timescales. The increase in lightning rates and thunder days that we observe coincides with an increased flux of SEPs which, while not being detected at ground level, nevertheless penetrate the atmosphere to tropospheric altitudes. This effect could be further amplified by an increase in mean lightning stroke intensity that brings more strokes above the detection threshold of the ATD system. In order to remove any potential seasonal bias the analysis was repeated for daily solar wind triggers occurring during the summer months (June to August). Though this reduced the number of solar wind triggers to 32, the response in both lightning and thunder day data remained statistically significant. This modulation of lightning by regular and predictable solar wind events may be beneficial to medium range forecasting of hazardous weather.
Resumo:
The Distribution Network Operators (DNOs) role is becoming more difficult as electric vehicles and electric heating penetrate the network, increasing the demand. As a result it becomes harder for the distribution networks infrastructure to remain within its operating constraints. Energy storage is a potential alternative to conventional network reinforcement such as upgrading cables and transformers. The research presented here in this paper shows that due to the volatile nature of the LV network, the control approach used for energy storage has a significant impact on performance. This paper presents and compares control methodologies for energy storage where the objective is to get the greatest possible peak demand reduction across the day from a pre-specified storage device. The results presented show the benefits and detriments of specific types of control on a storage device connected to a single phase of an LV network, using aggregated demand profiles based on real smart meter data from individual homes. The research demonstrates an important relationship between how predictable an aggregation is and the best control methodology required to achieve the objective.
Resumo:
Seasonal-to-interannual predictions of Arctic sea ice may be important for Arctic communities and industries alike. Previous studies have suggested that Arctic sea ice is potentially predictable but that the skill of predictions of the September extent minimum, initialized in early summer, may be low. The authors demonstrate that a melt season “predictability barrier” and two predictability reemergence mechanisms, suggested by a previous study, are robust features of five global climate models. Analysis of idealized predictions with one of these models [Hadley Centre Global Environment Model, version 1.2 (HadGEM1.2)], initialized in January, May and July, demonstrates that this predictability barrier exists in initialized forecasts as well. As a result, the skill of sea ice extent and volume forecasts are strongly start date dependent and those that are initialized in May lose skill much faster than those initialized in January or July. Thus, in an operational setting, initializing predictions of extent and volume in July has strong advantages for the prediction of the September minimum when compared to predictions initialized in May. Furthermore, a regional analysis of sea ice predictability indicates that extent is predictable for longer in the seasonal ice zones of the North Atlantic and North Pacific than in the regions dominated by perennial ice in the central Arctic and marginal seas. In a number of the Eurasian shelf seas, which are important for Arctic shipping, only the forecasts initialized in July have continuous skill during the first summer. In contrast, predictability of ice volume persists for over 2 yr in the central Arctic but less in other regions.
Resumo:
Persistent contrails are an important climate impact of aviation which could potentially be reduced by re-routing aircraft to avoid contrailing; however this generally increases both the flight length and its corresponding CO emissions. Here, we provide a simple framework to assess the trade-off between the climate impact of CO emissions and contrails for a single flight, in terms of the absolute global warming potential and absolute global temperature potential metrics for time horizons of 20, 50 and 100 years. We use the framework to illustrate the maximum extra distance (with no altitude changes) that can be added to a flight and still reduce its overall climate impact. Small aircraft can fly up to four times further to avoid contrailing than large aircraft. The results have a strong dependence on the applied metric and time horizon. Applying a conservative estimate of the uncertainty in the contrail radiative forcing and climate efficacy leads to a factor of 20 difference in the maximum extra distance that could be flown to avoid a contrail. The impact of re-routing on other climatically-important aviation emissions could also be considered in this framework.
Resumo:
Mankind is facing an unprecedented health challenge in the current pandemic of obesity and diabetes. We propose that this is the inevitable (and predictable) consequence of the evolution of intelligence, which itself could be an expression of life being an information system driven by entropy. Because of its ability to make life more adaptable and robust, intelligence evolved as an efficient adaptive response to the stresses arising from an ever-changing environment. These adaptive responses are encapsulated by the epiphenomena of “hormesis”, a phenomenon we believe to be central to the evolution of intelligence and essential for the maintenance of optimal physiological function and health. Thus, as intelligence evolved, it would eventually reach a cognitive level with the ability to control its environment through technology and have the ability remove all stressors. In effect, it would act to remove the very hormetic factors that had driven its evolution. Mankind may have reached this point, creating an environmental utopia that has reduced the very stimuli necessary for optimal health and the evolution of intelligence – “the intelligence paradox”. One of the hallmarks of this paradox is of course the rising incidence in obesity, diabetes and the metabolic syndrome. This leads to the conclusion that wherever life evolves, here on earth or in another part of the galaxy, the “intelligence paradox’” would be the inevitable side-effect of the evolution of intelligence. ET may not need to just “phone home” but may also need to “phone the local gym”. This suggests another possible reason to explain Fermi’s paradox; Enrico Fermi, the famous physicist, suggested in the 1950s that if extra-terrestrial intelligence was so prevalent, which was a common belief at the time, then where was it? Our suggestion is that if advanced life has got going elsewhere in our galaxy, it can’t afford to explore the galaxy because it has to pay its healthcare costs.
Resumo:
Decadal and longer timescale variability in the winter North Atlantic Oscillation (NAO) has considerable impact on regional climate, yet it remains unclear what fraction of this variability is potentially predictable. This study takes a new approach to this question by demonstrating clear physical differences between NAO variability on interannual-decadal (<30 year) and multidecadal (>30 year) timescales. It is shown that on the shorter timescale the NAO is dominated by variations in the latitude of the North Atlantic jet and storm track, whereas on the longer timescale it represents changes in their strength instead. NAO variability on the two timescales is associated with different dynamical behaviour in terms of eddy-mean flow interaction, Rossby wave breaking and blocking. The two timescales also exhibit different regional impacts on temperature and precipitation and different relationships to sea surface temperatures. These results are derived from linear regression analysis of the Twentieth Century and NCEP-NCAR reanalyses and of a high-resolution HiGEM General Circulation Model control simulation, with additional analysis of a long sea level pressure reconstruction. Evidence is presented for an influence of the ocean circulation on the longer timescale variability of the NAO, which is particularly clear in the model data. As well as providing new evidence of potential predictability, these findings are shown to have implications for the reconstruction and interpretation of long climate records.
Resumo:
Quasi-stationary convective bands can cause large localised rainfall accumulations and are often anchored by topographic features. Here, the predictability of and mechanisms causing one such band are determined using ensembles of the Met Office Unified Model at convection-permitting resolution (1.5 km grid length). The band was stationary over the UK for 3 h and produced rainfall accumulations of up to 34 mm. The amount and location of the predicted rainfall was highly variable despite only small differences between the large-scale conditions of the ensemble members. Only three of 21 members of the control ensemble produced a stationary rain band; these three had the weakest upstream winds and hence lowest Froude number. Band formation was due to the superposition of two processes: lee-side convergence resulting from flow around an upstream obstacle and thermally forced convergence resulting from elevated heating over the upstream terrain. Both mechanisms were enhanced when the Froude number was lower. By increasing the terrain height (thus reducing the Froude number), the band became more predictable. An ensemble approach is required to successfully predict the possible occurrence of such quasi-stationary convective events because the rainfall variability is largely modulated by small variations of the large-scale flow. However, high-resolution models are required to accurately resolve the small-scale interactions of the flow with the topography upon which the band formation depends. Thus, although topography provides some predictability, the quasi-stationary convective bands anchored by it are likely to remain a forecasting challenge for many years to come.
Resumo:
The more information is available, and the more predictable are events, the better forecasts ought to be. In this paper forecasts by bookmakers, prediction markets and tipsters are evaluated for a range of events with varying degrees of predictability and information availability. All three types of forecast represent different structures of information processing and as such would be expected to perform differently. By and large, events that are more predictable, and for which more information is available, do tend to be forecast better.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
At the most recent session of the Conference of the Parties (COP19) in Warsaw (November 2013) the Warsaw international mechanism for loss and damage associated with climate change impacts was established under the United Nations Framework Convention on Climate Change (UNFCCC). The mechanism aims at promoting the implementation of approaches to address loss and damage associated with the adverse effects of climate change. Specifically, it aims to enhance understanding of risk management approaches to address loss and damage. Understanding risks associated with impacts due to highly predictable (slow onset) events like sea-level rise is relatively straightforward whereas assessing the effects of climate change on extreme weather events and their impacts is much more difficult. However, extreme weather events are a significant cause of loss of life and livelihoods, particularly in vulnerable countries and communities in Africa. The emerging science of probabilistic event attribution is relevant as it provides scientific evidence on the contribution of anthropogenic climate change to changes in risk of extreme events. It thus provides the opportunity to explore scientifically-backed assessments of the human influence on such events. However, different ways of framing attribution questions can lead to very different assessments of change in risk. Here we explain the methods of, and implications of different approaches to attributing extreme weather events with a focus on Africa. Crucially, it demonstrates that defining the most appropriate attribution question to ask is not a science decision but needs to be made in dialogue with those stakeholders who will use the answers.
Resumo:
Four alkyl substituted β-lactones were investigated as monomers in ring opening polymerisation to produce a family of poly(3-hydroxyalkanoate)s. Homopolymers were synthesised using a robust aluminium salen catalyst, resulting in polymers with low dispersity (Đ < 1.1) and predictable molecular weights. ABA triblock copolymers were prepared using poly(L-lactic acid) as the A block and the aforementioned poly(3-hydroxyalkanoate) as the B block via a sequential addition method. Characterisation of these copolymers determined they were well controlled with low dispersities and predictable molecular weight. DSC analysis determined copolymers prepared from β-butyrolactone or β-valerolactone yielded polymers with tunable and predictable thermal properties. Copolymers prepared from β-heptanolactone yielded a microphase separated material as indicated by SAXS, with two distinct Tgs. The polymers could be readily cast into flexible films and their improved tensile properties were explored.
Resumo:
Transgenerational inheritance of abiotic stress-induced epigenetic modifications in plants has potential adaptive significance and might condition the offspring to improve the response to the same stress, but this is at least partly dependent on the potency, penetrance and persistence of the transmitted epigenetic marks. We examined transgenerational inheritance of low Relative Humidity-induced DNA methylation for two gene loci in the stomatal developmental pathway in Arabidopsis thaliana and the abundance of associated short-interfering RNAs (siRNAs). Heritability of low humidity-induced methylation was more predictable and penetrative at one locus (SPEECHLESS, entropy ≤ 0.02; χ2 < 0.001) than the other (FAMA, entropy ≤ 0.17; χ2 ns). Methylation at SPEECHLESS correlated positively with the continued presence of local siRNAs (r2 = 0.87; p = 0.013) which, however, could be disrupted globally in the progeny under repeated stress. Transgenerational methylation and a parental low humidity-induced stomatal phenotype were heritable, but this was reversed in the progeny under repeated treatment in a previously unsuspected manner.