869 resultados para Just-in-time
Resumo:
A number of tests for non-linear dependence in time series are presented and implemented on a set of 10 daily sterling exchange rates covering the entire post Bretton-Woods era until the present day. Irrefutable evidence of non-linearity is shown in many of the series, but most of this dependence can apparently be explained by reference to the GARCH family of models. It is suggested that the literature in this area has reached an impasse, with the presence of ARCH effects clearly demonstrated in a large number of papers, but with the tests for non-linearity which are currently available being unable to classify any additional non-linear structure.
Resumo:
An alternative procedure to that of Lo is proposed for assessing whether there is significant evidence of persistence in time series. The technique estimates the Hurst exponent itself, and significance testing is based on an application of bootstrapping using surrogate data. The method is applied to a set of 10 daily pound exchange rates. A general lack of long-term memory is found to characterize all the series tested, in sympathy with the findings of a number of other recent papers which have used Lo's techniques.
Resumo:
This note describes a simple procedure for removing unphysical temporal discontinuities in ERA-Interim upper stratospheric global mean temperatures in March 1985 and August 1998 that have arisen due to changes in satellite radiance data used in the assimilation. The derived temperature adjustments (offsets) are suitable for use in stratosphere-resolving chemistry-climate models that are nudged (relaxed) to ERA-Interim winds and temperatures. Simulations using a nudged version of the Canadian Middle Atmosphere Model (CMAM) show that the inclusion of the temperature adjustments produces temperature time series that are devoid of the large jumps in 1985 and 1998. Due to its strong temperature dependence, the simulated upper stratospheric ozone is also shown to vary smoothly in time, unlike in a nudged simulation without the adjustments where abrupt changes in ozone occur at the times of the temperature jumps. While the adjustments to the ERA-Interim temperatures remove significant artefacts in the nudged CMAM simulation, spurious transient effects that arise due to water vapour and persist for about 5 yr after the 1979 switch to ERA-Interim data are identified, underlining the need for caution when analysing trends in runs nudged to reanalyses.
Resumo:
In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.
Resumo:
Residential electricity demand in most European countries accounts for a major proportion of overall electricity consumption. The timing of residential electricity demand has significant impacts on carbon emissions and system costs. This paper reviews the data and methods used in time use studies in the context of residential electricity demand modelling. It highlights key issues which are likely to become more topical for research on the timing of electricity demand following the roll-out of smart metres.
Resumo:
African societies are dependent on rainfall for agricultural and other water-dependent activities, yet rainfall is extremely variable in both space and time and reoccurring water shocks, such as drought, can have considerable social and economic impacts. To help improve our knowledge of the rainfall climate, we have constructed a 30-year (1983–2012), temporally consistent rainfall dataset for Africa known as TARCAT (TAMSAT African Rainfall Climatology And Time-series) using archived Meteosat thermal infra-red (TIR) imagery, calibrated against rain gauge records collated from numerous African agencies. TARCAT has been produced at 10-day (dekad) scale at a spatial resolution of 0.0375°. An intercomparison of TARCAT from 1983 to 2010 with six long-term precipitation datasets indicates that TARCAT replicates the spatial and seasonal rainfall patterns and interannual variability well, with correlation coefficients of 0.85 and 0.70 with the Climate Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) gridded-gauge analyses respectively in the interannual variability of the Africa-wide mean monthly rainfall. The design of the algorithm for drought monitoring leads to TARCAT underestimating the Africa-wide mean annual rainfall on average by −0.37 mm day−1 (21%) compared to other datasets. As the TARCAT rainfall estimates are historically calibrated across large climatically homogeneous regions, the data can provide users with robust estimates of climate related risk, even in regions where gauge records are inconsistent in time.
Resumo:
Recent studies of the variation of geomagnetic activity over the past 140 years have quantified the "coronal source" magnetic flux F-s that leaves the solar atmosphere and enters the heliosphere and have shown that it has risen, on average, by an estimated 34% since 1963 and by 140% since 1900. This variation of open solar flux has been reproduced by Solanki et al. [2000] using a model which demonstrates how the open flux accumulates and decays, depending on the rate of flux emergence in active regions and on the length of the solar cycle. We here use a new technique to evaluate solar cycle length and find that it does vary in association with the rate of change of F-s in the way predicted. The long-term variation of the rate of flux emergence is found to be very similar in form to that in F-s, which may offer a potential explanation of why F-s appears to be a useful proxy for extrapolating solar total irradiance back in time. We also find that most of the variation of cosmic ray fluxes incident on Earth is explained by the strength of the heliospheric field (quantified by F-s) and use observations of the abundance of the isotope Be-10 (produced by cosmic rays and deposited in ice sheets) to study the decrease in F-s during the Maunder minimum. The interior motions at the base of the convection zone, where the solar dynamo is probably located, have recently been revealed using the helioseismology technique and found to exhibit a 1.3-year oscillation. This periodicity is here reported in observations of the interplanetary magnetic field and geomagnetic activity but is only present after 1940, When present, it shows a strong 22-year variation, peaking near the maximum of even-numbered sunspot cycles and showing minima at the peaks of odd-numbered cycles. We discuss the implications of these long-term solar and heliospheric variations for Earth's environment.
Resumo:
Early in 1996, the latest of the European incoherent-scatter (EISCAT) radars came into operation on the Svalbard islands. The EISCAT Svalbard Radar (ESR) has been built in order to study the ionosphere in the northern polar cap and in particular, the dayside cusp. Conditions in the upper atmosphere in the cusp region are complex, with magnetosheath plasma cascading freely into the atmosphere along open magnetic field lines as a result of magnetic reconnection at the dayside magnetopause. A model has been developed to predict the effects of pulsed reconnection and the subsequent cusp precipitation in the ionosphere. Using this model we have successfully recreated some of the major features seen in photometer and satellite data within the cusp. In this paper, the work is extended to predict the signatures of pulsed reconnection in ESR data when the radar is pointed along the magnetic field. It is expected that enhancements in both electron concentration and electron temperature will be observed. Whether these enhancements are continuous in time or occur as a series of separate events is shown to depend critically on where the open/closed field-line boundary is with respect to the radar. This is shown to be particularly true when reconnection pulses are superposed on a steady background rate.
Resumo:
In 1984 and 1985 a series of experiments was undertaken in which dayside ionospheric flows were measured by the EISCAT “Polar” experiment, while observations of the solar wind and interplanetary magnetic field (IMF) were made by the AMPTE UKS and IRM spacecraft upstream from the Earth's bow shock. As a result, 40 h of simultaneous data were acquired, which are analysed in this paper to investigate the relationship between the ionospheric flow and the North-South (Bz) component of the IMF. The ionospheric flow data have 2.5 min resolution, and cover the dayside local time sector from ∼ 09:30 to ∼ 18:30 M.L.T. and the latitude range from 70.8° to 74.3°. Using cross-correlation analysis it is shown that clear relationships do exist between the ionospheric flow and IMF Bz, but that the form of the relations depends strongly on latitude and local time. These dependencies are readily interpreted in terms of a twinvortex flow pattern in which the magnitude and latitudinal extent of the flows become successively larger as Bz becomes successively more negative. Detailed maps of the flow are derived for a range of Bz values (between ± 4 nT) which clearly demonstrate the presence of these effects in the data. The data also suggest that the morning reversal in the East-West component of flow moves to earlier local times as Bz, declines in value and becomes negative. The correlation analysis also provides information on the ionospheric response time to changes in IMF Bz, it being found that the response is very rapid indeed. The most rapid response occurs in the noon to mid-afternoon sector, where the westward flows of the dusk cell respond with a delay of 3.9 ± 2.2 min to changes in the North-South field at the subsolar magnetopause. The flows appear to evolve in form over the subsequent ~ 5 min interval, however, as indicated by the longer response times found for the northward component of flow in this sector (6.7 ±2.2 min), and in data from earlier and later local times. No evidence is found for a latitudinal gradient in response time; changes in flow take place coherently in time across the entire radar field-of-view.
Resumo:
The Madden–Julian Oscillation (MJO) is the chief source of tropical intra-seasonal variability, but is simulated poorly by most state-of-the-art GCMs. Common errors include a lack of eastward propagation at the correct frequency and zonal extent, and too small a ratio of eastward- to westward-propagating variability. Here it is shown that HiGEM, a high-resolution GCM, simulates a very realistic MJO with approximately the correct spatial and temporal scale. Many MJO studies in GCMs are limited to diagnostics which average over a latitude band around the equator, allowing an analysis of the MJO’s structure in time and longitude only. In this study a wider range of diagnostics is applied. It is argued that such an approach is necessary for a comprehensive analysis of a model’s MJO. The standard analysis of Wheeler and Hendon (Mon Wea Rev 132(8):1917–1932, 2004; WH04) is applied to produce composites, which show a realistic spatial structure in the MJO envelopes but for the timing of the peak precipitation in the inter-tropical convergence zone, which bifurcates the MJO signal. Further diagnostics are developed to analyse the MJO’s episodic nature and the “MJO inertia” (the tendency to remain in the same WH04 phase from one day to the next). HiGEM favours phases 2, 3, 6 and 7; has too much MJO inertia; and dies out too frequently in phase 3. Recent research has shown that a key feature of the MJO is its interaction with the diurnal cycle over the Maritime Continent. This interaction is present in HiGEM but is unrealistically weak.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. In Part II, the moisture budgets and moist entropy budgets are analyzed. Vertical subgrid diabatic heating profiles and vertical velocity profiles are also compared; these are related to the horizontal and vertical advective components of the moist entropy budget which contribute to gross moist stability, GMS, and normalized GMS (NGMS). The 4-km model with explicit convection and good MJO performance has a vertical heating structure that increases with height in the lower troposphere in regions of strong convection (like observations), whereas the 12-km model with parameterized convection and a poor MJO does not show this relationship. The 4-km explicit convection model also has a more top-heavy heating profile for the troposphere as a whole near and to the west of the active MJO-related convection, unlike the 12-km parameterized convection model. The dependence of entropy advection components on moisture convergence is fairly weak in all models, and differences between models are not always related to MJO performance, making comparisons to previous work somewhat inconclusive. However, models with relatively good MJO strength and propagation have a slightly larger increase of the vertical advective component with increasing moisture convergence, and their NGMS vertical terms have more variability in time and longitude, with total NGMS that is comparatively larger to the west and smaller to the east.
Resumo:
More than two decades have passed since the fall of the Berlin Wall and the transfer of the Cold War file from a daily preoccupation of policy makers to a more detached assessment by historians. Scholars of U.S.-Latin American relations are beginning to take advantage both of the distance in time and of newly opened archives to reflect on the four decades that, from the 1940s to the 1980s, divided the Americas, as they did much of the world. Others are seeking to understand U.S. policy and inter-American relations in the post-Cold War era, a period that not only lacks a clear definition but also still has no name. Still others have turned their gaze forward to offer policies in regard to the region for the new Obama administration. Numerous books and review essays have addressed these three subjects—the Cold War, the post-Cold War era, and current and future issues on the inter-American agenda. Few of these studies attempt, however, to connect the three subjects or to offer new and comprehensive theories to explain the course of U.S. policies from the beginning of the twentieth century until the present. Indeed, some works and policy makers continue to use the mind-sets of the Cold War as though that conflict were still being fought. With the benefit of newly opened archives, some scholars have nevertheless drawn insights from the depths of the Cold War that improve our understanding of U.S. policies and inter-American relations, but they do not address the question as to whether the United States has escaped the longer cycle of intervention followed by neglect that has characterized its relations with Latin America. Another question is whether U.S. policies differ markedly before, during, and after the Cold War. In what follows, we ask whether the books reviewed here provide any insights in this regard and whether they offer a compass for the future of inter-American relations. We also offer our own thoughts as to how their various perspectives could be synthesized to address these questions more comprehensively.
Resumo:
To maintain synchrony in group activities, each individual within the group must continuously correct their movements to remain in time with the temporal cues available. Cues might originate from one or more members of the group. Current research suggests that when synchronising movements, individuals optimise their performance in terms of minimising variability of timing errors (asynchronies) between external cues and their own movements. However, the cost of this is an increase in the timing variability of their own movements. Here we investigate whether an individual’s timing strategy changes according to the task, in a group scenario. To investigate this, we employed a novel paradigm that positioned six individuals to form two chains with common origin and termination on the circumference of a circle. We found that participants with access to timing cues from only one other member used a strategy to minimise their asynchrony variance. In contrast, the participant at the common termination of the two chains, who was required to integrate timing cues from two members, used a strategy that minimised movement variability. We conclude that humans are able to flexibly switch timekeeping strategies to maintain task demands and thus optimise the temporal performance of their movements.
Resumo:
Explosive cyclones are intense extra-tropical low pressure systems featuring large deepening rates. In the Euro-Atlantic sector, they are a major source of life-threatening weather impacts due to their associated strong wind gusts, heavy precipitation and storm surges. The wintertime variability of the North Atlantic cyclonic activity is primarily modulated by the North Atlantic Oscillation (NAO). In this study, we investigate the interannual and multi-decadal variability of explosive North Atlantic cyclones using track density data from two reanalysis datasets (NCEP and ERA-40) and a control simulation of an atmosphere/ocean coupled General Circulation Model (GCM—ECHAM5/MPIOM1). The leading interannual and multi-decadal modes of variability of explosive cyclone track density are characterized by a strengthening/weakening pattern between Newfoundland and Iceland, which is mainly modulated by the NAO at both timescales. However, the NAO control of interannual cyclone variability is not stationary in time and abruptly fluctuates during periods of 20–25 years long both in NCEP and ECHAM5/MPIOM1. These transitions are accompanied by structural changes in the leading mode of explosive cyclone variability, and by decreased/enhanced baroclinicity over the sub-polar/sub-tropical North Atlantic. The influence of the ocean is apparently important for both the occurrence and persistence of such anomalous periods. In the GCM, the Atlantic Meridional Overturning Circulation appears to influence the large-scale baroclinicity and explosive cyclone development over the North Atlantic. These results permit a better understanding of explosive cyclogenesis variability at different climatic timescales and might help to improve predictions of these hazardous events.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.