26 resultados para Longitudinal Data Analysis and Time Series
em CentAUR: Central Archive University of Reading - UK
Resumo:
African societies are dependent on rainfall for agricultural and other water-dependent activities, yet rainfall is extremely variable in both space and time and reoccurring water shocks, such as drought, can have considerable social and economic impacts. To help improve our knowledge of the rainfall climate, we have constructed a 30-year (1983–2012), temporally consistent rainfall dataset for Africa known as TARCAT (TAMSAT African Rainfall Climatology And Time-series) using archived Meteosat thermal infra-red (TIR) imagery, calibrated against rain gauge records collated from numerous African agencies. TARCAT has been produced at 10-day (dekad) scale at a spatial resolution of 0.0375°. An intercomparison of TARCAT from 1983 to 2010 with six long-term precipitation datasets indicates that TARCAT replicates the spatial and seasonal rainfall patterns and interannual variability well, with correlation coefficients of 0.85 and 0.70 with the Climate Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) gridded-gauge analyses respectively in the interannual variability of the Africa-wide mean monthly rainfall. The design of the algorithm for drought monitoring leads to TARCAT underestimating the Africa-wide mean annual rainfall on average by −0.37 mm day−1 (21%) compared to other datasets. As the TARCAT rainfall estimates are historically calibrated across large climatically homogeneous regions, the data can provide users with robust estimates of climate related risk, even in regions where gauge records are inconsistent in time.
Resumo:
A predictability index was defined as the ratio of the variance of the optimal prediction to the variance of the original time series by Granger and Anderson (1976) and Bhansali (1989). A new simplified algorithm for estimating the predictability index is introduced and the new estimator is shown to be a simple and effective tool in applications of predictability ranking and as an aid in the preliminary analysis of time series. The relationship between the predictability index and the position of the poles and lag p of a time series which can be modelled as an AR(p) model are also investigated. The effectiveness of the algorithm is demonstrated using numerical examples including an application to stock prices.
Resumo:
We have applied time series analytical techniques to the flux of lava from an extrusive eruption. Tilt data acting as a proxy for flux are used in a case study of the May–August 1997 period of the eruption at Soufrière Hills Volcano, Montserrat. We justify the use of such a proxy by simple calibratory arguments. Three techniques of time series analysis are employed: spectral, spectrogram and wavelet methods. In addition to the well-known ~9-hour periodicity shown by these data, a previously unknown periodic flux variability is revealed by the wavelet analysis as a 3-day cycle of frequency modulation during June–July 1997, though the physical mechanism responsible is not clear. Such time series analysis has potential for other lava flux proxies at other types of volcanoes.
Resumo:
Simulations of 15 coupled chemistry climate models, for the period 1960–2100, are presented. The models include a detailed stratosphere, as well as including a realistic representation of the tropospheric climate. The simulations assume a consistent set of changing greenhouse gas concentrations, as well as temporally varying chlorofluorocarbon concentrations in accordance with observations for the past and expectations for the future. The ozone results are analyzed using a nonparametric additive statistical model. Comparisons are made with observations for the recent past, and the recovery of ozone, indicated by a return to 1960 and 1980 values, is investigated as a function of latitude. Although chlorine amounts are simulated to return to 1980 values by about 2050, with only weak latitudinal variations, column ozone amounts recover at different rates due to the influence of greenhouse gas changes. In the tropics, simulated peak ozone amounts occur by about 2050 and thereafter total ozone column declines. Consequently, simulated ozone does not recover to values which existed prior to the early 1980s. The results also show a distinct hemispheric asymmetry, with recovery to 1980 values in the Northern Hemisphere extratropics ahead of the chlorine return by about 20 years. In the Southern Hemisphere midlatitudes, ozone is simulated to return to 1980 levels only 10 years ahead of chlorine. In the Antarctic, annually averaged ozone recovers at about the same rate as chlorine in high latitudes and hence does not return to 1960s values until the last decade of the simulations.
Resumo:
The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.
Resumo:
Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.
Resumo:
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.
Resumo:
This paper exploits a structural time series approach to model the time pattern of multiple and resurgent food scares and their direct and cross-product impacts on consumer response. A structural time series Almost Ideal Demand System (STS-AIDS) is embedded in a vector error correction framework to allow for dynamic effects (VEC-STS-AIDS). Italian aggregate household data on meat demand is used to assess the time-varying impact of a resurgent BSE crisis (1996 and 2000) and the 1999 Dioxin crisis. The VEC-STS-AIDS model monitors the short-run impacts and performs satisfactorily in terms of residuals diagnostics, overcoming the major problems encountered by the customary vector error correction approach.
Resumo:
We present the symbolic resonance analysis (SRA) as a viable method for addressing the problem of enhancing a weakly dominant mode in a mixture of impulse responses obtained from a nonlinear dynamical system. We demonstrate this using results from a numerical simulation with Duffing oscillators in different domains of their parameter space, and by analyzing event-related brain potentials (ERPs) from a language processing experiment in German as a representative application. In this paradigm, the averaged ERPs exhibit an N400 followed by a sentence final negativity. Contemporary sentence processing models predict a late positivity (P600) as well. We show that the SRA is able to unveil the P600 evoked by the critical stimuli as a weakly dominant mode from the covering sentence final negativity. (c) 2007 American Institute of Physics. (c) 2007 American Institute of Physics.
Resumo:
Bayesian Model Averaging (BMA) is used for testing for multiple break points in univariate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over specifications including: stationary; stationary around trend and unit root models, each containing different types and number of breaks and different lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in all of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our findings regarding the existence of unit roots, having allowed for structural breaks in the data, are largely consistent with previous work.
Resumo:
A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.
Resumo:
Expectations of future market conditions are generally acknowledged to be crucial for the development decision and hence for shaping the built environment. This empirical study of the Central London office market from 1987 to 2009 tests for evidence of adaptive and naive expectations. Applying VAR models and a recursive OLS regression with one-step forecasts, we find evidence of adaptive and naïve, rather than rational expectations of developers. Although the magnitude of the errors and the length of time lags vary over time and development cycles, the results confirm that developers’ decisions are explained to a large extent by contemporaneous and past conditions in both London submarkets. The corollary of this finding is that developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of unexpected exogenous shocks.