966 resultados para Time Series Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes of glaciers and snow cover in polar regions affect a wide range of physical and ecosystem processes on land and in the adjacent marine environment. In this study, we investigate the potential of 11-day repeat high-resolution satellite image time series from the TerraSAR-X mission to derive glaciological and hydrological parameters on King George Island, Antarctica during the period Oct/25/2010 to Apr/19/2011. The spatial pattern and temporal evolution of snow cover extent on ice-free areas can be monitored using multi-temporal coherence images. SAR coherence is used to map glacier extent of land terminating glaciers with an average accuracy of 25 m. Multi-temporal SAR color composites identify the position of the late summer snow line at about 220 m above sea level. Glacier surface velocities are obtained from intensity feature-tracking. Surface velocities near the calving front of Fourcade Glacier were up to 1.8 ± 0.01 m/d. Using an intercept theorem based on fundamental geometric principles together with differential GPS field measurements, the ice discharge of Fourcade Glacier was estimated to 20700 ± 5500 m**3/d (corresponding to ~19 ± 5 kt/d). The rapidly changing surface conditions on King George Island and the lack of high-resolution digital elevation models for the region remain restrictions for the applicability of SAR data and the precision of derived products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral absorption coefficients of total particulate matter ap (lambda) were determined using the in vitro filter technique. The present analysis deals with a set of 1166 spectra, determined in various oceanic (case 1) waters, with field chl a concentrations ([chl]) spanning 3 orders of magnitude (0.02-25 mg/m**3). As previously shown [Bricaud et al., 1995, doi:10.1029/95JC00463] for the absorption coefficients of living phytoplankton a phi (lamda), the ap (labda) coefficients also increase nonlinearly with [chl]. The relationships (power laws) that link ap (lambda) and a phi (lambda) to [chl] show striking similarities. Despite large fluctuations, the relative contribution of nonalgal particles to total absorption oscillates around an average value of 25-30% throughout the [chl] range. The spectral dependence of absorption by these nonalgal particles follows an exponential increase toward short wavelengths, with a weakly variable slope (0.011 ± 0.0025/nm). The empirical relationships linking ap (lambda) to ([chl]) can be used in bio-optical models. This parameterization based on in vitro measurements leads to a good agreement with a former modeling of the diffuse attenuation coefficient based on in situ measurements. This agreement is worth noting as independent methods and data sets are compared. It is stressed that for a given ([chl]), the ap (lambda) coefficients show large residual variability around the regression lines (for instance, by a factor of 3 at 440 nm). The consequences of such a variability, when predicting or interpreting the diffuse reflectance of the ocean, are examined, according to whether or not these variations in ap are associated with concomitant variations in particle scattering. In most situations the deviations in ap actually are not compensated by those in particle scattering, so that the amplitude of reflectance is affected by these variations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En la actualidad, el seguimiento de la dinámica de los procesos medio ambientales está considerado como un punto de gran interés en el campo medioambiental. La cobertura espacio temporal de los datos de teledetección proporciona información continua con una alta frecuencia temporal, permitiendo el análisis de la evolución de los ecosistemas desde diferentes escalas espacio-temporales. Aunque el valor de la teledetección ha sido ampliamente probado, en la actualidad solo existe un número reducido de metodologías que permiten su análisis de una forma cuantitativa. En la presente tesis se propone un esquema de trabajo para explotar las series temporales de datos de teledetección, basado en la combinación del análisis estadístico de series de tiempo y la fenometría. El objetivo principal es demostrar el uso de las series temporales de datos de teledetección para analizar la dinámica de variables medio ambientales de una forma cuantitativa. Los objetivos específicos son: (1) evaluar dichas variables medio ambientales y (2) desarrollar modelos empíricos para predecir su comportamiento futuro. Estos objetivos se materializan en cuatro aplicaciones cuyos objetivos específicos son: (1) evaluar y cartografiar estados fenológicos del cultivo del algodón mediante análisis espectral y fenometría, (2) evaluar y modelizar la estacionalidad de incendios forestales en dos regiones bioclimáticas mediante modelos dinámicos, (3) predecir el riesgo de incendios forestales a nivel pixel utilizando modelos dinámicos y (4) evaluar el funcionamiento de la vegetación en base a la autocorrelación temporal y la fenometría. Los resultados de esta tesis muestran la utilidad del ajuste de funciones para modelizar los índices espectrales AS1 y AS2. Los parámetros fenológicos derivados del ajuste de funciones permiten la identificación de distintos estados fenológicos del cultivo del algodón. El análisis espectral ha demostrado, de una forma cuantitativa, la presencia de un ciclo en el índice AS2 y de dos ciclos en el AS1 así como el comportamiento unimodal y bimodal de la estacionalidad de incendios en las regiones mediterránea y templada respectivamente. Modelos autorregresivos han sido utilizados para caracterizar la dinámica de la estacionalidad de incendios y para predecir de una forma muy precisa el riesgo de incendios forestales a nivel pixel. Ha sido demostrada la utilidad de la autocorrelación temporal para definir y caracterizar el funcionamiento de la vegetación a nivel pixel. Finalmente el concepto “Optical Functional Type” ha sido definido, donde se propone que los pixeles deberían ser considerados como unidades temporales y analizados en función de su dinámica temporal. ix SUMMARY A good understanding of land surface processes is considered as a key subject in environmental sciences. The spatial-temporal coverage of remote sensing data provides continuous observations with a high temporal frequency allowing the assessment of ecosystem evolution at different temporal and spatial scales. Although the value of remote sensing time series has been firmly proved, only few time series methods have been developed for analyzing this data in a quantitative and continuous manner. In the present dissertation a working framework to exploit Remote Sensing time series is proposed based on the combination of Time Series Analysis and phenometric approach. The main goal is to demonstrate the use of remote sensing time series to analyze quantitatively environmental variable dynamics. The specific objectives are (1) to assess environmental variables based on remote sensing time series and (2) to develop empirical models to forecast environmental variables. These objectives have been achieved in four applications which specific objectives are (1) assessing and mapping cotton crop phenological stages using spectral and phenometric analyses, (2) assessing and modeling fire seasonality in two different ecoregions by dynamic models, (3) forecasting forest fire risk on a pixel basis by dynamic models, and (4) assessing vegetation functioning based on temporal autocorrelation and phenometric analysis. The results of this dissertation show the usefulness of function fitting procedures to model AS1 and AS2. Phenometrics derived from function fitting procedure makes it possible to identify cotton crop phenological stages. Spectral analysis has demonstrated quantitatively the presence of one cycle in AS2 and two in AS1 and the unimodal and bimodal behaviour of fire seasonality in the Mediterranean and temperate ecoregions respectively. Autoregressive models has been used to characterize the dynamics of fire seasonality in two ecoregions and to forecasts accurately fire risk on a pixel basis. The usefulness of temporal autocorrelation to define and characterized land surface functioning has been demonstrated. And finally the “Optical Functional Types” concept has been proposed, in this approach pixels could be as temporal unities based on its temporal dynamics or functioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lomb periodogram has been traditionally a tool that allows us to elucidate if a frequency turns out to be important for explaining the behaviour of a given time series. Many linear and nonlinear reiterative harmonic processes that are used for studying the spectral content of a time series take into account this periodogram in order to avoid including spurious frequencies in their models due to the leakage problem of energy from one frequency to others. However, the estimation of the periodogram requires long computation time that makes the harmonic analysis slower when we deal with certain time series. Here we propose an algorithm that accelerates the extraction of the most remarkable frequencies from the periodogram, avoiding its whole estimation of the harmonic process at each iteration. This algorithm allows the user to perform a specific analysis of a given scalar time series. As a result, we obtain a functional model made of (1) a trend component, (2) a linear combination of Fourier terms, and (3) the so-called mixed secular terms by reducing the computation time of the estimation of the periodogram.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Moderate resolution remote sensing data, as provided by MODIS, can be used to detect and map active or past wildfires from daily records of suitable combinations of reflectance bands. The objective of the present work was to develop and test simple algorithms and variations for automatic or semiautomatic detection of burnt areas from time series data of MODIS biweekly vegetation indices for a Mediterranean region. MODIS-derived NDVI 250m time series data for the Valencia region, East Spain, were subjected to a two-step process for the detection of candidate burnt areas, and the results compared with available fire event records from the Valencia Regional Government. For each pixel and date in the data series, a model was fitted to both the previous and posterior time series data. Combining drops between two consecutive points and 1-year average drops, we used discrepancies or jumps between the pre and post models to identify seed pixels, and then delimitated fire scars for each potential wildfire using an extension algorithm from the seed pixels. The resulting maps of the detected burnt areas showed a very good agreement with the perimeters registered in the database of fire records used as reference. Overall accuracies and indices of agreement were very high, and omission and commission errors were similar or lower than in previous studies that used automatic or semiautomatic fire scar detection based on remote sensing. This supports the effectiveness of the method for detecting and mapping burnt areas in the Mediterranean region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequently, population ecology of marine organisms uses a descriptive approach in which their sizes and densities are plotted over time. This approach has limited usefulness for design strategies in management or modelling different scenarios. Population projection matrix models are among the most widely used tools in ecology. Unfortunately, for the majority of pelagic marine organisms, it is difficult to mark individuals and follow them over time to determine their vital rates and built a population projection matrix model. Nevertheless, it is possible to get time-series data to calculate size structure and densities of each size, in order to determine the matrix parameters. This approach is known as a “demographic inverse problem” and it is based on quadratic programming methods, but it has rarely been used on aquatic organisms. We used unpublished field data of a population of cubomedusae Carybdea marsupialis to construct a population projection matrix model and compare two different management strategies to lower population to values before year 2008 when there was no significant interaction with bathers. Those strategies were by direct removal of medusae and by reducing prey. Our results showed that removal of jellyfish from all size classes was more effective than removing only juveniles or adults. When reducing prey, the highest efficiency to lower the C. marsupialis population occurred when prey depletion affected prey of all medusae sizes. Our model fit well with the field data and may serve to design an efficient management strategy or build hypothetical scenarios such as removal of individuals or reducing prey. TThis This sdfsdshis method is applicable to other marine or terrestrial species, for which density and population structure over time are available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Tertiary detritic aquifer of Madrid (TDAM), with an average thickness of 1500 m and a heterogeneous, anisotropic structure, supplies water to Madrid, the most populated city of Spain (3.2 million inhabitants in the metropolitan area). Besides its complex structure, a previous work focused in the north-northwest of Madrid city showed that the aquifer behaves quasi elastically trough extraction/recovery cycles and ground uplifting during recovery periods compensates most of the ground subsidence measured during previous extraction periods (Ezquerro et al., 2014). Therefore, the relationship between ground deformation and groundwater level through time can be simulated using simple elastic models. In this work, we model the temporal evolution of the piezometric level in 19 wells of the TDAM in the period 1997–2010. Using InSAR and piezometric time series spanning the studied period, we first estimate the elastic storage coefficient (Ske) for every well. Both, the Ske of each well and the average Ske of all wells, are used to predict hydraulic heads at the different well locations during the study period and compared against the measured hydraulic heads, leading to very similar errors when using the Ske of each well and the average Ske of all wells: 14 and 16 % on average respectively. This result suggests that an average Ske can be used to estimate piezometric level variations in all the points where ground deformation has been measured by InSAR, thus allowing production of piezometric level maps for the different extraction/recovery cycles in the TDAM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After ingestion of a standardized dose of ethanol, alcohol concentrations were assessed, over 3.5 hours from blood (six readings) and breath (10 readings) in a sample of 412 MZ and DZ twins who took part in an Alcohol Challenge Twin Study (ACTS). Nearly all participants were subsequently genotyped on two polymorphic SNPs in the ADH1B and ADH1C loci known to affect in vitro ADH activity. In the DZ pairs, 14 microsatellite markers covering a 20.5 cM region on chromosome 4 that includes the ADH gene family were assessed, Variation in the timed series of autocorrelated blood and breath alcohol readings was studied using a bivariate simplex design. The contribution of a quantitative trait locus (QTL) or QTL's linked to the ADH region was estimated via a mixture of likelihoods weighted by identity-by-descent probabilities. The effects of allelic substitution at the ADH1B and ADH1C loci were estimated in the means part of the model simultaneously with the effects sex and age. There was a major contribution to variance in alcohol metabolism due to a QTL which accounted for about 64% of the additive genetic covariation common to both blood and breath alcohol readings at the first time point. No effects of the ADH1B*47His or ADH1C*349Ile alleles on in vivo metabolism were observed, although these have been shown to have major effects in vitro. This implies that there is a major determinant of variation for in vivo alcohol metabolism in the ADH region that is not accounted for by these polymorphisms. Earlier analyses of these data suggested that alcohol metabolism is related to drinking behavior and imply that this QTL may be protective against alcohol dependence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop an evolutionary kernel-based time update algorithm to recursively estimate subset discrete lag models (including fullorder models) with a forgetting factor and a constant term, using the exactwindowed case. The algorithm applies to causality detection when the true relationship occurs with a continuous or a random delay. We then demonstrate the use of the proposed evolutionary algorithm to study the monthly mutual fund data, which come from the 'CRSP Survivor-bias free US Mutual Fund Database'. The results show that the NAV is an influential player on the international stage of global bond and stock markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most traditional methods for extracting the relationships between two time series are based on cross-correlation. In a non-linear non-stationary environment, these techniques are not sufficient. We show in this paper how to use hidden Markov models (HMMs) to identify the lag (or delay) between different variables for such data. We first present a method using maximum likelihood estimation and propose a simple algorithm which is capable of identifying associations between variables. We also adopt an information-theoretic approach and develop a novel procedure for training HMMs to maximise the mutual information between delayed time series. Both methods are successfully applied to real data. We model the oil drilling process with HMMs and estimate a crucial parameter, namely the lag for return.