942 resultados para Object-based time-series


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-series analysis and prediction play an important role in state-based systems that involve dealing with varying situations in terms of states of the world evolving with time. Generally speaking, the world in the discourse persists in a given state until something occurs to it into another state. This paper introduces a framework for prediction and analysis based on time-series of states. It takes a time theory that addresses both points and intervals as primitive time elements as the temporal basis. A state of the world under consideration is defined as a set of time-varying propositions with Boolean truth-values that are dependent on time, including properties, facts, actions, events and processes, etc. A time-series of states is then formalized as a list of states that are temporally ordered one after another. The framework supports explicit expression of both absolute and relative temporal knowledge. A formal schema for expressing general time-series of states to be incomplete in various ways, while the concept of complete time-series of states is also formally defined. As applications of the formalism in time-series analysis and prediction, we present two illustrating examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic flow time series data are usually high dimensional and very complex. Also they are sometimes imprecise and distorted due to data collection sensor malfunction. Additionally, events like congestion caused by traffic accidents add more uncertainty to real-time traffic conditions, making traffic flow forecasting a complicated task. This article presents a new data preprocessing method targeting multidimensional time series with a very high number of dimensions and shows its application to real traffic flow time series from the California Department of Transportation (PEMS web site). The proposed method consists of three main steps. First, based on a language for defining events in multidimensional time series, mTESL, we identify a number of types of events in time series that corresponding to either incorrect data or data with interference. Second, each event type is restored utilizing an original method that combines real observations, local forecasted values and historical data. Third, an exponential smoothing procedure is applied globally to eliminate noise interference and other random errors so as to provide good quality source data for future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crisis-affected communities and global organizations for international aid are becoming increasingly digital as consequence geotechnology popularity. Humanitarian sector changed in profound ways by adopting new technical approach to obtain information from area with difficult geographical or political access. Since 2011, turkey is hosting a growing number of Syrian refugees along southeastern region. Turkish policy of hosting them in camps and the difficulty created by governors to international aid group expeditions to get information, made such international organizations to investigate and adopt other approach in order to obtain information needed. They intensified its remote sensing approach. However, the majority of studies used very high-resolution satellite imagery (VHRSI). The study area is extensive and the temporal resolution of VHRSI is low, besides it is infeasible only using these sensors as unique approach for the whole area. The focus of this research, aims to investigate the potentialities of mid-resolution imagery (here only Landsat) to obtain information from region in crisis (here, southeastern Turkey) through a new web-based platform called Google Earth Engine (GEE). Hereby it is also intended to verify GEE currently reliability once the Application Programming Interface (API) is still in beta version. The finds here shows that the basic functions are trustworthy. Results pointed out that Landsat can recognize change in the spectral resolution clearly only for the first settlement. The ongoing modifications vary for each case. Overall, Landsat demonstrated high limitations, but need more investigations and may be used, with restriction, as a support of VHRSI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Palliative care should be provided according to the individual needs of the patient, caregiver and family, so that the type and level of care provided, as well as the setting in which it is delivered, are dependent on the complexity and severity of individual needs, rather than prognosis or diagnosis. This paper presents a study designed to assess the feasibility and efficacy of an intervention to assist in the allocation of palliative care resources according to need, within the context of a population of people with advanced cancer. ---------- Methods/design: People with advanced cancer and their caregivers completed bi-monthly telephone interviews over a period of up to 18 months to assess unmet needs, anxiety and depression, quality of life, satisfaction with care and service utilisation. The intervention, introduced after at least two baseline phone interviews, involved a) training medical, nursing and allied health professionals at each recruitment site on the use of the Palliative Care Needs Assessment Guidelines and the Needs Assessment Tool: Progressive Disease - Cancer (NAT: PD-C); b) health professionals completing the NAT: PD-C with participating patients approximately monthly for the rest of the study period. Changes in outcomes will be compared pre-and post-intervention.---------- Discussion: The study will determine whether the routine, systematic and regular use of the Guidelines and NAT: PD-C in a range of clinical settings is a feasible and effective strategy for facilitating the timely provision of needs based care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We conducted an in-situ X-ray micro-computed tomography heating experiment at the Advanced Photon Source (USA) to dehydrate an unconfined 2.3 mm diameter cylinder of Volterra Gypsum. We used a purpose-built X-ray transparent furnace to heat the sample to 388 K for a total of 310 min to acquire a three-dimensional time-series tomography dataset comprising nine time steps. The voxel size of 2.2 μm3 proved sufficient to pinpoint reaction initiation and the organization of drainage architecture in space and time. We observed that dehydration commences across a narrow front, which propagates from the margins to the centre of the sample in more than four hours. The advance of this front can be fitted with a square-root function, implying that the initiation of the reaction in the sample can be described as a diffusion process. Novel parallelized computer codes allow quantifying the geometry of the porosity and the drainage architecture from the very large tomographic datasets (20483 voxels) in unprecedented detail. We determined position, volume, shape and orientation of each resolvable pore and tracked these properties over the duration of the experiment. We found that the pore-size distribution follows a power law. Pores tend to be anisotropic but rarely crack-shaped and have a preferred orientation, likely controlled by a pre-existing fabric in the sample. With on-going dehydration, pores coalesce into a single interconnected pore cluster that is connected to the surface of the sample cylinder and provides an effective drainage pathway. Our observations can be summarized in a model in which gypsum is stabilized by thermal expansion stresses and locally increased pore fluid pressures until the dehydration front approaches to within about 100 μm. Then, the internal stresses are released and dehydration happens efficiently, resulting in new pore space. Pressure release, the production of pores and the advance of the front are coupled in a feedback loop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a simulation-based density estimation technique for time series that exploits information found in covariate data. The method can be paired with a large range of parametric models used in time series estimation. We derive asymptotic properties of the estimator and illustrate attractive finite sample properties for a range of well-known econometric and financial applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: This study attempted to develop health risk-based metrics for defining a heatwave in Brisbane, Australia. Methods: Poisson generalised additive model was performed to assess the impact of heatwaves on mortality and emergency hospital admissions (EHAs) in Brisbane. Results: In general, the higher the intensity and the longer the duration of a heatwave, the greater the health impacts. There was no apparent difference in EHAs risk during different periods of a warm season. However, there was a greater risk of mortality in the second half of a warm season than that in the first half. While elderly (>75 years)were particularly vulnerable to both the EHA and mortality effects of a heatwave, the risk for EHAs also significantly increased for two other age groups (0-64 years and 65-74 years) during severe heatwaves. Different patterns between cardiorespiratory mortality and EHAs were observed. Based on these findings, we propose the use of a teiered heat warning system based on the health risk of heatwave. Conclusions: Health risk-based metrics are a useful tool for the development of local heatwave definitions. thsi tool may have significant implications for the assessment of heatwave-related health consequences and development of heatwave response plans and implementation strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider the problem of time series classification. Using piecewise linear interpolation various novel kernels are obtained which can be used with Support vector machines for designing classifiers capable of deciding the class of a given time series. The approach is general and is applicable in many scenarios. We apply the method to the task of Online Tamil handwritten character recognition with promising results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach is proposed for clustering time-series data. The approach can be used to discover groupings of similar object motions that were observed in a video collection. A finite mixture of hidden Markov models (HMMs) is fitted to the motion data using the expectation-maximization (EM) framework. Previous approaches for HMM-based clustering employ a k-means formulation, where each sequence is assigned to only a single HMM. In contrast, the formulation presented in this paper allows each sequence to belong to more than a single HMM with some probability, and the hard decision about the sequence class membership can be deferred until a later time when such a decision is required. Experiments with simulated data demonstrate the benefit of using this EM-based approach when there is more "overlap" in the processes generating the data. Experiments with real data show the promising potential of HMM-based motion clustering in a number of applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we investigate the influence of a power-law noise model, also called noise, on the performance of a feed-forward neural network used to predict time series. We introduce an optimization procedure that optimizes the parameters the neural networks by maximizing the likelihood function based on the power-law model. We show that our optimization procedure minimizes the mean squared leading to an optimal prediction. Further, we present numerical results applying method to time series from the logistic map and the annual number of sunspots demonstrate that a power-law noise model gives better results than a Gaussian model.