126 resultados para Autoregressive moving average (ARMA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the stability and convergence properties of the class of transform-domain least mean square (LMS) adaptive filters with second-order autoregressive (AR) process are investigated. It is well known that this class of adaptive filters improve convergence property of the standard LMS adaptive filters by applying the fixed data-independent orthogonal transforms and power normalization. However, the convergence performance of this class of adaptive filters can be quite different for various input processes, and it has not been fully explored. In this paper, we first discuss the mean-square stability and steady-state performance of this class of adaptive filters. We then analyze the effects of the transforms and power normalization performed in the various adaptive filters for both first-order and second-order AR processes. We derive the input asymptotic eigenvalue distributions and make comparisons on their convergence performance. Finally, computer simulations on AR process as well as moving-average (MA) process and autoregressive-moving-average (ARMA) process are demonstrated for the support of the analytical results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to forecast Fiji's exports and imports for the period 2003-2020.

Design/methodology/approach – To achieve the goal of this paper, the autoregressive moving average with explanatory variables (ARMAX) model was applied. To this end, the paper drew on the published export demand model and the import demand model of Narayan and Narayan for Fiji.

Findings – The paper's main findings are: Fiji's imports will outperform exports over the 2003-2020 period; and current account deficits will escalate to be around F$934.4 million on average over the 2003-2020 period.

Originality/value – Exports and imports are crucial for macroeconomic policymaking. It measures the degree of openness of a country and it signals the trade balance and current account balances. This has implications for inflation and exchange rate. By forecasting Fiji's exports and imports, the paper provides policy makers with a set of information that will be useful for devising macroeconomic policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article studies a simple, coherent approach for identifying and estimating error-correcting vector autoregressive moving average (EC-VARMA) models. Canonical correlation analysis is implemented for both determining the cointegrating rank, using a strongly consistent method, and identifying the short-run VARMA dynamics, using the scalar component methodology. Finite-sample performance is evaluated via Monte Carlo simulations and the approach is applied to modelling and forecasting US interest rates. The results reveal that EC-VARMA models generate significantly more accurate out-of-sample forecasts than vector error correction models (VECMs), especially for short horizons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Our study investigates different models to forecast the total number of next-day discharges from an open ward having no real-time clinical data.

METHODS: We compared 5 popular regression algorithms to model total next-day discharges: (1) autoregressive integrated moving average (ARIMA), (2) the autoregressive moving average with exogenous variables (ARMAX), (3) k-nearest neighbor regression, (4) random forest regression, and (5) support vector regression. Although the autoregressive integrated moving average model relied on past 3-month discharges, nearest neighbor forecasting used median of similar discharges in the past in estimating next-day discharge. In addition, the ARMAX model used the day of the week and number of patients currently in ward as exogenous variables. For the random forest and support vector regression models, we designed a predictor set of 20 patient features and 88 ward-level features.

RESULTS: Our data consisted of 12,141 patient visits over 1826 days. Forecasting quality was measured using mean forecast error, mean absolute error, symmetric mean absolute percentage error, and root mean square error. When compared with a moving average prediction model, all 5 models demonstrated superior performance with the random forests achieving 22.7% improvement in mean absolute error, for all days in the year 2014.

CONCLUSIONS: In the absence of clinical information, our study recommends using patient-level and ward-level data in predicting next-day discharges. Random forest and support vector regression models are able to use all available features from such data, resulting in superior performance over traditional autoregressive methods. An intelligent estimate of available beds in wards plays a crucial role in relieving access block in emergency departments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine the impact of smoke-free policies in Victorian gambling venues on electronic gaming machine (EGM) expenditure.

Method: Monthly EGM expenditure from July 1998 to December 2005, provided by the Victorian Commission for Gambling Regulation and the Office of the Liquor and Gambling Commissioner in South Australia, was analysed. The outcome measure was the ratio of monthly expenditure for Victoria to monthly expenditure in South Australia. Intervention analysis and autoregressive integrated moving average modelling were used to assess the impact of the smoke-free policy on expenditure.

Results: The smoke-free policy resulted in an abrupt, long-term decrease in the level of EGM expenditure. The mean level of monthly expenditure decreased by approximately 14%.

Conclusion:
The smoke-free policy not only protects hospitality workers and patrons from exposure to secondhand smoke but has also had an impact on slowing gambling losses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine the impact of smoke-free policies in Victorian gambling venues on electronic gaming machine (EGM) expenditure.

Method: Monthly EGM expenditure from July 1998 to December 2005, provided by the Victorian Commission for Gambling Regulation and the Office of the Liquor and Gambling Commissioner in South Australia, was analysed. The outcome measure was the ratio of monthly expenditure for Victoria to monthly expenditure in South Australia. Intervention analysis and autoregressive integrated moving average modelling were used to assess the impact of the smoke-free policy on expenditure.

Results: The smoke-free policy resulted in an abrupt, long-term decrease in the level of EGM expenditure. The mean level of monthly expenditure decreased by approximately 14%.

Conclusion: The smoke-free policy not only protects hospitality workers and patrons from exposure to secondhand smoke but has also had an impact on slowing gambling losses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Predicting future demand for intensive care is vital to planning the allocation of resources.

METHOD: Mathematical modelling using the autoregressive integrated moving average (ARIMA) was applied to intensive care data from the Australian and New Zealand Intensive Care Society (ANZICS) Core Database and population projections from the Australian Bureau of Statistics to forecast future demand in Australian intensive care.

RESULTS: The model forecasts an increase in ICU demand of over 50% by 2020, with current total ICU bed-days (in 2007) of 471 358, predicted to increase to 643 160 by 2020. An increased rate of ICU use by patients older than 80 years was also noted, with the average bed-days per 10 000 population for this group increasing from 396 in 2006 to 741 in 2007.

CONCLUSION: An increase in demand of the forecast magnitude could not be accommodated within current ICU capacity. Significant action will be required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction and Aims
Regulatory and collaborative intervention strategies have been developed to reduce the harms associated with alcohol consumption on licensed venues around the world, but there remains little research evidence regarding their comparative effectiveness. This paper describes concurrent changes in the number of night-time injury-related hospital emergency department presentations in two cities that implemented either a collaborative voluntary approach to reducing harms associated with licensed premises (Geelong) or a regulatory approach (Newcastle).

Design and Methods

This paper reports findings from Dealing with Alcohol-Related problems in the Night-Time Economy project. Data were drawn from injury-specific International Classification of Disease, 10th Revision codes for injuries (S and T codes) presenting during high-alcohol risk times (midnight—5.59 am, Saturday and Sunday mornings) at the emergency departments in Geelong Hospital and Newcastle (John Hunter Hospital and the Calvary Mater Hospital), before and after the introduction of licensing conditions between the years of 2005 and 2011. Time-series, seasonal autoregressive integrated moving average analyses were conducted on the data obtained from patients' medical records.

Results

Significant reductions in injury-related presentations during high-alcohol risk times were found for Newcastle since the imposition of regulatory licensing conditions (344 attendances per year, P < 0.001). None of the interventions deployed in Geelong (e.g. identification scanners, police operations, radio networks or closed-circuit television) were associated with reductions in emergency department presentations.

Discussion and Conclusions

The data suggest that mandatory interventions based on trading hours restrictions were associated with reduced emergency department injury presentations in high-alcohol hours than voluntary interventions. [Miller P, Curtis A, Palmer D, Busija L, Tindall J, Droste N, Gillham K, Coomber K, Wiggers J. Changes in injury-related hospital emergency department presentations associated with the imposition of regulatory versus voluntary licensing conditions on licensed venues in two cities. Drug Alcohol Rev 2014]*

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To determine the impact of tobacco control policies and mass media campaigns on smoking prevalence in Australian adults.
Methods: Data for calculating the average monthly prevalence of smoking between January 2001 and June 2011 were obtained via structured interviews of randomly sampled adults aged 18 years or older from Australia’s five largest capital cities (monthly mean number of adults interviewed: 2375). The influence on smoking prevalence was estimated for increased tobacco taxes; strengthened smoke-free laws; increased monthly population exposure to televised tobacco control mass media campaigns and pharmaceutical company advertising for nicotine replacement therapy (NRT), using gross ratings points; monthly sales of NRT, bupropion and varenicline; and introduction of graphic health warnings on cigarette packs. Autoregressive integrated moving average (ARIMA) models were used to examine the influence of these interventions on smoking prevalence.
Findings: The mean smoking prevalence for the study period was 19.9% (standard deviation: 2.0%), with a drop from 23.6% (in January 2001) to 17.3% (in June 2011). The best-fitting model showed that stronger smoke-free laws, tobacco price increases and greater exposure to mass media campaigns independently explained 76% of the decrease in smoking prevalence from February 2002 to June 2011.
Conclusion: Increased tobacco taxation, more comprehensive smoke-free laws and increased investment in mass media campaigns played a substantial role in reducing smoking prevalence among Australian adults between 2001 and 2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to determine the validity of an accelerometer to measure average acceleration values during high speed running. Thirteen subjects performed three sprint efforts over a 40 m distance (n = 39). Acceleration was measured using a 100 Hz tri-axial accelerometer integrated within a wearable tracking device (SPI-HPU, GPSports, Canberra). To provide a concurrent measure of acceleration, timing gates were positioned at 10 m intervals (0 m - 40 m). Accelerometer data collected during 0 m - 10 m and 10 m - 20 m provided a measure of average acceleration values. Accelerometer data was recorded as the raw output and filtered by applying a 3 point moving average and a 10 point moving average. The accelerometer could not measure average acceleration values during high speed running. The accelerometer significantly overestimated average acceleration values during both 0 m - 10 m and 10 m - 20 m, regardless of the data filtering technique (p < 0.001). Body mass significantly affected all accelerometer variables (p < 0.10, partial η = 0.091 - 0.219). Body mass and the absence of a gravity compensation formula affect the accuracy and practicality of accelerometers. Until GPSports integrated accelerometers incorporate a gravity compensation formula the usefulness of any accelerometer derived algorithms is questionable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In big-data-driven traffic flow prediction systems, the robustness of prediction performance depends on accuracy and timeliness. This paper presents a new MapReduce-based nearest neighbor (NN) approach for traffic flow prediction using correlation analysis (TFPC) on a Hadoop platform. In particular, we develop a real-time prediction system including two key modules, i.e., offline distributed training (ODT) and online parallel prediction (OPP). Moreover, we build a parallel k-nearest neighbor optimization classifier, which incorporates correlation information among traffic flows into the classification process. Finally, we propose a novel prediction calculation method, combining the current data observed in OPP and the classification results obtained from large-scale historical data in ODT, to generate traffic flow prediction in real time. The empirical study on real-world traffic flow big data using the leave-one-out cross validation method shows that TFPC significantly outperforms four state-of-the-art prediction approaches, i.e., autoregressive integrated moving average, Naïve Bayes, multilayer perceptron neural networks, and NN regression, in terms of accuracy, which can be improved 90.07% in the best case, with an average mean absolute percent error of 5.53%. In addition, it displays excellent speedup, scaleup, and sizeup.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to environmental loads, mechanical damages, structural aging and human factors, civil infrastructure inevitably deteriorate during their service lives. Since their damage may claim human lives and cause significant economic losses, how to identify damages and assess structural conditions timely and accurately has drawn increasingly more attentions from structural engineering community worldwide. In this study, a fast and sensitive time domain damage identification method will be developed. First, a high quality finite element model is built and the structural responses are simulated under different damage scenarios. Based on the simulated data, an Auto Regressive Moving Average Exogenous (ARMAX) model is then developed and calibrated. The calibrated ARMAX model can be used to identify damage in different scenarios through model updating process using clonal selection algorithm (CSA). The identification results demonstrate the performance of the proposed methodology, which has the potential to be used for damage identification in practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to environmental loads, mechanical damages, structural aging and human factors, civil infrastructure inevitably deteriorate during their service lives. Since their damage may claim human lives and cause significant economic losses, how to identify damages and assess structural conditions timely and accurately has drawn increasingly more attentions from structural engineering community worldwide. In this study, a fast and sensitive time domain damage identification method will be developed. To do this, a finite element model of a steel pipe laid on the soil is built and the structural responses are simulated under different damage scenarios. Based on the simulated data, an Auto Regressive Moving Average Exogenous (ARMAX) model is then built and calibrated. The calibrated ARMAX model is used to identify different damage scenarios through model updating process using clonal selection algorithm (CSA). The results demonstrate the application potential of the proposed method in identifying the pipeline conditions. To further verify its performance, laboratory tests of a steel pipe laid on the soil with and without soil support (free span damage) are carried out. The identification results of pipe-soil system show that the proposed method is capable of identifying damagein a complex structural system. Therefore, it can be applied to identifying pipeline conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is currently no universally recommended and accepted method of data processing within the science of indirect calorimetry for either mixing chamber or breath-by-breath systems of expired gas analysis. Exercise physiologists were first surveyed to determine methods used to process oxygen consumption ([OV0312]O 2) data, and current attitudes to data processing within the science of indirect calorimetry. Breath-by-breath datasets obtained from indirect calorimetry during incremental exercise were then used to demonstrate the consequences of commonly used time, breath and digital filter post-acquisition data processing strategies. Assessment of the variability in breath-by-breath data was determined using multiple regression based on the independent variables ventilation (VE), and the expired gas fractions for oxygen and carbon dioxide, FEO 2 and FECO2, respectively. Based on the results of explanation of variance of the breath-by-breath [OV0312]O2 data, methods of processing to remove variability were proposed for time-averaged, breath-averaged and digital filter applications. Among exercise physiologists, the strategy used to remove the variability in sequential [OV0312]O2 measurements varied widely, and consisted of time averages (30 sec [38%], 60 sec [18%], 20 sec [11%], 15 sec [8%]), a moving average of five to 11 breaths (10%), and the middle five of seven breaths (7%). Most respondents indicated that they used multiple criteria to establish maximum [OV0312]O 2 ([OV0312]O2max) including: the attainment of age-predicted maximum heart rate (HRmax) [53%], respiratory exchange ratio (RER) >1.10 (49%) or RER >1.15 (27%) and a rating of perceived exertion (RPE) of >17, 18 or 19 (20%). The reasons stated for these strategies included their own beliefs (32%), what they were taught (26%), what they read in research articles (22%), tradition (13%) and the influence of their colleagues (7%). The combination of VE, FEO 2 and FECO2 removed 96-98% of [OV0312]O2 breath-by-breath variability in incremental and steady-state exercise [OV0312]O2 data sets, respectively. Correction of residual error in [OV0312]O2 datasets to 10% of the raw variability results from application of a 30-second time average, 15-breath running average, or a 0.04 Hz low cut-off digital filter. Thus, we recommend that once these data processing strategies are used, the peak or maximal value becomes the highest processed datapoint. Exercise physiologists need to agree on, and continually refine through empirical research, a consistent process for analysing data from indirect calorimetry.