906 resultados para Autoregressive-Moving Average model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work proposes a method to examine variations in the cointegration relation between preferred and common stocks in the Brazilian stock market via Markovian regime switches. It aims on contributing for future works in "pairs trading" and, more specifically, to price discovery, given that, conditional on the state, the system is assumed stationary. This implies there exists a (conditional) moving average representation from which measures of "information share" (IS) could be extracted. For identification purposes, the Markov error correction model is estimated within a Bayesian MCMC framework. Inference and capability of detecting regime changes are shown using a Montecarlo experiment. I also highlight the necessity of modeling financial effects of high frequency data for reliable inference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of strategies for structural health monitoring (SHM) has become increasingly important because of the necessity of preventing undesirable damage. This paper describes an approach to this problem using vibration data. It involves a three-stage process: reduction of the time-series data using principle component analysis (PCA), the development of a data-based model using an auto-regressive moving average (ARMA) model using data from an undamaged structure, and the classification of whether or not the structure is damaged using a fuzzy clustering approach. The approach is applied to data from a benchmark structure from Los Alamos National Laboratory, USA. Two fuzzy clustering algorithms are compared: fuzzy c-means (FCM) and Gustafson-Kessel (GK) algorithms. It is shown that while both fuzzy clustering algorithms are effective, the GK algorithm marginally outperforms the FCM algorithm. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nearly half of the earth's photosynthetically fixed carbon derives from the oceans. To determine global and region specific rates, we rely on models that estimate marine net primary productivity (NPP) thus it is essential that these models are evaluated to determine their accuracy. Here we assessed the skill of 21 ocean color models by comparing their estimates of depth-integrated NPP to 1156 in situ C-14 measurements encompassing ten marine regions including the Sargasso Sea, pelagic North Atlantic, coastal Northeast Atlantic, Black Sea, Mediterranean Sea, Arabian Sea, subtropical North Pacific, Ross Sea, West Antarctic Peninsula, and the Antarctic Polar Frontal Zone. Average model skill, as determined by root-mean square difference calculations, was lowest in the Black and Mediterranean Seas, highest in the pelagic North Atlantic and the Antarctic Polar Frontal Zone, and intermediate in the other six regions. The maximum fraction of model skill that may be attributable to uncertainties in both the input variables and in situ NPP measurements was nearly 72%. on average, the simplest depth/wavelength integrated models performed no worse than the more complex depth/wavelength resolved models. Ocean color models were not highly challenged in extreme conditions of surface chlorophyll-a and sea surface temperature, nor in high-nitrate low-chlorophyll waters. Water column depth was the primary influence on ocean color model performance such that average skill was significantly higher at depths greater than 250 m, suggesting that ocean color models are more challenged in Case-2 waters (coastal) than in Case-1 (pelagic) waters. Given that in situ chlorophyll-a data was used as input data, algorithm improvement is required to eliminate the poor performance of ocean color NPP models in Case-2 waters that are close to coastlines. Finally, ocean color chlorophyll-a algorithms are challenged by optically complex Case-2 waters, thus using satellite-derived chlorophyll-a to estimate NPP in coastal areas would likely further reduce the skill of ocean color models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A predição do preço da energia elétrica é uma questão importante para todos os participantes do mercado, para que decidam as estratégias mais adequadas e estabeleçam os contratos bilaterais que maximizem seus lucros e minimizem os seus riscos. O preço da energia tipicamente exibe sazonalidade, alta volatilidade e picos. Além disso, o preço da energia é influenciado por muitos fatores, tais como: demanda de energia, clima e preço de combustíveis. Este trabalho propõe uma nova abordagem híbrida para a predição de preços de energia no mercado de curto prazo. Tal abordagem combina os filtros autorregressivos integrados de médias móveis (ARIMA) e modelos de Redes Neurais (RNA) numa estrutura em cascata e utiliza variáveis explanatórias. Um processo em dois passos é aplicado. Na primeira etapa, as variáveis explanatórias são preditas. Na segunda etapa, os preços de energia são preditos usando os valores futuros das variáveis exploratórias. O modelo proposto considera uma predição de 12 passos (semanas) a frente e é aplicada ao mercado brasileiro, que possui características únicas de comportamento e adota o despacho centralizado baseado em custo. Os resultados mostram uma boa capacidade de predição de picos de preço e uma exatidão satisfatória de acordo com as medidas de erro e testes de perda de cauda quando comparado com técnicas tradicionais. Em caráter complementar, é proposto um modelo classificador composto de árvores de decisão e RNA, com objetivo de explicitar as regras de formação de preços e, em conjunto com o modelo preditor, atuar como uma ferramenta atrativa para mitigar os riscos da comercialização de energia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop spatial statistical models for stream networks that can estimate relationships between a response variable and other covariates, make predictions at unsampled locations, and predict an average or total for a stream or a stream segment. There have been very few attempts to develop valid spatial covariance models that incorporate flow, stream distance, or both. The application of typical spatial autocovariance functions based on Euclidean distance, such as the spherical covariance model, are not valid when using stream distance. In this paper we develop a large class of valid models that incorporate flow and stream distance by using spatial moving averages. These methods integrate a moving average function, or kernel, against a white noise process. By running the moving average function upstream from a location, we develop models that use flow, and by construction they are valid models based on stream distance. We show that with proper weighting, many of the usual spatial models based on Euclidean distance have a counterpart for stream networks. Using sulfate concentrations from an example data set, the Maryland Biological Stream Survey (MBSS), we show that models using flow may be more appropriate than models that only use stream distance. For the MBSS data set, we use restricted maximum likelihood to fit a valid covariance matrix that uses flow and stream distance, and then we use this covariance matrix to estimate fixed effects and make kriging and block kriging predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in marine net primary productivity (PP) and export of particulate organic carbon (EP) are projected over the 21st century with four global coupled carbon cycle-climate models. These include representations of marine ecosystems and the carbon cycle of different structure and complexity. All four models show a decrease in global mean PP and EP between 2 and 20% by 2100 relative to preindustrial conditions, for the SRES A2 emission scenario. Two different regimes for productivity changes are consistently identified in all models. The first chain of mechanisms is dominant in the low- and mid-latitude ocean and in the North Atlantic: reduced input of macro-nutrients into the euphotic zone related to enhanced stratification, reduced mixed layer depth, and slowed circulation causes a decrease in macro-nutrient concentrations and in PP and EP. The second regime is projected for parts of the Southern Ocean: an alleviation of light and/or temperature limitation leads to an increase in PP and EP as productivity is fueled by a sustained nutrient input. A region of disagreement among the models is the Arctic, where three models project an increase in PP while one model projects a decrease. Projected changes in seasonal and interannual variability are modest in most regions. Regional model skill metrics are proposed to generate multi-model mean fields that show an improved skill in representing observation-based estimates compared to a simple multi-model average. Model results are compared to recent productivity projections with three different algorithms, usually applied to infer net primary production from satellite observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Recent reports using administrative claims data suggest the incidence of community- and hospital-onset sepsis is increasing. Whether this reflects changing epidemiology, more effective diagnostic methods, or changes in physician documentation and medical coding practices is unclear. METHODS We performed a temporal-trend study from 2008 to 2012 using administrative claims data and patient-level clinical data of adult patients admitted to Barnes-Jewish Hospital in St. Louis, Missouri. Temporal-trend and annual percent change were estimated using regression models with autoregressive integrated moving average errors. RESULTS We analyzed 62,261 inpatient admissions during the 5-year study period. 'Any SIRS' (i.e., SIRS on a single calendar day during the hospitalization) and 'multi-day SIRS' (i.e., SIRS on 3 or more calendar days), which both use patient-level data, and medical coding for sepsis (i.e., ICD-9-CM discharge diagnosis codes 995.91, 995.92, or 785.52) were present in 35.3 %, 17.3 %, and 3.3 % of admissions, respectively. The incidence of admissions coded for sepsis increased 9.7 % (95 % CI: 6.1, 13.4) per year, while the patient data-defined events of 'any SIRS' decreased by 1.8 % (95 % CI: -3.2, -0.5) and 'multi-day SIRS' did not change significantly over the study period. Clinically-defined sepsis (defined as SIRS plus bacteremia) and severe sepsis (defined as SIRS plus hypotension and bacteremia) decreased at statistically significant rates of 5.7 % (95 % CI: -9.0, -2.4) and 8.6 % (95 % CI: -4.4, -12.6) annually. All-cause mortality, SIRS mortality, and SIRS and clinically-defined sepsis case fatality did not change significantly during the study period. Sepsis mortality, based on ICD-9-CM codes, however, increased by 8.8 % (95 % CI: 1.9, 16.2) annually. CONCLUSIONS The incidence of sepsis, defined by ICD-9-CM codes, and sepsis mortality increased steadily without a concomitant increase in SIRS or clinically-defined sepsis. Our results highlight the need to develop strategies to integrate clinical patient-level data with administrative data to draw more accurate conclusions about the epidemiology of sepsis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study demonstrated that accurate, short-term forecasts of Veterans Affairs (VA) hospital utilization can be made using the Patient Treatment File (PTF), the inpatient discharge database of the VA. Accurate, short-term forecasts of two years or less can reduce required inventory levels, improve allocation of resources, and are essential for better financial management. These are all necessary achievements in an era of cost-containment.^ Six years of non-psychiatric discharge records were extracted from the PTF and used to calculate four indicators of VA hospital utilization: average length of stay, discharge rate, multi-stay rate (a measure of readmissions) and days of care provided. National and regional levels of these indicators were described and compared for fiscal year 1984 (FY84) to FY89 inclusive.^ Using the observed levels of utilization for the 48 months between FY84 and FY87, five techniques were used to forecast monthly levels of utilization for FY88 and FY89. Forecasts were compared to the observed levels of utilization for these years. Monthly forecasts were also produced for FY90 and FY91.^ Forecasts for days of care provided were not produced. Current inpatients with very long lengths of stay contribute a substantial amount of this indicator and it cannot be accurately calculated.^ During the six year period between FY84 and FY89, average length of stay declined substantially, nationally and regionally. The discharge rate was relatively stable, while the multi-stay rate increased slightly during this period. FY90 and FY91 forecasts show a continued decline in the average length of stay, while the discharge rate is forecast to decline slightly and the multi-stay rate is forecast to increase very slightly.^ Over a 24 month ahead period, all three indicators were forecast within a 10 percent average monthly error. The 12-month ahead forecast errors were slightly lower. Average length of stay was less easily forecast, while the multi-stay rate was the easiest indicator to forecast.^ No single technique performed significantly better as determined by the Mean Absolute Percent Error, a standard measure of error. However, Autoregressive Integrated Moving Average (ARIMA) models performed well overall and are recommended for short-term forecasting of VA hospital utilization. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the determination of the realized thermal niche and the effects of climate change on the range distribution of two brown trout populations inhabiting two streams in the Duero River basin (Iberian Peninsula) at the edge of the natural distribution area of this species. For reaching these goals, new methodological developments were applied to improve reliability of forecasts. Water temperature data were collected using 11 thermographs located along the altitudinal gradient, and they were used to model the relationship between stream temperature and air temperature along the river continuum. Trout abundance was studied using electrofishing at 37 sites to determine the current distribution. The RCP4.5 and RCP8.5 change scenarios adopted by the International Panel of Climate Change for its Fifth Assessment Report were used for simulations and local downscaling in this study. We found more reliable results using the daily mean stream temperature than maximum daily temperature and their respective seven days moving-average to determine the distribution thresholds. Thereby, the observed limits of the summer distribution of brown trout were linked to thresholds between 18.1ºC and 18.7ºC. These temperatures characterise a realised thermal niche narrower than the physiological thermal range. In the most unfavourable climate change scenario, the thermal habitat loss of brown trout increased to 38% (Cega stream) and 11% (Pirón stream) in the upstream direction at the end of the century; however, at the Cega stream, the range reduction could reach 56% due to the effect of a ?warm-window? opening in the piedmont reach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O nexo causal entre desenvolvimento financeiro e crescimento econômico vem ganhando destaque na literatura desde o início dos anos 1990. As principais linhas teóricas nessa área buscam demonstrar qual a significância da relação e o sentido da causalidade, se houver. Causalidade unidirecional no sentido do desenvolvimento financeiro para o crescimento econômico, bicausalidade entre ambos, e causalidade reversa, no sentido do crescimento para o desenvolvimento financeiro, são as principais hipóteses testadas nas pesquisas empíricas. O presente trabalho de tese tem por objetivo avaliar o nexo causal entre crédito (como um indicador do desenvolvimento financeiro) e crescimento no setor agropecuário brasileiro. O crédito rural como proporção do PIB agropecuário cresceu substancialmente desde meados da década de 90, passando de 15,44% em 1996 para 65,24% em 2014. Ao longo do período 1969-2014, a razão média anual entre crédito rural e PIB agropecuário foi de 43,87%. No mesmo período, o produto agropecuário cresceu em média 3,76% ao ano. Questiona-se se no mercado rural o crédito causa o crescimento agropecuário, se ocorre causalidade reversa ou se se opera a hipótese de bicausalidade. Para avaliar o nexo causal entre essas duas variáveis econômica foram empregados quatro procedimentos metodológicos: teste de causalidade de Granger em uma representação VAR com a abordagem de Toda e Yamamoto, teste de causalidade de Granger em um modelo FMOLS (Fully Modified OLS), teste de causalidade de Granger em um modelo ARDL (Autoregressive-Distributed Lag) e teste de causalidade de Granger no domínio da frequência, com o uso do método de Breitung e Candelon. Os resultados mostram de forma uniforme a presença de causalidade unidirecional do crédito rural para o crescimento do produto agropecuário. Causalidade reversa, no sentido do crescimento agropecuário para o crédito rural, não foi detectada de forma significativa em nenhum dos quatro métodos empregados. A não detecção de bicausalidade pode ser uma evidência do impacto da forte política de subsídio governamental ao crédito rural. A decisão do Governo quanto ao montante anual de crédito rural disponível a taxas de juros subsidiadas pode estar impedindo que o desempenho do setor, medido pela sua taxa de crescimento, exerça uma influência significativa na dinâmica do crédito rural. Os resultados também abrem a possibilidade a testar a hipótese de exogeneidade do crédito rural, o que seria uma extensão direta dos resultados obtidos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To carry out stability and voltage regulation studies on more electric aircraft systems in which there is a preponderance of multi-pulse, rectifier-fed motor-drive equipment, average dynamic models of the rectifier converters are required. Existing methods are difficult to apply to anything other than single converters with a low pulse number. Therefore an efficient, compact method for deriving the approximate, linear, average model of 6- and 12-pulse rectifiers, based on the assumption of a small duration of the overlap angle is presented. The models are validated against detailed simulations and laboratory prototypes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.