859 resultados para Hydrological forecasting.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62M20, 62M10, 62-07.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Finite Element Analysis (FEA) model is used to explore the relationship between clogging and hydraulics that occurs in Horizontal Subsurface Flow Treatment Wetlands (HSSF TWs) in the United Kingdom (UK). Clogging is assumed to be caused by particle transport and an existing single collector efficiency model is implemented to describe this behaviour. The flow model was validated against HSSF TW survey results obtained from the literature. The model successfully simulated the influence of overland flow on hydrodynamics, and the interaction between vertical flow through the low permeability surface layer and the horizontal flow of the saturated water table. The clogging model described the development of clogging within the system but under-predicted the extent of clogging which occurred over 15 years. This is because important clogging mechanisms were not considered by the model, such as biomass growth and vegetation establishment. The model showed the usefulness of FEA for linking hydraulic and clogging phenomenon in HSSF TWs and could be extended to include treatment processes. © 2011 Springer Science+Business Media B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the potential of pre-setting 11kV overhead line ratings over a time period of sufficient length to be useful to the real-time management of overhead lines. This forecast is based on short and long term freely available weather forecasts and is used to help investigate the potential for realising dynamic rating benefits on the electricity network. A comparison between the realisable benefits in ratings using this forecast data, over the period of a year has been undertaken.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents out-of-sample inflation forecasting results based on relative price variability and skewness. It is demonstrated that forecasts on long horizons of 1.5-2 years are significantly improved if the forecast equation is augmented with skewness. © 2010 Taylor & Francis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Írásunkban azt vizsgáljuk, hogy a hosszú lejáratú határidős árfolyamok stacionaritását feltételező hibakorrekciós modellek, amelyeknek korábbi számítások szerint - a világ devizapiaci forgalmának mintegy 75 százalékát kitevő fejlett ipari országokra alkalmazva - kitűnő a mintán kívüli előrejelző erejük, hogyan képesek három keletközép- európai ország devizaárfolyamát előrejelezni. A három vizsgálat alá vont deviza (cseh, magyar, lengyel) esetében az eredmények relációnként nagyon eltérnek, és összességében kedvezőtlenebbek, mint a fejlett ipari országokra kapott eredmények, amit a nem teljesen rugalmas árfolyamrezsim, a rendelkezésre álló adatsor rövidsége, az eurózóna-csatlakozáshoz kapcsolódó bizonytalanságok, a devizakockázati és a határidős kamatprémium létezése, továbbá a Balassa-Samuelson-hatás együttes befolyásaként tudunk értelmezni. JEL kód: E43, F31, F47. /===/ This paper studies whether models that assume long-maturity forward exchange rates are stationary (which proved in earlier studies to provide superior forecasting ability when applied to exchange rates of major currencies) are capable of forecasting the Euro exchange rates of three Central-East European currencies (the Czech koruna, Hungarian forint and Polish zloty). The results for the three currencies differ from each other and are generally much worse than those obtained earlier for major currencies. These unfavourable results are attributed to the consequences of managed exchange-rate systems, to the short time series available, to uncertainties related to future Euro-zone entry, to the existence of a foreign exchange and term premium, and to the Balassa–Samuelson effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ebben a cikkben bemutatjuk az MTA KRTK KTI munkaerő-piaci előrejelző rendszerének nagy léptékű szerkezetét, a szerkezet kialakítása során követett főbb elveket. Ismertetjük a hazai gyakorlatban egyedülállóan összetett és széles körű adatrendszert, amelyen a becslést és az előrejelzést elvégeztük. Röviden kitérünk az ágazati GDP előrejelzésére, a modell keresleti és kínálati oldalának működésére, valamint a kereslet és kínálat közötti eltérések dinamikájának vizsgálatára. ______ The article presents the overall structure, and main principles followed in devising the structure, of the labour-market forecasting system developed by the Institute of Economics of the Research Centre for Economic and Regional Studies of the Hungarian Academy of Sciences (MTA KRTK KTI). The authors present the broad, comprehensive data system unprecedented in Hungarian practice, from which the estimate and forecast are made. The article diverges briefly onto the forecasting of branch GDP, the mode of operation of the supply and demand sides of the model, and examination of the dynamics of discrepancies between supply and demand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vállalatok jelentős része szembesül azzal, hogy termékei jelentős része iránt viszonylag kevés alkalommal jelentkezik kereslet. Ebből következik, hogy az ilyen termékekre a klasszikus előrejelzési módszerek, mint pl. a mozgó átlag számítása, vagy az exponenciális simítás nem alkalmazható. Azon termékeket, amelyek iránt viszonylag ritkán jelenik meg kereslet, sporadikus keresletű termékeknek nevezzük. A megkülönböztetés a sporadikus és nem sporadikus termékek között sokszor csak hüvelykujj szabály alapján állapítható meg, de erre vonatkozóan a szakirodalomban találunk iránymutatást. A nemzetközi szakirodalomban már megjelentek olyan új kereslet-előrejelzési módszerek, melyeket kimondottan az ilyen, sporadikus kereslettel rendelkező termékek estében javasoltak. Cikkünk célja, hogy ezeket a szakirodalmi ajánlásokat egy konkrét hazai vállalat valós adatain esettanulmány jelleggel tesztelje. A nemzetközi szakirodalomban is ritkán publikálnak tudományos dolgozatokat, amelyek ezt a témakört valós alkalmazási környezetben tárgyalják; ismereteink szerint magyar nyelven erről tudományos dolgozat pedig még nem született. Elméleti bevezetőnk után egy gyógyszer-nagykereskedelmi vállalatnál valós adatait használva vizsgáljuk a kérdéskört. Sor kerül a vállalat termékportfóliójának a kereslet-előrejelzés szempontjából történő tipizálására, majd sporadikus keresletű termékek keresletének előrejelzésére és ennek során a szakirodalomban az alkalmazandó módszerekre vonatkozó ajánlások vizsgálatára. _____ Significant numbers of companies have the problem that demand for their products are sporadic in nature. Demand of such products is not continual in time; its demand is diffused, is random with large proportion of zero values in the analyzed time series. The sporadic character of a demand pattern actually means that available information on the demand of previous selling periods is leaky resulting in lower quality of data available. In these cases traditional forecasting techniques do not result in reliable forecast. Special forecasting algorithms have been developed during the last decade dealing with this problem. The paper introduces these techniques and offers suggestions for application. It also presents the case study of a Hungarian pharmaceutical wholesaler company. Based on real data we develop a topology of the company's product portfolio, carry out forecasts using different techniques including those developed for products with sporadic demand and also analyze the quality of these forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A szerző a 2008-ban kezdődött gazdasági világválság hatását vizsgálja az egy részvényre jutó nyereség előrejelzésének hibájára. Számos publikáció bizonyította, hogy az elemzők a tényértékeknél szisztematikusan kedvezőbb tervértéket adnak meg az egy részvényre jutó előrejelzéseikben. Más vizsgálatok azt igazolták, hogy az egy részvényre jutó előrejelzési hiba bizonytalan környezetben növekszik, míg arra is számos bizonyítékot lehet találni, hogy a negatív hírek hatását az elemzők alulsúlyozzák. A gazdasági világválság miatt az elemzőknek számtalan negatív hírt kellett figyelembe venniük az előrejelzések készítésekor, továbbá a válság az egész gazdaságban jelentősen növelte a bizonytalanságot. A szerző azt vizsgálja, hogy miként hatott a gazdasági világválság az egy részvényre jutó nyereség- előrejelzés hibájára, megkülönböztetve azt az időszakot, amíg a válság negatív hír volt, attól, amikor már hatásaként jelentősen megnőtt a bizonytalanság. _____ The author investigated the impact of the financial crisis that started in 2008 on the forecasting error for earnings per share. There is plentiful evidence from the 1980s that analysts give systematically more favourable values in their earnings per share (EPS) forecasts than reality, i.e. they are generally optimistic. Other investigations have supported the idea that the EPS forecasting error is greater under uncertain environmental circumstances, while other researchers prove that the analysts under-react to the negative information in their forecasts. The financial crisis brought a myriad of negative information for analysts to consider in such forecasts, while also increasing the level of uncertainty for the entire economy. The article investigates the impact of the financial crisis on the EPS forecasting error, distinguishing the period when the crisis gave merely negative information, from the one when its effect of uncertainty was significantly increased over the entire economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geochemical and geophysical approaches have been used to investigate the freshwater and saltwater dynamics in the coastal Biscayne Aquifer and Biscayne Bay. Stable isotopes of oxygen and hydrogen, and concentrations of Sr2+ and Ca2+ were combined in two geochemical mixing models to provide estimates of the various freshwater inputs (precipitation, canal water, and groundwater) to Biscayne Bay and the coastal canal system in South Florida. Shallow geophysical electromagnetic and direct current resistivity surveys were used to image the geometry and stratification of the saltwater mixing zone in the near coastal (less than 1km inland) Biscayne Aquifer. The combined stable isotope and trace metal models suggest a ratio of canal input-precipitation-groundwater of 38%–52%–10% in the wet season and 37%–58%–5% in the dry season with an error of 25%, where most (20%) of the error was attributed to the isotope regression model, while the remaining 5% error was attributed to the Sr2+/Ca2+ mixing model. These models suggest rainfall is the dominate source of freshwater to Biscayne Bay. For a bay-wide water budget that includes saltwater and freshwater mixing, fresh groundwater accounts for less than 2% of the total input. A similar Sr 2+/Ca2+ tracer model indicates precipitation is the dominate source in 9 out of 10 canals that discharge into Biscayne Bay. The two-component mixing model converged for 100% of the freshwater canal samples in this study with 63% of the water contributed to the canals coming from precipitation and 37% from groundwater inputs ±4%. There was a seasonal shift from 63% precipitation input in the dry season to 55% precipitation input in the wet season. The three end-member mixing model converged for only 60% of the saline canal samples possibly due to non-conservative behavior of Sr2+ and Ca2+ in saline groundwater discharging into the canal system. Electromagnetic and Direct Current resistivity surveys were successful at locating and estimating the geometry and depth of the freshwater/saltwater interface in the Biscayne Aquifer at two near coastal sites. A saltwater interface that deepened as the survey moved inland was detected with a maximum interpreted depth to the interface of 15 meters, approximately 0.33 km inland from the shoreline. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.