936 resultados para Forecasting
Resumo:
Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.
Resumo:
An expert system has been developed that provides 24 hour forecasts of roadway and bridge frost for locations in Iowa. The system is based on analysis of frost observations taken by highway maintenance personnel, analysis of conditions leading to frost as obtained from meteorologists with experience in forecasting bridge and roadway frost, and from fundamental physical principles of frost processes. The expert system requires the forecaster to enter information on recent maximum and minimum temperatures and forecasts of maximum and minimum air temperatures, dew point temperatures, precipitation, cloudiness, and wind speed. The system has been used operationally for the last two frost seasons by Freese-Notis Associates, who have been under contract with the Iowa DOT to supply frost forecasts. The operational meteorologists give the system their strong endorsement. They always consult the system before making a frost forecast unless conditions clearly indicate frost is not likely. In operational use, the system is run several times with different input values to test the sensitivity of frost formation on a particular day to various meteorological parameters. The users comment. that the system helps them to consider all the factors relevant to frost formation and is regarded as an office companion for making frost forecasts.
Resumo:
Several accidents, some involving fatalities, have occurred on U.S. Highway 30 near the Archer Daniels Midland Company (ADM) Corn Sweeteners plant in Cedar Rapids, Iowa. A contributing factor to many of these accidents has been the large amounts of water (vapor and liquid) emitted from multiple sources at ADM's facility located along the south side of the highway. Weather and road closure data acquired from IDOT have been used to develop a database of meteorological conditions preceding and accompanying closure of Highway 30 in Cedar Rapids. An expert system and a FORTRAN program were developed as aids in decision making with regard to closure of Highway 30 near the plant. The computer programs were used for testing, evaluation, and final deployment. Reports indicate the decision tools have been successfully implemented and were judged to be helpful in forecasting road closures and in reducing costs and personnel time in monitoring the roadway.
Resumo:
A network of 25 sonic stage sensors were deployed in the Squaw Creek basin upstream from Ames Iowa to determine if the state-of-the-art distributed hydrological model CUENCAS can produce reliable information for all road crossings including those that cross small creeks draining basins as small as 1 sq. mile. A hydraulic model was implemented for the major tributaries of the Squaw Creek where IFC sonic instruments were deployed and it was coupled to CUENCAS to validate the predictions made at small tributaries in the basin. This study demonstrates that the predictions made by the hydrological model at internal locations in the basins are as accurate as the predictions made at the outlet of the basin. Final rating curves based on surveyed cross sections were developed for the 22 IFC-bridge sites that are currently operating, and routine forecast is provided at those locations (see IFIS). Rating curves were developed for 60 additional bridge locations in the basin, however, we do not use those rating curves for routine forecast because the lack of accuracy of LiDAR derived cross sections is not optimal. The results of our work form the basis for two papers that have been submitted for publication to the Journal of Hydrological Engineering. Peer review of our work will gives a strong footing to our ability to expand our results from the pilot Squaw Creek basin to all basins in Iowa.
Resumo:
High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.
Resumo:
High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.
Resumo:
This master's thesis coversthe concepts of knowledge discovery, data mining and technology forecasting methods in telecommunications. It covers the various aspects of knowledge discoveryin data bases and discusses in detail the methods of data mining and technologyforecasting methods that are used in telecommunications. Main concern in the overall process of this thesis is to emphasize the methods that are being used in technology forecasting for telecommunications and data mining. It tries to answer to some extent to the question of do forecasts create a future? It also describes few difficulties that arise in technology forecasting. This thesis was done as part of my master's studies in Lappeenranta University of Technology.
Resumo:
The increasing interest aroused by more advanced forecasting techniques, together with the requirement for more accurate forecasts of tourismdemand at the destination level due to the constant growth of world tourism, has lead us to evaluate the forecasting performance of neural modelling relative to that of time seriesmethods at a regional level. Seasonality and volatility are important features of tourism data, which makes it a particularly favourable context in which to compare the forecasting performance of linear models to that of nonlinear alternative approaches. Pre-processed official statistical data of overnight stays and tourist arrivals fromall the different countries of origin to Catalonia from 2001 to 2009 is used in the study. When comparing the forecasting accuracy of the different techniques for different time horizons, autoregressive integrated moving average models outperform self-exciting threshold autoregressions and artificial neural network models, especially for shorter horizons. These results suggest that the there is a trade-off between the degree of pre-processing and the accuracy of the forecasts obtained with neural networks, which are more suitable in the presence of nonlinearity in the data. In spite of the significant differences between countries, which can be explained by different patterns of consumer behaviour,we also find that forecasts of tourist arrivals aremore accurate than forecasts of overnight stays.
Resumo:
Forecasting coal resources and reserves is critical for coal mine development. Thickness maps are commonly used for assessing coal resources and reserves; however they are limited for capturing coal splitting effects in thick and heterogeneous coal zones. As an alternative, three-dimensional geostatistical methods are used to populate facies distributionwithin a densely drilled heterogeneous coal zone in the As Pontes Basin (NWSpain). Coal distribution in this zone is mainly characterized by coal-dominated areas in the central parts of the basin interfingering with terrigenous-dominated alluvial fan zones at the margins. The three-dimensional models obtained are applied to forecast coal resources and reserves. Predictions using subsets of the entire dataset are also generated to understand the performance of methods under limited data constraints. Three-dimensional facies interpolation methods tend to overestimate coal resources and reserves due to interpolation smoothing. Facies simulation methods yield similar resource predictions than conventional thickness map approximations. Reserves predicted by facies simulation methods are mainly influenced by: a) the specific coal proportion threshold used to determine if a block can be recovered or not, and b) the capability of the modelling strategy to reproduce areal trends in coal proportions and splitting between coal-dominated and terrigenousdominated areas of the basin. Reserves predictions differ between the simulation methods, even with dense conditioning datasets. Simulation methods can be ranked according to the correlation of their outputs with predictions from the directly interpolated coal proportion maps: a) with low-density datasets sequential indicator simulation with trends yields the best correlation, b) with high-density datasets sequential indicator simulation with post-processing yields the best correlation, because the areal trends are provided implicitly by the dense conditioning data.
Resumo:
Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.
Resumo:
In the European Union, the importance of mobile communications was realized early on. The process of mobile communications becoming ubiquitous has taken time, as the innovation of mobile communications diffused into the society. The aim of this study is to find out how the evolution and spatial patterns of the diffusion of mobile communications within the European Union could be taken into account in forecasting the diffusion process. There is relatively lot of research of innovation diffusion on the individual (micro) andthe country (macro) level, if compared to the territorial level. Territorial orspatial diffusion refers either to the intra-country or inter-country diffusionof an innovation. In both settings, the dif- fusion of a technological innovation has gained scarce attention. This study adds knowledge of the diffusion between countries, focusing especially on the role of location in this process. The main findings of the study are the following: The penetration rates of the European Union member countries have become more even in the period of observation, from the year 1981 to 2000. The common digital GSM system seems to have hastened this process. As to the role of location in the diffusion process, neighboring countries have had similar diffusion processes. They can be grouped into three, the Nordic countries, the central and southern European countries, and the remote southern European countries. The neighborhood effect is also domi- nating in thegravity model which is used for modeling the adoption timing of the countries. The subsequent diffusion within a country, measured by the logistic model in Finland, is af- fected positively by its economic situation, and it seems to level off at some 92 %. Considering the launch of future mobile communications systemsusing a common standard should implicate an equal development between the countries. The launching time should be carefully selected as the diffusion is probably delayed in economic downturns. The location of a country, measured by distance, can be used in forecasting the adoption and diffusion. Fi- nally, the result of penetration rates becoming more even implies that in a relatively homoge- nous set of countries, such as the European Union member countries, the estimated final pene- tration of a single country can be used for approximating the penetration of the others. The estimated eventual penetration of Finland, some 92 %, should thus also be the eventual level for all the European Union countries and for the European Union as a whole.
Resumo:
Selostus: Sään vaikutus syysviljojen hehtolitran painoon ennusteen laadinnan näkökulmasta
Resumo:
Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.