860 resultados para Stochastic demand
Resumo:
Spain’s transport infrastructure policy has become a paradigmatic case of oversupply and of mismatch with demand. The massive expansion of the country’s transport infrastructure over the last decade has not been a response to demand bottlenecks or previously identified needs. For this reason, the intensity of use today on all interurban modes of transport in Spain falls well below that of other EU countries. This paper analyzes the institutional and regulatory factors that have permitted this policy, allowing us to draw lessons from the Spanish case that should help other countries avoid the pitfalls and shortcomings of Spanish policy.
Resumo:
Spain’s transport infrastructure policy has become a paradigmatic case of oversupply and of mismatch with demand. The massive expansion of the country’s transport infrastructure over the last decade has not been a response to demand bottlenecks or previously identified needs. For this reason, the intensity of use today on all interurban modes of transport in Spain falls well below that of other EU countries. This paper analyzes the institutional and regulatory factors that have permitted this policy, allowing us to draw lessons from the Spanish case that should help other countries avoid the pitfalls and shortcomings of Spanish policy.
Resumo:
Spain’s transport infrastructure policy has become a paradigmatic case of oversupply and of mismatch with demand. The massive expansion of the country’s transport infrastructure over the last decade has not been a response to demand bottlenecks or previously identified needs. For this reason, the intensity of use today on all interurban modes of transport in Spain falls well below that of other EU countries. This paper analyzes the institutional and regulatory factors that have permitted this policy, allowing us to draw lessons from the Spanish case that should help other countries avoid the pitfalls and shortcomings of Spanish policy. Based on our analysis, we also discuss policy remedies and suggest reforms in different regulatory areas, which could help improve the performance of Spain’s infrastructure policy.
Resumo:
Laitekaappien integrointi koostuu moduulien sekä kaapeleiden liittämisestä mallikohtaisiksi kokonaisuuksiksi. Tämä kokoonpanoprosessi on tilausohjautuva ja tehdään mallikohtaisesti yksittäiskokoonpanona. Mallien integrointityön vaikeus ja kokoonpanoaika vaihtelevat voimakkaasti. Tämä yhdistettynä työvoiman vaihtuvuuteen luo haastavan ympäristön kehittää tuotantoa sekä laadun että kapasiteetin näkökulmasta. Työssä on selvitetty voidaanko näitä kehittää jakamalla tuotantoprosessi pienempiin vaiheisiin, jotka ovat helpompi tasapainottaa ja oppia. Kokoomalinjan soveltaminen tilausohjautuvaan tuotantoon vaatii perinteiseen tahdistettuun kokoomalinjaan nähden suurempien poikkeavuuksien sallimista. Toisistaan merkittävästi poikkeavien työaikojen ja laajan mallivariaation vuoksi linjaa ei pystytä hallitsemaan niin järjestelmällisesti kuin tasapituisilla työvaiheilla. Tehokkaan tuotannon aikaansaaminen tällaiselle linjalle vaatii mahdollisuutta työjärjestyksen suunnitteluun ja sen simulointiin. Tässä työssä on pyritty arvioimaan simuloinnin avulla kokoomalinjan toimivuutta stokastisen kysynnän vallitessa. Malli on luotu hyväksikäyttäen tuotteiden valmistusaikoja, jotka on jaettu mallikohtaisesti kaikkiin mahdollisiin työtehtäviin. Nämä tehtävät on pyritty tasapainottamaan eri työpisteiden tehtäviksi. Tasapainotuksen tavoitteena on ollut minimoida tuotteiden työtehtävien keston voimakasta hajontaa, jota mallien kysynnän satunnaisuus voimistaa. Simulointien perusteella on luotu yksinkertaistettu sääntö työjärjestyksen muodostamiselle. Mallinnuksessa on pyritty maksimoimaan tuotannon tehokkuus minimoiden sekä keskeneräisen tuotannon määrää että läpimenoaikaa. Tehokkaimman vaihtoehdon löydyttyä on arvioitu kokoomalinjan soveltuvuutta laitekaappien integrointiin.
Resumo:
In a very volatile industry of high technology it is of utmost importance to accurately forecast customers’ demand. However, statistical forecasting of sales, especially in heavily competitive electronics product business, has always been a challenging task due to very high variation in demand and very short product life cycles of products. The purpose of this thesis is to validate if statistical methods can be applied to forecasting sales of short life cycle electronics products and provide a feasible framework for implementing statistical forecasting in the environment of the case company. Two different approaches have been developed for forecasting on short and medium term and long term horizons. Both models are based on decomposition models, but differ in interpretation of the model residuals. For long term horizons residuals are assumed to represent white noise, whereas for short and medium term forecasting horizon residuals are modeled using statistical forecasting methods. Implementation of both approaches is performed in Matlab. Modeling results have shown that different markets exhibit different demand patterns and therefore different analytical approaches are appropriate for modeling demand in these markets. Moreover, the outcomes of modeling imply that statistical forecasting can not be handled separately from judgmental forecasting, but should be perceived only as a basis for judgmental forecasting activities. Based on modeling results recommendations for further deployment of statistical methods in sales forecasting of the case company are developed.
Resumo:
Conventional (CONV) neuromuscular electrical stimulation (NMES) (i.e., short pulse duration, low frequencies) induces a higher energetic response as compared to voluntary contractions (VOL). In contrast, wide-pulse, high-frequency (WPHF) NMES might elicit-at least in some subjects (i.e., responders)-a different motor unit recruitment compared to CONV that resembles the physiological muscle activation pattern of VOL. We therefore hypothesized that for these responder subjects, the metabolic demand of WPHF would be lower than CONV and comparable to VOL. 18 healthy subjects performed isometric plantar flexions at 10% of their maximal voluntary contraction force for CONV (25 Hz, 0.05 ms), WPHF (100 Hz, 1 ms) and VOL protocols. For each protocol, force time integral (FTI) was quantified and subjects were classified as responders and non-responders to WPHF based on k-means clustering analysis. Furthermore, a fatigue index based on FTI loss at the end of each protocol compared with the beginning of the protocol was calculated. Phosphocreatine depletion (ΔPCr) was assessed using 31P magnetic resonance spectroscopy. Responders developed four times higher FTI's during WPHF (99 ± 37 ×103 N.s) than non-responders (26 ± 12 ×103 N.s). For both responders and non-responders, CONV was metabolically more demanding than VOL when ΔPCr was expressed relative to the FTI. Only for the responder group, the ∆PCr/FTI ratio of WPHF (0.74 ± 0.19 M/N.s) was significantly lower compared to CONV (1.48 ± 0.46 M/N.s) but similar to VOL (0.65 ± 0.21 M/N.s). Moreover, the fatigue index was not different between WPHF (-16%) and CONV (-25%) for the responders. WPHF could therefore be considered as the less demanding NMES modality-at least in this subgroup of subjects-by possibly exhibiting a muscle activation pattern similar to VOL contractions.
Resumo:
There is a lack of studies on tourism demand in Catalonia. To fill the gap, this paper focuses on detecting the macroeconomic factors that determine tourism demand in Catalonia. We also analyse the relation between these factors and tourism demand. Despite the strong seasonal component and the outliers in the time series of some countries, overnight stays give a better indication of tourism demand in Catalonia than the number of tourists. The degree of linear association between the macroeconomic variables and tourism demand is also higher when using quarterly rather than monthly data. Finally, there are notable differences between the results obtained for the different countries analysed. These results indicate that the best way to model tourism demand in Catalonia is to specify a quarterly model of overnight stays, differentiating between an aggregate demand model for the total number of tourists and specific models for each of the countries analysed.
Resumo:
Improving educational quality is an important public policy goal. However, its success requires identifying factors associated with student achievement. At the core of these proposals lies the principle that increased public school quality can make school system more efficient, resulting in correspondingly stronger performance by students. Nevertheless, the public educational system is not devoid of competition which arises, among other factors, through the efficiency of management and the geographical location of schools. Moreover, families in Spain appear to choose a school on the grounds of location. In this environment, the objective of this paper is to analyze whether geographical space has an impact on the relationship between the level of technical quality of public schools (measured by the efficiency score) and the school demand index. To do this, an empirical application is performed on a sample of 1,695 public schools in the region of Catalonia (Spain). This application shows the effects of spatial autocorrelation on the estimation of the parameters and how these problems are addressed through spatial econometrics models. The results confirm that space has a moderating effect on the relationship between efficiency and school demand, although only in urban municipalities.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
This study considered the current situation of biofuels markets in Finland. The fact that industry consumes more than half of the total primary energy, widely applied combined heat and power production and a high share of solid biomass fuels in the total energy consumption are specific to the Finnish energy system. Wood is the most important source of bioenergy in Finland, representing 21% of the total energy consumption in 2006. Almost 80% of the wood-based energy is recovered from industrial by-products and residues. Finland has commitment itself to maintaining its greenhouse gas emissions at the 1990 level, at the highest, during the period 2008–2012. The energy and climate policy carried out in recent years has been based on the National Energy and Climate introduced in 2005. The Finnish energy policy aims to achieve the target, and a variety of measures are taken to promote the use of renewable energy sources and especially wood fuels. In 2007, the government started to prepare a new long-term (up to the year 2050) climate and energy strategy that will meet EU’s new targets for the reduction of green house gas emissions and the promotion of renewable energy sources. The new strategy will be introduced during 2008. The international biofuels trade has a substantial importance for the utilisation of bioenergy in Finland. In 2006, the total international trading of solid and liquid biofuels was approximately 64 PJ of which import was 61 PJ. Most of the import is indirect and takes place within the forest industry’s raw wood imports. In 2006, as much as 24% of wood energy was based on foreignorigin wood. Wood pellets and tall oil form the majority of export streams of biofuels. The indirect import of wood fuels increased almost 10% in 2004–2006, while the direct trade of solid and liquid biofuels has been almost constant.
Resumo:
The stochastic convergence amongst Mexican Federal entities is analyzed in panel data framework. The joint consideration of cross-section dependence and multiple structural breaks is required to ensure that the statistical inference is based on statistics with good statistical properties. Once these features are accounted for, evidence in favour of stochastic convergence is found. Since stochastic convergence is a necessary, yet insufficient condition for convergence as predicted by economic growth models, the paper also investigates whether-convergence process has taken place. We found that the Mexican states have followed either heterogeneous convergence patterns or divergence process throughout the analyzed period.
Resumo:
In this paper, we obtain sharp asymptotic formulas with error estimates for the Mellin con- volution of functions de ned on (0;1), and use these formulas to characterize the asymptotic behavior of marginal distribution densities of stock price processes in mixed stochastic models. Special examples of mixed models are jump-di usion models and stochastic volatility models with jumps. We apply our general results to the Heston model with double exponential jumps, and make a detailed analysis of the asymptotic behavior of the stock price density, the call option pricing function, and the implied volatility in this model. We also obtain similar results for the Heston model with jumps distributed according to the NIG law.
Resumo:
The Practical Stochastic Model is a simple and robust method to describe coupled chemical reactions. The connection between this stochastic method and a deterministic method was initially established to understand how the parameters and variables that describe the concentration in both methods were related. It was necessary to define two main concepts to make this connection: the filling of compartments or dilutions and the rate of reaction enhancement. The parameters, variables, and the time of the stochastic methods were scaled with the size of the compartment and were compared with a deterministic method. The deterministic approach was employed as an initial reference to achieve a consistent stochastic result. Finally, an independent robust stochastic method was obtained. This method could be compared with the Stochastic Simulation Algorithm developed by Gillespie, 1977. The Practical Stochastic Model produced absolute values that were essential to describe non-linear chemical reactions with a simple structure, and allowed for a correct description of the chemical kinetics.