927 resultados para Railway scheduling
Resumo:
The European transport market has confronted several changes during the last decade. Due to European Union legislative mandates, the railway freight market was deregulated in 2007. The market followed the trend started by other transport modes as well as other previously regulated industries such as banking, telecommunications and energy. Globally, the first country to deregulate the railway freight market was the United States, with the introduction of the Staggers Rail Act in 1980. Some European countries decided to follow suit already before regulation was mandated; among the forerunners were the United Kingdom, Sweden and Germany. The previous research has concentrated only on these countries, which has provided an interesting research gap for this thesis. The Baltic Sea Region consists of countries with different kinds of liberalization paths, including Sweden and Germany, which have been on the frontline, whereas Lithuania and Finland have only one active railway undertaking, the incumbent. The transport market of the European Union is facing further challenges in the near future, due to the Sulphur Directive, oil dependency and the changing structure of European rail networks. In order to improve the accessibility of this peripheral area, further action is required. This research focuses on topics such as the progression of deregulation, barriers to entry, country-specific features, cooperation and internationalization. Based on the research results, it can be stated that the Baltic Sea Region’s railway freight market is expected to change in the future. Further private railway undertakings are anticipated, and these would change the market structure. The realization of European Union’s plans to increase the improved rail network to cover the Baltic States is strongly hoped for, and railway freight market counterparts inside and among countries are starting to enhance their level of cooperation. The Baltic Sea Region countries have several special national characteristics which influence the market and should be taken into account when companies evaluate possible market entry actions. According to thesis interviews, the Swedish market has a strong level of cooperation in the form of an old-boy network, and is supported by a positive attitude of the incumbent towards the private railway undertakings. This has facilitated the entry process of newcomers, and currently the market has numerous operating railway undertakings. A contrary example was found from Poland, where the incumbent sent old rolling stock to the scrap yard rather than sell it to private railway undertakings. The importance of personal relations is highlighted in Russia, followed by the railway market’s strong political bond with politics. Nonetheless, some barriers to entry are shared by the Baltic Sea Region, the main ones being acquisition of rolling stock, bureaucracy and needed investments. The railway freight market is internationalizing, which is perceived via several alliances as well as the increased number of mergers and acquisitions. After deregulation, markets seem to increase the number of railway undertakings at a rather fast pace, but with the passage of time, the larger operators tend to acquire smaller ones. Therefore, it is expected that in a decade’s time, the number of railway undertakings will start to decrease in the deregulation pioneer countries, while the ones coming from behind might still experience an increase. The Russian market is expected to be totally liberalized, and further alliances between the Russian Railways and European railway undertakings are expected to occur. The Baltic Sea Region’s railway freight market is anticipated to improve, and, based on the interviewees’ comments, attract more cargoes from road to rail.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.
Resumo:
Kartta kuuluu A. E. Nordenskiöldin kokoelmaan
Resumo:
Kartta kuuluu A. E. Nordenskiöldin kokoelmaan