11 resultados para Modern track process
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Euroopan unionissa päätettiin jo yli vuosikymmen sitten, että rautatieliiketoiminta vapautetaan kilpailulle. Iso-Britanniasta olimäärä tulla esimerkkivaltio tämän prosessin käyttöönotossa. Pääideana oli säännöstelyn keventäminen, jolloin omistuspohja toimialalla laajenee ja rautateiden infrastruktuuri sekä toiminta parantuvat. Infrastruktuuri on määrä olla yhden organisaation hallinnassa ja raiteiden käyttöoikeus on kaikilla lupaehdot täyttävillä operaattoreilla, jotka kilpailevat keskenään matkustajista ja tavararahdeista. Kuitenkin Yhdysvalloissa ja eräissä Latinalaisen Amerikan maissa kilpailu on vapautettu siten, että rautatieyritys omistaa raideinfrastruktuurin, junat, tavarankuljetus- sekä matkustajavaunut. Iso-Britannian yksityistämistä pidettiin aluksi isonaepäonnistumisena: nopealla aikataululla sovellettiin jäykkiä transaktioperusteisia ulkoistamisstrategioita infrastruktuurin kunnossapitoon, jotka lopulta johtivat junien jatkuviin myöhästymisiin ja muutamaan tuhoisaan onnettomuuteen. Liiketoiminnallisessa mielessäkään ei oikein onnistuttu: infrastruktuurista vastaava yritys jouduttiin listaamaan pois Lontoon pörssistä, ja hallituksen oli pakko luoda tukipaketti pahasti velkaantuneen, vain marginaalisien investointien kohteena olleen yrityksen toimintaa varten (vaikka kapasiteettitarvetta oli markkinoilla). Myös rautatieoperaattorit olivat taloudellisessa ahdingossa ja vain määrätietoisten hallituksen laatimien pelastuspakettien avulla ala nousi syvimmästä kriisistään. Tästä huolimatta näiden negatiivisten sivuvaikutusten ohella koko ala pystyi kasvattamaan kysyntää, niin matkustaja- kuin rahtiliikenteenkin osalta. Vähenevän kysynnän trendi, joka alkoi 1970-luvulla, otti käännöksen parempaan. Toinen eurooppalaismaa, jolla on pitkät kokemukset yksityistämisestä, on Ruotsi. Tämä maatapaus on melko konservatiivinen verrattuna tilanteeseen edellisessä; vain rajattu määrä reittejä on avattu kilpailulle ja sopimukset tehdään kerralla pitkäksi aikaa eteenpäin. Ruotsin säännöstelyn purku osoittautui menestykseksi, koska tuottavuus onollut vakaassa kasvussa ja rautateiden markkinaosuus erityisesti matkustajapuolella on noussut merkittävästi, verrattuna muihin kuljetusmuotoihin. Kuitenkin kilpailua on käytännössä vähän tässä maassa ja parempiatuloksia on lupa odottaa, kun vain säännöstelyn purkaminen jatkuu. Viimeinen tutkimuksemme kohteena oleva maa on Yhdysvallat, joka alistutti rautatiet kilpailulle jo 1980-luvun alussa, käyttäen jo edellä mainittua vertikaalista integraatiota; tämä valinta on taas johtanut hyvin erilaisiin tuloksiin. Vaihtoehtoinen rakenteellinen uudistustapa on suosinut rahtivirtoja matkustajiin nähden, ja lopputuloksena tämä tapaus synnytti yrityksiä huolehtimaan toista näistä kahdesta pääasiakasryhmästä. Viimeaikaiset tulokset tästä yksityistämisprosessista ovat olleet hyviä: jäljellejääneiden yritysten voitot ovat kasvaneet, osinkoja ollaan kyetty jakamaan ja osakkeiden arvostus on noussut. Tässä tutkimusraportissa yritämme kolmen maatapauksen kautta esittää, miten yksityistämisprosessi tulee vaikuttamaan Euroopassa, kun kilpailu rautateillä vapautuu. Me käymmeläpi, mikä näistä kolmesta maaesimerkistä on kaikkein todennäköisin jaesitämme ehdotuksia siihen, miten valtiot voisivat välttää ei-haluttuja sivuvaikutuksia. Kolme maaesimerkkiä, ja lopuksi esitetty lyhyt tilastollinen analyysi osoittavat, että rautateillä on tulevaisuuden potentiaalia Euroopassa, ja kilpailun vapauttaminen on avain tämän potentiaalin realisointiin.
Resumo:
The fiber recovery process is an essential part of the modern paper mill. It creates the basisfor mill's internal recirculation of the most important raw materials ¿ water and fiber. It is normally also a start point for further treatment of wastewater and if it works efficiently, it offers excellent basis to minimize effluents. This dissertation offers two different approaches to the subject. Firstly a novel save-all disc filter feeding system is developed and presented. This so-called precoat method is tested both in the laboratory and full-scale conditions. In laboratory scale it beats the traditional one clearly, when low freeness pulps are used as a sweetener stock. The full-scale application needs still some development work before it can be implemented to the paper mills. Secondly, the operationenvironment of save-all disc filter is studied mostly in laboratory conditions.The focus of this study is in cases, where low-freeness pulps are used as a sweetener stock of save-all filter. The effects of CSF-value, pressure drop, suspension consistency and retention chemicals to the quantity and quality of the filtrate was studied. Also the filtration resistance of the low freeness pulps was one studied.
Resumo:
Ohjelmistojen tärkeys nykypäivän yhteiskunnalle kasvaa jatkuvasti. Monia ohjelmistoprojekteja vaivaavat ongelmat aikataulussa pysymisestä, korkean tuottavuuden ylläpitämisestä ja riittävän korkeasta laadusta. Ohjelmistokehitysprosessien parantamisessa on naiden ongelmien minimoimiseksi tehty suuria investointeja. Investointien syynä on ollut olettamus ohjelmistokehityksen kapasiteetin suora riippuvuus tuotteen laadusta. Tämän tutkimuksen tarkoituksena oli tutkia Ohjelmistokehitysprosessien parantamisen mahdollisuuksia. Olemassaolevat ohjelmistokehityksen ja Ohjelmistokehitysprosessin parantamisen mallit, tekniikat ja metodologiat esiteltiin. Esiteltyjen mallien, tekniikoiden ja metodologioiden soveltuvuus analysoitiin ja suositus mallien käytöstä annettiin.
Resumo:
The purpose of this Thesis was to study what is the present situation of Business Intelligence of the company unit. This means how efficiently unit uses possibilities of modern information management systems. The aim was to resolve how operative informa-tion management of unit’s tender process could be improved by modern information technology applications. This makes it possible that tender processes could be faster and more efficiency. At the beginning it was essential to acquaint oneself with written literature of Business Intelligence. Based on Business Intelligence theory is was relatively easy but challenging to search and discern how tender business could be improved by methods of Busi-ness Intelligence. The empirical phase of this study was executed as qualitative research method. This phase includes theme and natural interviews on the company. Problems and challenges of tender process were clarified in a part an empirical phase. Group of challenges were founded when studying information management of company unit. Based on theory and interviews, group of improvements were listed which company could possible do in the future when developing its operative processes.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
Finland’s rural landscape has gone through remarkable changes from the 1950’s, due to agricultural developments. Changed farming practices have influenced especially traditional landscape management, and modifications in the arable land structure and grasslands transitions are notable. The review of the previous studies reveal the importance of the rural landscape composition and structure to species and landscape diversity, whereas including the relevance in presence of the open ditches, size of the field and meadow patches, topology of the natural and agricultural landscape. This land-change study includes applying remote sensed data from two time series and empirical geospatial analysis in Geographic Information Systems (GIS). The aims of this retrospective research is to detect agricultural landscape use and land cover change (LULCC) dynamics and discuss the consequences of agricultural intensification to landscape structure covering from the aspects of landscape ecology. Measurements of LULC are derived directly from pre-processed aerial images by a variety of analytical procedures, including statistical methods and image interpretation. The methodological challenges are confronted in the process of landscape classification and combining change detection approaches with landscape indices. Particular importance is paid on detecting agricultural landscape features at a small scale, demanding comprehensive understanding of such agroecosystems. Topological properties of the classified arable land and valley are determined in order to provide insight and emphasize the aspect the field edges in the agricultural landscape as important habitat. Change detection dynamics are presented with change matrix and additional calculations of gain, loss, swap, net change, change rate and tendencies are made. Transition’s possibility is computed following Markov’s probability model and presented with matrix, as well. Thesis’s spatial aspect is revealed with illustrative maps providing knowledge of location of the classified landscape categories and location of the dynamics of the changes occurred. It was assured that in Rekijoki valley’s landscape, remarkable changes in landscape has occurred. Landscape diversity has been strongly influenced by modern agricultural landscape change, as NP of open ditches has decreased and the MPS of the arable plot has decreased. Overall change in the diversity of the landscape is determined with the decrease of SHDI. Valley landscape considered as traditional land use area has experienced major transitional changes, as meadows class has lost almost one third of the area due to afforestation. Also, remarkable transitions have occurred from forest to meadow and arable land to built area. Boundaries measurement between modern and traditional landscape has indicated noticeable proportional increase in arable land-forest edge type and decrease in arable land-meadow edge type. Probability calculations predict higher future changes for traditional landscape, but also for arable land turning into built area.
Resumo:
Laser additive manufacturing (LAM), known also as 3D printing, has gained a lot of interest in past recent years within various industries, such as medical and aerospace industries. LAM enables fabrication of complex 3D geometries by melting metal powder layer by layer with laser beam. Research in laser additive manufacturing has been focused in development of new materials and new applications in past 10 years. Since this technology is on cutting edge, efficiency of manufacturing process is in center role of research of this industry. Aim of this thesis is to characterize methods for process efficiency improvements in laser additive manufacturing. The aim is also to clarify the effect of process parameters to the stability of the process and in microstructure of manufactured pieces. Experimental tests of this thesis were made with various process parameters and their effect on build pieces has been studied, when additive manufacturing was performed with a modified research machine representing EOSINT M-series and with EOS EOSINT M280. Material used was stainless steel 17-4 PH. Also, some of the methods for process efficiency improvements were tested. Literature review of this thesis presents basics of laser additive manufacturing, methods for improve the process efficiency and laser beam – material- interaction. It was observed that there are only few public studies about process efficiency of laser additive manufacturing of stainless steel. According to literature, it is possible to improve process efficiency with higher power lasers and thicker layer thicknesses. The process efficiency improvement is possible if the effect of process parameter changes in manufactured pieces is known. According to experiments carried out in this thesis, it was concluded that process parameters have major role in single track formation in laser additive manufacturing. Rough estimation equations were created to describe the effect of input parameters to output parameters. The experimental results showed that the WDA (width-depth-area of cross-sections of single track) is correlating exponentially with energy density input. The energy density input is combination of the input parameters of laser power, laser beam spot diameter and scan speed. The use of skin-core technique enables improvement of process efficiency as the core of the part is manufactured with higher laser power and thicker layer thickness and the skin with lower laser power and thinner layer thickness in order to maintain high resolution. In this technique the interface between skin and core must have overlapping in order to achieve full dense parts. It was also noticed in this thesis that keyhole can be formed in LAM process. It was noticed that the threshold intensity value of 106 W/cm2 was exceeded during the tests. This means that in these tests the keyhole formation was possible.
Resumo:
The conventional activated sludge processes (CAS) for the treatment of municipal wastewater are going to be outdated gradually due to more stringent environmental protection laws and regulations. The Membrane bioreactors (MBRs) are the most promising modern technology widely accepted in the world of wastewater treatment due to their highly pronounced features such as high quality effluent, less foot print and working under high MLSS concentration. This research project was carried out to investigate the feasibility and effectiveness of MBR technology compare to the CAS process based on the scientific facts and results. The pilot scale MBR pilot plant was run for more than 150 days and the analysis results were evaluated. The prime focus of the project was to evaluate the correlation of permeate flux under different operating MLSS concentrations. The permeate flux was found almost constant regardless of variations in MLSS concentrations. The removal of micropollutant such as heavy metals, PCPPs, PFCs, steroidal hormones was also studied. The micropollutant removal performance of MBR process was found relatively effective than CAS process. Furthermore, the compatibility of submerged membranes within the bioreactor had truly reduced the process footprint.
Resumo:
In this work, the feasibility of the floating-gate technology in analog computing platforms in a scaled down general-purpose CMOS technology is considered. When the technology is scaled down the performance of analog circuits tends to get worse because the process parameters are optimized for digital transistors and the scaling involves the reduction of supply voltages. Generally, the challenge in analog circuit design is that all salient design metrics such as power, area, bandwidth and accuracy are interrelated. Furthermore, poor flexibility, i.e. lack of reconfigurability, the reuse of IP etc., can be considered the most severe weakness of analog hardware. On this account, digital calibration schemes are often required for improved performance or yield enhancement, whereas high flexibility/reconfigurability can not be easily achieved. Here, it is discussed whether it is possible to work around these obstacles by using floating-gate transistors (FGTs), and analyze problems associated with the practical implementation. FGT technology is attractive because it is electrically programmable and also features a charge-based built-in non-volatile memory. Apart from being ideal for canceling the circuit non-idealities due to process variations, the FGTs can also be used as computational or adaptive elements in analog circuits. The nominal gate oxide thickness in the deep sub-micron (DSM) processes is too thin to support robust charge retention and consequently the FGT becomes leaky. In principle, non-leaky FGTs can be implemented in a scaled down process without any special masks by using “double”-oxide transistors intended for providing devices that operate with higher supply voltages than general purpose devices. However, in practice the technology scaling poses several challenges which are addressed in this thesis. To provide a sufficiently wide-ranging survey, six prototype chips with varying complexity were implemented in four different DSM process nodes and investigated from this perspective. The focus is on non-leaky FGTs, but the presented autozeroing floating-gate amplifier (AFGA) demonstrates that leaky FGTs may also find a use. The simplest test structures contain only a few transistors, whereas the most complex experimental chip is an implementation of a spiking neural network (SNN) which comprises thousands of active and passive devices. More precisely, it is a fully connected (256 FGT synapses) two-layer spiking neural network (SNN), where the adaptive properties of FGT are taken advantage of. A compact realization of Spike Timing Dependent Plasticity (STDP) within the SNN is one of the key contributions of this thesis. Finally, the considerations in this thesis extend beyond CMOS to emerging nanodevices. To this end, one promising emerging nanoscale circuit element - memristor - is reviewed and its applicability for analog processing is considered. Furthermore, it is discussed how the FGT technology can be used to prototype computation paradigms compatible with these emerging two-terminal nanoscale devices in a mature and widely available CMOS technology.
Resumo:
Mobile network coverage is traditionally provided by outdoor macro base stations, which have a long range and serve several of customers. Due to modern passive houses and tightening construction legislation, mobile network service is deteriorated in many indoor locations. Typically, solutions for indoor coverage problem are expensive and demand actions from the mobile operator. Due to these, superior solutions are constantly researched. The solution presented in this thesis is based on Small Cell technology. Small Cells are low power access nodes designed to provide voice and data services.. This thesis concentrates on a specific Small Cell solution, which is called a Pico Cell. The problem regarding Pico Cells and Small Cells in general is that they are a new technological solution for the mobile operator, and the possible problem sources and incidents are not properly mapped. The purpose of this thesis is to figure out the possible problems in the Pico Cell deployment and how they could be solved within the operator’s incident management process. The research in the thesis is carried out with a literature research and a case study. The possible problems are investigated through lab testing. Pico Cell automated deployment process was tested in the lab environment and its proper functionality is confirmed. The related network elements were also tested and examined, and the emerged problems are resolvable. Operators existing incident management process can be used for Pico Cell troubleshooting with minor updates. Certain pre-requirements have to be met before Pico Cell deployment can be considered. The main contribution of this thesis is the Pico Cell integrated incident management process. The presented solution works in theory and solves the problems found during the lab testing. The limitations in the customer service level were solved by adding the necessary tools and by designing a working question pattern. Process structures for automated network discovery and pico specific radio parameter planning were also added for the mobile network management layer..