4 resultados para Condensers (Steam)
em Helda - Digital Repository of University of Helsinki
Resumo:
Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.
Resumo:
Flax and hemp have traditionally been used mainly for textiles, but recently interest has also been focused on non-textile applications. Microbial quality throughout the whole processing chain of bast fibres has not previously been studied. This study concentrates on the microbial quality and possible microbial risks in the production chain of hemp and flax fibres and fibrous thermal insulations. In order to be able to utilize hemp and flax fibres, the bast fibres must be separated from the rest of the plant. Non-cellulosic components can be removed with various pretreatment processes, which are associated with a certain risk of microbial contamination. In this study enzymatic retting and steam explosion (STEX) were examined as pretreatment processes. On the basis of the results obtained in this study, the microbial contents on stalks of both plants studied increased at the end of the growing season and during the winter. However, by processing and mechanical separation it is possible to produce fibres containing less moulds and bacteria than the whole stem. Enzymatic treatment encouraged the growth of moulds in fibres. Steam explosion reduced the amount of moulds in fibres. Dry thermal treatment used in this study did not markedly reduce the amount of microbes. In this project an emission measurement chamber was developed which was suitable for measurements of emissions from both mat type and loose fill type insulations, and capable of interdisciplinary sampling. In this study, the highest amounts of fungal emissions were in the range of 10^3 10^5 cfu/m^3 from the flax and hemp insulations at 90% RH of air. The fungal emissions from stone wool, glass wool and recycled paper insulations were below 10^2 cfu/m^3 even at 90% RH. Equally low values were obtained from bast fibrous materials in lower humidities (at 30% and 80% RH of air). After drying of moulded insulations at 30% RH, the amounts of emitted moulds were in all cases higher compared to the emissions at 90% RH before drying. The most common fungi in bast fibres were Penicillium and Rhizopus. The widest variety of different fungi was in the untreated hemp and linseed fibres and in the commercial loose-fill flax insulation. Penicillium, Rhizopus and Paecilomyces were the most tolerant to steam explosion. According to the literature, the most common fungi in building materials and indoor air are Penicillium, Aspergillus and Cladosporium, which were all found in some of the bast fibre materials in this study. As organic materials, hemp and flax fibres contain high levels of nutrients for microbial growth. The amount of microbes can be controlled and somewhat decreased by the processing methods presented.
Resumo:
Fusion energy is a clean and safe solution for the intricate question of how to produce non-polluting and sustainable energy for the constantly growing population. The fusion process does not result in any harmful waste or green-house gases, since small amounts of helium is the only bi-product that is produced when using the hydrogen isotopes deuterium and tritium as fuel. Moreover, deuterium is abundant in seawater and tritium can be bred from lithium, a common metal in the Earth's crust, rendering the fuel reservoirs practically bottomless. Due to its enormous mass, the Sun has been able to utilize fusion as its main energy source ever since it was born. But here on Earth, we must find other means to achieve the same. Inertial fusion involving powerful lasers and thermonuclear fusion employing extreme temperatures are examples of successful methods. However, these have yet to produce more energy than they consume. In thermonuclear fusion, the fuel is held inside a tokamak, which is a doughnut-shaped chamber with strong magnets wrapped around it. Once the fuel is heated up, it is controlled with the help of these magnets, since the required temperatures (over 100 million degrees C) will separate the electrons from the nuclei, forming a plasma. Once the fusion reactions occur, excess binding energy is released as energetic neutrons, which are absorbed in water in order to produce steam that runs turbines. Keeping the power losses from the plasma low, thus allowing for a high number of reactions, is a challenge. Another challenge is related to the reactor materials, since the confinement of the plasma particles is not perfect, resulting in particle bombardment of the reactor walls and structures. Material erosion and activation as well as plasma contamination are expected. Adding to this, the high energy neutrons will cause radiation damage in the materials, causing, for instance, swelling and embrittlement. In this thesis, the behaviour of a material situated in a fusion reactor was studied using molecular dynamics simulations. Simulations of processes in the next generation fusion reactor ITER include the reactor materials beryllium, carbon and tungsten as well as the plasma hydrogen isotopes. This means that interaction models, {\it i.e. interatomic potentials}, for this complicated quaternary system are needed. The task of finding such potentials is nonetheless nearly at its end, since models for the beryllium-carbon-hydrogen interactions were constructed in this thesis and as a continuation of that work, a beryllium-tungsten model is under development. These potentials are combinable with the earlier tungsten-carbon-hydrogen ones. The potentials were used to explain the chemical sputtering of beryllium due to deuterium plasma exposure. During experiments, a large fraction of the sputtered beryllium atoms were observed to be released as BeD molecules, and the simulations identified the swift chemical sputtering mechanism, previously not believed to be important in metals, as the underlying mechanism. Radiation damage in the reactor structural materials vanadium, iron and iron chromium, as well as in the wall material tungsten and the mixed alloy tungsten carbide, was also studied in this thesis. Interatomic potentials for vanadium, tungsten and iron were modified to be better suited for simulating collision cascades that are formed during particle irradiation, and the potential features affecting the resulting primary damage were identified. Including the often neglected electronic effects in the simulations was also shown to have an impact on the damage. With proper tuning of the electron-phonon interaction strength, experimentally measured quantities related to ion-beam mixing in iron could be reproduced. The damage in tungsten carbide alloys showed elemental asymmetry, as the major part of the damage consisted of carbon defects. On the other hand, modelling the damage in the iron chromium alloy, essentially representing steel, showed that small additions of chromium do not noticeably affect the primary damage in iron. Since a complete assessment of the response of a material in a future full-scale fusion reactor is not achievable using only experimental techniques, molecular dynamics simulations are of vital help. This thesis has not only provided insight into complicated reactor processes and improved current methods, but also offered tools for further simulations. It is therefore an important step towards making fusion energy more than a future goal.
Resumo:
The aim of this thesis was to study the crops currently used for biofuel production from the following aspects: 1. what should be the average yield/ ha to reach an energy balance at least 0 or positive 2. what are the shares of the primary and secondary energy flows in agriculture, transport, processing and usage, and 3. overall effects of biofuel crop cultivation, transport, processing and usage. This thesis concentrated on oilseed rape biodiesel and wheat bioethanol in the European Union, comparing them with competing biofuels, such as corn and sugarcane-based ethanol, and the second generation biofuels. The study was executed by comparing Life Cycle Assessment-studies from the EU-region and by analyzing them thoroughly from the differences viewpoint. The variables were the following: energy ratio, hectare yield (l/ha), impact on greenhouse gas emissions (particularly CO2), energy consumption in crop growing and processing one hectare of a particular crop to biofuel, distribution of energy in processing and effects of the secondary energy flows, like e.g. wheat straw. Processing was found to be the most energy consuming part in the production of biofuels. So if the raw materials will remain the same, the development will happen in processing. First generation biodiesel requires esterification, which consumes approximately one third of the process energy. Around 75% of the energy consumed in manufacturing the first generation wheat-based ethanol is spent in steam and electricity generation. No breakthroughs are in sight in the agricultural sector to achieve significantly higher energy ratios. It was found out that even in ideal conditions the energy ratio of first generation wheat-based ethanol will remain slightly under 2. For oilseed rape-based biodiesel the energy ratios are better, and energy consumption per hectare is lower compared to wheat-based ethanol. But both of these are lower compared to e.g. sugarcane-based ethanol. Also the hectare yield of wheat-based ethanol is significantly lower. Biofuels are in a key position when considering the future of the world’s transport sector. Uncertainties concerning biofuels are, however, several, like the schedule of large scale introduction to consumer markets, technologies used, raw materials and their availability and - maybe the biggest - the real production capacity in relation to the fuel consumption. First generation biofuels have not been the expected answer to environmental problems. Comparisons made show that sugarcane-based ethanol is the most prominent first generation biofuel at the moment, both from energy and environment point of view. Also palmoil-based biodiesel looks promising, although it involves environmental concerns as well. From this point of view the biofuels in this study - wheat-based ethanol and oilseed rape-based biodiesel - are not very competitive options. On the other hand, crops currently used for fuel production in different countries are selected based on several factors, not only based on thier relative general superiority. It is challenging to make long-term forecasts for the biofuel sector, but it can be said that satisfying the world's current and near future traffic fuel consumption with biofuels can only be regarded impossible. This does not mean that biofuels shoud be rejected and their positive aspects ignored, but maybe this reality helps us to put them in perspective. To achieve true environmental benefits through the usage of biofuels there must first be a significant drop both in traffic volumes and overall fuel consumption. Second generation biofuels are coming, but serious questions about their availability and production capacities remain open. Therefore nothing can be taken for granted in this issue, expect the need for development.