8 resultados para STEAM REFORMING
em Helda - Digital Repository of University of Helsinki
Resumo:
Aerosol particles deteriorate air quality, atmospheric visibility and our health. They affect the Earth s climate by absorbing and scattering sunlight, forming clouds, and also via several feed-back mechanisms. The net effect on the radiative balance is negative, i.e. cooling, which means that particles counteract the effect of greenhouse gases. However, particles are one of the poorly known pieces in the climate puzzle. Some of the airborne particles are natural, some anthropogenic; some enter the atmosphere in particle form, while others form by gas-to-particle conversion. Unless the sources and dynamical processes shaping the particle population are quantified, they cannot be incorporated into climate models. The molecular level understanding of new particle formation is still inadequate, mainly due to the lack of suitable measurement techniques to detect the smallest particles and their precursors. This thesis has contributed to our ability to measure newly formed particles. Three new condensation particle counter applications for measuring the concentration of nano-particles were developed. The suitability of the methods for detecting both charged and electrically neutral particles and molecular clusters as small as 1 nm in diameter was thoroughly tested both in laboratory and field conditions. It was shown that condensation particle counting has reached the size scale of individual molecules, and besides measuring the concentration they can be used for getting size information. In addition to atmospheric research, the particle counters could have various applications in other fields, especially in nanotechnology. Using the new instruments, the first continuous time series of neutral sub-3 nm particle concentrations were measured at two field sites, which represent two different kinds of environments: the boreal forest and the Atlantic coastline, both of which are known to be hot-spots for new particle formation. The contribution of ions to the total concentrations in this size range was estimated, and it could be concluded that the fraction of ions was usually minor, especially in boreal forest conditions. Since the ionization rate is connected to the amount of cosmic rays entering the atmosphere, the relative contribution of neutral to charged nucleation mechanisms extends beyond academic interest, and links the research directly to current climate debate.
Resumo:
From the Finnish Art Society to the Ateneum: Fredrik Cygnaeus, Carl Gustaf Estlander and the Roles of the Art Collection My dissertation deals with the Finnish Art Society and the development of its collection in the evolving field of the visual arts from the foundation of the society in 1846 to its exhibition in the Ateneum, a palace of art that was opened to the public in Helsinki in 1888. The main questions that it addresses are why and how the collection came into being, what its purpose was and what kind of future prospects were projected for it in the rapidly evolving field of the visual arts. I have examined the subject of my study from the perspectives of institutional history, the organisation of the field of art and the history of art collections. The prisms through which I have viewed the subject are the history of museums in Europe, the written history of art, the art association movement and the organisation of art education in relation to an ideology of enlightenment. Thus the activities of the Finnish Art Society are here mirrored for the first time in a wider context and the history of its collection located on the map of European collections. My research shows that the history of the collection of the Finnish Art Society initially depended on certain players in the visual arts and their particular leanings. The most important of these custodians were two long-serving chairmen of the society, Fredrik Cygnaeus (1807 1881) and Carl Gustaf Estlander (1834 1910). When the foundations for art activities had been laid through the establishment of the society, Cygnaeus and Estlander began to plan how the field of art might be moulded so as to improve the level of training for artists and to improve the quality of the collections and the opportunities for their display. Cygnaeus campaigned for the establishment of the Finnish Fine Arts Academy, while Estlander saw opportunities to combine the visual and applied arts. The findings of my research bring new information about the history of the collection of the Finnish Art Society, its profile, the professional abilities of those who were mainly responsible for developing it and the relationship between it and plans for reforming art education. The major findings are connected with the position of the collection in the field of art at different stages of its development. Despite the central monopoly of the Finnish Art Society in the field of art, the position of the collection was closely bound up with leading players in the field of art and their personal interests. This subservience also created an impediment to its full-blown enhancement and purposeful profiling, and it remained evident for a long time when the collection was seeking its own place in the Finnish art world.
Resumo:
Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.
Resumo:
Flax and hemp have traditionally been used mainly for textiles, but recently interest has also been focused on non-textile applications. Microbial quality throughout the whole processing chain of bast fibres has not previously been studied. This study concentrates on the microbial quality and possible microbial risks in the production chain of hemp and flax fibres and fibrous thermal insulations. In order to be able to utilize hemp and flax fibres, the bast fibres must be separated from the rest of the plant. Non-cellulosic components can be removed with various pretreatment processes, which are associated with a certain risk of microbial contamination. In this study enzymatic retting and steam explosion (STEX) were examined as pretreatment processes. On the basis of the results obtained in this study, the microbial contents on stalks of both plants studied increased at the end of the growing season and during the winter. However, by processing and mechanical separation it is possible to produce fibres containing less moulds and bacteria than the whole stem. Enzymatic treatment encouraged the growth of moulds in fibres. Steam explosion reduced the amount of moulds in fibres. Dry thermal treatment used in this study did not markedly reduce the amount of microbes. In this project an emission measurement chamber was developed which was suitable for measurements of emissions from both mat type and loose fill type insulations, and capable of interdisciplinary sampling. In this study, the highest amounts of fungal emissions were in the range of 10^3 10^5 cfu/m^3 from the flax and hemp insulations at 90% RH of air. The fungal emissions from stone wool, glass wool and recycled paper insulations were below 10^2 cfu/m^3 even at 90% RH. Equally low values were obtained from bast fibrous materials in lower humidities (at 30% and 80% RH of air). After drying of moulded insulations at 30% RH, the amounts of emitted moulds were in all cases higher compared to the emissions at 90% RH before drying. The most common fungi in bast fibres were Penicillium and Rhizopus. The widest variety of different fungi was in the untreated hemp and linseed fibres and in the commercial loose-fill flax insulation. Penicillium, Rhizopus and Paecilomyces were the most tolerant to steam explosion. According to the literature, the most common fungi in building materials and indoor air are Penicillium, Aspergillus and Cladosporium, which were all found in some of the bast fibre materials in this study. As organic materials, hemp and flax fibres contain high levels of nutrients for microbial growth. The amount of microbes can be controlled and somewhat decreased by the processing methods presented.
Resumo:
Territoriality is a central issue in indigenous peoples struggles. The territorial struggles involve struggles over the control of natural resources and over political participation and representation, but also over the perception of territorial rights and the symbolic representation of the territory. These struggles are carried through both in material and symbolic ways through recurring to different discourses and representations that provide legitimation for the territorial claims of the group. The study is located in the Northern Autonomous Atlantic Region of Nicaragua. The study concerns the territorial strategies, conceptions and practices of the indigenous people and other actors. Territorial conflicts exist between the autonomous region and the central government of Nicaragua, between mestizo settlers and indigenous people, between different indigenous groups, and between these and development agents such as conservation projects. The study focuses on how territorial discourses and representations are used to legitimate territorial control. Environmental, historical and cartographical discourses are the most important discourses recurred to. The influence of discourses and representations on the territorial practices and policies of the different actors, the links between the local struggles and global processes, and the broader structural factors impacting on the territorial struggles are also analysed. Among the structural factors are the problems related to land tenure and management and the use of natural resources, the advance of the agricultural frontier, the institutional weaknesses of the central and regional governments and the legislative processes. The territorial discourses are both recurred to in a strategic way and also grounded in local ideals and practices. The discourses have produced real effects for example in legislation, land tenure systems, political representation and environmental practices. Although the use of discourses and representations are an important power tool in territorial struggles, territorial control cannot be effectively accomplished merely through representing territorial claims in a legitimate way or through reforming legislation, as the conflicts are also largely a result of structural factors affecting the region. The fieldwork was carried out during a total of twelve months between 2000 and 2002. The research methods used were semi-structured interviews, participant observation and participatory research methods. A broad range of literary sources were also used to collect data. The study is located within the field of critical political geography with a discursive political ecology approach. It can be called a critical realist approach to the discursive analysis of indigenous territoriality.
Resumo:
Fusion energy is a clean and safe solution for the intricate question of how to produce non-polluting and sustainable energy for the constantly growing population. The fusion process does not result in any harmful waste or green-house gases, since small amounts of helium is the only bi-product that is produced when using the hydrogen isotopes deuterium and tritium as fuel. Moreover, deuterium is abundant in seawater and tritium can be bred from lithium, a common metal in the Earth's crust, rendering the fuel reservoirs practically bottomless. Due to its enormous mass, the Sun has been able to utilize fusion as its main energy source ever since it was born. But here on Earth, we must find other means to achieve the same. Inertial fusion involving powerful lasers and thermonuclear fusion employing extreme temperatures are examples of successful methods. However, these have yet to produce more energy than they consume. In thermonuclear fusion, the fuel is held inside a tokamak, which is a doughnut-shaped chamber with strong magnets wrapped around it. Once the fuel is heated up, it is controlled with the help of these magnets, since the required temperatures (over 100 million degrees C) will separate the electrons from the nuclei, forming a plasma. Once the fusion reactions occur, excess binding energy is released as energetic neutrons, which are absorbed in water in order to produce steam that runs turbines. Keeping the power losses from the plasma low, thus allowing for a high number of reactions, is a challenge. Another challenge is related to the reactor materials, since the confinement of the plasma particles is not perfect, resulting in particle bombardment of the reactor walls and structures. Material erosion and activation as well as plasma contamination are expected. Adding to this, the high energy neutrons will cause radiation damage in the materials, causing, for instance, swelling and embrittlement. In this thesis, the behaviour of a material situated in a fusion reactor was studied using molecular dynamics simulations. Simulations of processes in the next generation fusion reactor ITER include the reactor materials beryllium, carbon and tungsten as well as the plasma hydrogen isotopes. This means that interaction models, {\it i.e. interatomic potentials}, for this complicated quaternary system are needed. The task of finding such potentials is nonetheless nearly at its end, since models for the beryllium-carbon-hydrogen interactions were constructed in this thesis and as a continuation of that work, a beryllium-tungsten model is under development. These potentials are combinable with the earlier tungsten-carbon-hydrogen ones. The potentials were used to explain the chemical sputtering of beryllium due to deuterium plasma exposure. During experiments, a large fraction of the sputtered beryllium atoms were observed to be released as BeD molecules, and the simulations identified the swift chemical sputtering mechanism, previously not believed to be important in metals, as the underlying mechanism. Radiation damage in the reactor structural materials vanadium, iron and iron chromium, as well as in the wall material tungsten and the mixed alloy tungsten carbide, was also studied in this thesis. Interatomic potentials for vanadium, tungsten and iron were modified to be better suited for simulating collision cascades that are formed during particle irradiation, and the potential features affecting the resulting primary damage were identified. Including the often neglected electronic effects in the simulations was also shown to have an impact on the damage. With proper tuning of the electron-phonon interaction strength, experimentally measured quantities related to ion-beam mixing in iron could be reproduced. The damage in tungsten carbide alloys showed elemental asymmetry, as the major part of the damage consisted of carbon defects. On the other hand, modelling the damage in the iron chromium alloy, essentially representing steel, showed that small additions of chromium do not noticeably affect the primary damage in iron. Since a complete assessment of the response of a material in a future full-scale fusion reactor is not achievable using only experimental techniques, molecular dynamics simulations are of vital help. This thesis has not only provided insight into complicated reactor processes and improved current methods, but also offered tools for further simulations. It is therefore an important step towards making fusion energy more than a future goal.
Resumo:
The 1980s and the early 1990s have proved to be an important turning point in the history of the Nordic welfare states. After this breaking point, the Nordic social order has been built upon a new foundation. This study shows that the new order is mainly built upon new hierarchies and control mechanisms that have been developed consistently through economic and labour market policy measures. During the post-war period Nordic welfare states to an increasing extent created equality of opportunity and scope for agency among people. Public social services were available for all and the tax-benefit system maintained a level income distribution. During this golden era of Nordic welfare state, the scope for agency was, however, limited by social structures. Public institutions and law tended to categorize people according to their life circumstances ascribing them a predefined role. In the 1980s and 1990s this collectivist social order began to mature and it became subject to political renegotiation. Signs of a new social order in the Nordic countries have included the liberation of the financial markets, the privatizing of public functions and redefining the role of the public sector. It is now possible to reassess the ideological foundations of this new order. As a contrast to widely used political rhetoric, the foundation of the new order has not been the ideas of individual freedom or choice. Instead, the most important aim appears to have been to control and direct people to act in accordance with the rules of the market. The various levels of government and the social security system have been redirected to serve this goal. Instead of being a mechanism for redistributing income, the Nordic social security system has been geared towards creating new hierarchies on the Nordic labour markets. During the past decades, conditions for receiving income support and unemployment benefit have been tightened in all Nordic countries. As a consequence, people have been forced to accept deteriorating terms and conditions on the labour market. Country-specific variations exist, however: in sum Sweden has been most conservative, Denmark most innovative and Finland most radical in reforming labour market policy. The new hierarchies on the labour market have co-incided with slow or non-existent growth of real wages and with a strong growth of the share of capital income. Slow growth of real wages has kept inflation low and thus secured the value of capital. Societal development has thus progressed from equality of opportunity during the age of the welfare states towards a hierarchical social order where the majority of people face increasing constraints and where a fortunate minority enjoys prosperity and security.
Resumo:
The aim of this thesis was to study the crops currently used for biofuel production from the following aspects: 1. what should be the average yield/ ha to reach an energy balance at least 0 or positive 2. what are the shares of the primary and secondary energy flows in agriculture, transport, processing and usage, and 3. overall effects of biofuel crop cultivation, transport, processing and usage. This thesis concentrated on oilseed rape biodiesel and wheat bioethanol in the European Union, comparing them with competing biofuels, such as corn and sugarcane-based ethanol, and the second generation biofuels. The study was executed by comparing Life Cycle Assessment-studies from the EU-region and by analyzing them thoroughly from the differences viewpoint. The variables were the following: energy ratio, hectare yield (l/ha), impact on greenhouse gas emissions (particularly CO2), energy consumption in crop growing and processing one hectare of a particular crop to biofuel, distribution of energy in processing and effects of the secondary energy flows, like e.g. wheat straw. Processing was found to be the most energy consuming part in the production of biofuels. So if the raw materials will remain the same, the development will happen in processing. First generation biodiesel requires esterification, which consumes approximately one third of the process energy. Around 75% of the energy consumed in manufacturing the first generation wheat-based ethanol is spent in steam and electricity generation. No breakthroughs are in sight in the agricultural sector to achieve significantly higher energy ratios. It was found out that even in ideal conditions the energy ratio of first generation wheat-based ethanol will remain slightly under 2. For oilseed rape-based biodiesel the energy ratios are better, and energy consumption per hectare is lower compared to wheat-based ethanol. But both of these are lower compared to e.g. sugarcane-based ethanol. Also the hectare yield of wheat-based ethanol is significantly lower. Biofuels are in a key position when considering the future of the world’s transport sector. Uncertainties concerning biofuels are, however, several, like the schedule of large scale introduction to consumer markets, technologies used, raw materials and their availability and - maybe the biggest - the real production capacity in relation to the fuel consumption. First generation biofuels have not been the expected answer to environmental problems. Comparisons made show that sugarcane-based ethanol is the most prominent first generation biofuel at the moment, both from energy and environment point of view. Also palmoil-based biodiesel looks promising, although it involves environmental concerns as well. From this point of view the biofuels in this study - wheat-based ethanol and oilseed rape-based biodiesel - are not very competitive options. On the other hand, crops currently used for fuel production in different countries are selected based on several factors, not only based on thier relative general superiority. It is challenging to make long-term forecasts for the biofuel sector, but it can be said that satisfying the world's current and near future traffic fuel consumption with biofuels can only be regarded impossible. This does not mean that biofuels shoud be rejected and their positive aspects ignored, but maybe this reality helps us to put them in perspective. To achieve true environmental benefits through the usage of biofuels there must first be a significant drop both in traffic volumes and overall fuel consumption. Second generation biofuels are coming, but serious questions about their availability and production capacities remain open. Therefore nothing can be taken for granted in this issue, expect the need for development.