12 resultados para orders of worth

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluation practices have pervaded the Finnish society and welfare state. At the same time the term effectiveness has become a powerful organising concept in welfare state activities. The aim of the study is to analyse how the outcome-oriented society came into being through historical processes, to answer the question of how social policy and welfare state practices were brought under the governance of the concept of effectiveness . Discussions about social imagination, Michel Foucault s conceptions of the history of the present and of governmentality, genealogy and archaeology, along with Ian Hacking s notions of dynamic nominalism and styles of reasoning, are used as the conceptual and methodological starting points for the study. In addition, Luc Boltanski s and Laurent Thévenot s ideas of orders of worth , regimes of evaluation in everyday life, are employed. Usually, evaluation is conceptualised as an autonomous epistemic culture and practice (evaluation as epistemic practice), but evaluation is here understood as knowledge-creation processes elementary to different epistemic practices (evaluation in epistemic practices). The emergence of epistemic cultures and styles of reasoning about the effectiveness or impacts of welfare state activities are analysed through Finnish social policy and social work research. The study uses case studies which represent debates and empirical research dealing with the effectiveness and quality of social services and social work. While uncertainty and doubts over the effects and consequences of welfare policies have always been present in discourses about social policy, the theme has not been acknowledged much in social policy research. To resolve these uncertainties, eight styles of reasoning about such effects have emerged over time. These are the statistical, goal-based, needs-based, experimental, interaction-based, performance measurement, auditing and evidence-based styles of reasoning. Social policy research has contributed in various ways to the creation of these epistemic practices. The transformation of the welfare state, starting at the end of 1980s, increased market-orientation and trimmed public welfare responsibilities, and led to the adoption of the New Public Management (NPM) style of leadership. Due to these developments the concept of effectiveness made a breakthrough, and new accountabilities with their knowledge tools for performance measurement and auditing and evidence-based styles of reasoning became more dominant in the ruling of the welfare state. Social sciences and evaluation have developed a heteronomous relation with each other, although there still remain divergent tendencies between them. Key words: evaluation, effectiveness, social policy, welfare state, public services, sociology of knowledge

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research in this thesis addresses the question of corporate legitimation and values. It studies moral speech in Finnish companies' social responsibility reports and annual reports. The managerial rhetoric has been examined as a means of building and maintaining legitimacy. The companies studyed are the ten biggest companies that repordted on social responsibility in 2004, and the analysed data consists of the companie's reporting from 1998 to 2008. The theoretical and analytical framework is provided by Luc Boltanski's and Laurent Thévenot's theory of justification. The theory is focused on systems of moral thinking and argumentation, so called "orders of worth". The study shows how these moral schemes were used in the legitimation process. Special attention is paid on the ways that compromises are made between different orders of worth, such as the market, civic and green order. The study shows that the focus of legitimation has shifted towards societal and environmental themes. The values of market and industry, profits and efficiency, however, remain the strongest basis for organizational legitimation in Finnish companies. The economic crisis of 2008 had a visible impact on the moral rhetoric, especially in the Finnish forestry sector. Large layoffs questionned the companies' traditional role and made companies adopt a more market-centered and project-based moral rhetoric. New inspirational and project-centered moral speech emerged as the companies were less able to present themselves as nation-based, traditional actors in the Finnish society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dissertation deals with remote narrowband measurements of the electromagnetic radiation emitted by lightning flashes. A lightning flash consists of a number of sub-processes. The return stroke, which transfers electrical charge from the thundercloud to to the ground, is electromagnetically an impulsive wideband process; that is, it emits radiation at most frequencies in the electromagnetic spectrum, but its duration is only some tens of microseconds. Before and after the return stroke, multiple sub-processes redistribute electrical charges within the thundercloud. These sub-processes can last for tens to hundreds of milliseconds, many orders of magnitude longer than the return stroke. Each sub-process causes radiation with specific time-domain characteristics, having maxima at different frequencies. Thus, if the radiation is measured at a single narrow frequency band, it is difficult to identify the sub-processes, and some sub-processes can be missed altogether. However, narrowband detectors are simple to design and miniaturize. In particular, near the High Frequency band (High Frequency, 3 MHz to 30 MHz), ordinary shortwave radios can, in principle, be used as detectors. This dissertation utilizes a prototype detector which is essentially a handheld AM radio receiver. Measurements were made in Scandinavia, and several independent data sources were used to identify lightning sub-processes, as well as the distance to each individual flash. It is shown that multiple sub-processes radiate strongly near the HF band. The return stroke usually radiates intensely, but it cannot be reliably identified from the time-domain signal alone. This means that a narrowband measurement is best used to characterize the energy of the radiation integrated over the whole flash, without attempting to identify individual processes. The dissertation analyzes the conditions under which this integrated energy can be used to estimate the distance to the flash. It is shown that flash-by-flash variations are large, but the integrated energy is very sensitive to changes in the distance, dropping as approximately the inverse cube root of the distance. Flashes can, in principle, be detected at distances of more than 100 km, but since the ground conductivity can vary, ranging accuracy drops dramatically at distances larger than 20 km. These limitations mean that individual flashes cannot be ranged accurately using a single narrowband detector, and the useful range is limited to 30 kilometers at the most. Nevertheless, simple statistical corrections are developed, which enable an accurate estimate of the distance to the closest edge of an active storm cell, as well as the approach speed. The results of the dissertation could therefore have practical applications in real-time short-range lightning detection and warning systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Atmospheric aerosol particles affect the global climate as well as human health. In this thesis, formation of nanometer sized atmospheric aerosol particles and their subsequent growth was observed to occur all around the world. Typical formation rate of 3 nm particles at varied from 0.01 to 10 cm-3s-1. One order of magnitude higher formation rates were detected in urban environment. Highest formation rates up to 105 cm-3s-1 were detected in coastal areas and in industrial pollution plumes. Subsequent growth rates varied from 0.01 to 20 nm h-1. Smallest growth rates were observed in polar areas and the largest in the polluted urban environment. This was probably due to competition between growth by condensation and loss by coagulation. Observed growth rates were used in the calculation of a proxy condensable vapour concentration and its source rate in vastly different environments from pristine Antarctica to polluted India. Estimated concentrations varied only 2 orders of magnitude, but the source rates for the vapours varied up to 4 orders of magnitude. Highest source rates were in New Delhi and lowest were in the Antarctica. Indirect methods were applied to study the growth of freshly formed particles in the atmosphere. Also a newly developed Water Condensation Particle Counter, TSI 3785, was found to be a potential candidate to detect water solubility and thus indirectly composition of atmospheric ultra-fine particles. Based on indirect methods, the relative roles of sulphuric acid, non-volatile material and coagulation were investigated in rural Melpitz, Germany. Condensation of non-volatile material explained 20-40% and sulphuric acid the most of the remaining growth up to a point, when nucleation mode reached 10 to 20 nm in diameter. Coagulation contributed typically less than 5%. Furthermore, hygroscopicity measurements were applied to detect the contribution of water soluble and insoluble components in Athens. During more polluted days, the water soluble components contributed more to the growth. During less anthropogenic influence, non-soluble compounds explained a larger fraction of the growth. In addition, long range transport to a measurement station in Finland in a relatively polluted air mass was found to affect the hygroscopicity of the particles. This aging could have implications to cloud formation far away from the pollution sources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Atmospheric aerosol particle formation events can be a significant source for tropospheric aerosols and thus influence the radiative properties and cloud cover of the atmosphere. This thesis investigates the analysis of aerosol size distribution data containing particle formation events, describes the methodology of the analysis and presents time series data measured inside the Boreal forest. This thesis presents a methodology to identify regional-scale particle formation, and to derive the basic characteristics such as growth and formation rates. The methodology can also be used to estimate concentration and source rates of the vapour causing particle growth. Particle formation was found to occur frequently in the boreal forest area over areas covering up to hundreds of kilometers. Particle formation rates of boreal events were found to be of the order of 0.01-5 cm^-3 s^-1, while the nucleation rates of 1 nm particles can be a few orders of magnitude higher. The growth rates of over 3 nm sized particles were of the order of a few nanometers per hour. The vapor concentration needed to sustain such growth is of the order of 10^7--10^8 cm^-3, approximately one order of magnitude higher than sulphuric acid concentrations found in the atmosphere. Therefore, one has to assume that other vapours, such as organics, have a key role in growing newborn particles to sizes where they can become climatically active. Formation event occurrence shows a clear annual variation with peaks in summer and autumns. This variation is similar to the variation exhibited the obtained formation rates of particles. The growth rate, on the other hand, reaches its highest values during summer. This difference in the annual behavior, and the fact that no coupling between the growth and formation process could be identified, suggest that these processes might be different ones, and that both are needed for a particle formation burst to be observed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the last decades mean-field models, in which large-scale magnetic fields and differential rotation arise due to the interaction of rotation and small-scale turbulence, have been enormously successful in reproducing many of the observed features of the Sun. In the meantime, new observational techniques, most prominently helioseismology, have yielded invaluable information about the interior of the Sun. This new information, however, imposes strict conditions on mean-field models. Moreover, most of the present mean-field models depend on knowledge of the small-scale turbulent effects that give rise to the large-scale phenomena. In many mean-field models these effects are prescribed in ad hoc fashion due to the lack of this knowledge. With large enough computers it would be possible to solve the MHD equations numerically under stellar conditions. However, the problem is too large by several orders of magnitude for the present day and any foreseeable computers. In our view, a combination of mean-field modelling and local 3D calculations is a more fruitful approach. The large-scale structures are well described by global mean-field models, provided that the small-scale turbulent effects are adequately parameterized. The latter can be achieved by performing local calculations which allow a much higher spatial resolution than what can be achieved in direct global calculations. In the present dissertation three aspects of mean-field theories and models of stars are studied. Firstly, the basic assumptions of different mean-field theories are tested with calculations of isotropic turbulence and hydrodynamic, as well as magnetohydrodynamic, convection. Secondly, even if the mean-field theory is unable to give the required transport coefficients from first principles, it is in some cases possible to compute these coefficients from 3D numerical models in a parameter range that can be considered to describe the main physical effects in an adequately realistic manner. In the present study, the Reynolds stresses and turbulent heat transport, responsible for the generation of differential rotation, were determined along the mixing length relations describing convection in stellar structure models. Furthermore, the alpha-effect and magnetic pumping due to turbulent convection in the rapid rotation regime were studied. The third area of the present study is to apply the local results in mean-field models, which task we start to undertake by applying the results concerning the alpha-effect and turbulent pumping in mean-field models describing the solar dynamo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Music as the Art of Anxiety: A Philosophical Approach to the Existential-Ontological Meaning of Music. The present research studies music as an art of anxiety from the points of view of both Martin Heidegger s thought and phenomenological philosophy in general. In the Heideggerian perspective, anxiety is understood as a fundamental mode of being (Grundbefindlichkeit) in human existence. Taken as an existential-ontological concept, anxiety is conceived philosophically and not psychologically. The central research questions are: what is the relationship between music and existential-ontological anxiety? In what way can music be considered as an art of anxiety? In thinking of music as a channel and manifestation of anxiety, what makes it a special case? What are the possible applications of phenomenology and Heideggerian thought in musicology? The main aim of the research is to develop a theory of music as an art of existential-ontological anxiety and to apply this theory to musicologically relevant phenomena. Furthermore, the research will contribute to contemporary musicological debates and research as it aims to outline the phenomenological study of music as a field of its own; the development of a specific methodology is implicit in these aims. The main subject of the study, a theory of music as an art of anxiety, integrates Heideggerian and phenomenological philosophies with critical and cultural theories concerning violence, social sacrifice, and mimetic desire (René Girard), music, noise and society (Jacques Attali), and the affect-based charme of music (Vladimir Jankélévitch). Thus, in addition to the subjective mood (Stimmung) of emptiness and meaninglessness, the philosophical concept of anxiety also refers to a state of disorder and chaos in general; for instance, to noise in the realm of sound and total (social) violence at the level of society. In this study, music is approached as conveying the existentially crucial human compulsion for signifying i.e., organizing chaos. In music, this happens primarily at the immediate level of experience, i.e. in affectivity, and also in relation to all of the aforementioned dimensions (sound, society, consciousness, and so on). Thus, music s existential-ontological meaning in human existence, Dasein, is in its ability to reveal different orders of existence as such. Indeed, this makes music the art of anxiety: more precisely, music can be existentially significant at the level of moods. The study proceeds from outlining the relevance of phenomenology and Heidegger s philosophy in musicology to the philosophical development of a theory of music as the art of anxiety. The theory is developed further through the study of three selected specific musical phenomena: the concept of a musical work, guitar smashing in the performance tradition of rock music, and Erik Bergman s orchestral work Colori ed improvvisazioni. The first example illustrates the level of individual human-subject in music as the art of anxiety, as a means of signifying chaos, while the second example focuses on the collective need to socio-culturally channel violence. The third example, being music-analytical, studies contemporary music s ability to mirror the structures of anxiety at the level of a specific musical text. The selected examples illustrate that, in addition to the philosophical orientation, the research also contributes to music analysis, popular music studies, and the cultural-critical study of music. Key words: music, anxiety, phenomenology, Martin Heidegger, ontology, guitar smashing, Erik Bergman, musical work, affectivity, Stimmung, René Girard

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Microbial activity in soils is the main source of nitrous oxide (N2O) to the atmosphere. Nitrous oxide is a strong greenhouse gas in the troposphere and participates in ozone destructive reactions in the stratosphere. The constant increase in the atmospheric concentration, as well as uncertainties in the known sources and sinks of N2O underline the need to better understand the processes and pathways of N2O in terrestrial ecosystems. This study aimed at quantifying N2O emissions from soils in northern Europe and at investigating the processes and pathways of N2O from agricultural and forest ecosystems. Emissions were measured in forest ecosystems, agricultural soils and a landfill, using the soil gradient, chamber and eddy covariance methods. Processes responsible for N2O production, and the pathways of N2O from the soil to the atmosphere, were studied in the laboratory and in the field. These ecosystems were chosen for their potential importance to the national and global budget of N2O. Laboratory experiments with boreal agricultural soils revealed that N2O production increases drastically with soil moisture content, and that the contribution of the nitrification and denitrification processes to N2O emissions depends on soil type. Laboratory study with beech (Fagus sylvatica) seedlings demonstrated that trees can serve as conduits for N2O from the soil to the atmosphere. If this mechanism is important in forest ecosystems, the current emission estimates from forest soils may underestimate the total N2O emissions from forest ecosystems. Further field and laboratory studies are needed to evaluate the importance of this mechanism in forest ecosystems. The emissions of N2O from northern forest ecosystems and a municipal landfill were highly variable in time and space. The emissions of N2O from boreal upland forest soil were among the smallest reported in the world. Despite the low emission rates, the soil gradient method revealed a clear seasonal variation in N2O production. The organic topsoil was responsible for most of the N2O production and consumption in this forest soil. Emissions from the municipal landfill were one to two orders of magnitude higher than those from agricultural soils, which are the most important source of N2O to the atmosphere. Due to their small areal coverage, landfills only contribute minimally to national N2O emissions in Finland. The eddy covariance technique was demonstrated to be useful for measuring ecosystem-scale emissions of N2O in forest and landfill ecosystems. Overall, more measurements and integration between different measurement techniques are needed to capture the large variability in N2O emissions from natural and managed northern ecosystems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Terminal oxidases are the final proteins of the respiratory chain in eukaryotes and some bacteria. They catalyze most of the biological oxygen consumption on Earth done by aerobic organisms. During the catalytic reaction terminal oxidases reduce dioxygen to water and use the energy released in this process to maintain the electrochemical proton gradient by functioning as a redox-driven proton pump. This membrane gradient of protons is extremely important for cells as it is used for many cellular processes, such as transportation of substrates and ATP synthesis. Even though the structures of several terminal oxidases are known, they are not sufficient in themselves to explain the molecular mechanism of proton pumping. In this work we have applied a complex approach using a variety of different techniques to address the properties and the mechanism of proton translocation by the terminal oxidases. The combination of direct measurements of pH changes during catalytic turnover, time-resolved potentiometric electrometry and optical spectroscopy, made it possible to obtain valuable information about various aspects of oxidase functioning. We compared oxygen binding properties of terminal oxidases from the distinct heme-copper (CcO) and cytochrome bd families and found that cytochrome bd has a high affinity for oxygen, which is 3 orders of magnitude higher than that of CcO. Interestingly, the difference between CcO and cytochrome bd is not only in higher affinity of the latter to oxygen, but also in the way that each of these enzymes traps oxygen during catalysis. CcO traps oxygen kinetically - the molecule of bound dioxygen is rapidly reduced before it can dissociate. Alternatively, cytochrome bd employs an alternative mechanism of oxygen trapping - part of the redox energy is invested into tight oxygen binding, and the price paid for this is the lack of proton pumping. A single cycle of oxygen reduction to water is characterized by translocation of four protons across the membrane. Our results make it possible to assign the pumping steps to discrete transitions of the catalytic cycle and indicate that during in vivo turnover of the oxidase these four protons are transferred, one at a time, during the P→F, F→OH, Oh→Eh, and Eh→R transitions. At the same time, each individual proton translocation step in the catalytic cycle is not just a single reaction catalyzed by CcO, but rather a complicated sequence of interdependent electron and proton transfers. We assume that each single proton translocation cycle of CcO is assured by internal proton transfer from the conserved Glu-278 to an as yet unidentified pump site above the hemes. Delivery of a proton to the pump site serves as a driving reaction that forces the proton translocation cycle to continue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aerosol particles play a role in the earth ecosystem and affect human health. A significant pathway of producing aerosol particles in the atmosphere is new particle formation, where condensable vapours nucleate and these newly formed clusters grow by condensation and coagulation. However, this phenomenon is still not fully understood. This thesis brings an insight to new particle formation from an experimental point of view. Laboratory experiments were conducted both on the nucleation process and physicochemical properties related to new particle formation. Nucleation rate measurements are used to test nucleation theories. These theories, in turn, are used to predict nucleation rates in atmospheric conditions. However, the nucleation rate measurements have proven quite difficult to conduct, as different devices can yield nucleation rates with differences of several orders of magnitude for the same substances. In this thesis, work has been done to have a greater understanding in nucleation measurements, especially those conducted in a laminar flow diffusion chamber. Systematic studies of nucleation were also made for future verification of nucleation theories. Surface tensions and densities of substances related to atmospheric new particle formation were measured. Ternary sulphuric acid + ammonia + water is a proposed candidate to participate in atmospheric nucleation. Surface tensions of an alternative candidate to nucleate in boreal forest areas, sulphuric acid + dimethylamine + water, were also measured. Binary compounds, consisting of organic acids + water are possible candidates to participate in the early growth of freshly nucleated particles. All the measured surface tensions and densities were fitted with equations, thermodynamically consistent if possible, to be easily applied to atmospheric model calculations of nucleation and subsequent evolution of particle size.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A large fraction of an XML document typically consists of text data. The XPath query language allows text search via the equal, contains, and starts-with predicates. Such predicates can be efficiently implemented using a compressed self-index of the document's text nodes. Most queries, however, contain some parts querying the text of the document, plus some parts querying the tree structure. It is therefore a challenge to choose an appropriate evaluation order for a given query, which optimally leverages the execution speeds of the text and tree indexes. Here the SXSI system is introduced. It stores the tree structure of an XML document using a bit array of opening and closing brackets plus a sequence of labels, and stores the text nodes of the document using a global compressed self-index. On top of these indexes sits an XPath query engine that is based on tree automata. The engine uses fast counting queries of the text index in order to dynamically determine whether to evaluate top-down or bottom-up with respect to the tree structure. The resulting system has several advantages over existing systems: (1) on pure tree queries (without text search) such as the XPathMark queries, the SXSI system performs on par or better than the fastest known systems MonetDB and Qizx, (2) on queries that use text search, SXSI outperforms the existing systems by 1-3 orders of magnitude (depending on the size of the result set), and (3) with respect to memory consumption, SXSI outperforms all other systems for counting-only queries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Soon after the Bolshevik Revolution of 1917, a three-year civil war broke out in Russia. As in many other civil wars, foreign powers intervened in the conflict. Britain played a leading role in this intervention and had a significant effect on the course of the war. Without this intervention on the White side, the superiority of numbers in manpower and weaponry of the Bolsheviks would have quickly overwhelmed their opponents. The aim of this dissertation is to explain the nature and role of the British intervention on the southern, and most decisive, front of the Civil War. The political decision making in London is studied as a background, but the focus of the dissertation is on the actual implementation of the British policy in Russia. The British military mission arrived in South Russia in late 1918, and started to provide General Denikin s White army with ample supplies. General Denikin would have not been able to build his army of more than 200,000 men or to make his operation against Moscow without the British matériel. The British mission also organized the training and equipping of the Russian troops with British weapons. This made the material aid much more effective. Many of the British instructors took part in fighting the Bolsheviks despite the orders of their government. The study is based on primary sources produced by British departments of state and members of the British mission and military units in South Russia. Primary sources from the Whites, including the personal collections of several key figures of the White movement and official records of the Armed Forces of South Russia are also used to give a balanced picture of the course of events. It is possible to draw some general conclusions from the White movement and reasons for their defeat from the study of the British intervention. In purely material terms the British aid placed Denikin s army in a far more favourable position than the Bolsheviks in 1919, but other military defects in the White army were numerous. The White commanders were unimaginative, their military thinking was obsolete, and they were incapable of organizing the logistics of their army. There were also fundamental defects in the morale of the White troops. In addition to all political mistakes of Denikin s movement and a general inability to adjust to the complex situation in Revolutionary Russia, the Whites suffered a clear military defeat. In South Russia the Whites were defeated not because of the lack of British aid, but rather in spite of it.