15 resultados para Hindered settling

em Helda - Digital Repository of University of Helsinki


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This doctoral dissertation examines the description of the North as it appears in the Old English Orosius (OE Or.) in the form of the travel accounts by Ohthere and Wulfstan and a catalogue of peoples of Germania. The description is discussed in the context of ancient and early medieval textual and cartographic descriptions of the North, with a special emphasis on Anglo-Saxon sources and the intellectual context of the reign of King Alfred (871-899). This is the first time that these sources, a multidisciplinary approach and secondary literature, also from Scandinavia and Finland, have been brought together. The discussion is source-based, and archaeological theories and geographical ideas are used to support the primary evidence. This study belongs to the disciplines of early medieval literature and (cultural) history, Anglo-Saxon studies, English philology, and historical geography. The OE Or. was probably part of Alfred s educational campaign, which conveyed royal ideology to the contemporary elite. The accounts and catalogue are original interpolations which represent a unique historical source for the Viking Age. They contain unparalleled information about peoples and places in Fennoscandia and the southern Baltic and sailing voyages to the White Sea, the Danish lands, and the Lower Vistula. The historical-philological analysis reveals an emphasis on wealth and property, rank, luxury goods, settlement patterns, and territorial divisions. Trade is strongly implied by the mentions of central places and northern products, such as walrus ivory. The references to such peoples as the Finnas, the Cwenas, and the Beormas appear in connection with information about geography and subsistence in the far North. Many of the topics in the accounts relate to Anglo-Saxon aristocratic culture and interests. The accounts focus on the areas associated with the Northmen, the Danes and the Este. These areas resonated in the Anglo-Saxon geographical imagination: they were curious about the northern margin of the world, their own continental ancestry and the geography of their homeland of Angeln, and they had an interest in the Goths and their connection with the southern Baltic in mythogeography. The non-judgemental representation of the North as generally peaceful and relatively normal place is related to Alfredian and Orosian ideas about the unity and spreading of Christendom, and to desires for unity among the Germani and for peace with the Vikings, who were settling in England. These intellectual contexts reflect the innovative and organizational forces of Alfred s reign. The description of the North in the OE Or. can be located in the context of the Anglo-Saxon worldview and geographical mindset. It mirrors the geographical curiosity expressed in other Anglo-Saxon sources, such as the poem Widsith and the Anglo-Saxon mappa mundi. The northern section of this early eleventh-century world map is analyzed in detail here for the first time. It is suggested that the section depicts the North Atlantic and the Scandinavian Peninsula. The survey of ancient and early medieval sources provides a comparative context for the OE Or. In this material, produced by such authors as Strabo, Pliny, Tacitus, Jordanes, and Rimbert, the significance of the North was related to the search for and definition of the northern edge of the world, universal accounts of the world, the northern homeland in the origin stories of the gentes, and Carolingian expansion and missionary activity. These frameworks were transmitted to Anglo-Saxon literary culture, where the North occurs in the context of the definition of Britain s place in the world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Russian Karelians were one of the small peasant nations of the Russian Empire that began to identify themselves as nations during the late imperial period. At that historical moment Russian Karelia fell between an economically undeveloped empire and the rapidly modernizing borderland of Finland. The economic and cultural lure of Finland drew Karelians into the Finnish camp. This attraction was seen as a challenge to Russia and influenced the straggle between Russia and Finland for the Karelians. This struggle was waged from 1905 to 1917. This work is focused on the beginning stage of the struggle, its various phases, and their results. The confrontation extended into different dimensions (economic, political, ideological, church and cultural politics) and occurred on two levels: central and regional. Countermeasures against local nationalisms developed much earlier both in Russia and in other empires for use were also used in the Russian Karelian case. Economic policies were deployed to try to make relations with Russia more alluring for Karelians and to improve their economic condition. However, these efforts produced only minimal results due to the economic weakness of the empire and a lack of finances. Fear of the economic integration of the Karelians and Finns, which would have stimulated the economy of the Karelia, also hindered these attempts. The further development of the Orthodox Church, the schools and the zemstvos in Karelia yielded fewer results than expected due to the economic underdevelopment of the region and the avoidance of the Finnish language. Policizing measures were the most successfull, as all activities in Russian Karelia by the Finns were entirely halted in practice. However, the aspiration of Russian Karelians to integrate their home districts with Finland remained a latent force that just waited for an opportunity to push to the surface again. Such a chance materialized with the Russian revolution. The Karelian question was also a part of Russian domestic political confrontation. At the and of the 1800s, the Russian nationalist right had grown strong and increasingly gained the favor of the autocracy. The right political forces exploited the Karelian question in its anti-Finnish ideology and in its general resistance to the national emancipation of the minority peoples of Russia. A separate ideology was developed, focusing on the closeness of Karelians to the "great Russian people." Simultaneously, this concept found a place in the ultramonarchist myth of the particularly close connection between the people and tsar that was prominent in the era of Nicholas II. This myth assigned the Karelians a place amongst the "simple people" faithful to the tsar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gender-specific division of crafts is significant in Finnish culture. The aim of craft education in comprehensive schools has been to unify education for both girls and boys during the thirty years of modern comprehensive education, but in reality crafts has diverged into two different subjects for girls and boys. Craft education has taught different things: girls have been taught things that are feminine and included in private life, while the boys’ education has been mostly directed at public and working life. At the same time girls and boys have been educated and socialized towards a certain womanhood and manhood. Craft as a hobby is also very gender divided but there are some clues that tell of the changing positions on the border areas. The starting point of this study has been the assumption that gender structures within crafts can be transformed. But it is important to know the background that creates and maintains these constructions and the expedients for dismantling the structures. The main purpose of this study is to find out how the gender-specific crafts are changing and how they are still maintained. My study analyses why men and woman have chosen untypical hobbies and through that I shall discuss how gendered structures can be dismantled. In the autumn of 2008 I interviewed borderline breakers within crafts: men who had an interest in textile crafts and women who had an interest in technical work. Informants of this study were 20-31 years old and they studied behavioural sciences. The data was collected in 12 theme interviews and document analysis was used to study the results. The dismantling of the gender divide in crafts is slow, but new borders are drawn all the time. The informants have started textile and technical handicrafts soon after comprehensive school or during their studies at university. A main motivation for the new hobby has been the possibility to study for a teacher’s degree with technology or textile studies as a main or a secondary subject. Gender-specific division of crafts is constructed early in childhood as different skills are taught to girls and boys. School teaches gender specific skills and the different skills of girls and boys have not been taken into account. Instead, it has been assumed through comprehensive school that both girls and boys share certain basic skills, and attention has not been paid to the differences between genders. At university the same assumption has still thrived. Skill levels between men and women are substantial and that may have hindered a move by women to technical studies and by men to textile studies. Hence an assumption of naturally gendered crafts has been developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drug Analysis without Primary Reference Standards: Application of LC-TOFMS and LC-CLND to Biofluids and Seized Material Primary reference standards for new drugs, metabolites, designer drugs or rare substances may not be obtainable within a reasonable period of time or their availability may also be hindered by extensive administrative requirements. Standards are usually costly and may have a limited shelf life. Finally, many compounds are not available commercially and sometimes not at all. A new approach within forensic and clinical drug analysis involves substance identification based on accurate mass measurement by liquid chromatography coupled with time-of-flight mass spectrometry (LC-TOFMS) and quantification by LC coupled with chemiluminescence nitrogen detection (LC-CLND) possessing equimolar response to nitrogen. Formula-based identification relies on the fact that the accurate mass of an ion from a chemical compound corresponds to the elemental composition of that compound. Single-calibrant nitrogen based quantification is feasible with a nitrogen-specific detector since approximately 90% of drugs contain nitrogen. A method was developed for toxicological drug screening in 1 ml urine samples by LC-TOFMS. A large target database of exact monoisotopic masses was constructed, representing the elemental formulae of reference drugs and their metabolites. Identification was based on matching the sample component s measured parameters with those in the database, including accurate mass and retention time, if available. In addition, an algorithm for isotopic pattern match (SigmaFit) was applied. Differences in ion abundance in urine extracts did not affect the mass accuracy or the SigmaFit values. For routine screening practice, a mass tolerance of 10 ppm and a SigmaFit tolerance of 0.03 were established. Seized street drug samples were analysed instantly by LC-TOFMS and LC-CLND, using a dilute and shoot approach. In the quantitative analysis of amphetamine, heroin and cocaine findings, the mean relative difference between the results of LC-CLND and the reference methods was only 11%. In blood specimens, liquid-liquid extraction recoveries for basic lipophilic drugs were first established and the validity of the generic extraction recovery-corrected single-calibrant LC-CLND was then verified with proficiency test samples. The mean accuracy was 24% and 17% for plasma and whole blood samples, respectively, all results falling within the confidence range of the reference concentrations. Further, metabolic ratios for the opioid drug tramadol were determined in a pharmacogenetic study setting. Extraction recovery estimation, based on model compounds with similar physicochemical characteristics, produced clinically feasible results without reference standards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analysed whether the land tenure insecurity problem has led to a decline in long-term land improvements (liming and phosphorus fertilization) under the Common Agricultural Policy (CAP) and Nordic production conditions in European Union (EU) countries such as Finland. The results suggests that under traditional cash lease contracts, which are encouraged by the existing land leasing regulations and agricultural subsidy programs, the land tenure insecurity problem on leased land reduces land improvements that have a long pay-back period. In particular, soil pH was found to be significantly lower on land cultivated under a lease contract compared to land owned by the farmers themselves. The results also indicate that land improvements could not be reversed by land markets, because land owners would otherwise have carried out land improvements even if not farming by themselves. To reveal the causality between land tenure and land improvements, the dynamic optimisation problem was solved by a stochastic dynamic programming routine with known parameters for one-period returns and transition equations. The model parameters represented Finnish soil quality and production conditions. The decision rules were solved for alternative likelihood scenarios over the continuation of the fixed-term lease contract. The results suggest that as the probability of non-renewal of the lease contract increases, farmers quickly reduce investments in irreversible land improvements and, thereafter, yields gradually decline. The simulations highlighted the observed trends of a decline in land improvements on land parcels that are cultivated under lease contracts. Land tenure has resulted in the neglect of land improvement in Finland. This study aimed to analyze whether these challenges could be resolved by a tax policy that encourages land sales. Using Finnish data, real estate tax and a temporal relaxation on the taxation of capital gains showed some potential for the restructuring of land ownership. Potential sellers who could not be revealed by traditional logit models were identified with the latent class approach. Those landowners with an intention to sell even without a policy change were sensitive to temporal relaxation in the taxation of capital gains. In the long term, productivity and especially productivity growth are necessary conditions for the survival of farms and the food industry in Finland. Technical progress was found to drive the increase in productivity. The scale had only a moderate effect and for the whole study period (1976–2006) the effect was close to zero. Total factor productivity (TFP) increased, depending on the model, by 0.6–1.7% per year. The results demonstrated that the increase in productivity was hindered by the policy changes introduced in 1995. It is also evidenced that the increase in land leasing is connected to these policy changes. Land institutions and land tenure questions are essential in agricultural and rural policies on all levels, from local to international. Land ownership and land titles are commonly tied to fundamental political, economic and social questions. A fair resolution calls for innovative and new solutions both on national and international levels. However, this seems to be a problem when considering the application of EU regulations to member states inheriting divergent landownership structures and farming cultures. The contribution of this study is in describing the consequences of fitting EU agricultural policy to Finnish agricultural land tenure conditions and heritage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The importance of intermolecular interactions to chemistry, physics, and biology is difficult to overestimate. Without intermolecular forces, condensed phase matter could not form. The simplest way to categorize different types of intermolecular interactions is to describe them using van der Waals and hydrogen bonded (H-bonded) interactions. In the H-bond, the intermolecular interaction appears between a positively charged hydrogen atom and electronegative fragments and it originates from strong electrostatic interactions. H-bonding is important when considering the properties of condensed phase water and in many biological systems including the structure of DNA and proteins. Vibrational spectroscopy is a useful tool for studying complexes and the solvation of molecules. Vibrational frequency shift has been used to characterize complex formation. In an H-bonded system A∙∙∙H-X (A and X are acceptor and donor species, respectively), the vibrational frequency of the H-X stretching vibration usually decreases from its value in free H-X (red-shift). This frequency shift has been used as evidence for H-bond formation and the magnitude of the shift has been used as an indicator of the H-bonding strength. In contrast to this normal behavior are the blue-shifting H-bonds, in which the H-X vibrational frequency increases upon complex formation. In the last decade, there has been active discussion regarding these blue-shifting H-bonds. Noble-gases have been considered inert due to their limited reactivity with other elements. In the early 1930 s, Pauling predicted the stable noble-gas compounds XeF6 and KrF6. It was not until three decades later Neil Bartlett synthesized the first noble-gas compound, XePtF6, in 1962. A renaissance of noble-gas chemistry began in 1995 with the discovery of noble-gas hydride molecules at the University of Helsinki. The first hydrides were HXeCl, HXeBr, HXeI, HKrCl, and HXeH. These molecules have the general formula of HNgY, where H is a hydrogen atom, Ng is a noble-gas atom (Ar, Kr, or Xe), and Y is an electronegative fragment. At present, this class of molecules comprises 23 members including both inorganic and organic compounds. The first and only argon-containing neutral chemical compound HArF was synthesized in 2000 and its properties have since been investigated in a number of studies. A helium-containing chemical compound, HHeF, was predicted computationally, but its lifetime has been predicted to be severely limited by hydrogen tunneling. Helium and neon are the only elements in the periodic table that do not form neutral, ground state molecules. A noble-gas matrix is a useful medium in which to study unstable and reactive species including ions. A solvated proton forms a centrosymmetric NgHNg+ (Ng = Ar, Kr, and Xe) structure in a noble-gas matrix and this is probably the simplest example of a solvated proton. Interestingly, the hypothetical NeHNe+ cation is isoelectronic with the water-solvated proton H5O2+ (Zundel-ion). In addition to the NgHNg+ cations, the isoelectronic YHY- (Y = halogen atom or pseudohalogen fragment) anions have been studied with the matrix-isolation technique. These species have been known to exist in alkali metal salts (YHY)-M+ (M = alkali metal e.g. K or Na) for more than 80 years. Hydrated HF forms the FHF- structure in aqueous solutions, and these ions participate in several important chemical processes. In this thesis, studies of the intermolecular interactions of HNgY molecules and centrosymmetric ions with various species are presented. The HNgY complexes show unusual spectral features, e.g. large blue-shifts of the H-Ng stretching vibration upon complexation. It is suggested that the blue-shift is a normal effect for these molecules, and that originates from the enhanced (HNg)+Y- ion-pair character upon complexation. It is also found that the HNgY molecules are energetically stabilized in the complexed form, and this effect is computationally demonstrated for the HHeF molecule. The NgHNg+ and YHY- ions also show blue-shifts in their asymmetric stretching vibration upon complexation with nitrogen. Additionally, the matrix site structure and hindered rotation (libration) of the HNgY molecules were studied. The librational motion is a much-discussed solid state phenomenon, and the HNgY molecules embedded in noble-gas matrices are good model systems to study this effect. The formation mechanisms of the HNgY molecules and the decay mechanism of NgHNg+ cations are discussed. A new electron tunneling model for the decay of NgHNg+ absorptions in noble-gas matrices is proposed. Studies of the NgHNg+∙∙∙N2 complexes support this electron tunneling mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Can war be justified? Expressions of opinions by the general assemblies of the World Council of Churches on the question of war as a method of settling conflicts. The purpose of this study is to describe and analyse the expressions of opinions recorded in the documents of the general assemblies of the WCC during the Cold War period from 1948 to 1983 on the use of war as a method of settling international and national conflicts. The main sources are the official reports of the WCC´s assemblies during the years 1948 to 1983. This study divides the discussions into three periods. The first period (1949-1968) is dominated by the pressures arising from the Second World War. Experiences of the war led the assemblies of the WCC to the conclusion that modern warfare as a method of settling conflicts should be rejected. Modern war was contrary to God´s purposes and the whole meaning of creation, said the assembly. Although the WCC rejected modern war, it left open the possibility of conflict where principles of just war may be practised. The question of war was also linked to the state and its function, which led to the need to create a politically neutral doctrine for the socio-ethical thinking of churches and of the WCC itself. The doctrine was formulated using the words "responsible society". The question of war and socio-ethical thinking were on the WCC`s agenda throughout the first period. Another issue that had an influence on the first period was the increasing role of Third World countries. This new dimension also brought new aspects to the question of war and violence. The second period (1968-1975) presented greater challenges to the WCC, especially in traditional western countries. The Third World, political activity in the socialist world and ideas of revolution were discussed. The WCC`s fourth Assembly in Uppsala was challenged by these new ideas of revolution. The old doctrine of "responsible society" was seen by many participants as unsuitable in the modern world, especially for Third World countries. The situation of a world governed by armaments, causing social and economic disruption, was felt by churches to be problematic. The peace movement gathered pace and attention. There was pressure to see armed forces as an option on the way to a new world order. The idea of a just war was challenged by that of just revolution. These ideas of revolution did not receive support from the Uppsala Assembly, but they pressured the WCC to reconsider its socio-ethical thinking. Revolution was seen as a possibility, but only when it could be peaceful. In the Nairobi Assembly the theme of just, participatory and sustainable society provided yet another viewpoint, dealing with the life of the world and its problems as a whole. The third period (1975-1983) introduced a new, alternative doctrine the "JPIC Process", justice, peace and the integrity of creation for social thinking in the WCC. The WCC no longer wanted to discuss war or poverty as separate questions, but wanted to combine all aspects of life to see the impact of an arms-governed world on humankind. Thus, during the last period, discussions focused on socio-ethical questions, where war and violence were only parts of a larger problem. Through the new JPIC Process, the WCC`s Assembly in Vancouver looked for a new world, one without violence, in all aspects of life. Despite differing opinions in socio-ethical thinking, the churches in the WCC agreed that modern warfare cannot be regarded as acceptable or just. The old idea of a "just war" still had a place, but it was not seen by all as a valid principle. As a result the WCC viewed war as a final solution to be employed when all other methods had failed. Such a war would have to secure peace and justice for all. In the discussions there was a strong political east-west divide, and, during the last two decades, a north-south divide as well. The effect of the Cold War was obvious. In the background to the theological positions were two main concepts namely the idea of God´s activity in man´s history through the so-called regiments and, the concept of the Kingdom of God on Earth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population dynamics are generally viewed as the result of intrinsic (purely density dependent) and extrinsic (environmental) processes. Both components, and potential interactions between those two, have to be modelled in order to understand and predict dynamics of natural populations; a topic that is of great importance in population management and conservation. This thesis focuses on modelling environmental effects in population dynamics and how effects of potentially relevant environmental variables can be statistically identified and quantified from time series data. Chapter I presents some useful models of multiplicative environmental effects for unstructured density dependent populations. The presented models can be written as standard multiple regression models that are easy to fit to data. Chapters II IV constitute empirical studies that statistically model environmental effects on population dynamics of several migratory bird species with different life history characteristics and migration strategies. In Chapter II, spruce cone crops are found to have a strong positive effect on the population growth of the great spotted woodpecker (Dendrocopos major), while cone crops of pine another important food resource for the species do not effectively explain population growth. The study compares rate- and ratio-dependent effects of cone availability, using state-space models that distinguish between process and observation error in the time series data. Chapter III shows how drought, in combination with settling behaviour during migration, produces asymmetric spatially synchronous patterns of population dynamics in North American ducks (genus Anas). Chapter IV investigates the dynamics of a Finnish population of skylark (Alauda arvensis), and point out effects of rainfall and habitat quality on population growth. Because the skylark time series and some of the environmental variables included show strong positive autocorrelation, the statistical significances are calculated using a Monte Carlo method, where random autocorrelated time series are generated. Chapter V is a simulation-based study, showing that ignoring observation error in analyses of population time series data can bias the estimated effects and measures of uncertainty, if the environmental variables are autocorrelated. It is concluded that the use of state-space models is an effective way to reach more accurate results. In summary, there are several biological assumptions and methodological issues that can affect the inferential outcome when estimating environmental effects from time series data, and that therefore need special attention. The functional form of the environmental effects and potential interactions between environment and population density are important to deal with. Other issues that should be considered are assumptions about density dependent regulation, modelling potential observation error, and when needed, accounting for spatial and/or temporal autocorrelation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular motors are proteins that convert chemical energy into mechanical work. The viral packaging ATPase P4 is a hexameric molecular motor that translocates RNA into preformed viral capsids. P4 belongs to the ubiquitous class of hexameric helicases. Although its structure is known, the mechanism of RNA translocation remains elusive. Here we present a detailed kinetic study of nucleotide binding, hydrolysis, and product release by P4. We propose a stochastic-sequential cooperative model to describe the coordination of ATP hydrolysis within the hexamer. In this model the apparent cooperativity is a result of hydrolysis stimulation by ATP and RNA binding to neighboring subunits rather than cooperative nucleotide binding. Simultaneous interaction of neighboring subunits with RNA makes the otherwise random hydrolysis sequential and processive. Further, we use hydrogen/deuterium exchange detected by high resolution mass spectrometry to visualize P4 conformational dynamics during the catalytic cycle. Concerted changes of exchange kinetics reveal a cooperative unit that dynamically links ATP binding sites and the central RNA binding channel. The cooperative unit is compatible with the structure-based model in which translocation is effected by conformational changes of a limited protein region. Deuterium labeling also discloses the transition state associated with RNA loading which proceeds via opening of the hexameric ring. Hydrogen/deuterium exchange is further used to delineate the interactions of the P4 hexamer with the viral procapsid. P4 associates with the procapsid via its C-terminal face. The interactions stabilize subunit interfaces within the hexamer. The conformation of the virus-bound hexamer is more stable than the hexamer in solution, which is prone to spontaneous ring openings. We propose that the stabilization within the viral capsid increases the packaging processivity and confers selectivity during RNA loading. Finally, we use single molecule techniques to characterize P4 translocation along RNA. While the P4 hexamer encloses RNA topologically within the central channel, it diffuses randomly along the RNA. In the presence of ATP, unidirectional net movement is discernible in addition to the stochastic motion. The diffusion is hindered by activation energy barriers that depend on the nucleotide binding state. The results suggest that P4 employs an electrostatic clutch instead of cycling through stable, discrete, RNA binding states during translocation. Conformational changes coupled to ATP hydrolysis modify the electrostatic potential inside the central channel, which in turn biases RNA motion in one direction. Implications of the P4 model for other hexameric molecular motors are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To a large extent, lakes can be described with a one-dimensional approach, as their main features can be characterized by the vertical temperature profile of the water. The development of the profiles during the year follows the seasonal climate variations. Depending on conditions, lakes become stratified during the warm summer. After cooling, overturn occurs, water cools and an ice cover forms. Typically, water is inversely stratified under the ice, and another overturn occurs in spring after the ice has melted. Features of this circulation have been used in studies to distinguish between lakes in different areas, as basis for observation systems and even as climate indicators. Numerical models can be used to calculate temperature in the lake, on the basis of the meteorological input at the surface. The simple form is to solve the surface temperature. The depth of the lake affects heat transfer, together with other morphological features, the shape and size of the lake. Also the surrounding landscape affects the formation of the meteorological fields over the lake and the energy input. For small lakes the shading by the shores affects both over the lake and inside the water body bringing limitations for the one-dimensional approach. A two-layer model gives an approximation for the basic stratification in the lake. A turbulence model can simulate vertical temperature profile in a more detailed way. If the shape of the temperature profile is very abrupt, vertical transfer is hindered, having many important consequences for lake biology. One-dimensional modelling approach was successfully studied comparing a one-layer model, a two-layer model and a turbulence model. The turbulence model was applied to lakes with different sizes, shapes and locations. Lake models need data from the lakes for model adjustment. The use of the meteorological input data on different scales was analysed, ranging from momentary turbulent changes over the lake to the use of the synoptical data with three hour intervals. Data over about 100 past years were used on the mesoscale at the range of about 100 km and climate change scenarios for future changes. Increasing air temperature typically increases water temperature in epilimnion and decreases ice cover. Lake ice data were used for modelling different kinds of lakes. They were also analyzed statistically in global context. The results were also compared with results of a hydrological watershed model and data from very small lakes for seasonal development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research has been prompted by an interest in the atmospheric processes of hydrogen. The sources and sinks of hydrogen are important to know, particularly if hydrogen becomes more common as a replacement for fossil fuel in combustion. Hydrogen deposition velocities (vd) were estimated by applying chamber measurements, a radon tracer method and a two-dimensional model. These three approaches were compared with each other to discover the factors affecting the soil uptake rate. A static-closed chamber technique was introduced to determine the hydrogen deposition velocity values in an urban park in Helsinki, and at a rural site at Loppi. A three-day chamber campaign to carry out soil uptake estimation was held at a remote site at Pallas in 2007 and 2008. The atmospheric mixing ratio of molecular hydrogen has also been measured by a continuous method in Helsinki in 2007 - 2008 and at Pallas from 2006 onwards. The mean vd values measured in the chamber experiments in Helsinki and Loppi were between 0.0 and 0.7 mm s-1. The ranges of the results with the radon tracer method and the two-dimensional model were 0.13 - 0.93 mm s-1 and 0.12 - 0.61 mm s-1, respectively, in Helsinki. The vd values in the three-day campaign at Pallas were 0.06 - 0.52 mm s-1 (chamber) and 0.18 - 0.52 mm s-1 (radon tracer method and two-dimensional model). At Kumpula, the radon tracer method and the chamber measurements produced higher vd values than the two-dimensional model. The results of all three methods were close to each other between November and April, except for the chamber results from January to March, while the soil was frozen. The hydrogen deposition velocity values of all three methods were compared with one-week cumulative rain sums. Precipitation increases the soil moisture, which decreases the soil uptake rate. The measurements made in snow seasons showed that a thick snow layer also hindered gas diffusion, lowering the vd values. The H2 vd values were compared to the snow depth. A decaying exponential fit was obtained as a result. During a prolonged drought in summer 2006, soil moisture values were lower than in other summer months between 2005 and 2008. Such conditions were prevailing in summer 2006 when high chamber vd values were measured. The mixing ratio of molecular hydrogen has a seasonal variation. The lowest atmospheric mixing ratios were found in the late autumn when high deposition velocity values were still being measured. The carbon monoxide (CO) mixing ratio was also measured. Hydrogen and carbon monoxide are highly correlated in an urban environment, due to the emissions originating from traffic. After correction for the soil deposition of H2, the slope was 0.49±0.07 ppb (H2) / ppb (CO). Using the corrected hydrogen-to-carbon-monoxide ratio, the total hydrogen load emitted by Helsinki traffic in 2007 was 261 t (H2) a-1. Hydrogen, methane and carbon monoxide are connected with each other through the atmospheric methane oxidation process, in which formaldehyde is produced as an important intermediate. The photochemical degradation of formaldehyde produces hydrogen and carbon monoxide as end products. Examination of back-trajectories revealed long-range transportation of carbon monoxide and methane. The trajectories can be grouped by applying cluster and source analysis methods. Thus natural and anthropogenic emission sources can be separated by analyzing trajectory clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The subjects of this study are Narcotics Anonymous (NA), a non-profit, peer-support-based fellowship, its recovery programme, and the former drug addicts who consider themselves members of the fellowship. The study data consist of episodic interviews (n=24) and questionnaires (n=212). In the collection of questionnaire data, survey research methods had to be applied judiciously. This study analyses NA members background and their substance abuse and treatment history, as well as factors that have contributed to or hindered their bonding with NA. A recovery model is presented that stems from NA s written and oral tradition, and which has been conceptualised into NA s recovery theory. At its simplest, NA s recovery theory can be described in two sentences: 1) There are drug dependent addicts who have an addiction disease. 2) Through an NA way of life, recovery is possible. In this study, addiction and addiction disease are described through recovery stories shared at NA. It also describes how the way of life offered by NA supports recovery from drug addiction, the way of life which recovering addicts have adopted, and how they have done so. The study also presents results that, based on the study data, emerge from participation in the NA programme, and describes how the NA recovery theory works in practice, i.e. how NA members utilise the tools provided by the fellowship and how the lives of recovering addicts change during their membership. Furthermore, this study also discusses criticism of NA. According to the study, NA affects the lives of recovering drug addicts in a number of ways. People of different ages and with a variety of personal, treatment and drug abuse histories seem to benefit from membership of NA. Viewed from the outside, NA may appear as strictly normative, but in practice each member can adapt the programme in a way that suits him/her best. Indeed, flexibility is one of the strengths of NA, but without more extensive knowledge of the fellowship, it is possible that the norms reflected in NA texts or the fanaticism of individual NA members may drive some people away. Due to the increasing number of NA members, the association is also able to provide more alternatives. This study confirms the view that peer support is important, as well as the fact that an official treatment system is required in parallel with peer support activities. NA can never fully replace professional support, neither should it be left with sole responsibility for recovering addicts. Keywords: Narcotics Anonymous, peer support, recovery study, recovery, substance addiction, drug treatment, drugs, explorative research

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conflict, Unity, Oblivion: Commemoration of the Liberation War by the Civic Guard and the Veterans´ Union in 1918-1944 The Finnish Civil War ended in May 1918 as a victory for the white side. The war was named by the winners as the Liberation War and its legacy became a central theme for public commemorations during the interwar period. At the same time the experiences of the defeated were hindered from becoming a part of the official history of Finland. The commemoration of the war was related not only to the war experience but also to a national mission, which was seen fulfilled with the independence of Finland. Although the idea of the commemoration was to form a unifying non-political scene for the nation, the remembrance of the Liberation War rather continued than sought to reconcile to the conflict of 1918. The outbreak of the war between the Soviet Union and Finland in 1939 immediately affected the memory culture. The new myth of the Miracle of the Winter War, which referred to the unity shown by the people, required a marginalization of controversial memory of the Liberation War. This study examines from the concepts of public memory and narrative templates how the problematic experience of a civil war developed to a popular public commemoration. Instead of dealing with the manipulative and elite-centered grandiose commemoration projects, the study focuses on the more modest local level and emphasizes the significance of local memory agents and narrative templates of collective memory. The main subjects in the study are the Civil Guard and the Veterans´ Union. Essential for the widespread movement was the development of the Civic Guard from a wartime organization to a peacetime popular movement. The guards, who identified themselves trough the memories and the threats of civil war, formed a huge network of memory agents in every corner of the country. They effectively linked both local memory with official memory and the civic society with the state level. Only with the emergence of the right wing veteran movement in the 30ies did the tensions grow between the two levels of public memory. The study shows the diversity of the commemoration movement of the Liberation War. It was not only a result of a nation-state project and political propaganda, but also a way for local communities to identify and strengthen themselves in a time of political upheaval and uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.