14 resultados para sterically hindered organotellurium
em Helda - Digital Repository of University of Helsinki
Resumo:
Russian Karelians were one of the small peasant nations of the Russian Empire that began to identify themselves as nations during the late imperial period. At that historical moment Russian Karelia fell between an economically undeveloped empire and the rapidly modernizing borderland of Finland. The economic and cultural lure of Finland drew Karelians into the Finnish camp. This attraction was seen as a challenge to Russia and influenced the straggle between Russia and Finland for the Karelians. This struggle was waged from 1905 to 1917. This work is focused on the beginning stage of the struggle, its various phases, and their results. The confrontation extended into different dimensions (economic, political, ideological, church and cultural politics) and occurred on two levels: central and regional. Countermeasures against local nationalisms developed much earlier both in Russia and in other empires for use were also used in the Russian Karelian case. Economic policies were deployed to try to make relations with Russia more alluring for Karelians and to improve their economic condition. However, these efforts produced only minimal results due to the economic weakness of the empire and a lack of finances. Fear of the economic integration of the Karelians and Finns, which would have stimulated the economy of the Karelia, also hindered these attempts. The further development of the Orthodox Church, the schools and the zemstvos in Karelia yielded fewer results than expected due to the economic underdevelopment of the region and the avoidance of the Finnish language. Policizing measures were the most successfull, as all activities in Russian Karelia by the Finns were entirely halted in practice. However, the aspiration of Russian Karelians to integrate their home districts with Finland remained a latent force that just waited for an opportunity to push to the surface again. Such a chance materialized with the Russian revolution. The Karelian question was also a part of Russian domestic political confrontation. At the and of the 1800s, the Russian nationalist right had grown strong and increasingly gained the favor of the autocracy. The right political forces exploited the Karelian question in its anti-Finnish ideology and in its general resistance to the national emancipation of the minority peoples of Russia. A separate ideology was developed, focusing on the closeness of Karelians to the "great Russian people." Simultaneously, this concept found a place in the ultramonarchist myth of the particularly close connection between the people and tsar that was prominent in the era of Nicholas II. This myth assigned the Karelians a place amongst the "simple people" faithful to the tsar.
Resumo:
Gender-specific division of crafts is significant in Finnish culture. The aim of craft education in comprehensive schools has been to unify education for both girls and boys during the thirty years of modern comprehensive education, but in reality crafts has diverged into two different subjects for girls and boys. Craft education has taught different things: girls have been taught things that are feminine and included in private life, while the boys’ education has been mostly directed at public and working life. At the same time girls and boys have been educated and socialized towards a certain womanhood and manhood. Craft as a hobby is also very gender divided but there are some clues that tell of the changing positions on the border areas. The starting point of this study has been the assumption that gender structures within crafts can be transformed. But it is important to know the background that creates and maintains these constructions and the expedients for dismantling the structures. The main purpose of this study is to find out how the gender-specific crafts are changing and how they are still maintained. My study analyses why men and woman have chosen untypical hobbies and through that I shall discuss how gendered structures can be dismantled. In the autumn of 2008 I interviewed borderline breakers within crafts: men who had an interest in textile crafts and women who had an interest in technical work. Informants of this study were 20-31 years old and they studied behavioural sciences. The data was collected in 12 theme interviews and document analysis was used to study the results. The dismantling of the gender divide in crafts is slow, but new borders are drawn all the time. The informants have started textile and technical handicrafts soon after comprehensive school or during their studies at university. A main motivation for the new hobby has been the possibility to study for a teacher’s degree with technology or textile studies as a main or a secondary subject. Gender-specific division of crafts is constructed early in childhood as different skills are taught to girls and boys. School teaches gender specific skills and the different skills of girls and boys have not been taken into account. Instead, it has been assumed through comprehensive school that both girls and boys share certain basic skills, and attention has not been paid to the differences between genders. At university the same assumption has still thrived. Skill levels between men and women are substantial and that may have hindered a move by women to technical studies and by men to textile studies. Hence an assumption of naturally gendered crafts has been developed.
Resumo:
Drug Analysis without Primary Reference Standards: Application of LC-TOFMS and LC-CLND to Biofluids and Seized Material Primary reference standards for new drugs, metabolites, designer drugs or rare substances may not be obtainable within a reasonable period of time or their availability may also be hindered by extensive administrative requirements. Standards are usually costly and may have a limited shelf life. Finally, many compounds are not available commercially and sometimes not at all. A new approach within forensic and clinical drug analysis involves substance identification based on accurate mass measurement by liquid chromatography coupled with time-of-flight mass spectrometry (LC-TOFMS) and quantification by LC coupled with chemiluminescence nitrogen detection (LC-CLND) possessing equimolar response to nitrogen. Formula-based identification relies on the fact that the accurate mass of an ion from a chemical compound corresponds to the elemental composition of that compound. Single-calibrant nitrogen based quantification is feasible with a nitrogen-specific detector since approximately 90% of drugs contain nitrogen. A method was developed for toxicological drug screening in 1 ml urine samples by LC-TOFMS. A large target database of exact monoisotopic masses was constructed, representing the elemental formulae of reference drugs and their metabolites. Identification was based on matching the sample component s measured parameters with those in the database, including accurate mass and retention time, if available. In addition, an algorithm for isotopic pattern match (SigmaFit) was applied. Differences in ion abundance in urine extracts did not affect the mass accuracy or the SigmaFit values. For routine screening practice, a mass tolerance of 10 ppm and a SigmaFit tolerance of 0.03 were established. Seized street drug samples were analysed instantly by LC-TOFMS and LC-CLND, using a dilute and shoot approach. In the quantitative analysis of amphetamine, heroin and cocaine findings, the mean relative difference between the results of LC-CLND and the reference methods was only 11%. In blood specimens, liquid-liquid extraction recoveries for basic lipophilic drugs were first established and the validity of the generic extraction recovery-corrected single-calibrant LC-CLND was then verified with proficiency test samples. The mean accuracy was 24% and 17% for plasma and whole blood samples, respectively, all results falling within the confidence range of the reference concentrations. Further, metabolic ratios for the opioid drug tramadol were determined in a pharmacogenetic study setting. Extraction recovery estimation, based on model compounds with similar physicochemical characteristics, produced clinically feasible results without reference standards.
Resumo:
This study analysed whether the land tenure insecurity problem has led to a decline in long-term land improvements (liming and phosphorus fertilization) under the Common Agricultural Policy (CAP) and Nordic production conditions in European Union (EU) countries such as Finland. The results suggests that under traditional cash lease contracts, which are encouraged by the existing land leasing regulations and agricultural subsidy programs, the land tenure insecurity problem on leased land reduces land improvements that have a long pay-back period. In particular, soil pH was found to be significantly lower on land cultivated under a lease contract compared to land owned by the farmers themselves. The results also indicate that land improvements could not be reversed by land markets, because land owners would otherwise have carried out land improvements even if not farming by themselves. To reveal the causality between land tenure and land improvements, the dynamic optimisation problem was solved by a stochastic dynamic programming routine with known parameters for one-period returns and transition equations. The model parameters represented Finnish soil quality and production conditions. The decision rules were solved for alternative likelihood scenarios over the continuation of the fixed-term lease contract. The results suggest that as the probability of non-renewal of the lease contract increases, farmers quickly reduce investments in irreversible land improvements and, thereafter, yields gradually decline. The simulations highlighted the observed trends of a decline in land improvements on land parcels that are cultivated under lease contracts. Land tenure has resulted in the neglect of land improvement in Finland. This study aimed to analyze whether these challenges could be resolved by a tax policy that encourages land sales. Using Finnish data, real estate tax and a temporal relaxation on the taxation of capital gains showed some potential for the restructuring of land ownership. Potential sellers who could not be revealed by traditional logit models were identified with the latent class approach. Those landowners with an intention to sell even without a policy change were sensitive to temporal relaxation in the taxation of capital gains. In the long term, productivity and especially productivity growth are necessary conditions for the survival of farms and the food industry in Finland. Technical progress was found to drive the increase in productivity. The scale had only a moderate effect and for the whole study period (1976–2006) the effect was close to zero. Total factor productivity (TFP) increased, depending on the model, by 0.6–1.7% per year. The results demonstrated that the increase in productivity was hindered by the policy changes introduced in 1995. It is also evidenced that the increase in land leasing is connected to these policy changes. Land institutions and land tenure questions are essential in agricultural and rural policies on all levels, from local to international. Land ownership and land titles are commonly tied to fundamental political, economic and social questions. A fair resolution calls for innovative and new solutions both on national and international levels. However, this seems to be a problem when considering the application of EU regulations to member states inheriting divergent landownership structures and farming cultures. The contribution of this study is in describing the consequences of fitting EU agricultural policy to Finnish agricultural land tenure conditions and heritage.
Resumo:
The development of a simple method of coating a semi-permanent phospholipid layer onto a capillary for electrochromatography use was the focus of this study. The work involved finding good coating conditions, stabilizing the phospholipid coating, and examining the effect of adding divalent cations, cetyltrimethylammonium bromide, and polyethylene glycol (PEG)-lipids on the stability of the coating. Since a further purpose was to move toward more biological membrane coatings, the capillaries were also coated with cholesterol-containing liposomes and liposomes of red blood cell ghost lipids. Liposomes were prepared by extrusion, and large unilamellar vesicles with a diameter of about 100 nm were obtained. Zwitterionic phosphatidylcholine (PC) was used as a basic component, mainly 1-palmitoyl-2-oleyl-sn-glycero-3-phosphocholine (POPC) but also eggPC and 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC). Different amounts of sphingomyelin, bovine brain phosphatidylserine, and cholesterol were added to the PC. The stability of the coating in 40 mM N-(2-hydroxyethyl)piperazine-N’-(2-ethanesulfonic acid) (HEPES) solution at pH 7.4 was studied by measuring the electroosmotic flow and by separating neutral steroids, basic proteins, and low-molar-mass drugs. The presence of PC in the coating solution was found to be essential to achieving a coating. The stability of the coating was improved by the addition of negative phosphatidylserine, cholesterol, divalent cations, or PEGylated lipids, and by working in the gel-state region of the phospholipid. Study of the effect on the PC coating of divalent metal ions calcium, magnesium, and zinc showed a molar ratio of 1:3 PC/Ca2+ or PC/Mg2+ to give increased rigidity to the membrane and the best coating stability. The PEGylated lipids used in the study were sterically stabilized commercial lipids with covalently attached PEG chains. The vesicle size generally decreased when PEGylated lipids of higher molar mass were present in the vesicle. The predominance of discoidal micelles over liposomes increased PEG chain length and the average size of the vesicles thus decreased. In the capillary electrophoresis (CE) measurements a highly stable electroosmotic flow was achieved with 20% PEGylated lipid in the POPC coating dispersion, the best results being obtained for disteroyl PEG (3000) conjugates. The results suggest that smaller particles (discoidal micelles) result in tighter packing and better shielding of silanol groups on the silica wall. The effect of temperature on the coating stability was investigated by using DPPC liposomes at temperatures above (45 C) and below (25 C) the main phase transition temperature. Better results were obtained with DPPC in the more rigid gel state than in the fluid state: the electroosmotic flow was heavily suppressed and the PC coating was stabilized. Also dispersions of DPPC with 0−30 mol% of cholesterol and sphingomyelin in different ratios, which more closely resemble natural membranes, resulted in stable coatings. Finally, the CE measurements revealed that a stable coating is formed when capillaries are coated with liposomes of red blood cell ghost lipids.
Resumo:
The importance of intermolecular interactions to chemistry, physics, and biology is difficult to overestimate. Without intermolecular forces, condensed phase matter could not form. The simplest way to categorize different types of intermolecular interactions is to describe them using van der Waals and hydrogen bonded (H-bonded) interactions. In the H-bond, the intermolecular interaction appears between a positively charged hydrogen atom and electronegative fragments and it originates from strong electrostatic interactions. H-bonding is important when considering the properties of condensed phase water and in many biological systems including the structure of DNA and proteins. Vibrational spectroscopy is a useful tool for studying complexes and the solvation of molecules. Vibrational frequency shift has been used to characterize complex formation. In an H-bonded system A∙∙∙H-X (A and X are acceptor and donor species, respectively), the vibrational frequency of the H-X stretching vibration usually decreases from its value in free H-X (red-shift). This frequency shift has been used as evidence for H-bond formation and the magnitude of the shift has been used as an indicator of the H-bonding strength. In contrast to this normal behavior are the blue-shifting H-bonds, in which the H-X vibrational frequency increases upon complex formation. In the last decade, there has been active discussion regarding these blue-shifting H-bonds. Noble-gases have been considered inert due to their limited reactivity with other elements. In the early 1930 s, Pauling predicted the stable noble-gas compounds XeF6 and KrF6. It was not until three decades later Neil Bartlett synthesized the first noble-gas compound, XePtF6, in 1962. A renaissance of noble-gas chemistry began in 1995 with the discovery of noble-gas hydride molecules at the University of Helsinki. The first hydrides were HXeCl, HXeBr, HXeI, HKrCl, and HXeH. These molecules have the general formula of HNgY, where H is a hydrogen atom, Ng is a noble-gas atom (Ar, Kr, or Xe), and Y is an electronegative fragment. At present, this class of molecules comprises 23 members including both inorganic and organic compounds. The first and only argon-containing neutral chemical compound HArF was synthesized in 2000 and its properties have since been investigated in a number of studies. A helium-containing chemical compound, HHeF, was predicted computationally, but its lifetime has been predicted to be severely limited by hydrogen tunneling. Helium and neon are the only elements in the periodic table that do not form neutral, ground state molecules. A noble-gas matrix is a useful medium in which to study unstable and reactive species including ions. A solvated proton forms a centrosymmetric NgHNg+ (Ng = Ar, Kr, and Xe) structure in a noble-gas matrix and this is probably the simplest example of a solvated proton. Interestingly, the hypothetical NeHNe+ cation is isoelectronic with the water-solvated proton H5O2+ (Zundel-ion). In addition to the NgHNg+ cations, the isoelectronic YHY- (Y = halogen atom or pseudohalogen fragment) anions have been studied with the matrix-isolation technique. These species have been known to exist in alkali metal salts (YHY)-M+ (M = alkali metal e.g. K or Na) for more than 80 years. Hydrated HF forms the FHF- structure in aqueous solutions, and these ions participate in several important chemical processes. In this thesis, studies of the intermolecular interactions of HNgY molecules and centrosymmetric ions with various species are presented. The HNgY complexes show unusual spectral features, e.g. large blue-shifts of the H-Ng stretching vibration upon complexation. It is suggested that the blue-shift is a normal effect for these molecules, and that originates from the enhanced (HNg)+Y- ion-pair character upon complexation. It is also found that the HNgY molecules are energetically stabilized in the complexed form, and this effect is computationally demonstrated for the HHeF molecule. The NgHNg+ and YHY- ions also show blue-shifts in their asymmetric stretching vibration upon complexation with nitrogen. Additionally, the matrix site structure and hindered rotation (libration) of the HNgY molecules were studied. The librational motion is a much-discussed solid state phenomenon, and the HNgY molecules embedded in noble-gas matrices are good model systems to study this effect. The formation mechanisms of the HNgY molecules and the decay mechanism of NgHNg+ cations are discussed. A new electron tunneling model for the decay of NgHNg+ absorptions in noble-gas matrices is proposed. Studies of the NgHNg+∙∙∙N2 complexes support this electron tunneling mechanism.
Resumo:
Salmonella enterica serovar Typhimurium is a common cause of gastroenteritis in humans and, occasionally, also causes systemic infection. During systemic infection an important characteristic of Salmonella is its ability to survive and replicate within macrophages. The outer membrane protease PgtE of S. enterica is a member of the omptin family of outer membrane aspartate proteases, which are beta-barrel proteins with five surface-exposed loops. The main goals of this study were to characterize biological substrates and pathogenesis-associated functions of PgtE and to determine the conditions where PgtE is fully active. In this study we found that PgtE requires rough lipopolysaccharide (LPS) to be functional but is sterically inhibited by the long O-antigen side chain in smooth LPS. Salmonella isolates normally are smooth with a long oligosaccharide O-antigen, and PgtE remains functionally cryptic in wild-type Salmonella cultivated in vitro. Interestingly, our results showed that due to increased expression of PgtE and to reduced length of the LPS O-antigen chains, the wild-type Salmonella expresses highly functional PgtE when isolated from mouse macrophage-like J774A.1 cells. Salmonella is thought to be continuously released from macrophages to infect new ones, and our results suggest that PgtE is functional during these transient extracellular growth phases. Six novel host protein substrates were identified for PgtE in this work. PgtE was previously known to activate human plasminogen (Plg) to plasmin, a broad-spectrum serine protease, and in this study PgtE was shown to interfere with the Plg system by inactivating the main inhibitor of plasmin, alpha2-antiplasmin. PgtE also interferes with another important proteolytic system of mammals by activating pro-matrix metalloproteinase-9 to an active gelatinase. PgtE also directly degrades gelatin, a component of extracellular matrices. PgtE also increases bacterial resistance against complement-mediated killing in human serum and enhances survival of Salmonella within murine macrophages as well as in the liver and spleen of intraperitoneally infected mice. Taken together, the results in this study suggest that PgtE is a virulence factor of Salmonella that has adapted to interfere with host proteolytic systems and to modify extracellular matrix; these features likely assist the migration of Salmonella during systemic salmonellosis.
Resumo:
Molecular motors are proteins that convert chemical energy into mechanical work. The viral packaging ATPase P4 is a hexameric molecular motor that translocates RNA into preformed viral capsids. P4 belongs to the ubiquitous class of hexameric helicases. Although its structure is known, the mechanism of RNA translocation remains elusive. Here we present a detailed kinetic study of nucleotide binding, hydrolysis, and product release by P4. We propose a stochastic-sequential cooperative model to describe the coordination of ATP hydrolysis within the hexamer. In this model the apparent cooperativity is a result of hydrolysis stimulation by ATP and RNA binding to neighboring subunits rather than cooperative nucleotide binding. Simultaneous interaction of neighboring subunits with RNA makes the otherwise random hydrolysis sequential and processive. Further, we use hydrogen/deuterium exchange detected by high resolution mass spectrometry to visualize P4 conformational dynamics during the catalytic cycle. Concerted changes of exchange kinetics reveal a cooperative unit that dynamically links ATP binding sites and the central RNA binding channel. The cooperative unit is compatible with the structure-based model in which translocation is effected by conformational changes of a limited protein region. Deuterium labeling also discloses the transition state associated with RNA loading which proceeds via opening of the hexameric ring. Hydrogen/deuterium exchange is further used to delineate the interactions of the P4 hexamer with the viral procapsid. P4 associates with the procapsid via its C-terminal face. The interactions stabilize subunit interfaces within the hexamer. The conformation of the virus-bound hexamer is more stable than the hexamer in solution, which is prone to spontaneous ring openings. We propose that the stabilization within the viral capsid increases the packaging processivity and confers selectivity during RNA loading. Finally, we use single molecule techniques to characterize P4 translocation along RNA. While the P4 hexamer encloses RNA topologically within the central channel, it diffuses randomly along the RNA. In the presence of ATP, unidirectional net movement is discernible in addition to the stochastic motion. The diffusion is hindered by activation energy barriers that depend on the nucleotide binding state. The results suggest that P4 employs an electrostatic clutch instead of cycling through stable, discrete, RNA binding states during translocation. Conformational changes coupled to ATP hydrolysis modify the electrostatic potential inside the central channel, which in turn biases RNA motion in one direction. Implications of the P4 model for other hexameric molecular motors are discussed.
Resumo:
To a large extent, lakes can be described with a one-dimensional approach, as their main features can be characterized by the vertical temperature profile of the water. The development of the profiles during the year follows the seasonal climate variations. Depending on conditions, lakes become stratified during the warm summer. After cooling, overturn occurs, water cools and an ice cover forms. Typically, water is inversely stratified under the ice, and another overturn occurs in spring after the ice has melted. Features of this circulation have been used in studies to distinguish between lakes in different areas, as basis for observation systems and even as climate indicators. Numerical models can be used to calculate temperature in the lake, on the basis of the meteorological input at the surface. The simple form is to solve the surface temperature. The depth of the lake affects heat transfer, together with other morphological features, the shape and size of the lake. Also the surrounding landscape affects the formation of the meteorological fields over the lake and the energy input. For small lakes the shading by the shores affects both over the lake and inside the water body bringing limitations for the one-dimensional approach. A two-layer model gives an approximation for the basic stratification in the lake. A turbulence model can simulate vertical temperature profile in a more detailed way. If the shape of the temperature profile is very abrupt, vertical transfer is hindered, having many important consequences for lake biology. One-dimensional modelling approach was successfully studied comparing a one-layer model, a two-layer model and a turbulence model. The turbulence model was applied to lakes with different sizes, shapes and locations. Lake models need data from the lakes for model adjustment. The use of the meteorological input data on different scales was analysed, ranging from momentary turbulent changes over the lake to the use of the synoptical data with three hour intervals. Data over about 100 past years were used on the mesoscale at the range of about 100 km and climate change scenarios for future changes. Increasing air temperature typically increases water temperature in epilimnion and decreases ice cover. Lake ice data were used for modelling different kinds of lakes. They were also analyzed statistically in global context. The results were also compared with results of a hydrological watershed model and data from very small lakes for seasonal development.
Resumo:
This research has been prompted by an interest in the atmospheric processes of hydrogen. The sources and sinks of hydrogen are important to know, particularly if hydrogen becomes more common as a replacement for fossil fuel in combustion. Hydrogen deposition velocities (vd) were estimated by applying chamber measurements, a radon tracer method and a two-dimensional model. These three approaches were compared with each other to discover the factors affecting the soil uptake rate. A static-closed chamber technique was introduced to determine the hydrogen deposition velocity values in an urban park in Helsinki, and at a rural site at Loppi. A three-day chamber campaign to carry out soil uptake estimation was held at a remote site at Pallas in 2007 and 2008. The atmospheric mixing ratio of molecular hydrogen has also been measured by a continuous method in Helsinki in 2007 - 2008 and at Pallas from 2006 onwards. The mean vd values measured in the chamber experiments in Helsinki and Loppi were between 0.0 and 0.7 mm s-1. The ranges of the results with the radon tracer method and the two-dimensional model were 0.13 - 0.93 mm s-1 and 0.12 - 0.61 mm s-1, respectively, in Helsinki. The vd values in the three-day campaign at Pallas were 0.06 - 0.52 mm s-1 (chamber) and 0.18 - 0.52 mm s-1 (radon tracer method and two-dimensional model). At Kumpula, the radon tracer method and the chamber measurements produced higher vd values than the two-dimensional model. The results of all three methods were close to each other between November and April, except for the chamber results from January to March, while the soil was frozen. The hydrogen deposition velocity values of all three methods were compared with one-week cumulative rain sums. Precipitation increases the soil moisture, which decreases the soil uptake rate. The measurements made in snow seasons showed that a thick snow layer also hindered gas diffusion, lowering the vd values. The H2 vd values were compared to the snow depth. A decaying exponential fit was obtained as a result. During a prolonged drought in summer 2006, soil moisture values were lower than in other summer months between 2005 and 2008. Such conditions were prevailing in summer 2006 when high chamber vd values were measured. The mixing ratio of molecular hydrogen has a seasonal variation. The lowest atmospheric mixing ratios were found in the late autumn when high deposition velocity values were still being measured. The carbon monoxide (CO) mixing ratio was also measured. Hydrogen and carbon monoxide are highly correlated in an urban environment, due to the emissions originating from traffic. After correction for the soil deposition of H2, the slope was 0.49±0.07 ppb (H2) / ppb (CO). Using the corrected hydrogen-to-carbon-monoxide ratio, the total hydrogen load emitted by Helsinki traffic in 2007 was 261 t (H2) a-1. Hydrogen, methane and carbon monoxide are connected with each other through the atmospheric methane oxidation process, in which formaldehyde is produced as an important intermediate. The photochemical degradation of formaldehyde produces hydrogen and carbon monoxide as end products. Examination of back-trajectories revealed long-range transportation of carbon monoxide and methane. The trajectories can be grouped by applying cluster and source analysis methods. Thus natural and anthropogenic emission sources can be separated by analyzing trajectory clusters.
Resumo:
The subjects of this study are Narcotics Anonymous (NA), a non-profit, peer-support-based fellowship, its recovery programme, and the former drug addicts who consider themselves members of the fellowship. The study data consist of episodic interviews (n=24) and questionnaires (n=212). In the collection of questionnaire data, survey research methods had to be applied judiciously. This study analyses NA members background and their substance abuse and treatment history, as well as factors that have contributed to or hindered their bonding with NA. A recovery model is presented that stems from NA s written and oral tradition, and which has been conceptualised into NA s recovery theory. At its simplest, NA s recovery theory can be described in two sentences: 1) There are drug dependent addicts who have an addiction disease. 2) Through an NA way of life, recovery is possible. In this study, addiction and addiction disease are described through recovery stories shared at NA. It also describes how the way of life offered by NA supports recovery from drug addiction, the way of life which recovering addicts have adopted, and how they have done so. The study also presents results that, based on the study data, emerge from participation in the NA programme, and describes how the NA recovery theory works in practice, i.e. how NA members utilise the tools provided by the fellowship and how the lives of recovering addicts change during their membership. Furthermore, this study also discusses criticism of NA. According to the study, NA affects the lives of recovering drug addicts in a number of ways. People of different ages and with a variety of personal, treatment and drug abuse histories seem to benefit from membership of NA. Viewed from the outside, NA may appear as strictly normative, but in practice each member can adapt the programme in a way that suits him/her best. Indeed, flexibility is one of the strengths of NA, but without more extensive knowledge of the fellowship, it is possible that the norms reflected in NA texts or the fanaticism of individual NA members may drive some people away. Due to the increasing number of NA members, the association is also able to provide more alternatives. This study confirms the view that peer support is important, as well as the fact that an official treatment system is required in parallel with peer support activities. NA can never fully replace professional support, neither should it be left with sole responsibility for recovering addicts. Keywords: Narcotics Anonymous, peer support, recovery study, recovery, substance addiction, drug treatment, drugs, explorative research
Resumo:
Conflict, Unity, Oblivion: Commemoration of the Liberation War by the Civic Guard and the Veterans´ Union in 1918-1944 The Finnish Civil War ended in May 1918 as a victory for the white side. The war was named by the winners as the Liberation War and its legacy became a central theme for public commemorations during the interwar period. At the same time the experiences of the defeated were hindered from becoming a part of the official history of Finland. The commemoration of the war was related not only to the war experience but also to a national mission, which was seen fulfilled with the independence of Finland. Although the idea of the commemoration was to form a unifying non-political scene for the nation, the remembrance of the Liberation War rather continued than sought to reconcile to the conflict of 1918. The outbreak of the war between the Soviet Union and Finland in 1939 immediately affected the memory culture. The new myth of the Miracle of the Winter War, which referred to the unity shown by the people, required a marginalization of controversial memory of the Liberation War. This study examines from the concepts of public memory and narrative templates how the problematic experience of a civil war developed to a popular public commemoration. Instead of dealing with the manipulative and elite-centered grandiose commemoration projects, the study focuses on the more modest local level and emphasizes the significance of local memory agents and narrative templates of collective memory. The main subjects in the study are the Civil Guard and the Veterans´ Union. Essential for the widespread movement was the development of the Civic Guard from a wartime organization to a peacetime popular movement. The guards, who identified themselves trough the memories and the threats of civil war, formed a huge network of memory agents in every corner of the country. They effectively linked both local memory with official memory and the civic society with the state level. Only with the emergence of the right wing veteran movement in the 30ies did the tensions grow between the two levels of public memory. The study shows the diversity of the commemoration movement of the Liberation War. It was not only a result of a nation-state project and political propaganda, but also a way for local communities to identify and strengthen themselves in a time of political upheaval and uncertainty.
Resumo:
Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.