9 resultados para Canning and preserving
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.
Resumo:
During spermatogenesis, different genes are expressed in a strictly coordinated fashion providing an excellent model to study cell differentiation. Recent identification of testis specific genes and the development of green fluorescence protein (GFP) transgene technology and an in vivo system for studying the differentiation of transplanted male germ cells in infertile testis has opened new possibilities for studying the male germ cell differentiation at molecular level. We have employed these techniques in combination with transillumination based stage recognition (Parvinen and Vanha-Perttula, 1972) and squash preparation techniques (Parvinen and Hecht, 1981) to study the regulation of male germ cell differentiation. By using transgenic mice expressing enhanced-(E)GFP as a marker we have studied the expression and hormonal regulation of beta-actin and acrosin proteins in the developmentally different living male germ cells. Beta-actin was demonstrated in all male germ cells, whereas acrosin was expressed only in late meiotic and in postmeiotic cells. Follicle stimulating hormone stimulated b-actin-EGFP expression at stages I-VI and enhanced the formation of microtubules in spermatids and this way reduced the size of the acrosomic system. When EGFP expressing spermatogonial stem cells were transplanted into infertile mouse testis differentiation and the synchronized development of male germ cells could be observed during six months observation time. Each colony developed independently and maintained typical stage-dependent cell associations. Furthermore, if more than two colonies were fused, each of them was adjusted to one stage and synchronized. By studying living spermatids we were able to demonstrate novel functions for Golgi complex and chromatoid body in material sharing between neighbor spermatids. Immunosytochemical analyses revealed a transport of haploid cell specific proteins in spermatids (TRA54 and Shippo1) and through the intercellular bridges (TRA54). Cytoskeleton inhibitor (nocodazole) demonstrated the importance of microtubules in material sharing between spermatids and in preserving the integrity of the chromatoid body. Golgi complex inhibitor, brefeldin A, revealed the great importance of Golgi complex i) in acrosomic system formation ii) TRA54 translation and in iii) granule trafficking between spermatids.
Resumo:
Performance standards for Positron emission tomography (PET) were developed to be able to compare systems from different generations and manufacturers. This resulted in the NEMA methodology in North America and the IEC in Europe. In practices, the NEMA NU 2- 2001 is the method of choice today. These standardized methods allow assessment of the physical performance of new commercial dedicated PET/CT tomographs. The point spread in image formation is one of the factors that blur the image. The phenomenon is often called the partial volume effect. Several methods for correcting for partial volume are under research but no real agreement exists on how to solve it. The influence of the effect varies in different clinical settings and it is likely that new methods are needed to solve this problem. Most of the clinical PET work is done in the field of oncology. The whole body PET combined with a CT is the standard investigation today in oncology. Despite the progress in PET imaging technique visualization, especially quantification of small lesions is a challenge. In addition to partial volume, the movement of the object is a significant source of error. The main causes of movement are respiratory and cardiac motions. Most of the new commercial scanners are in addition to cardiac gating, also capable of respiratory gating and this technique has been used in patients with cancer of the thoracic region and patients being studied for the planning of radiation therapy. For routine cardiac applications such as assessment of viability and perfusion only cardiac gating has been used. However, the new targets such as plaque or molecular imaging of new therapies require better control of the cardiac motion also caused by respiratory motion. To overcome these problems in cardiac work, a dual gating approach has been proposed. In this study we investigated the physical performance of a new whole body PET/CT scanner with NEMA standard, compared methods for partial volume correction in PET studies of the brain and developed and tested a new robust method for dual cardiac-respiratory gated PET with phantom, animal and human data. Results from performance measurements showed the feasibility of the new scanner design in 2D and 3D whole body studies. Partial volume was corrected, but there is no best method among those tested as the correction also depends on the radiotracer and its distribution. New methods need to be developed for proper correction. The dual gating algorithm generated is shown to handle dual-gated data, preserving quantification and clearly eliminating the majority of contraction and respiration movement
Resumo:
Sales configurators are essential tools for companies that offer complicated case specifically crafted products for customers. Most sophisticated of them are able to design an entire end product on the fly according to given constraints, calculate price for the offer and move the order into production. This thesis covers a sales configurator acquisition project in a large industrial company that offers cranes for its customers. The study spans the preliminary stages of a large-scale software purchase project starting from the specification of problem domain and ending up presenting the most viable software solution that fulfils the requirements for the new system. The project consists of mapping usage environment, use cases, and collecting requirements that are expected from the new system. The collected requirements involve fitting the new sales system into enterprise application infrastructure, mitigating the risks involved in the project and specifying new features to the application whilst preserving all of the admired features of the old sales system currently used in the company. The collected requirements were presented to a number of different sales software vendors who were asked to provide solution suggestions that would fulfil all the demands. All of the received solution proposals were exposed to an evaluation to determine the most feasible solutions, and the construction of evaluation criteria itself was a part of the study. The final outcome of this study is a short-list of the most feasible sales configurator solutions together with a description of how software purchase process in large enterprises work, and which aspects should be paid attention in large projects of similar kind.
Resumo:
This dissertation explores the complicated relations between Estonian, Latvian, and Lithuanian postwar refugees and American foreign policymakers between 1948 and 1960. There were seemingly shared interests between the parties during the first decade of the Cold War. Generally, Eastern European refugees refused to recognize Soviet hegemony in their homelands, and American policy towards the Soviet bloc during the Truman and Eisenhower administrations sought to undermine the Kremlin’s standing in the region. More specifically, Baltic refugees and State Department officials sought to preserve the 1940 non-recognition policy towards the Soviet annexation of the Baltic States. I propose that despite the seemingly natural convergence of interests, the American experiment of constructing a State-Private network revolving around fostering relations with exile groups was fraught with difficulties. These difficulties ultimately undermined any ability that the United States might have had to liberate the Baltic States from the Soviet Union. As this dissertation demonstrates, Baltic exiles were primarily concerned with preserving a high level of political continuity to the interwar republics under the assumption that they would be able to regain their positions in liberated, democratic societies. American policymakers, however, were primarily concerned with maintaining the non-recognition policy, the framework in which all policy considerations were analyzed. I argue that these two motivating factors created unnecessary tensions in American policy towards the Baltic republics in the spheres of psychological warfare as well as exile unity in the United States and Europe. Despite these shortcomings, I argue that out of the exiles’ failings was born a generation of Baltic constituents that blurred the political legitimacy line between exiles who sought to return home and ethnic Americans who were loyal to the United States. These Baltic constituents played an important role in garnering the support of the United States Congress, starting in the 1950s, but became increasingly influential after the 1956 Hungarian Revolution, despite the seemingly less important role Eastern Europe played in the Cold War. The actions of the Baltic constituents not only prevented the Baltic question from being forever lost in the memory hole of history, but actually created enough political pressure on the State Department that it was impossible to alter the long-standing policy of not recognizing the Soviet annexation of the Baltic States.
Resumo:
The thesis presents results obtained during the authors PhD-studies. First systems of language equations of a simple form consisting of just two equations are proved to be computationally universal. These are systems over unary alphabet, that are seen as systems of equations over natural numbers. The systems contain only an equation X+A=B and an equation X+X+C=X+X+D, where A, B, C and D are eventually periodic constants. It is proved that for every recursive set S there exists natural numbers p and d, and eventually periodic sets A, B, C and D such that a number n is in S if and only if np+d is in the unique solution of the abovementioned system of two equations, so all recursive sets can be represented in an encoded form. It is also proved that all recursive sets cannot be represented as they are, so the encoding is really needed. Furthermore, it is proved that the family of languages generated by Boolean grammars is closed under injective gsm-mappings and inverse gsm-mappings. The arguments apply also for the families of unambiguous Boolean languages, conjunctive languages and unambiguous languages. Finally, characterizations for morphisims preserving subfamilies of context-free languages are presented. It is shown that the families of deterministic and LL context-free languages are closed under codes if and only if they are of bounded deciphering delay. These families are also closed under non-codes, if they map every letter into a submonoid generated by a single word. The family of unambiguous context-free languages is closed under all codes and under the same non-codes as the families of deterministic and LL context-free languages.
Resumo:
This thesis focuses on the private membership test (PMT) problem and presents three single server protocols to resolve this problem. In the presented solutions, a client can perform an inclusion test for some record x in a server's database, without revealing his record. Moreover after executing the protocols, the contents of server's database remain secret. In each of these solutions, a different cryptographic protocol is utilized to construct a privacy preserving variant of Bloom filter. The three suggested solutions are slightly different from each other, from privacy perspective and also from complexity point of view. Therefore, their use cases are different and it is impossible to choose one that is clearly the best between all three. We present the software developments of the three protocols by utilizing various pseudocodes. The performance of our implementation is measured based on a real case scenario. This thesis is a spin-off from the Academy of Finland research project "Cloud Security Services".
Resumo:
Our surrounding landscape is in a constantly dynamic state, but recently the rate of changes and their effects on the environment have considerably increased. In terms of the impact on nature, this development has not been entirely positive, but has rather caused a decline in valuable species, habitats, and general biodiversity. Regardless of recognizing the problem and its high importance, plans and actions of how to stop the detrimental development are largely lacking. This partly originates from a lack of genuine will, but is also due to difficulties in detecting many valuable landscape components and their consequent neglect. To support knowledge extraction, various digital environmental data sources may be of substantial help, but only if all the relevant background factors are known and the data is processed in a suitable way. This dissertation concentrates on detecting ecologically valuable landscape components by using geospatial data sources, and applies this knowledge to support spatial planning and management activities. In other words, the focus is on observing regionally valuable species, habitats, and biotopes with GIS and remote sensing data, using suitable methods for their analysis. Primary emphasis is given to the hemiboreal vegetation zone and the drastic decline in its semi-natural grasslands, which were created by a long trajectory of traditional grazing and management activities. However, the applied perspective is largely methodological, and allows for the application of the obtained results in various contexts. Models based on statistical dependencies and correlations of multiple variables, which are able to extract desired properties from a large mass of initial data, are emphasized in the dissertation. In addition, the papers included combine several data sets from different sources and dates together, with the aim of detecting a wider range of environmental characteristics, as well as pointing out their temporal dynamics. The results of the dissertation emphasise the multidimensionality and dynamics of landscapes, which need to be understood in order to be able to recognise their ecologically valuable components. This not only requires knowledge about the emergence of these components and an understanding of the used data, but also the need to focus the observations on minute details that are able to indicate the existence of fragmented and partly overlapping landscape targets. In addition, this pinpoints the fact that most of the existing classifications are too generalised as such to provide all the required details, but they can be utilized at various steps along a longer processing chain. The dissertation also emphases the importance of landscape history as an important factor, which both creates and preserves ecological values, and which sets an essential standpoint for understanding the present landscape characteristics. The obtained results are significant both in terms of preserving semi-natural grasslands, as well as general methodological development, giving support to science-based framework in order to evaluate ecological values and guide spatial planning.