52 resultados para Hierarchical models

em Helda - Digital Repository of University of Helsinki


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject of my research is the romantic dating culture, the practice of 'going with', among preadolescents ('tweens') in Finland during the 1990s. Preadolescence is a cultural construction of the post-industrial period, experienced by school students between the ages of 7 to 13. Deemed by researchers as a shallow, unchallenging and uninteresting period, it has been shadowed in previous studies by early childhood and puberty. This study combines paradigms of the folkloristic research of children's lore, which began in the 1970s, with those of later, turn-of-the-century girls study. The phenomena of romantic girl culture are studied in several ways, through ample and varied subject materials collected in different places at different times. The research material was collected directly from schoolchildren through interviews, questionnaires and the observations of preadolescents' behavior in discos, among other methods. Part of the material consists of reminiscent thematic writings and parts have been quoted from tween message boards. A general picture of romantic preadolescent dating culture is formed in this study from five previously published articles and a summary. The influence of western culture, with its respect for relationships, is evident in tween dating culture. Seven- to thirteen-year olds use the elements of the society around them to construct an appropriate way for themselves to 'go out' with someone. Many expressions in preadolescent dating culture are contrary to the models of adult relationships. For example, a couple isn't necessarily expected to meet each other even once, or the other party, the boy, doesn't even need to know he's dating someone. Girls organize and experience relationships by playing card fortune-telling, calculating 'Love Percentages', and other methods. Categorizing tween dating culture and its related emotional qualities from an adult point of view as simply a play is one example of the hierarchical system of generations where childhood emotions, actions and conceptions of reality aren't valued as highly as the 'real life' of adults. Lowest on the totem pole are little girls, who in this study get their voices backed up by the researcher's adulthood and research-based sisterhood. Keywords: childhood, children's lore, dating culture, girls and boys, girls study, fortune-telling games, preadolescence/tweens

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertension, obesity, dyslipidemia and dysglycemia constitute metabolic syndrome, a major public health concern, which is associated with cardiovascular mortality. High dietary salt (NaCl) is the most important dietary risk factor for elevated blood pressure. The kidney has a major role in salt-sensitive hypertension and is vulnerable to harmful effects of increased blood pressure. Elevated serum urate is a common finding in these disorders. While dysregulation of urate excretion is associated with cardiovascular diseases, present studies aimed to clarify the role of xanthine oxidoreductase (XOR), i.e. xanthine dehydrogenase (XDH) and its post-translational isoform xanthine oxidase (XO), in cardiovascular diseases. XOR yields urate from hypoxanthine and xanthine. Low oxygen levels upregulate XOR in addition to other factors. In present studies higher renal XOR activity was found in hypertension-prone rats than in the controls. Furthermore, NaCl intake increased renal XOR dose-dependently. To clarify whether XOR has any causal role in hypertension, rats were kept on NaCl diets for different periods of time, with or without a XOR inhibitor, allopurinol. While allopurinol did not alleviate hypertension, it prevented left ventricular and renal hypertrophy. Nitric oxide synthases (NOS) produce nitric oxide (NO), which mediates vasodilatation. A paucity of NO, produced by NOS inhibition, aggravated hypertension and induced renal XOR, whereas NO generating drug, alleviated salt-induced hypertension without changes in renal XOR. Zucker fa/fa rat is an animal model of metabolic syndrome. These rats developed substantial obesity and modest hypertension and showed increased hepatic and renal XOR activities. XOR was modified by diet and antihypertensive treatment. Cyclosporine (CsA) is a fungal peptide and one of the first-line immunosuppressive drugs used in the management of organ transplantation. Nephrotoxicity ensue high doses resulting in hypertension and limit CsA use. CsA increased renal XO substantially in salt-sensitive rats on a high NaCl diet, indicating a possible role for this reactive oxygen species generating isoform in CsA nephrotoxicity. Renal hypoxia, common to these rodent models of hypertension and obesity, is one of the plausible XOR inducing factors. Although XOR inhibition did not prevent hypertension, present experimental data indicate that XOR plays a role in the pathology of salt-induced cardiac and renal hypertrophy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective and background. Tobacco smoking, pancreatitis and diabetes mellitus are the only known causes of pancreatic cancer, leaving ample room for yet unidentified determinants. This is an empirical study on a Finnish data on occupational exposures and pancreatic cancer risk, and a non-Bayesian and a hierarchical Bayesian meta-analysis of data on occupational factors and pancreatic cancer. Methods. The case-control study analyzed 595 incident cases of pancreatic cancer and 1,622 controls of stomach, colon, and rectum cancer, diagnosed 1984-1987 and known to be dead by 1990 in Finland. The next-of-kin responded to a mail questionnaire on job and medical histories and lifestyles. Meta-analysis of occupational risk factors of pancreatic cancer started off with 1,903 identified studies. The analyses were based on different subsets of that database. Five epidemiologists examined the reports and extracted the pertinent data using a standardized extraction form that covered 20 study descriptors and the relevant relative risk estimates. Random effects meta-analyses were applied for 23 chemical agents. In addition, hierarchical Bayesian models for meta-analysis were applied to the occupational data of 27 job titles using job exposure matrix as a link matrix and estimating the relative risks of pancreatic cancer associated with nine occupational agents. Results. In the case-control study, logistic regressions revealed excess risks of pancreatic cancer associated with occupational exposures to ionizing radiation, nonchlorinated solvents, and pesticides. Chlorinated hydrocarbon solvents and related compounds, used mainly in metal degreasing and dry cleaning, are emerging as likely risk factors of pancreatic cancer in the non-Bayesian and the hierarchical Bayesian meta-analysis. Consistent excess risk was found for insecticides, and a high excess for nickel and nickel compounds in the random effects meta-analysis but not in the hierarchical Bayesian meta-analysis. Conclusions. In this study occupational exposure to chlorinated hydrocarbon solvents and related compounds and insecticides increase risk of pancreatic cancer. Hierarchical Bayesian meta-analysis is applicable when studies addressing the agent(s) under study are lacking or very few, but several studies address job titles with potential exposure to these agents. A job-exposure matrix or a formal expert assessment system is necessary in this situation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nephrin is a transmembrane protein belonging to the immunoglobulin superfamily and is expressed primarily in the podocytes, which are highly differentiated epithelial cells needed for primary urine formation in the kidney. Mutations leading to nephrin loss abrogate podocyte morphology, and result in massive protein loss into urine and consequent early death in humans carrying specific mutations in this gene. The disease phenotype is closely replicated in respective mouse models. The purpose of this thesis was to generate novel inducible mouse-lines, which allow targeted gene deletion in a time and tissue-specific manner. A proof of principle model for succesful gene therapy for this disease was generated, which allowed podocyte specific transgene replacement to rescue gene deficient mice from perinatal lethality. Furthermore, the phenotypic consequences of nephrin restoration in the kidney and nephrin deficiency in the testis, brain and pancreas in rescued mice were investigated. A novel podocyte-specific construct was achieved by using standard cloning techniques to provide an inducible tool for in vitro and in vivo gene targeting. Using modified constructs and microinjection procedures two novel transgenic mouse-lines were generated. First, a mouse-line with doxycycline inducible expression of Cre recombinase that allows podocyte-specific gene deletion was generated. Second, a mouse-line with doxycycline inducible expression of rat nephrin, which allows podocyte-specific nephrin over-expression was made. Furthermore, it was possible to rescue nephrin deficient mice from perinatal lethality by cross-breeding them with a mouse-line with inducible rat nephrin expression that restored the missing endogenous nephrin only in the kidney after doxycycline treatment. The rescued mice were smaller, infertile, showed genital malformations and developed distinct histological abnormalities in the kidney with an altered molecular composition of the podocytes. Histological changes were also found in the testis, cerebellum and pancreas. The expression of another molecule with limited tissue expression, densin, was localized to the plasma membranes of Sertoli cells in the testis by immunofluorescence staining. Densin may be an essential adherens junction protein between Sertoli cells and developing germ cells and these junctions share similar protein assembly with kidney podocytes. This single, binary conditional construct serves as a cost- and time-efficient tool to increase the understanding of podocyte-specific key proteins in health and disease. The results verified a tightly controlled inducible podocyte-specific transgene expression in vitro and in vivo as expected. These novel mouse-lines with doxycycline inducible Cre recombinase and with rat nephrin expression will be useful for conditional gene targeting of essential podocyte proteins and to study in detail their functions in the adult mice. This is important for future diagnostic and pharmacologic development platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation examines the short- and long-run impacts of timber prices and other factors affecting NIPF owners' timber harvesting and timber stocking decisions. The utility-based Faustmann model provides testable hypotheses of the exogenous variables retained in the timber supply analysis. The timber stock function, derived from a two-period biomass harvesting model, is estimated using a two-step GMM estimator based on balanced panel data from 1983 to 1991. Timber supply functions are estimated using a Tobit model adjusted for heteroscedasticity and nonnormality of errors based on panel data from 1994 to 1998. Results show that if specification analysis of the Tobit model is ignored, inconsistency and biasedness can have a marked effect on parameter estimates. The empirical results show that owner's age is the single most important factor determining timber stock; timber price is the single most important factor in harvesting decision. The results of the timber supply estimations can be interpreted using utility-based Faustmann model of a forest owner who values a growing timber in situ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important safety aspect to be considered when foods are enriched with phytosterols and phytostanols is the oxidative stability of these lipid compounds, i.e. their resistance to oxidation and thus to the formation of oxidation products. This study concentrated on producing scientific data to support this safety evaluation process. In the absence of an official method for analyzing of phytosterol/stanol oxidation products, we first developed a new gas chromatographic - mass spectrometric (GC-MS) method. We then investigated factors affecting these compounds' oxidative stability in lipid-based food models in order to identify critical conditions under which significant oxidation reactions may occur. Finally, the oxidative stability of phytosterols and stanols in enriched foods during processing and storage was evaluated. Enriched foods covered a range of commercially available phytosterol/stanol ingredients, different heat treatments during food processing, and different multiphase food structures. The GC-MS method was a powerful tool for measuring the oxidative stability. Data obtained in food model studies revealed that the critical factors for the formation and distribution of the main secondary oxidation products were sterol structure, reaction temperature, reaction time, and lipid matrix composition. Under all conditions studied, phytostanols as saturated compounds were more stable than unsaturated phytosterols. In addition, esterification made phytosterols more reactive than free sterols at low temperatures, while at high temperatures the situation was the reverse. Generally, oxidation reactions were more significant at temperatures above 100°C. At lower temperatures, the significance of these reactions increased with increasing reaction time. The effect of lipid matrix composition was dependent on temperature; at temperatures above 140°C, phytosterols were more stable in an unsaturated lipid matrix, whereas below 140°C they were more stable in a saturated lipid matrix. At 140°C, phytosterols oxidized at the same rate in both matrices. Regardless of temperature, phytostanols oxidized more in an unsaturated lipid matrix. Generally, the distribution of oxidation products seemed to be associated with the phase of overall oxidation. 7-ketophytosterols accumulated when oxidation had not yet reached the dynamic state. Once this state was attained, the major products were 5,6-epoxyphytosterols and 7-hydroxyphytosterols. The changes observed in phytostanol oxidation products were not as informative since all stanol oxides quantified represented hydroxyl compounds. The formation of these secondary oxidation products did not account for all of the phytosterol/stanol losses observed during the heating experiments, indicating the presence of dimeric, oligomeric or other oxidation products, especially when free phytosterols and stanols were heated at high temperatures. Commercially available phytosterol/stanol ingredients were stable during such food processes as spray-drying and ultra high temperature (UHT)-type heating and subsequent long-term storage. Pan-frying, however, induced phytosterol oxidation and was classified as a rather deteriorative process. Overall, the findings indicated that although phytosterols and stanols are stable in normal food processing conditions, attention should be paid to their use in frying. Complex interactions between other food constituents also suggested that when new phytosterol-enriched foods are developed their oxidative stability must first be established. The results presented here will assist in choosing safe conditions for phytosterol/stanol enrichment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determination of the environmental factors controlling earth surface processes and landform patterns is one of the central themes in physical geography. However, the identification of the main drivers of the geomorphological phenomena is often challenging. Novel spatial analysis and modelling methods could provide new insights into the process-environment relationships. The objective of this research was to map and quantitatively analyse the occurrence of cryogenic phenomena in subarctic Finland. More precisely, utilising a grid-based approach the distribution and abundance of periglacial landforms were modelled to identify important landscape scale environmental factors. The study was performed using a comprehensive empirical data set of periglacial landforms from an area of 600 km2 at a 25-ha resolution. The utilised statistical methods were generalized linear modelling (GLM) and hierarchical partitioning (HP). GLMs were used to produce distribution and abundance models and HP to reveal independently the most likely causal variables. The GLM models were assessed utilising statistical evaluation measures, prediction maps, field observations and the results of HP analyses. A total of 40 different landform types and subtypes were identified. Topographical, soil property and vegetation variables were the primary correlates for the occurrence and cover of active periglacial landforms on the landscape scale. In the model evaluation, most of the GLMs were shown to be robust although the explanation power, prediction ability as well as the selected explanatory variables varied between the models. The great potential of the combination of a spatial grid system, terrain data and novel statistical techniques to map the occurrence of periglacial landforms was demonstrated in this study. GLM proved to be a useful modelling framework for testing the shapes of the response functions and significances of the environmental variables and the HP method helped to make better deductions of the important factors of earth surface processes. Hence, the numerical approach presented in this study can be a useful addition to the current range of techniques available to researchers to map and monitor different geographical phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.