19 resultados para GAUSSIAN NUCLEUS MODELS

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part I: Parkinson’s disease is a slowly progressive neurodegenerative disorder in which particularly the dopaminergic neurons of the substantia nigra pars compacta degenerate and die. Current conventional treatment is based on restraining symptoms but it has no effect on the progression of the disease. Gene therapy research has focused on the possibility of restoring the lost brain function by at least two means: substitution of critical enzymes needed for the synthesis of dopamine and slowing down the progression of the disease by supporting the functions of the remaining nigral dopaminergic neurons by neurotrophic factors. The striatal levels of enzymes such as tyrosine hydroxylase, dopadecarboxylase and GTP-CH1 are decreased as the disease progresses. By replacing one or all of the enzymes, dopamine levels in the striatum may be restored to normal and behavioral impairments caused by the disease may be ameliorated especially in the later stages of the disease. The neurotrophic factors glial cell derived neurotrophic factor (GDNF) and neurturin have shown to protect and restore functions of dopaminergic cell somas and terminals as well as improve behavior in animal lesion models. This therapy may be best suited at the early stages of the disease when there are more dopaminergic neurons for neurotrophic factors to reach. Viral vector-mediated gene transfer provides a tool to deliver proteins with complex structures into specific brain locations and provides long-term protein over-expression. Part II: The aim of our study was to investigate the effects of two orally dosed COMT inhibitors entacapone (10 and 30 mg/kg) and tolcapone (10 and 30 mg/kg) with a subsequent administration of a peripheral dopadecarboxylase inhibitor carbidopa (30 mg/kg) and L- dopa (30 mg/kg) on dopamine and its metabolite levels in the dorsal striatum and nucleus accumbens of freely moving rats using dual-probe in vivo microdialysis. Earlier similarly designed studies have only been conducted in the dorsal striatum. We also confirmed the result of earlier ex vivo studies regarding the effects of intraperitoneally dosed tolcapone (30 mg/kg) and entacapone (30 mg/kg) on striatal and hepatic COMT activity. The results obtained from the dorsal striatum were generally in line with earlier studies, where tolcapone tended to increase dopamine and DOPAC levels and decrease HVA levels. Entacapone tended to keep striatal dopamine and HVA levels elevated longer than in controls and also tended to elevate the levels of DOPAC. Surprisingly in the nucleus accumbens, dopamine levels after either dose of entacapone or tolcapone were not elevated. Accumbal DOPAC levels, especially in the tolcapone 30 mg/kg group, were elevated nearly to the same extent as measured in the dorsal striatum. Entacapone 10 mg/kg elevated accumbal HVA levels more than the dose of 30 mg/kg and the effect was more pronounced in the nucleus accumbens than in the dorsal striatum. This suggests that entacapone 30 mg/kg has minor central effects. Also our ex vivo study results obtained from the dorsal striatum suggest that entacapone 30 mg/kg has minor and transient central effects, even though central HVA levels were not suppressed below those of the control group in either brain area in the microdialysis study. Both entacapone and tolcapone suppressed hepatic COMT activity more than striatal COMT activity. Tolcapone was more effective than entacapone in the dorsal striatum. The differences between dopamine and its metabolite levels in the dorsal striatum and nucleus accumbens may be due to different properties of the two brain areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertension, obesity, dyslipidemia and dysglycemia constitute metabolic syndrome, a major public health concern, which is associated with cardiovascular mortality. High dietary salt (NaCl) is the most important dietary risk factor for elevated blood pressure. The kidney has a major role in salt-sensitive hypertension and is vulnerable to harmful effects of increased blood pressure. Elevated serum urate is a common finding in these disorders. While dysregulation of urate excretion is associated with cardiovascular diseases, present studies aimed to clarify the role of xanthine oxidoreductase (XOR), i.e. xanthine dehydrogenase (XDH) and its post-translational isoform xanthine oxidase (XO), in cardiovascular diseases. XOR yields urate from hypoxanthine and xanthine. Low oxygen levels upregulate XOR in addition to other factors. In present studies higher renal XOR activity was found in hypertension-prone rats than in the controls. Furthermore, NaCl intake increased renal XOR dose-dependently. To clarify whether XOR has any causal role in hypertension, rats were kept on NaCl diets for different periods of time, with or without a XOR inhibitor, allopurinol. While allopurinol did not alleviate hypertension, it prevented left ventricular and renal hypertrophy. Nitric oxide synthases (NOS) produce nitric oxide (NO), which mediates vasodilatation. A paucity of NO, produced by NOS inhibition, aggravated hypertension and induced renal XOR, whereas NO generating drug, alleviated salt-induced hypertension without changes in renal XOR. Zucker fa/fa rat is an animal model of metabolic syndrome. These rats developed substantial obesity and modest hypertension and showed increased hepatic and renal XOR activities. XOR was modified by diet and antihypertensive treatment. Cyclosporine (CsA) is a fungal peptide and one of the first-line immunosuppressive drugs used in the management of organ transplantation. Nephrotoxicity ensue high doses resulting in hypertension and limit CsA use. CsA increased renal XO substantially in salt-sensitive rats on a high NaCl diet, indicating a possible role for this reactive oxygen species generating isoform in CsA nephrotoxicity. Renal hypoxia, common to these rodent models of hypertension and obesity, is one of the plausible XOR inducing factors. Although XOR inhibition did not prevent hypertension, present experimental data indicate that XOR plays a role in the pathology of salt-induced cardiac and renal hypertrophy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nephrin is a transmembrane protein belonging to the immunoglobulin superfamily and is expressed primarily in the podocytes, which are highly differentiated epithelial cells needed for primary urine formation in the kidney. Mutations leading to nephrin loss abrogate podocyte morphology, and result in massive protein loss into urine and consequent early death in humans carrying specific mutations in this gene. The disease phenotype is closely replicated in respective mouse models. The purpose of this thesis was to generate novel inducible mouse-lines, which allow targeted gene deletion in a time and tissue-specific manner. A proof of principle model for succesful gene therapy for this disease was generated, which allowed podocyte specific transgene replacement to rescue gene deficient mice from perinatal lethality. Furthermore, the phenotypic consequences of nephrin restoration in the kidney and nephrin deficiency in the testis, brain and pancreas in rescued mice were investigated. A novel podocyte-specific construct was achieved by using standard cloning techniques to provide an inducible tool for in vitro and in vivo gene targeting. Using modified constructs and microinjection procedures two novel transgenic mouse-lines were generated. First, a mouse-line with doxycycline inducible expression of Cre recombinase that allows podocyte-specific gene deletion was generated. Second, a mouse-line with doxycycline inducible expression of rat nephrin, which allows podocyte-specific nephrin over-expression was made. Furthermore, it was possible to rescue nephrin deficient mice from perinatal lethality by cross-breeding them with a mouse-line with inducible rat nephrin expression that restored the missing endogenous nephrin only in the kidney after doxycycline treatment. The rescued mice were smaller, infertile, showed genital malformations and developed distinct histological abnormalities in the kidney with an altered molecular composition of the podocytes. Histological changes were also found in the testis, cerebellum and pancreas. The expression of another molecule with limited tissue expression, densin, was localized to the plasma membranes of Sertoli cells in the testis by immunofluorescence staining. Densin may be an essential adherens junction protein between Sertoli cells and developing germ cells and these junctions share similar protein assembly with kidney podocytes. This single, binary conditional construct serves as a cost- and time-efficient tool to increase the understanding of podocyte-specific key proteins in health and disease. The results verified a tightly controlled inducible podocyte-specific transgene expression in vitro and in vivo as expected. These novel mouse-lines with doxycycline inducible Cre recombinase and with rat nephrin expression will be useful for conditional gene targeting of essential podocyte proteins and to study in detail their functions in the adult mice. This is important for future diagnostic and pharmacologic development platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation examines the short- and long-run impacts of timber prices and other factors affecting NIPF owners' timber harvesting and timber stocking decisions. The utility-based Faustmann model provides testable hypotheses of the exogenous variables retained in the timber supply analysis. The timber stock function, derived from a two-period biomass harvesting model, is estimated using a two-step GMM estimator based on balanced panel data from 1983 to 1991. Timber supply functions are estimated using a Tobit model adjusted for heteroscedasticity and nonnormality of errors based on panel data from 1994 to 1998. Results show that if specification analysis of the Tobit model is ignored, inconsistency and biasedness can have a marked effect on parameter estimates. The empirical results show that owner's age is the single most important factor determining timber stock; timber price is the single most important factor in harvesting decision. The results of the timber supply estimations can be interpreted using utility-based Faustmann model of a forest owner who values a growing timber in situ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important safety aspect to be considered when foods are enriched with phytosterols and phytostanols is the oxidative stability of these lipid compounds, i.e. their resistance to oxidation and thus to the formation of oxidation products. This study concentrated on producing scientific data to support this safety evaluation process. In the absence of an official method for analyzing of phytosterol/stanol oxidation products, we first developed a new gas chromatographic - mass spectrometric (GC-MS) method. We then investigated factors affecting these compounds' oxidative stability in lipid-based food models in order to identify critical conditions under which significant oxidation reactions may occur. Finally, the oxidative stability of phytosterols and stanols in enriched foods during processing and storage was evaluated. Enriched foods covered a range of commercially available phytosterol/stanol ingredients, different heat treatments during food processing, and different multiphase food structures. The GC-MS method was a powerful tool for measuring the oxidative stability. Data obtained in food model studies revealed that the critical factors for the formation and distribution of the main secondary oxidation products were sterol structure, reaction temperature, reaction time, and lipid matrix composition. Under all conditions studied, phytostanols as saturated compounds were more stable than unsaturated phytosterols. In addition, esterification made phytosterols more reactive than free sterols at low temperatures, while at high temperatures the situation was the reverse. Generally, oxidation reactions were more significant at temperatures above 100°C. At lower temperatures, the significance of these reactions increased with increasing reaction time. The effect of lipid matrix composition was dependent on temperature; at temperatures above 140°C, phytosterols were more stable in an unsaturated lipid matrix, whereas below 140°C they were more stable in a saturated lipid matrix. At 140°C, phytosterols oxidized at the same rate in both matrices. Regardless of temperature, phytostanols oxidized more in an unsaturated lipid matrix. Generally, the distribution of oxidation products seemed to be associated with the phase of overall oxidation. 7-ketophytosterols accumulated when oxidation had not yet reached the dynamic state. Once this state was attained, the major products were 5,6-epoxyphytosterols and 7-hydroxyphytosterols. The changes observed in phytostanol oxidation products were not as informative since all stanol oxides quantified represented hydroxyl compounds. The formation of these secondary oxidation products did not account for all of the phytosterol/stanol losses observed during the heating experiments, indicating the presence of dimeric, oligomeric or other oxidation products, especially when free phytosterols and stanols were heated at high temperatures. Commercially available phytosterol/stanol ingredients were stable during such food processes as spray-drying and ultra high temperature (UHT)-type heating and subsequent long-term storage. Pan-frying, however, induced phytosterol oxidation and was classified as a rather deteriorative process. Overall, the findings indicated that although phytosterols and stanols are stable in normal food processing conditions, attention should be paid to their use in frying. Complex interactions between other food constituents also suggested that when new phytosterol-enriched foods are developed their oxidative stability must first be established. The results presented here will assist in choosing safe conditions for phytosterol/stanol enrichment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.