86 resultados para Situation models

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important safety aspect to be considered when foods are enriched with phytosterols and phytostanols is the oxidative stability of these lipid compounds, i.e. their resistance to oxidation and thus to the formation of oxidation products. This study concentrated on producing scientific data to support this safety evaluation process. In the absence of an official method for analyzing of phytosterol/stanol oxidation products, we first developed a new gas chromatographic - mass spectrometric (GC-MS) method. We then investigated factors affecting these compounds' oxidative stability in lipid-based food models in order to identify critical conditions under which significant oxidation reactions may occur. Finally, the oxidative stability of phytosterols and stanols in enriched foods during processing and storage was evaluated. Enriched foods covered a range of commercially available phytosterol/stanol ingredients, different heat treatments during food processing, and different multiphase food structures. The GC-MS method was a powerful tool for measuring the oxidative stability. Data obtained in food model studies revealed that the critical factors for the formation and distribution of the main secondary oxidation products were sterol structure, reaction temperature, reaction time, and lipid matrix composition. Under all conditions studied, phytostanols as saturated compounds were more stable than unsaturated phytosterols. In addition, esterification made phytosterols more reactive than free sterols at low temperatures, while at high temperatures the situation was the reverse. Generally, oxidation reactions were more significant at temperatures above 100°C. At lower temperatures, the significance of these reactions increased with increasing reaction time. The effect of lipid matrix composition was dependent on temperature; at temperatures above 140°C, phytosterols were more stable in an unsaturated lipid matrix, whereas below 140°C they were more stable in a saturated lipid matrix. At 140°C, phytosterols oxidized at the same rate in both matrices. Regardless of temperature, phytostanols oxidized more in an unsaturated lipid matrix. Generally, the distribution of oxidation products seemed to be associated with the phase of overall oxidation. 7-ketophytosterols accumulated when oxidation had not yet reached the dynamic state. Once this state was attained, the major products were 5,6-epoxyphytosterols and 7-hydroxyphytosterols. The changes observed in phytostanol oxidation products were not as informative since all stanol oxides quantified represented hydroxyl compounds. The formation of these secondary oxidation products did not account for all of the phytosterol/stanol losses observed during the heating experiments, indicating the presence of dimeric, oligomeric or other oxidation products, especially when free phytosterols and stanols were heated at high temperatures. Commercially available phytosterol/stanol ingredients were stable during such food processes as spray-drying and ultra high temperature (UHT)-type heating and subsequent long-term storage. Pan-frying, however, induced phytosterol oxidation and was classified as a rather deteriorative process. Overall, the findings indicated that although phytosterols and stanols are stable in normal food processing conditions, attention should be paid to their use in frying. Complex interactions between other food constituents also suggested that when new phytosterol-enriched foods are developed their oxidative stability must first be established. The results presented here will assist in choosing safe conditions for phytosterol/stanol enrichment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study I discuss G. W. Leibniz's (1646-1716) views on rational decision-making from the standpoint of both God and man. The Divine decision takes place within creation, as God freely chooses the best from an infinite number of possible worlds. While God's choice is based on absolutely certain knowledge, human decisions on practical matters are mostly based on uncertain knowledge. However, in many respects they could be regarded as analogous in more complicated situations. In addition to giving an overview of the divine decision-making and discussing critically the criteria God favours in his choice, I provide an account of Leibniz's views on human deliberation, which includes some new ideas. One of these concerns is the importance of estimating probabilities in making decisions one estimates both the goodness of the act itself and its consequences as far as the desired good is concerned. Another idea is related to the plurality of goods in complicated decisions and the competition this may provoke. Thirdly, heuristic models are used to sketch situations under deliberation in order to help in making the decision. Combining the views of Marcelo Dascal, Jaakko Hintikka and Simo Knuuttila, I argue that Leibniz applied two kinds of models of rational decision-making to practical controversies, often without explicating the details. The more simple, traditional pair of scales model is best suited to cases in which one has to decide for or against some option, or to distribute goods among parties and strive for a compromise. What may be of more help in more complicated deliberations is the novel vectorial model, which is an instance of the general mathematical doctrine of the calculus of variations. To illustrate this distinction, I discuss some cases in which he apparently applied these models in different kinds of situation. These examples support the view that the models had a systematic value in his theory of practical rationality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. The sentence span task is a complex working memory span task used for estimating total working memory capacity for both processing (sentence comprehension) and storage (remembering a set of words). Several traditional models of working memory suggest that performance on these tasks relies on phonological short-term storage. However, long-term memory effects as well as the effects of expertise and strategies have challenged this view. This study uses a working memory task that aids the creation of retrieval structures in the form of stories, which have been shown to form integrated structures in longterm memory. The research question is whether sentence and story contexts boost memory performance in a complex working memory task. The hypothesis is that storage of the words in the task takes place in long-term memory. Evidence of this would be better recall for words as parts of sentences than for separate words, and, particularly, a beneficial effect for words as part of an organized story. Methods. Twenty stories consisting of five sentences each were constructed, and the stimuli in all experimental conditions were based on these sentences and sentence-final words, reordered and recombined for the other conditions. Participants read aloud sets of five sentences that either formed a story or not. In one condition they had to report all the last words at the end of the set, in another, they memorised an additional separate word with each sentence. The sentences were presented on the screen one word at a time (500 ms). After the presentation of each sentence, the participant verified a statement about the sentence. After five sentences, the participant repeated back the words in correct positions. Experiment 1 (n=16) used immediate recall, experiment 2 (n=21) both immediate recall and recall after a distraction interval (the operation span task). In experiment 2 a distracting mental arithmetic task was presented instead of recall in half of the trials, and an individual word was added before each sentence in the two experimental conditions when the participants were to memorize the sentence final words. Subjects also performed a listening span task (in exp.1) or an operation span task (exp.2) to allow comparison of the estimated span and performance in the story task. Results were analysed using correlations, repeated measures ANOVA and a chi-square goodness of fit test on the distribution of errors. Results and discussion. Both the relatedness of the sentences (the story condition) and the inclusion of the words into sentences helped memory. An interaction showed that the story condition had a greater effect on last words than separate words. The beneficial effect of the story was shown in all serial positions. The effects remained in delayed recall. When the sentences formed stories, performance in verification of the statements about sentence context was better. This, as well as the differing distributions of errors in different experimental conditions, suggest different levels of representation are in use in the different conditions. In the story condition, the nature of these representations could be in the form of an organized memory structure, a situation model. The other working memory tasks had only few week correlations to the story task. This could indicate that different processes are in use in the tasks. The results do not support short-term phonological storage, but instead are compatible with the words being encoded to LTM during the task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The crucial questions that define democracy relate to its depth and width: who can participate in decision making and what kind of things can be commonly decided? Theories deeper than representative democracy emphasize discussion, in which by evaluating justifications it is possible to achieve consensus in ideal situation. The aim of my research is to develop tools for promoting the participation of third graders in decision making in the classroom. In addition I study the development of depth and width of democracy in the classroom, the development of skills and competencies in the decision making and the challenges of the project. My research method is participative action research. I collected my data between October 2007 and March 2008. I used videos and observation diaries as my primary data. Additional data consisted of the interviews of the students, the conversations between the adults and the material produced by the teacher. Since we discussed the matters students had highlighted in specific lessons, my analysis proceeds according to these lessons constructing a general view of the process. The width and depth of classroom democracy are difficult to define. Though the system we had created enabled third graders to discuss matters they found important the participation was unequal: some of the students couldn’t among other things give justifications for their opinions. This poses challenges for models that emphasize deliberation since these theories presuppose that everyone has concordant competencies. But then again only critical citizens who are able to make justifications and to evaluate them are able to oppose indoctrination. This makes teaching these competencies justified. Different decision making procedures define the classroom democracy. Deliberation doesn’t necessarily provide deeper information about the preferences of the participants than mere voting. But then again voting doesn’t express the reasons which support one’s preferences. Structured conversation can equalize the time used for every participant’s opinions, but doesn’t solve the challenge of unequal competencies. Children’s suggestion box diversified the possibilities to participate, and also the silent ones used it during the research. The asymmetry in deliberation might also be caused by the social structure of the students. Teacher’s directing and participation encouraging role in deliberation was significant. Diversifying the participation by different roles could equalize the asymmetry in participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertension, obesity, dyslipidemia and dysglycemia constitute metabolic syndrome, a major public health concern, which is associated with cardiovascular mortality. High dietary salt (NaCl) is the most important dietary risk factor for elevated blood pressure. The kidney has a major role in salt-sensitive hypertension and is vulnerable to harmful effects of increased blood pressure. Elevated serum urate is a common finding in these disorders. While dysregulation of urate excretion is associated with cardiovascular diseases, present studies aimed to clarify the role of xanthine oxidoreductase (XOR), i.e. xanthine dehydrogenase (XDH) and its post-translational isoform xanthine oxidase (XO), in cardiovascular diseases. XOR yields urate from hypoxanthine and xanthine. Low oxygen levels upregulate XOR in addition to other factors. In present studies higher renal XOR activity was found in hypertension-prone rats than in the controls. Furthermore, NaCl intake increased renal XOR dose-dependently. To clarify whether XOR has any causal role in hypertension, rats were kept on NaCl diets for different periods of time, with or without a XOR inhibitor, allopurinol. While allopurinol did not alleviate hypertension, it prevented left ventricular and renal hypertrophy. Nitric oxide synthases (NOS) produce nitric oxide (NO), which mediates vasodilatation. A paucity of NO, produced by NOS inhibition, aggravated hypertension and induced renal XOR, whereas NO generating drug, alleviated salt-induced hypertension without changes in renal XOR. Zucker fa/fa rat is an animal model of metabolic syndrome. These rats developed substantial obesity and modest hypertension and showed increased hepatic and renal XOR activities. XOR was modified by diet and antihypertensive treatment. Cyclosporine (CsA) is a fungal peptide and one of the first-line immunosuppressive drugs used in the management of organ transplantation. Nephrotoxicity ensue high doses resulting in hypertension and limit CsA use. CsA increased renal XO substantially in salt-sensitive rats on a high NaCl diet, indicating a possible role for this reactive oxygen species generating isoform in CsA nephrotoxicity. Renal hypoxia, common to these rodent models of hypertension and obesity, is one of the plausible XOR inducing factors. Although XOR inhibition did not prevent hypertension, present experimental data indicate that XOR plays a role in the pathology of salt-induced cardiac and renal hypertrophy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective and background. Tobacco smoking, pancreatitis and diabetes mellitus are the only known causes of pancreatic cancer, leaving ample room for yet unidentified determinants. This is an empirical study on a Finnish data on occupational exposures and pancreatic cancer risk, and a non-Bayesian and a hierarchical Bayesian meta-analysis of data on occupational factors and pancreatic cancer. Methods. The case-control study analyzed 595 incident cases of pancreatic cancer and 1,622 controls of stomach, colon, and rectum cancer, diagnosed 1984-1987 and known to be dead by 1990 in Finland. The next-of-kin responded to a mail questionnaire on job and medical histories and lifestyles. Meta-analysis of occupational risk factors of pancreatic cancer started off with 1,903 identified studies. The analyses were based on different subsets of that database. Five epidemiologists examined the reports and extracted the pertinent data using a standardized extraction form that covered 20 study descriptors and the relevant relative risk estimates. Random effects meta-analyses were applied for 23 chemical agents. In addition, hierarchical Bayesian models for meta-analysis were applied to the occupational data of 27 job titles using job exposure matrix as a link matrix and estimating the relative risks of pancreatic cancer associated with nine occupational agents. Results. In the case-control study, logistic regressions revealed excess risks of pancreatic cancer associated with occupational exposures to ionizing radiation, nonchlorinated solvents, and pesticides. Chlorinated hydrocarbon solvents and related compounds, used mainly in metal degreasing and dry cleaning, are emerging as likely risk factors of pancreatic cancer in the non-Bayesian and the hierarchical Bayesian meta-analysis. Consistent excess risk was found for insecticides, and a high excess for nickel and nickel compounds in the random effects meta-analysis but not in the hierarchical Bayesian meta-analysis. Conclusions. In this study occupational exposure to chlorinated hydrocarbon solvents and related compounds and insecticides increase risk of pancreatic cancer. Hierarchical Bayesian meta-analysis is applicable when studies addressing the agent(s) under study are lacking or very few, but several studies address job titles with potential exposure to these agents. A job-exposure matrix or a formal expert assessment system is necessary in this situation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nephrin is a transmembrane protein belonging to the immunoglobulin superfamily and is expressed primarily in the podocytes, which are highly differentiated epithelial cells needed for primary urine formation in the kidney. Mutations leading to nephrin loss abrogate podocyte morphology, and result in massive protein loss into urine and consequent early death in humans carrying specific mutations in this gene. The disease phenotype is closely replicated in respective mouse models. The purpose of this thesis was to generate novel inducible mouse-lines, which allow targeted gene deletion in a time and tissue-specific manner. A proof of principle model for succesful gene therapy for this disease was generated, which allowed podocyte specific transgene replacement to rescue gene deficient mice from perinatal lethality. Furthermore, the phenotypic consequences of nephrin restoration in the kidney and nephrin deficiency in the testis, brain and pancreas in rescued mice were investigated. A novel podocyte-specific construct was achieved by using standard cloning techniques to provide an inducible tool for in vitro and in vivo gene targeting. Using modified constructs and microinjection procedures two novel transgenic mouse-lines were generated. First, a mouse-line with doxycycline inducible expression of Cre recombinase that allows podocyte-specific gene deletion was generated. Second, a mouse-line with doxycycline inducible expression of rat nephrin, which allows podocyte-specific nephrin over-expression was made. Furthermore, it was possible to rescue nephrin deficient mice from perinatal lethality by cross-breeding them with a mouse-line with inducible rat nephrin expression that restored the missing endogenous nephrin only in the kidney after doxycycline treatment. The rescued mice were smaller, infertile, showed genital malformations and developed distinct histological abnormalities in the kidney with an altered molecular composition of the podocytes. Histological changes were also found in the testis, cerebellum and pancreas. The expression of another molecule with limited tissue expression, densin, was localized to the plasma membranes of Sertoli cells in the testis by immunofluorescence staining. Densin may be an essential adherens junction protein between Sertoli cells and developing germ cells and these junctions share similar protein assembly with kidney podocytes. This single, binary conditional construct serves as a cost- and time-efficient tool to increase the understanding of podocyte-specific key proteins in health and disease. The results verified a tightly controlled inducible podocyte-specific transgene expression in vitro and in vivo as expected. These novel mouse-lines with doxycycline inducible Cre recombinase and with rat nephrin expression will be useful for conditional gene targeting of essential podocyte proteins and to study in detail their functions in the adult mice. This is important for future diagnostic and pharmacologic development platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to investigate the effects of location, site type, regeneration method and precommercial thinning on the characteristics and development of young, even-aged, pure Scots pine stands. In addition, the effects of timing and intensity of first commercial thinning on the yield and profitability during the rotation period were also studied. The stand characteristics and external quality of young Scots pine stands and stand-level growth models were based on extensive inventory data of the Finnish Forest Research Institute for young Scots pine stands (3 measurement times, 192 stands). The effect of precommercial thinning on stand development was examined on the basis of long-term experiments (13 stands, 169 plots). The effect of timing and intensity of the first commercial thinning on yield and profitability were based on measurements made in first commercial thinnings (27 stands of Metsähallitus), and the further stand development was modeled using the MOTTI simulator. The thesis was based on four articles and a summary. Stand level growth models were developed for young, even-aged Scots pine stands. The models reliably predicted the development up until the first commercial thinning stage. The stand density of young Scots pine stands in Finland was moderately low compared to the target values. In addition, the external quality of pines was low on average. The low stand density and poor external quality will result in the need for quality tree selection in thinnings, if high quality sawn timber is required. In Northern Finland, only 20% of the dominant trees were classified as normal. This will lead to the situation where external quality will remain relatively poor up until the end of rotation. Early and light precommercial thinning (Hdom 3 m, to a density of 3000 trees per hectare) increased the thinning removal by 40% compared to late and more intensive precommercial thinning (at 7 meters to a density of 2000 trees per hectare). A model for the effect of precommercial thinning on merchantable thinning removal at the first commercial thinning was developed for forest management planning purposes. When the recommended time of first commercial thinning was delayed from a dominant height of 12 m to 16 m, or by ten years, the yield of merchantable wood was doubled. Simultaneously, the current value of the stumpage revenues (with 4% interest rate) was increased on the average by 65% (330 € per hectare). Variation in stumpage prices or interest rates did not have any effect on the final results. Without exception, delaying the first commercial thinning by ten years seemed to be the most profitable method. This presupposes that precommercial thinning has been carried out at the right time and that tree quality aspects do not be specially considered. Furthermore, the wood yield and economic outcome from the entire rotation were similar regardless of whether the first thinning was performed at the time currently recommended or ten years later.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation examines the short- and long-run impacts of timber prices and other factors affecting NIPF owners' timber harvesting and timber stocking decisions. The utility-based Faustmann model provides testable hypotheses of the exogenous variables retained in the timber supply analysis. The timber stock function, derived from a two-period biomass harvesting model, is estimated using a two-step GMM estimator based on balanced panel data from 1983 to 1991. Timber supply functions are estimated using a Tobit model adjusted for heteroscedasticity and nonnormality of errors based on panel data from 1994 to 1998. Results show that if specification analysis of the Tobit model is ignored, inconsistency and biasedness can have a marked effect on parameter estimates. The empirical results show that owner's age is the single most important factor determining timber stock; timber price is the single most important factor in harvesting decision. The results of the timber supply estimations can be interpreted using utility-based Faustmann model of a forest owner who values a growing timber in situ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.