917 resultados para C51 - Model Construction and Estimation
Resumo:
National anniversaries such as independence days demand precise coordination in order to make citizens change their routines to forego work and spend the day at rest or at festivities that provide social focus and spectacle. The complex social construction of national days is taken for granted and operates as a given in the news media, which are the main agents responsible for coordinating these planned disruptions of normal routines. This study examines the language used in the news to construct the rather unnatural idea of national days and to align people in observing them. The data for the study consist of news stories about the Fourth of July in the New York Times, sampled over 150 years and are supplemented by material from other sources and other countries. The study is multidimensional, applying concepts from pragmatics (speech acts, politeness, information structure), systemic functional linguistics (the interpersonal metafunction and the Appraisal framework) and cognitive linguistics (frames, metaphor) as well as journalism and communications to arrive at an interdisciplinary understanding of how resources for meaning are used by writers and readers of the news stories. The analysis shows that on national anniversaries, nations tend to be metaphorized as persons having birthdays, to whom politeness should be shown. The face of the nation is to be respected in the sense of identifying the nation's interests as one's own (positive face) and speaking of citizen responsibilities rather than rights (negative face). Resources are available for both positive and negative evaluations of events and participants and the newspaper deftly changes footings (Goffman 1981) to demonstrate the required politeness while also heteroglossically allowing for a certain amount of disattention and even protest - within limits, for state holidays are almost never construed as Bakhtinian festivals, as they tend to reaffirm the hierarchy rather than invert it. Celebrations are evaluated mainly for impressiveness, and for the essentially contested quality of appropriateness, which covers norms of predictability, size, audience response, aesthetics, and explicit reference to the past. Events may also be negatively evaluated as dull ("banal") or inauthentic ("hoopla"). Audiences are evaluated chiefly in terms of their enthusiasm, or production of appropriate displays for emotional response, for national days are supposed to be occasions of flooding-out of nationalistic feeling. By making these evaluations, the newspaper reinforces its powerful position as an independent critic, while at the same time playing an active role in the construction and reproduction of emotional order embodied in "the nation's birthday." As an occasion for mobilization and demonstrations of power, national days may be seen to stand to war in the relation of play to fighting (Bateson 1955). Evidence from the newspaper's coverage of recent conflicts is adduced to support this analysis. In the course of the investigation, methods are developed for analyzing large collections of newspaper content, particularly topical soft news and feature materials that have hitherto been considered less influential and worthy of study than so-called hard news. In his work on evaluation in newspaper stories, White (1998) proposed that the classic hard news story is focused on an event that threatens the social order, but news of holidays and celebrations in general does not fit this pattern, in fact its central event is a reproduction of the social order. Thus in the system of news values (Galtung and Ruge 1965), national holiday news draws on "ground" news values such as continuity and predictability rather than "figure" news values such as negativity and surprise. It is argued that this ground helps form a necessary space for hard news to be seen as important, similar to the way in which the information structure of language is seen to rely on the regular alternation of given and new information (Chafe 1994).
Resumo:
This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.
Urinary tract infection of mice to model human disease: Practicalities, implications and limitations
Resumo:
Urinary tract infections (UTIs) are among the most common bacterial infections in humans. Murine models of human UTI are vital experimental tools that have helped to elucidate UTI pathogenesis and advance knowledge of potential treatment and infection prevention strategies. Fundamentally, several variables are inherent in different murine models, and understanding the limitations of these variables provides an opportunity to understand how models may be best applied to research aimed at mimicking human disease. In this review, we discuss variables inherent in murine UTI model studies and how these affect model usage, data analysis and data interpretation. We examine recent studies that have elucidated UTI host–pathogen interactions from the perspective of gene expression, and review new studies of biofilm and UTI preventative approaches. We also consider potential standards for variables inherent in murine UTI models and discuss how these might expand the utility of models for mimicking human disease and uncovering new aspects of pathogenesis
Resumo:
In genetic epidemiology, population-based disease registries are commonly used to collect genotype or other risk factor information concerning affected subjects and their relatives. This work presents two new approaches for the statistical inference of ascertained data: a conditional and full likelihood approaches for the disease with variable age at onset phenotype using familial data obtained from population-based registry of incident cases. The aim is to obtain statistically reliable estimates of the general population parameters. The statistical analysis of familial data with variable age at onset becomes more complicated when some of the study subjects are non-susceptible, that is to say these subjects never get the disease. A statistical model for a variable age at onset with long-term survivors is proposed for studies of familial aggregation, using latent variable approach, as well as for prospective studies of genetic association studies with candidate genes. In addition, we explore the possibility of a genetic explanation of the observed increase in the incidence of Type 1 diabetes (T1D) in Finland in recent decades and the hypothesis of non-Mendelian transmission of T1D associated genes. Both classical and Bayesian statistical inference were used in the modelling and estimation. Despite the fact that this work contains five studies with different statistical models, they all concern data obtained from nationwide registries of T1D and genetics of T1D. In the analyses of T1D data, non-Mendelian transmission of T1D susceptibility alleles was not observed. In addition, non-Mendelian transmission of T1D susceptibility genes did not make a plausible explanation for the increase in T1D incidence in Finland. Instead, the Human Leucocyte Antigen associations with T1D were confirmed in the population-based analysis, which combines T1D registry information, reference sample of healthy subjects and birth cohort information of the Finnish population. Finally, a substantial familial variation in the susceptibility of T1D nephropathy was observed. The presented studies show the benefits of sophisticated statistical modelling to explore risk factors for complex diseases.
Resumo:
The forest simulator is a computerized model for predicting forest growth and future development as well as effects of forest harvests and treatments. The forest planning system is a decision support tool, usually including a forest simulator and an optimisation model, for finding the optimal forest management actions. The information produced by forest simulators and forest planning systems is used for various analytical purposes and in support of decision making. However, the quality and reliability of this information can often be questioned. Natural variation in forest growth and estimation errors in forest inventory, among other things, cause uncertainty in predictions of forest growth and development. This uncertainty stemming from different sources has various undesirable effects. In many cases outcomes of decisions based on uncertain information are something else than desired. The objective of this thesis was to study various sources of uncertainty and their effects in forest simulators and forest planning systems. The study focused on three notable sources of uncertainty: errors in forest growth predictions, errors in forest inventory data, and stochastic fluctuation of timber assortment prices. Effects of uncertainty were studied using two types of forest growth models, individual tree-level models and stand-level models, and with various error simulation methods. New method for simulating more realistic forest inventory errors was introduced and tested. Also, three notable sources of uncertainty were combined and their joint effects on stand-level net present value estimates were simulated. According to the results, the various sources of uncertainty can have distinct effects in different forest growth simulators. The new forest inventory error simulation method proved to produce more realistic errors. The analysis on the joint effects of various sources of uncertainty provided interesting knowledge about uncertainty in forest simulators.
Resumo:
Separately, polyphenols and plant cell walls (PCW) are important contributors to the health benefits associated with fruits and vegetables. However, interactions with PCW which occur either during food preparation or mastication may affect bioaccessibility and hence bioavailability of polyphenols. Binding interactions between anthocyanins, phenolic acids (PAs) and PCW components, were evaluated using both a bacterial cellulose-pectin model system and a black carrot puree system. The majority of available polyphenols bound to PCW material with 60-70% of available anthocyanins and PAs respectively binding to black carrot puree PCW matter. Once bound, release of polyphenols using acidified methanol is low with only similar to 20% of total anthocyanins to similar to 30% of PAs being released. Less than 2% of bound polyphenol was released after in vitro gastric and small intestinal (S.I.) digestion for both the model system and the black carrot puree PCW matter. Confocal laser scanning microscopy shows localised binding of anthocyanins to PCW. Very similar patterns of binding for anthocyanins and PAs suggest that PAs form complexes with anthocyanins and polysaccharides. Time dependent changes in extractability with acidified methanol but not the total bound fraction suggests that initial nonspecific deposition on cellulose surfaces is followed by rearrangement of the bound molecules. Minimal release of anthocyanins and PAs after simulated gastric and S.I. digestion indicates that polyphenols in fruits and vegetables which bind to the PCW will be transported to the colon where they would be expected to be released by the action of cell wall degrading bacteria.
Resumo:
The aim of this thesis was to develop measurement techniques and systems for measuring air quality and to provide information about air quality conditions and the amount of gaseous emissions from semi-insulated and uninsulated dairy buildings in Finland and Estonia. Specialization and intensification in livestock farming, such as in dairy production, is usually accompanied by an increase in concentrated environmental emissions. In addition to high moisture, the presence of dust and corrosive gases, and widely varying gas concentrations in dairy buildings, Finland and Estonia experience winter temperatures reaching below -40 ºC and summer temperatures above +30 ºC. The adaptation of new technologies for long-term air quality monitoring and measurement remains relatively uncommon in dairy buildings because the construction and maintenance of accurate monitoring systems for long-term use are too expensive for the average dairy farmer to afford. Though the documentation of accurate air quality measurement systems intended mainly for research purposes have been made in the past, standardised methods and the documentation of affordable systems and simple methods for performing air quality and emissions measurements in dairy buildings are unavailable. In this study, we built three measurement systems: 1) a Stationary system with integrated affordable sensors for on-site measurements, 2) a Wireless system with affordable sensors for off-site measurements, and 3) a Mobile system consisting of expensive and accurate sensors for measuring air quality. In addition to assessing existing methods, we developed simplified methods for measuring ventilation and emission rates in dairy buildings. The three measurement systems were successfully used to measure air quality in uninsulated, semi-insulated, and fully-insulated dairy buildings between the years 2005 and 2007. When carefully calibrated, the affordable sensors in the systems gave reasonably accurate readings. The spatial air quality survey showed high variation in microclimate conditions in the dairy buildings measured. The average indoor air concentration for carbon dioxide was 950 ppm, for ammonia 5 ppm, for methane 48 ppm, for relative humidity 70%, and for inside air velocity 0.2 m/s. The average winter and summer indoor temperatures during the measurement period were -7º C and +24 ºC for the uninsulated, +3 ºC and +20 ºC for the semi-insulated and +10 ºC and +25 ºC for the fully-insulated dairy buildings. The measurement results showed that the uninsulated dairy buildings had lower indoor gas concentrations and emissions compared to fully insulated buildings. Although occasionally exceeded, the ventilation rates and average indoor air quality in the dairy buildings were largely within recommended limits. We assessed the traditional heat balance, moisture balance, carbon dioxide balance and direct airflow methods for estimating ventilation rates. The direct velocity measurement for the estimation of ventilation rate proved to be impractical for naturally ventilated buildings. Two methods were developed for estimating ventilation rates. The first method is applicable in buildings in which the ventilation can be stopped or completely closed. The second method is useful in naturally ventilated buildings with large openings and high ventilation rates where spatial gas concentrations are heterogeneously distributed. The two traditional methods (carbon dioxide and methane balances), and two newly developed methods (theoretical modelling using Fick s law and boundary layer theory, and the recirculation flux-chamber technique) were used to estimate ammonia emissions from the dairy buildings. Using the traditional carbon dioxide balance method, ammonia emissions per cow from the dairy buildings ranged from 7 g day-1 to 35 g day-1, and methane emissions per cow ranged from 96 g day-1 to 348 g day-1. The developed methods proved to be as equally accurate as the traditional methods. Variation between the mean emissions estimated with the traditional and the developed methods was less than 20%. The developed modelling procedure provided sound framework for examining the impact of production systems on ammonia emissions in dairy buildings.
Resumo:
Future time perspective - the way individuals perceive their remaining time in life - importantly influences socio-emotional goals and motivational outcomes. Recently, researchers have called for studies that investigate relationships between personality and future time perspective. Using a cross-lagged panel design, this study investigated effects of chronic regulatory focus dimensions (promotion and prevention orientation) on future time perspective dimensions (focus on opportunities and limitations). Survey data were collected two times, separated by a 3. month time lag, from 85 participants. Results of structural equation modeling showed that promotion orientation had a positive lagged effect on focus on opportunities, and prevention orientation had a positive lagged effect on focus on limitations.
Resumo:
Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment. © 2012 American Society of Agricultural and Biological Engineers.
Resumo:
The rule of law is understood to be a core aspect in achieving a stable economy and an ordered society. Without the elements that are inherent in this principle the possibilities of anarchy, unfairness and uncertainty are amplified, which in turn can result in an economy with dramatic fluctuations. In this regard, commentators do not always agree that the rule of law is strictly adhered to in the international legal context. Therefore, this paper will explore one aspect of international regulation and consider whether the UNCITRAL Model Law on Cross-border Insolvency (1997) (‘Model Law’) and its associated Guide to Enactment and Interpretation (2013) contribute to the promotion of the key elements of the rule of law.
Resumo:
In this thesis the use of the Bayesian approach to statistical inference in fisheries stock assessment is studied. The work was conducted in collaboration of the Finnish Game and Fisheries Research Institute by using the problem of monitoring and prediction of the juvenile salmon population in the River Tornionjoki as an example application. The River Tornionjoki is the largest salmon river flowing into the Baltic Sea. This thesis tackles the issues of model formulation and model checking as well as computational problems related to Bayesian modelling in the context of fisheries stock assessment. Each article of the thesis provides a novel method either for extracting information from data obtained via a particular type of sampling system or for integrating the information about the fish stock from multiple sources in terms of a population dynamics model. Mark-recapture and removal sampling schemes and a random catch sampling method are covered for the estimation of the population size. In addition, a method for estimating the stock composition of a salmon catch based on DNA samples is also presented. For most of the articles, Markov chain Monte Carlo (MCMC) simulation has been used as a tool to approximate the posterior distribution. Problems arising from the sampling method are also briefly discussed and potential solutions for these problems are proposed. Special emphasis in the discussion is given to the philosophical foundation of the Bayesian approach in the context of fisheries stock assessment. It is argued that the role of subjective prior knowledge needed in practically all parts of a Bayesian model should be recognized and consequently fully utilised in the process of model formulation.
Resumo:
The study of the organisational culture in the construction industry is still in the stage of debate (Oney-Yazıcı et al., 2007). Despite the complexities involved in measuring the culture of the construction industry (Tijhuis and Fellows, 2012), this culture is regarded as being worthy of research, especially in relation to the organisational culture needed to support quality management systems (Koh and Low, 2008; Watson and Howarth, 2011) and to improve organisational effectiveness, and therefore, organisational performance (Coffey, 2010; Cheung et al., 2011). A number of recent studies have examined the construction companies’ organisational culture within the context of the use of Cameron and Quinn’s Competing Value Framework (CVF), as well as the use of their Organizational Culture Assessment Instrument (OCAI) as the conceptual paradigm for the analyses (Thomas et al., 2002; Nummelin, 2006; Oney- Yazıcı et al., 2007; Koh and Low, 2008). However, there has been little research based on the use of Cameron and Quinn’s CVF-OCAI tool for identifying types of construction companies’ organisational culture and their influences on the implementation of QMS-ISO 9001. Research output and information is also very limited relating to the strength of the companies’ organisational culture driving an effective QMS-ISO 9001 implementation, affecting the companies’ effectiveness. To rectify these research gaps, the research has been aimed to study organisational culture types (based on CVF) and their influences on the implementation of QMS-ISO 9001:2008 principles and elements, which eventually lead to improved companies’ quality performance. In order to fully examine the status of the QMS being implemented, the research has studied the relationships of the barriers of QMS implementation with the implementation of QMS-ISO 9001:2008 principles and elements and with the business performance of the companies, as well as the examination of the relationships of the implementation of QMS-ISO 9001:2008 principles and elements with the companies’ business performance. The research output has been the development of fundamental and original studies on the study topics, to provide the knowledge for improvements in Indonesian construction companies’ quality performance and quality outcomes.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
- Objectives Preschool-aged children spend substantial amounts of time engaged in screen-based activities. As parents have considerable control over their child's health behaviours during the younger years, it is important to understand those influences that guide parents' decisions about their child's screen time behaviours. - Design A prospective design with two waves of data collection, 1 week apart, was adopted. - Methods Parents (n = 207) completed a Theory of Planned Behaviour (TPB)-based questionnaire, with the addition of parental role construction (i.e., parents' expectations and beliefs of responsibility for their child's behaviour) and past behaviour. A number of underlying beliefs identified in a prior pilot study were also assessed. - Results The model explained 77% (with past behaviour accounting for 5%) of the variance in intention and 50% (with past behaviour accounting for 3%) of the variance in parental decisions to limit child screen time. Attitude, subjective norms, perceived behavioural control, parental role construction, and past behaviour predicted intentions, and intentions and past behaviour predicted follow-up behaviour. Underlying screen time beliefs (e.g., increased parental distress, pressure from friends, inconvenience) were also identified as guiding parents' decisions. - Conclusion Results support the TPB and highlight the importance of beliefs for understanding parental decisions for children's screen time behaviours, as well as the addition of parental role construction. This formative research provides necessary depth of understanding of sedentary lifestyle behaviours in young children which can be adopted in future interventions to test the efficacy of the TPB mechanisms in changing parental behaviour for their child's health.
Resumo:
The delivery of products and services for construction-based businesses is increasingly becoming knowledge-driven and information-intensive. The proliferation of building information modelling (BIM) has increased business opportunities as well as introduced new challenges for the architectural, engineering and construction and facilities management (AEC/FM) industry. As such, the effective use, sharing and exchange of building life cycle information and knowledge management in building design, construction, maintenance and operation assumes a position of paramount importance. This paper identifies a subset of construction management (CM) relevant knowledge for different design conditions of building components through a critical, comprehensive review of synthesized literature and other information gathering and knowledge acquisition techniques. It then explores how such domain knowledge can be formalized as ontologies and, subsequently, a query vocabulary in order to equip BIM users with the capacity to query digital models of a building for the retrieval of useful and relevant domain-specific information. The formalized construction knowledge is validated through interviews with domain experts in relation to four case study projects. Additionally, retrospective analyses of several design conditions are used to demonstrate the soundness (realism), completeness, and appeal of the knowledge base and query-based reasoning approach in relation to the state-of-the-art tools, Solibri Model Checker and Navisworks. The knowledge engineering process and the methods applied in this research for information representation and retrieval could provide useful mechanisms to leverage BIM in support of a number of knowledge intensive CM/FM tasks and functions.