956 resultados para Inter Session Variability Modelling
Resumo:
The prediction of tillering is poor or absent in existing sorghum crop models even though fertile tillers contribute significantly to grain yield. The objective of this study was to identify general quantitative relationships underpinning tiller dynamics of sorghum for a broad range of assimilate availabilities. Emergence, phenology, leaf area development and fertility of individual main calms and tillers were quantified weekly in plants grown at one of four plant densities ranging from two to 16 plants m(-2). On any given day, a tiller was considered potentially fertile (a posteriori) if its number of leaves continued to increase thereafter. The dynamics of potentially fertile tiller number per plant varied greatly with plant density, but could generally be described by three determinants, stable across plant densities: tiller emergence rate aligned with leaf ligule appearance rate; cessation of tiller emergence occurred at a stable leaf area index; and rate of decrease in potentially fertile tillers was linearly related to the ratio of realized to potential leaf area growth. Realized leaf area growth is the measured increase in leaf area, whereas potential leaf area growth is the estimated increase in leaf area if all potentially fertile tillers were to continue to develop. Procedures to predict this ratio, by estimating realized leaf area per plant from intercepted radiation and potential leaf area per plant from the number and type of developing axes, are presented. While it is suitable for modelling tiller dynamics in grain sorghum, this general framework needs to be validated by testing it in different environments and for other cultivars. (C) 2002 Annals of Botany Company.
Resumo:
Crop modelling has evolved over the last 30 or so years in concert with advances in crop physiology, crop ecology and computing technology. Having reached a respectable degree of acceptance, it is appropriate to review briefly the course of developments in crop modelling and to project what might be major contributions of crop modelling in the future. Two major opportunities are envisioned for increased modelling activity in the future. One opportunity is in a continuing central, heuristic role to support scientific investigation, to facilitate decision making by crop managers, and to aid in education. Heuristic activities will also extend to the broader system-level issues of environmental and ecological aspects of crop production. The second opportunity is projected as a prime contributor in understanding and advancing the genetic regulation of plant performance and plant improvement. Physiological dissection and modelling of traits provides an avenue by which crop modelling could contribute to enhancing integration of molecular genetic technologies in crop improvement. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
This study examines batch-to-batch variability in the production of dietary fluids and videofluoroscopy fluids of a single hospital. The material properties, such as viscosity, yield stress, and density, show significant variations between batches. Also waterbased products (i.e., cordial) provide (a) the most stability from week to week for both dietary and videofluoroscopy fluids and (b) the best dietary and videofluoroscopy fluid matches. The study also highlights the need for further research into how base substances, such as water, juice, and dairy products, react with different thickeners and with barium.
Resumo:
The aim of this study was to assess the variation between neuropathologists in the diagnosis of common dementia syndromes when multiple published protocols are applied. Fourteen out of 18 Australian neuropathologists participated in diagnosing 20 cases (16 cases of dementia, 4 age-matched controls) using consensus diagnostic methods. Diagnostic criteria, clinical synopses and slides from multiple brain regions were sent to participants who were asked for case diagnoses. Diagnostic sensitivity, specificity, predictive value, accuracy and variability were determined using percentage agreement and kappa statistics. Using CERAD criteria, there was a high inter-rater agreement for cases with probable and definite Alzheimer's disease but low agreement for cases with possible Alzheimer's disease. Braak staging and the application of criteria for dementia with Lewy bodies also resulted in high inter-rater agreement. There was poor agreement for the diagnosis of frontotemporal dementia and for identifying small vessel disease. Participants rarely diagnosed more than one disease in any case. To improve efficiency when applying multiple diagnostic criteria, several simplifications were proposed and tested on 5 of the original 210 cases. Inter-rater reliability for the diagnosis of Alzheimer's disease and dementia with Lewy bodies significantly improved. Further development of simple and accurate methods to identify small vessel lesions and diagnose frontotemporal dementia is warranted.
Resumo:
The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We present the first mathematical model on the transmission dynamics of Schistosoma japonicum. The work extends Barbour's classic model of schistosome transmission. It allows for the mammalian host heterogeneity characteristic of the S. japonicum life cycle, and solves the problem of under-specification of Barbour's model by the use of Chinese data we are collecting on human-bovine transmission in the Poyang Lake area of Jiangxi Province in China. The model predicts that in the lake/marshland areas of the Yangtze River basin: (1) once-early mass chemotherapy of humans is little better than twice-yearly mass chemotherapy in reducing human prevalence. Depending on the heterogeneity of prevalence within the population, targeted treatment of high prevalence groups, with lower overall coverage, can be more effective than mass treatment with higher overall coverage. Treatment confers a short term benefit only, with prevalence rising to endemic levels once chemotherapy programs are stopped (2) depending on the relative contributions of bovines and humans, bovine treatment can benefit humans almost as much as human treatment. Like human treatment, bovine treatment confers a short-term benefit. A combination of human and bovine treatment will dramatically reduce human prevalence and maintains the reduction for a longer period of time than treatment of a single host, although human prevalence rises once treatment ceases; (3) assuming 75% coverage of bovines, a bovine vaccine which acts on worm fecundity must have about 75% efficacy to reduce the reproduction rate below one and ensure mid-term reduction and long-term elimination of the parasite. Such a vaccination program should be accompanied by an initial period of human treatment to instigate a short-term reduction in prevalence, following which the reduction is enhanced by vaccine effects; (4) if the bovine vaccine is only 45% efficacious (the level of current prototype vaccines) it will lower the endemic prevalence, but will not result in elimination. If it is accompanied by an initial period of human treatment and by a 45% improvement in human sanitation or a 30% reduction in contaminated water contact by humans, elimination is then possible. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Activated sludge flocculation was modelled using population balances. The model followed the dynamics of activated sludge flocculation providing a good approximation of the change in mean floe size with time. Increasing the average velocity gradient decreased the final floe size. The breakage rate coefficient and collision efficiency also varied with the average velocity gradient. A power law relationship was found for the increase in breakage rate coefficient with increasing average velocity gradient. Further investigation will be conducted to determine the relationship between the collision efficiency and particle size to provide a better approximation of dynamic changes in the floe size distribution during flocculation. (C) 2002 Published by Elsevier Science B.V.
Resumo:
A technique based on laser light diffraction is shown to be successful in collecting on-line experimental data. Time series of floc size distributions (FSD) under different shear rates (G) and calcium additions were collected. The steady state mass mean diameter decreased with increasing shear rate G and increased when calcium additions exceeded 8 mg/l. A so-called population balance model (PBM) was used to describe the experimental data, This kind of model describes both aggregation and breakage through birth and death terms. A discretised PBM was used since analytical solutions of the integro-partial differential equations are non-existing. Despite the complexity of the model, only 2 parameters need to be estimated: the aggregation rate and the breakage rate. The model seems, however, to lack flexibility. Also, the description of the floc size distribution (FSD) in time is not accurate.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
Colour pattern variation is a striking and widespread phenomenon. Differential predation risk between individuals is often invoked to explain colour variation, but empirical support for this hypothesis is equivocal. We investigated differential conspicuousness and predation risk in two species of Australian rock dragons, Ctenophorus decresii and C. vadnappa. To humans, the coloration of males of these species varies between 'bright' and 'dull'. Visual modelling based on objective colour measurements and the spectral sensitivities of avian visual pigments showed that dragon colour variants are differentially conspicuous to the visual system of avian predators when viewed against the natural background. We conducted field experiments to test for differential predation risk, using plaster models of 'bright' and 'dull' males. 'Bright' models were attacked significantly more often than 'dull' models suggesting that differential conspicuousness translates to differential predation risk in the wild. We also examined the influence of natural geographical range on predation risk. Results from 22 localities suggest that predation rates vary according to whether predators are familiar with the prey species. This study is among the first to demonstrate both differential conspicuousness and differential predation risk in the wild using an experimental protocol. (C) 2003 Published by Elsevier Ltd on behalf of The Association for the Study of Animal Behaviour.
Resumo:
Coral bleaching events have become more frequent and widespread, largely due to elevated sea surface temperatures. Global climate change could lead to increased variability of sea surface temperatures, through influences on climate systems, e.g. El Nino Southern Oscillation (ENSO). Field observations in 1999, following a strong ENSO, revealed that corals bleached in winter after unusually cold weather. To explore the basis for these observations, the photosynthetic responses of the coral species Montipora digitata Studer were investigated in a series of temperature and light experiments. Small replicate coral colonies were exposed to ecologically relevant lower temperatures for varying durations and under light regimes that ranged from darkness to full sunlight. Photosynthetic efficiency was analyzed using a pulse amplitude modulated (PAM) fluorometer (F-0, F-m, F-v/F-m), and chlorophyll a (chl a) content and symbiotic dinoflagellate density were analyzed with spectrophotometry and microscopy, respectively. Cold temperature stress had a negative impact on M digitata colonies indicated by decreased photosynthetic efficiency (F-v/F-m), loss of symbiotic dinoflagellates and changes in photosynthetic pigment concentrations. Corals in higher light regimes were more susceptible to cold temperature stress, Moderate cold stress resulted in photoacclimatory responses, but severe cold stress resulted in photodamage, bleaching and increased mortality. Responses to cold temperature stress of M digitata appeared similar to that observed in corals exposed to warmer than normal temperatures, suggesting a common mechanism. The results of this study suggest that corals and coral reefs may also be impacted by exposure to cold as well as warm temperature extremes as climate change occurs.
Resumo:
Increasingly, electropalatography (EPG) is being used in speech pathology research to identify and describe speech disorders of neurological origin. However, limited data currently exists that describes normal articulatory segment timing and the degree of variability exhibited by normal speakers when assessed with EPG. Therefore, the purpose of the current investigation was to use the Reading EPG3 system to quantify segmental timing values and examine articulatory timing variability for three English consonants. Ten normal subjects repeated ten repetitions of CV words containing the target consonants /t/, /l/, and /s/ while wearing an artificial palate. The target consonants were followed by the /i/ vowel and were contained in the carrier phrase 'I saw a __'. Mean duration of the approach, closure/constriction, and release phases of consonant articulation were calculated. In addition, inter-subject articulatory timing variability was investigated using descriptive graphs and intra-subject articulatory timing variability was investigated using a coefficient of variation. Results revealed the existence of intersubject variability for mean segment timing values. This could be attributed to individual differences in the suprasegmental features of speech and individual differences in oral cavity size and structure. No significant differences were reported for degree of intra-subject variability between the three sounds for these same phases of articulation. However, when this data set was collapsed, results revealed that the closure/constriction phase of consonant articulation exhibited significantly less intra-subject variability than both the approach and release phases. The stabilization of the tongue against the fixed structure of the hard palate during the closure phase of articulation may have reduced the levels of intra-subject variability.