17 resultados para Time-series analysis Mathematical models
em University of Queensland eSpace - Australia
Resumo:
After ingestion of a standardized dose of ethanol, alcohol concentrations were assessed, over 3.5 hours from blood (six readings) and breath (10 readings) in a sample of 412 MZ and DZ twins who took part in an Alcohol Challenge Twin Study (ACTS). Nearly all participants were subsequently genotyped on two polymorphic SNPs in the ADH1B and ADH1C loci known to affect in vitro ADH activity. In the DZ pairs, 14 microsatellite markers covering a 20.5 cM region on chromosome 4 that includes the ADH gene family were assessed, Variation in the timed series of autocorrelated blood and breath alcohol readings was studied using a bivariate simplex design. The contribution of a quantitative trait locus (QTL) or QTL's linked to the ADH region was estimated via a mixture of likelihoods weighted by identity-by-descent probabilities. The effects of allelic substitution at the ADH1B and ADH1C loci were estimated in the means part of the model simultaneously with the effects sex and age. There was a major contribution to variance in alcohol metabolism due to a QTL which accounted for about 64% of the additive genetic covariation common to both blood and breath alcohol readings at the first time point. No effects of the ADH1B*47His or ADH1C*349Ile alleles on in vivo metabolism were observed, although these have been shown to have major effects in vitro. This implies that there is a major determinant of variation for in vivo alcohol metabolism in the ADH region that is not accounted for by these polymorphisms. Earlier analyses of these data suggested that alcohol metabolism is related to drinking behavior and imply that this QTL may be protective against alcohol dependence.
Resumo:
This study explores whether the introduction of selectively trained radiographers reporting Accident and Emergency (A&E) X-ray examinations or the appendicular skeleton affected the availability of reports for A&E and General Practitioner (GP) examinations at it typical district general hospital. This was achieved by analysing monthly data on A&E and GP examinations for 1993 1997 using structural time-series models. Parameters to capture stochastic seasonal effects and stochastic time trends were included ill the models. The main outcome measures were changes in the number, proportion and timeliness of A&E and GP examinations reported. Radiographer reporting X-ray examinations requested by A&E was associated with it 12% (p = 0.050) increase in the number of A&E examinations reported and it 37% (p
Resumo:
We demonstrate that the process of generating smooth transitions Call be viewed as a natural result of the filtering operations implied in the generation of discrete-time series observations from the sampling of data from an underlying continuous time process that has undergone a process of structural change. In order to focus discussion, we utilize the problem of estimating the location of abrupt shifts in some simple time series models. This approach will permit its to address salient issues relating to distortions induced by the inherent aggregation associated with discrete-time sampling of continuous time processes experiencing structural change, We also address the issue of how time irreversible structures may be generated within the smooth transition processes. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
In this paper we develop an evolutionary kernel-based time update algorithm to recursively estimate subset discrete lag models (including fullorder models) with a forgetting factor and a constant term, using the exactwindowed case. The algorithm applies to causality detection when the true relationship occurs with a continuous or a random delay. We then demonstrate the use of the proposed evolutionary algorithm to study the monthly mutual fund data, which come from the 'CRSP Survivor-bias free US Mutual Fund Database'. The results show that the NAV is an influential player on the international stage of global bond and stock markets.
Resumo:
BACKGROUND: Intervention time series analysis (ITSA) is an important method for analysing the effect of sudden events on time series data. ITSA methods are quasi-experimental in nature and the validity of modelling with these methods depends upon assumptions about the timing of the intervention and the response of the process to it. METHOD: This paper describes how to apply ITSA to analyse the impact of unplanned events on time series when the timing of the event is not accurately known, and so the problems of ITSA methods are magnified by uncertainty in the point of onset of the unplanned intervention. RESULTS: The methods are illustrated using the example of the Australian Heroin Shortage of 2001, which provided an opportunity to study the health and social consequences of an abrupt change in heroin availability in an environment of widespread harm reduction measures. CONCLUSION: Application of these methods enables valuable insights about the consequences of unplanned and poorly identified interventions while minimising the risk of spurious results.
Resumo:
A number of mathematical models have been used to describe percutaneous absorption kinetics. In general, most of these models have used either diffusion-based or compartmental equations. The object of any mathematical model is to a) be able to represent the processes associated with absorption accurately, b) be able to describe/summarize experimental data with parametric equations or moments, and c) predict kinetics under varying conditions. However, in describing the processes involved, some developed models often suffer from being of too complex a form to be practically useful. In this chapter, we attempt to approach the issue of mathematical modeling in percutaneous absorption from four perspectives. These are to a) describe simple practical models, b) provide an overview of the more complex models, c) summarize some of the more important/useful models used to date, and d) examine sonic practical applications of the models. The range of processes involved in percutaneous absorption and considered in developing the mathematical models in this chapter is shown in Fig. 1. We initially address in vitro skin diffusion models and consider a) constant donor concentration and receptor conditions, b) the corresponding flux, donor, skin, and receptor amount-time profiles for solutions, and c) amount- and flux-time profiles when the donor phase is removed. More complex issues, such as finite-volume donor phase, finite-volume receptor phase, the presence of an efflux. rate constant at the membrane-receptor interphase, and two-layer diffusion, are then considered. We then look at specific models and issues concerned with a) release from topical products, b) use of compartmental models as alternatives to diffusion models, c) concentration-dependent absorption, d) modeling of skin metabolism, e) role of solute-skin-vehicle interactions, f) effects of vehicle loss, a) shunt transport, and h) in vivo diffusion, compartmental, physiological, and deconvolution models. We conclude by examining topics such as a) deep tissue penetration, b) pharmacodynamics, c) iontophoresis, d) sonophoresis, and e) pitfalls in modeling.
Resumo:
Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.
Resumo:
The robustness of mathematical models for biological systems is studied by sensitivity analysis and stochastic simulations. Using a neural network model with three genes as the test problem, we study robustness properties of synthesis and degradation processes. For single parameter robustness, sensitivity analysis techniques are applied for studying parameter variations and stochastic simulations are used for investigating the impact of external noise. Results of sensitivity analysis are consistent with those obtained by stochastic simulations. Stochastic models with external noise can be used for studying the robustness not only to external noise but also to parameter variations. For external noise we also use stochastic models to study the robustness of the function of each gene and that of the system.