952 resultados para Three models
Resumo:
In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
The estimation of data transformation is very useful to yield response variables satisfying closely a normal linear model, Generalized linear models enable the fitting of models to a wide range of data types. These models are based on exponential dispersion models. We propose a new class of transformed generalized linear models to extend the Box and Cox models and the generalized linear models. We use the generalized linear model framework to fit these models and discuss maximum likelihood estimation and inference. We give a simple formula to estimate the parameter that index the transformation of the response variable for a subclass of models. We also give a simple formula to estimate the rth moment of the original dependent variable. We explore the possibility of using these models to time series data to extend the generalized autoregressive moving average models discussed by Benjamin er al. [Generalized autoregressive moving average models. J. Amer. Statist. Assoc. 98, 214-223]. The usefulness of these models is illustrated in a Simulation study and in applications to three real data sets. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Inside the `cavernous sinus` or `parasellar region` the human internal carotid artery takes the shape of a siphon that is twisted and torqued in three dimensions and surrounded by a network of veins. The parasellar section of the internal carotid artery is of broad biological and medical interest, as its peculiar shape is associated with temperature regulation in the brain and correlated with the occurrence of vascular pathologies. The present study aims to provide anatomical descriptions and objective mathematical characterizations of the shape of the parasellar section of the internal carotid artery in human infants and its modifications during ontogeny. Three-dimensional (3D) computer models of the parasellar section of the internal carotid artery of infants were generated with a state-of-the-art 3D reconstruction method and analysed using both traditional morphometric methods and novel mathematical algorithms. We show that four constant, demarcated bends can be described along the infant parasellar section of the internal carotid artery, and we provide measurements of their angles. We further provide calculations of the curvature and torsion energy, and the total complexity of the 3D skeleton of the parasellar section of the internal carotid artery, and compare the complexity of this in infants and adults. Finally, we examine the relationship between shape parameters of the parasellar section of the internal carotid artery in infants, and the occurrence of intima cushions, and evaluate the reliability of subjective angle measurements for characterizing the complexity of the parasellar section of the internal carotid artery in infants. The results can serve as objective reference data for comparative studies and for medical imaging diagnostics. They also form the basis for a new hypothesis that explains the mechanisms responsible for the ontogenetic transformation in the shape of the parasellar section of the internal carotid artery.
Resumo:
Using digitized images of the three-dimensional, branching structures for root systems of bean seedlings, together with analytical and numerical methods that map a common susceptible-infected- recovered (`SIR`) epidemiological model onto the bond percolation problem, we show how the spatially correlated branching structures of plant roots affect transmission efficiencies, and hence the invasion criterion, for a soil-borne pathogen as it spreads through ensembles of morphologically complex hosts. We conclude that the inherent heterogeneities in transmissibilities arising from correlations in the degrees of overlap between neighbouring plants render a population of root systems less susceptible to epidemic invasion than a corresponding homogeneous system. Several components of morphological complexity are analysed that contribute to disorder and heterogeneities in the transmissibility of infection. Anisotropy in root shape is shown to increase resilience to epidemic invasion, while increasing the degree of branching enhances the spread of epidemics in the population of roots. Some extension of the methods for other epidemiological systems are discussed.
Resumo:
Three-dimensional quantitative structure-activity relationships (3D-QSAR) were performed for a series of analgesic cyclic imides using the CoMFA and CoMSIA methods. Significant correlation coefficients ( CoMFA, r(2) = 0.95 and q(2) = 0.72; CoMSIA, r(2) = 0.96 and q(2) = 0.76) were obtained, and the generated models were externally validated using test sets. The final QSAR models as well as the information gathered from 3D contour maps should be useful for the design of novel cyclic imides having improved analgesic activity.
Resumo:
Comparative molecular field analysis (CoMFA) studies were conducted on a series of 100 isoniazid derivatives as anti-tuberculosis agents using two receptor-independent structural data set alignment strategies: (1) rigid-body fit, and (2) pharmacophore-based. Significant cross-validated correlation coefficients were obtained (CoMFA(1), q(2) = 0,75 and CoMFA(2), q(2) = 0.74), indicating the potential of the models for untested compounds. The models were then used to predict the inhibitory potency of 20 test set compounds that were not included in the training set, and the predicted values were in good agreement with the experimental results.
Resumo:
Liponucleosides may assist the anchoring of nucleic acid nitrogen bases into biological membranes for tailored nanobiotechnological applications. To this end precise knowledge about the biophysical and chemical details at the membrane surface is required. In this paper, we used Langmuir monolayers as simplified cell membrane models and studied the insertion of five lipidated nucleosides. These molecules varied in the type of the covalently attached lipid group, the nucleobase, and the number of hydrophobic moieties attached to the nucleoside. All five lipidated nucleosides were found to be surface-active and capable of forming stable monolayers. They could also be incorporated into dipalmitoylphosphatidylcholine (DPPC) monolayers, four of which induced expansion in the surface pressure isotherm and a decrease in the surface compression modulus of DPPC. In contrast, one nucleoside possessing three alkyl chain modifications formed very condensed monolayers and induced film condensation and an increase in the compression modulus for the DPPC monolayer, thus reflecting the importance of the ability of the nucleoside molecules to be arranged in a closely packed manner. The implications of these results lie on the possibility of tuning nucleic acid pairing by modifying structural characteristics of the liponucleosides. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In this article, we compare three residuals based on the deviance component in generalised log-gamma regression models with censored observations. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. For all cases studied, the empirical distributions of the proposed residuals are in general symmetric around zero, but only a martingale-type residual presented negligible kurtosis for the majority of the cases studied. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for the martingale-type residual in generalised log-gamma regression models with censored data. A lifetime data set is analysed under log-gamma regression models and a model checking based on the martingale-type residual is performed.
Resumo:
In order to extend previous SAR and QSAR studies, 3D-QSAR analysis has been performed using CoMFA and CoMSIA approaches applied to a set of 39 alpha-(N)-heterocyclic carboxaldehydes thiosemicarbazones with their inhibitory activity values (IC(50)) evaluated against ribonucleotide reductase (RNR) of H.Ep.-2 cells (human epidermoid carcinoma), taken from selected literature. Both rigid and field alignment methods, taking the unsubstituted 2-formylpyridine thiosemicarbazone in its syn conformation as template, have been used to generate multiple predictive CoMFA and CoMSIA models derived from training sets and validated with the corresponding test sets. Acceptable predictive correlation coefficients (Q(cv)(2) from 0.360 to 0.609 for CoMFA and Q(cv)(2) from 0.394 to 0.580 for CoMSIA models) with high fitted correlation coefficients (r` from 0.881 to 0.981 for CoMFA and r(2) from 0.938 to 0.993 for CoMSIA models) and low standard errors (s from 0.135 to 0.383 for CoMFA and s from 0.098 to 0.240 for CoMSIA models) were obtained. More precise CoMFA and CoMSIA models have been derived considering the subset of thiosemicarbazones (TSC) substituted only at 5-position of the pyridine ring (n=22). Reasonable predictive correlation coefficients (Q(cv)(2) from 0.486 to 0.683 for CoMFA and Q(cv)(2) from 0.565 to 0.791 for CoMSIA models) with high fitted correlation coefficients (r(2) from 0.896 to 0.997 for CoMFA and r(2) from 0.991 to 0.998 for CoMSIA models) and very low standard errors (s from 0.040 to 0.179 for CoMFA and s from 0.029 to 0.068 for CoMSIA models) were obtained. The stability of each CoMFA and CoMSIA models was further assessed by performing bootstrapping analysis. For the two sets the generated CoMSIA models showed, in general, better statistics than the corresponding CoMFA models. The analysis of CoMFA and CoMSIA contour maps suggest that a hydrogen bond acceptor near the nitrogen of the pyridine ring can enhance inhibitory activity values. This observation agrees with literature data, which suggests that the nitrogen pyridine lone pairs can complex with the iron ion leading to species that inhibits RNR. The derived CoMFA and CoMSIA models contribute to understand the structural features of this class of TSC as antitumor agents in terms of steric, electrostatic, hydrophobic and hydrogen bond donor and hydrogen bond acceptor fields as well as to the rational design of this key enzyme inhibitors.
Resumo:
Backgound and aims: The main purpose of the PEDAL study is to identify and estimate sample individual pharmacokinetic- pharmacodynamic (PK/PD) models for duodenal infusion of levodopa/carbidopa (Duodopa®) that can be used for in numero simulation of treatment strategies. Other objectives are to study the absorption of Duodopa® and to form a basis for power calculation for a future larger study. PK/PD based on oral levodopa is problematic because of irregular gastric emptying. Preliminary work with data from [Gundert-Remy U et al. Eur J Clin Pharmacol 1983;25:69-72] suggested that levodopa infusion pharmacokinetics can be described by a two-compartment model. Background research led to a hypothesis for an effect model incorporating concentration-unrelated fluctuations, more complex than standard E-max models. Methods: PEDAL involved a few patients already on Duodopa®. A bolus dose (normal morning dose plus 50%) was given after a washout during night. Data collection continued until the clinical effect was back at baseline. The procedure was repeated on two non-consecutive days per patient. The following data were collected in 5 to 15 minutes intervals: i) Accelerometer data. ii) Three e-diary questions about ability to walk, feelings of “off” and “dyskinesia”. iii) Clinical assessment of motor function by a physician. iv) Plasma concentrations of levodopa, carbidopa and the metabolite 3-O-methyldopa. The main effect variable will be the clinical assessment. Results: At date of abstract submission, lab analyses were currently being performed. Modelling results, simulation experiments and conclusions will be presented in our poster.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Predictive models for chronic renal disease using decision trees, naïve bayes and case-based methods
Resumo:
Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
This paper develops background considerations to help better framing the results of a CGE exercise. Three main criticisms are usually addressed to CGE efforts. First, they are too aggregate, their conclusions failing to shed light on relevant sectors or issues. Second, they imply huge data requirements. Timeliness is frequently jeopardised by out-dated sources, benchmarks referring to realities gone by. Finally, results are meaningless, as they answer wrong or ill-posed questions. Modelling demands end up by creating a rather artificial context, where the original questions lose content. In spite of a positive outlook on the first two, crucial questions lie in the third point. After elaborating such questions, and trying to answer some, the text argues that CGE models can come closer to reality. If their use is still scarce to give way to a fruitful symbiosis between negotiations and simulation results, they remain the only available technique providing a global, inter-related way of capturing economy-wide effects of several different policies. International organisations can play a major role supporting and encouraging improvements. They are also uniquely positioned to enhance information and data sharing, as well as putting people from various origins together, to share their experiences. A serious and complex homework is however required, to correct, at least, the most dangerous present shortcomings of the technique.
Resumo:
This thesis is composed of three articles with the subjects of macroeconomics and - nance. Each article corresponds to a chapter and is done in paper format. In the rst article, which was done with Axel Simonsen, we model and estimate a small open economy for the Canadian economy in a two country General Equilibrium (DSGE) framework. We show that it is important to account for the correlation between Domestic and Foreign shocks and for the Incomplete Pass-Through. In the second chapter-paper, which was done with Hedibert Freitas Lopes, we estimate a Regime-switching Macro-Finance model for the term-structure of interest rates to study the US post-World War II (WWII) joint behavior of macro-variables and the yield-curve. We show that our model tracks well the US NBER cycles, the addition of changes of regime are important to explain the Expectation Theory of the term structure, and macro-variables have increasing importance in recessions to explain the variability of the yield curve. We also present a novel sequential Monte-Carlo algorithm to learn about the parameters and the latent states of the Economy. In the third chapter, I present a Gaussian A ne Term Structure Model (ATSM) with latent jumps in order to address two questions: (1) what are the implications of incorporating jumps in an ATSM for Asian option pricing, in the particular case of the Brazilian DI Index (IDI) option, and (2) how jumps and options a ect the bond risk-premia dynamics. I show that jump risk-premia is negative in a scenario of decreasing interest rates (my sample period) and is important to explain the level of yields, and that gaussian models without jumps and with constant intensity jumps are good to price Asian options.