856 resultados para C33 - Models with Panel Data
Resumo:
Proton computerized tomography deals with relatively thick targets like the human head or trunk. In this case precise analytical calculation of the proton final energy is a rather complicated task, thus the Monte Carlo simulation stands out as a solution. We used the GEANT4.8.2 code to calculate the proton final energy spectra after passing a thick Al absorber and compared it with the same conditions of the experimental data. The ICRU49, Ziegler85 and Ziegler2000 models from the low energy extension pack were used. The results were also compared with the SRIM2008 and MCNPX2.4 simulations, and with solutions of the Boltzmann transport equation in the Fokker-Planck approximation. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.
Resumo:
Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4 can lead to significant disagreements in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents proton energy spectra obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models for 19.68MeV protons passing through a number of Al absorbers with various thicknesses. The spectra were compared with the experimental data, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and with the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the MCNPX simulations reasonably reproduce well all experimental spectra. For the relatively thin targets all the methods give practically identical results but this is not the same for the thick absorbers. It should be noted that all the spectra were measured at the proton energies significantly above 2MeV, i.e., in the so-called Bethe-Bloch region. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies are necessary for better understanding and definitive conclusions. © 2009 American Institute of Physics.
Resumo:
Numerical modeling of the interaction among waves and coastal structures is a challenge due to the many nonlinear phenomena involved, such as, wave propagation, wave transformation with water depth, interaction among incident and reflected waves, run-up / run-down and wave overtopping. Numerical models based on Lagrangian formulation, like SPH (Smoothed Particle Hydrodynamics), allow simulating complex free surface flows. The validation of these numerical models is essential, but comparing numerical results with experimental data is not an easy task. In the present paper, two SPH numerical models, SPHysics LNEC and SPH UNESP, are validated comparing the numerical results of waves interacting with a vertical breakwater, with data obtained in physical model tests made in one of the LNEC's flume. To achieve this validation, the experimental set-up is determined to be compatible with the Characteristics of the numerical models. Therefore, the flume dimensions are exactly the same for numerical and physical model and incident wave characteristics are identical, which allows determining the accuracy of the numerical models, particularly regarding two complex phenomena: wave-breaking and impact loads on the breakwater. It is shown that partial renormalization, i.e. renormalization applied only for particles near the structure, seems to be a promising compromise and an original method that allows simultaneously propagating waves, without diffusion, and modeling accurately the pressure field near the structure.
Resumo:
Background: Infant mortality is an important measure of human development, related to the level of welfare of a society. In order to inform public policy, various studies have tried to identify the factors that influence, at an aggregated level, infant mortality. The objective of this paper is to analyze the regional pattern of infant mortality in Brazil, evaluating the effect of infrastructure, socio-economic, and demographic variables to understand its distribution across the country. Methods: Regressions including socio-economic and living conditions variables are conducted in a structure of panel data. More specifically, a spatial panel data model with fixed effects and a spatial error autocorrelation structure is used to help to solve spatial dependence problems. The use of a spatial modeling approach takes into account the potential presence of spillovers between neighboring spatial units. The spatial units considered are Minimum Comparable Areas, defined to provide a consistent definition across Census years. Data are drawn from the 1980, 1991 and 2000 Census of Brazil, and from data collected by the Ministry of Health (DATASUS). In order to identify the influence of health care infrastructure, variables related to the number of public and private hospitals are included. Results: The results indicate that the panel model with spatial effects provides the best fit to the data. The analysis confirms that the provision of health care infrastructure and social policy measures (e. g. improving education attainment) are linked to reduced rates of infant mortality. An original finding concerns the role of spatial effects in the analysis of IMR. Spillover effects associated with health infrastructure and water and sanitation facilities imply that there are regional benefits beyond the unit of analysis. Conclusions: A spatial modeling approach is important to produce reliable estimates in the analysis of panel IMR data. Substantively, this paper contributes to our understanding of the physical and social factors that influence IMR in the case of a developing country.
Weibull and generalised exponential overdispersion models with an application to ozone air pollution
Resumo:
We consider the problem of estimating the mean and variance of the time between occurrences of an event of interest (inter-occurrences times) where some forms of dependence between two consecutive time intervals are allowed. Two basic density functions are taken into account. They are the Weibull and the generalised exponential density functions. In order to capture the dependence between two consecutive inter-occurrences times, we assume that either the shape and/or the scale parameters of the two density functions are given by auto-regressive models. The expressions for the mean and variance of the inter-occurrences times are presented. The models are applied to the ozone data from two regions of Mexico City. The estimation of the parameters is performed using a Bayesian point of view via Markov chain Monte Carlo (MCMC) methods.
Resumo:
Background: In addition to the oncogenic human papillomavirus (HPV), several cofactors are needed in cervical carcinogenesis, but whether the HPV covariates associated with incident i) CIN1 are different from those of incident ii) CIN2 and iii) CIN3 needs further assessment. Objectives: To gain further insights into the true biological differences between CIN1, CIN2 and CIN3, we assessed HPV covariates associated with incident CIN1, CIN2, and CIN3. Study Design and Methods: HPV covariates associated with progression to CIN1, CIN2 and CIN3 were analysed in the combined cohort of the NIS (n = 3,187) and LAMS study (n = 12,114), using competing-risks regression models (in panel data) for baseline HR-HPV-positive women (n = 1,105), who represent a sub-cohort of all 1,865 women prospectively followed-up in these two studies. Results: Altogether, 90 (4.8%), 39 (2.1%) and 14 (1.4%) cases progressed to CIN1, CIN2, and CIN3, respectively. Among these baseline HR-HPV-positive women, the risk profiles of incident GIN I, CIN2 and CIN3 were unique in that completely different HPV covariates were associated with progression to CIN1, CIN2 and CIN3, irrespective which categories (non-progression, CIN1, CIN2, CIN3 or all) were used as competing-risks events in univariate and multivariate models. Conclusions: These data confirm our previous analysis based on multinomial regression models implicating that distinct covariates of HR-HPV are associated with progression to CIN1, CIN2 and CIN3. This emphasises true biological differences between the three grades of GIN, which revisits the concept of combining CIN2 with CIN3 or with CIN1 in histological classification or used as a common end-point, e.g., in HPV vaccine trials.
Resumo:
In this paper, we propose nonlinear elliptical models for correlated data with heteroscedastic and/or autoregressive structures. Our aim is to extend the models proposed by Russo et al. [22] by considering a more sophisticated scale structure to deal with variations in data dispersion and/or a possible autocorrelation among measurements taken throughout the same experimental unit. Moreover, to avoid the possible influence of outlying observations or to take into account the non-normal symmetric tails of the data, we assume elliptical contours for the joint distribution of random effects and errors, which allows us to attribute different weights to the observations. We propose an iterative algorithm to obtain the maximum-likelihood estimates for the parameters and derive the local influence curvatures for some specific perturbation schemes. The motivation for this work comes from a pharmacokinetic indomethacin data set, which was analysed previously by Bocheng and Xuping [1] under normality.
Resumo:
The choice of an appropriate family of linear models for the analysis of longitudinal data is often a matter of concern for practitioners. To attenuate such difficulties, we discuss some issues that emerge when analyzing this type of data via a practical example involving pretestposttest longitudinal data. In particular, we consider log-normal linear mixed models (LNLMM), generalized linear mixed models (GLMM), and models based on generalized estimating equations (GEE). We show how some special features of the data, like a nonconstant coefficient of variation, may be handled in the three approaches and evaluate their performance with respect to the magnitude of standard errors of interpretable and comparable parameters. We also show how different diagnostic tools may be employed to identify outliers and comment on available software. We conclude by noting that the results are similar, but that GEE-based models may be preferable when the goal is to compare the marginal expected responses.
Resumo:
The aim of this study was to develop a model capable to capture the different contributions which characterize the nonlinear behaviour of reinforced concrete structures. In particular, especially for non slender structures, the contribution to the nonlinear deformation due to bending may be not sufficient to determine the structural response. Two different models characterized by a fibre beam-column element are here proposed. These models can reproduce the flexure-shear interaction in the nonlinear range, with the purpose to improve the analysis in shear-critical structures. The first element discussed is based on flexibility formulation which is associated with the Modified Compression Field Theory as material constitutive law. The other model described in this thesis is based on a three-field variational formulation which is associated with a 3D generalized plastic-damage model as constitutive relationship. The first model proposed in this thesis was developed trying to combine a fibre beamcolumn element based on the flexibility formulation with the MCFT theory as constitutive relationship. The flexibility formulation, in fact, seems to be particularly effective for analysis in the nonlinear field. Just the coupling between the fibre element to model the structure and the shear panel to model the individual fibres allows to describe the nonlinear response associated to flexure and shear, and especially their interaction in the nonlinear field. The model was implemented in an original matlab® computer code, for describing the response of generic structures. The simulations carried out allowed to verify the field of working of the model. Comparisons with available experimental results related to reinforced concrete shears wall were performed in order to validate the model. These results are characterized by the peculiarity of distinguishing the different contributions due to flexure and shear separately. The presented simulations were carried out, in particular, for monotonic loading. The model was tested also through numerical comparisons with other computer programs. Finally it was applied for performing a numerical study on the influence of the nonlinear shear response for non slender reinforced concrete (RC) members. Another approach to the problem has been studied during a period of research at the University of California Berkeley. The beam formulation follows the assumptions of the Timoshenko shear beam theory for the displacement field, and uses a three-field variational formulation in the derivation of the element response. A generalized plasticity model is implemented for structural steel and a 3D plastic-damage model is used for the simulation of concrete. The transverse normal stress is used to satisfy the transverse equilibrium equations of at each control section, this criterion is also used for the condensation of degrees of freedom from the 3D constitutive material to a beam element. In this thesis is presented the beam formulation and the constitutive relationships, different analysis and comparisons are still carrying out between the two model presented.
Resumo:
The first chapter of this work has the aim to provide a brief overview of the history of our Universe, in the context of string theory and considering inflation as its possible application to cosmological problems. We then discuss type IIB string compactifications, introducing the study of the inflaton, a scalar field candidated to describe the inflation theory. The Large Volume Scenario (LVS) is studied in the second chapter paying particular attention to the stabilisation of the Kähler moduli which are four-dimensional gravitationally coupled scalar fields which parameterise the size of the extra dimensions. Moduli stabilisation is the process through which these particles acquire a mass and can become promising inflaton candidates. The third chapter is devoted to the study of Fibre Inflation which is an interesting inflationary model derived within the context of LVS compactifications. The fourth chapter tries to extend the zone of slow-roll of the scalar potential by taking larger values of the field φ. Everything is done with the purpose of studying in detail deviations of the cosmological observables, which can better reproduce current experimental data. Finally, we present a slight modification of Fibre Inflation based on a different compactification manifold. This new model produces larger tensor modes with a spectral index in good agreement with the date released in February 2015 by the Planck satellite.
Resumo:
Chlamydia trachomatis is the most common bacterial sexually transmitted infection (STI) in many developed countries. The highest prevalence rates are found among young adults who have frequent partner change rates. Three published individual-based models have incorporated a detailed description of age-specific sexual behaviour in order to quantify the transmission of C. trachomatis in the population and to assess the impact of screening interventions. Owing to varying assumptions about sexual partnership formation and dissolution and the great uncertainty about critical parameters, such models show conflicting results about the impact of preventive interventions. Here, we perform a detailed evaluation of these models by comparing the partnership formation and dissolution dynamics with data from Natsal 2000, a population-based probability sample survey of sexual attitudes and lifestyles in Britain. The data also allow us to describe the dispersion of C. trachomatis infections as a function of sexual behaviour, using the Gini coefficient. We suggest that the Gini coefficient is a useful measure for calibrating infectious disease models that include risk structure and highlight the need to estimate this measure for other STIs.
Resumo:
Marginal generalized linear models can be used for clustered and longitudinal data by fitting a model as if the data were independent and using an empirical estimator of parameter standard errors. We extend this approach to data where the number of observations correlated with a given one grows with sample size and show that parameter estimates are consistent and asymptotically Normal with a slower convergence rate than for independent data, and that an information sandwich variance estimator is consistent. We present two problems that motivated this work, the modelling of patterns of HIV genetic variation and the behavior of clustered data estimators when clusters are large.
Resumo:
Generalized linear mixed models (GLMMs) provide an elegant framework for the analysis of correlated data. Due to the non-closed form of the likelihood, GLMMs are often fit by computational procedures like penalized quasi-likelihood (PQL). Special cases of these models are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints often make it difficult to apply these iterative procedures to data sets with very large number of cases. This paper proposes a computationally efficient strategy based on the Gauss-Seidel algorithm that iteratively fits sub-models of the GLMM to subsetted versions of the data. Additional gains in efficiency are achieved for Poisson models, commonly used in disease mapping problems, because of their special collapsibility property which allows data reduction through summaries. Convergence of the proposed iterative procedure is guaranteed for canonical link functions. The strategy is applied to investigate the relationship between ischemic heart disease, socioeconomic status and age/gender category in New South Wales, Australia, based on outcome data consisting of approximately 33 million records. A simulation study demonstrates the algorithm's reliability in analyzing a data set with 12 million records for a (non-collapsible) logistic regression model.
Resumo:
A number of authors have studies the mixture survival model to analyze survival data with nonnegligible cure fractions. A key assumption made by these authors is the independence between the survival time and the censoring time. To our knowledge, no one has studies the mixture cure model in the presence of dependent censoring. To account for such dependence, we propose a more general cure model which allows for dependent censoring. In particular, we derive the cure models from the perspective of competing risks and model the dependence between the censoring time and the survival time using a class of Archimedean copula models. Within this framework, we consider the parameter estimation, the cure detection, and the two-sample comparison of latency distribution in the presence of dependent censoring when a proportion of patients is deemed cured. Large sample results using the martingale theory are obtained. We applied the proposed methodologies to the SEER prostate cancer data.