944 resultados para log-linear models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we extend semiparametric mixed linear models with normal errors to elliptical errors in order to permit distributions with heavier and lighter tails than the normal ones. Penalized likelihood equations are applied to derive the maximum penalized likelihood estimates (MPLEs) which appear to be robust against outlying observations in the sense of the Mahalanobis distance. A reweighed iterative process based on the back-fitting method is proposed for the parameter estimation and the local influence curvatures are derived under some usual perturbation schemes to study the sensitivity of the MPLEs. Two motivating examples preliminarily analyzed under normal errors are reanalyzed considering some appropriate elliptical errors. The local influence approach is used to compare the sensitivity of the model estimates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background For analyzing longitudinal familial data we adopted a log-linear form to incorporate heterogeneity in genetic variance components over the time, and additionally a serial correlation term in the genetic effects at different levels of ages. Due to the availability of multiple measures on the same individual, we permitted environmental correlations that may change across time. Results Systolic blood pressure from family members from the first and second cohort was used in the current analysis. Measures of subjects receiving hypertension treatment were set as censored values and they were corrected. An initial check of the variance and covariance functions proposed for analyzing longitudinal familial data, using empirical semi-variogram plots, indicated that the observed trait dispersion pattern follows the assumptions adopted. Conclusion The corrections for censored phenotypes based on ordinary linear models may be an appropriate simple model to correct the data, ensuring that the original variability in the data was retained. In addition, empirical semi-variogram plots are useful for diagnosis of the (co)variance model adopted.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Setup operations are significant in some production environments. It is mandatory that their production plans consider some features, as setup state conservation across periods through setup carryover and crossover. The modelling of setup crossover allows more flexible decisions and is essential for problems with long setup times. This paper proposes two models for the capacitated lot-sizing problem with backlogging and setup carryover and crossover. The first is in line with other models from the literature, whereas the second considers a disaggregated setup variable, which tracks the starting and completion times of the setup operation. This innovative approach permits a more compact formulation. Computational results show that the proposed models have outperformed other state-of-the-art formulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed modesl and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated marginal residual vector by the Cholesky decomposition of the inverse of the estimated marginal variance matrix. Linear functions or the resulting "rotated" residuals are used to construct an empirical cumulative distribution function (ECDF), whose stochastic limit is characterized. We describe a resampling technique that serves as a computationally efficient parametric bootstrap for generating representatives of the stochastic limit of the ECDF. Through functionals, such representatives are used to construct global tests for the hypothesis of normal margional errors. In addition, we demonstrate that the ECDF of the predicted random effects, as described by Lange and Ryan (1989), can be formulated as a special case of our approach. Thus, our method supports both omnibus and directed tests. Our method works well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Prospective cohort studies have provided evidence on longer-term mortality risks of fine particulate matter (PM2.5), but due to their complexity and costs, only a few have been conducted. By linking monitoring data to the U.S. Medicare system by county of residence, we developed a retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), comprising over 20 million enrollees in the 250 largest counties during 2000-2002. We estimated log-linear regression models having as outcome the age-specific mortality rate for each county and as the main predictor, the average level for the study period 2000. Area-level covariates were used to adjust for socio-economic status and smoking. We reported results under several degrees of adjustment for spatial confounding and with stratification into by eastern, central and western counties. We estimated that a 10 µg/m3 increase in PM25 is associated with a 7.6% increase in mortality (95% CI: 4.4 to 10.8%). We found a stronger association in the eastern counties than nationally, with no evidence of an association in western counties. When adjusted for spatial confounding, the estimated log-relative risks drop by 50%. We demonstrated the feasibility of using Medicare data to establish cohorts for follow-up for effects of air pollution. Particulate matter (PM) air pollution is a global public health problem (1). In developing countries, levels of airborne particles still reach concentrations at which serious health consequences are well-documented; in developed countries, recent epidemiologic evidence shows continued adverse effects, even though particle levels have declined in the last two decades (2-6). Increased mortality associated with higher levels of PM air pollution has been of particular concern, giving an imperative for stronger protective regulations (7). Evidence on PM and health comes from studies of acute and chronic adverse effects (6). The London Fog of 1952 provides dramatic evidence of the unacceptable short-term risk of extremely high levels of PM air pollution (8-10); multi-site time-series studies of daily mortality show that far lower levels of particles are still associated with short-term risk (5)(11-13). Cohort studies provide complementary evidence on the longer-term risks of PM air pollution, indicating the extent to which exposure reduces life expectancy. The design of these studies involves follow-up of cohorts for mortality over periods of years to decades and an assessment of mortality risk in association with estimated long-term exposure to air pollution (2-4;14-17). Because of the complexity and costs of such studies, only a small number have been conducted. The most rigorously executed, including the Harvard Six Cities Study and the American Cancer Society’s (ACS) Cancer Prevention Study II, have provided generally consistent evidence for an association of long- term exposure to particulate matter air pollution with increased all-cause and cardio-respiratory mortality (2,4,14,15). Results from these studies have been used in risk assessments conducted for setting the U.S. National Ambient Air Quality Standard (NAAQS) for PM and for estimating the global burden of disease attributable to air pollution (18,19). Additional prospective cohort studies are necessary, however, to confirm associations between long-term exposure to PM and mortality, to broaden the populations studied, and to refine estimates by regions across which particle composition varies. Toward this end, we have used data from the U.S. Medicare system, which covers nearly all persons 65 years of age and older in the United States. We linked Medicare mortality data to (particulate matter less than 2.5 µm in aerodynamic diameter) air pollution monitoring data to create a new retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), consisting of 20 million persons from 250 counties and representing about 50% of the US population of elderly living in urban settings. In this paper, we report on the relationship between longer-term exposure to PM2.5 and mortality risk over the period 2000 to 2002 in the MCAPS.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is system dynamics that determines the function of cells, tissues and organisms. To develop mathematical models and estimate their parameters are an essential issue for studying dynamic behaviors of biological systems which include metabolic networks, genetic regulatory networks and signal transduction pathways, under perturbation of external stimuli. In general, biological dynamic systems are partially observed. Therefore, a natural way to model dynamic biological systems is to employ nonlinear state-space equations. Although statistical methods for parameter estimation of linear models in biological dynamic systems have been developed intensively in the recent years, the estimation of both states and parameters of nonlinear dynamic systems remains a challenging task. In this report, we apply extended Kalman Filter (EKF) to the estimation of both states and parameters of nonlinear state-space models. To evaluate the performance of the EKF for parameter estimation, we apply the EKF to a simulation dataset and two real datasets: JAK-STAT signal transduction pathway and Ras/Raf/MEK/ERK signaling transduction pathways datasets. The preliminary results show that EKF can accurately estimate the parameters and predict states in nonlinear state-space equations for modeling dynamic biochemical networks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: There is an ongoing debate concerning how outcome variables change during the course of psychotherapy. We compared the dose–effect model, which posits diminishing effects of additional sessions in later treatment phases, against a model that assumes a linear and steady treatment progress through termination. Method: Session-by-session outcome data of 6,375 outpatients were analyzed, and participants were categorized according to treatment length. Linear and log-linear (i.e., negatively accelerating) latent growth curve models (LGCMs) were estimated and compared for different treatment length categories. Results: When comparing the fit of the various models, the log-linear LGCMs assuming negatively accelerating treatment progress consistently outperformed the linear models irre- spective of treatment duration. The rate of change was found to be inversely related to the length of treatment. Conclusion: As proposed by the dose–effect model, the expected course of improvement in psychotherapy appears to follow a negatively accelerated pattern of change, irrespective of the duration of the treatment. However, our results also suggest that the rate of change is not constant across various treatment lengths. As proposed by the “good enough level” model, longer treatments are associated with less rapid rates of change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background and Aims Ongoing global warming has been implicated in shifting phenological patterns such as the timing and duration of the growing season across a wide variety of ecosystems. Linear models are routinely used to extrapolate these observed shifts in phenology into the future and to estimate changes in associated ecosystem properties such as net primary productivity. Yet, in nature, linear relationships may be special cases. Biological processes frequently follow more complex, non-linear patterns according to limiting factors that generate shifts and discontinuities, or contain thresholds beyond which responses change abruptly. This study investigates to what extent cambium phenology is associated with xylem growth and differentiation across conifer species of the northern hemisphere. Methods Xylem cell production is compared with the periods of cambial activity and cell differentiation assessed on a weekly time scale on histological sections of cambium and wood tissue collected from the stems of nine species in Canada and Europe over 1–9 years per site from 1998 to 2011. Key Results The dynamics of xylogenesis were surprisingly homogeneous among conifer species, although dispersions from the average were obviously observed. Within the range analysed, the relationships between the phenological timings were linear, with several slopes showing values close to or not statistically different from 1. The relationships between the phenological timings and cell production were distinctly non-linear, and involved an exponential pattern. Conclusions The trees adjust their phenological timings according to linear patterns. Thus, shifts of one phenological phase are associated with synchronous and comparable shifts of the successive phases. However, small increases in the duration of xylogenesis could correspond to a substantial increase in cell production. The findings suggest that the length of the growing season and the resulting amount of growth could respond differently to changes in environmental conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A variety of lattice discretisations of continuum actions has been considered, usually requiring the correct classical continuum limit. Here we discuss “weird” lattice formulations without that property, namely lattice actions that are invariant under most continuous deformations of the field configuration, in one version even without any coupling constants. It turns out that universality is powerful enough to still provide the correct quantum continuum limit, despite the absence of a classical limit, or a perturbative expansion. We demonstrate this for a set of O(N) models (or non-linear σ-models). Amazingly, such “weird” lattice actions are not only in the right universality class, but some of them even have practical benefits, in particular an excellent scaling behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical models predict lognormal species abundance distributions (SADs) in stable and productive environments, with log-series SADs in less stable, dispersal driven communities. We studied patterns of relative species abundances of perennial vascular plants in global dryland communities to: (i) assess the influence of climatic and soil characteristics on the observed SADs, (ii) infer how environmental variability influences relative abundances, and (iii) evaluate how colonisation dynamics and environmental filters shape abundance distributions. We fitted lognormal and log-series SADs to 91 sites containing at least 15 species of perennial vascular plants. The dependence of species relative abundances on soil and climate variables was assessed using general linear models. Irrespective of habitat type and latitude, the majority of the SADs (70.3%) were best described by a lognormal distribution. Lognormal SADs were associated with low annual precipitation, higher aridity, high soil carbon content, and higher variability of climate variables and soil nitrate. Our results do not corroborate models predicting the prevalence of log-series SADs in dryland communities. As lognormal SADs were particularly associated with sites with drier conditions and a higher environmental variability, we reject models linking lognormality to environmental stability and high productivity conditions. Instead our results point to the prevalence of lognormal SADs in heterogeneous environments, allowing for more evenly distributed plant communities, or in stressful ecosystems, which are generally shaped by strong habitat filters and limited colonisation. This suggests that drylands may be resilient to environmental changes because the many species with intermediate relative abundances could take over ecosystem functioning if the environment becomes suboptimal for dominant species.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The visual responses of neurons in the cerebral cortex were first adequately characterized in the 1960s by D. H. Hubel and T. N. Wiesel [(1962) J. Physiol. (London) 160, 106-154; (1968) J. Physiol. (London) 195, 215-243] using qualitative analyses based on simple geometric visual targets. Over the past 30 years, it has become common to consider the properties of these neurons by attempting to make formal descriptions of these transformations they execute on the visual image. Most such models have their roots in linear-systems approaches pioneered in the retina by C. Enroth-Cugell and J. R. Robson [(1966) J. Physiol. (London) 187, 517-552], but it is clear that purely linear models of cortical neurons are inadequate. We present two related models: one designed to account for the responses of simple cells in primary visual cortex (V1) and one designed to account for the responses of pattern direction selective cells in MT (or V5), an extrastriate visual area thought to be involved in the analysis of visual motion. These models share a common structure that operates in the same way on different kinds of input, and instantiate the widely held view that computational strategies are similar throughout the cerebral cortex. Implementations of these models for Macintosh microcomputers are available and can be used to explore the models' properties.