954 resultados para Generalized Logistic Model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Unlike infections occurring during periods of chemotherapy-induced neutropenia, postoperative infections in patients with solid malignancy remain largely understudied. The purpose of this population-based study was to evaluate the clinical and economic burden, as well as the relationship of hospital surgical volume and outcomes associated with serious postoperative infection (SPI) – i.e., bacteremia/sepsis, pneumonia, and wound infection – following resection of common solid tumors.^ From the Texas Discharge Data Research File, we identified all Texas residents who underwent resection of cancer of the lung, esophagus, stomach, pancreas, colon, or rectum between 2002 and 2006. From their billing records, we identified ICD-9 codes indicating SPI and also subsequent SPI-related readmissions occurring within 30 days of surgery. Random-effects logistic regression was used to calculate the impact of SPI on mortality, as well as the association between surgical volume and SPI, adjusting for case-mix, hospital characteristics, and clustering of multiple surgical admissions within the same patient and patients within the same hospital. Excess bed days and costs were calculated by subtracting values for patients without infections from those with infections computed using multilevel mixed-effects generalized linear model by fitting a gamma distribution to the data using log link.^ Serious postoperative infection occurred following 9.4% of the 37,582 eligible tumor resections and was independently associated with an 11-fold increase in the odds of in-hospital mortality (95% Confidence Interval [95% CI], 6.7-18.5, P < 0.001). Patients with SPI required 6.3 additional hospital days (95% CI, 6.1 - 6.5) at an incremental cost of $16,396 (95% CI, $15,927–$16,875). There was a significant trend toward lower overall rates of SPI with higher surgical volume (P=0.037). ^ Due to the substantial morbidity, mortality, and excess costs associated with SPI following solid tumor resections and given that, under current reimbursement practices, most of this heavy burden is borne by acute care providers, it is imperative for hospitals to identify more effective prophylactic measures, so that these potentially preventable infections and their associated expenditures can be averted. Additional volume-outcomes research is also needed to identify infection prevention processes that can be transferred from higher- to lower-volume providers.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Generalized linear Poisson and logistic regression models were utilized to examine the relationship between temperature and precipitation and cases of Saint Louis encephalitis virus spread in the Houston metropolitan area. The models were investigated with and without repeated measures, with a first order autoregressive (AR1) correlation structure used for the repeated measures model. The two types of Poisson regression models, with and without correlation structure, showed that a unit increase in temperature measured in degrees Fahrenheit increases the occurrence of the virus 1.7 times and a unit increase in precipitation measured in inches increases the occurrence of the virus 1.5 times. Logistic regression did not show these covariates to be significant as predictors for encephalitis activity in Houston for either correlation structure. This discrepancy for the logistic model could be attributed to the small data set.^ Keywords: Saint Louis Encephalitis; Generalized Linear Model; Poisson; Logistic; First Order Autoregressive; Temperature; Precipitation. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Bayesian approach to estimation of the regression coefficients of a multinominal logit model with ordinal scale response categories is presented. A Monte Carlo method is used to construct the posterior distribution of the link function. The link function is treated as an arbitrary scalar function. Then the Gauss-Markov theorem is used to determine a function of the link which produces a random vector of coefficients. The posterior distribution of the random vector of coefficients is used to estimate the regression coefficients. The method described is referred to as a Bayesian generalized least square (BGLS) analysis. Two cases involving multinominal logit models are described. Case I involves a cumulative logit model and Case II involves a proportional-odds model. All inferences about the coefficients for both cases are described in terms of the posterior distribution of the regression coefficients. The results from the BGLS method are compared to maximum likelihood estimates of the regression coefficients. The BGLS method avoids the nonlinear problems encountered when estimating the regression coefficients of a generalized linear model. The method is not complex or computationally intensive. The BGLS method offers several advantages over Bayesian approaches. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The history of the logistic function since its introduction in 1838 is reviewed, and the logistic model for a polychotomous response variable is presented with a discussion of the assumptions involved in its derivation and use. Following this, the maximum likelihood estimators for the model parameters are derived along with a Newton-Raphson iterative procedure for evaluation. A rigorous mathematical derivation of the limiting distribution of the maximum likelihood estimators is then presented using a characteristic function approach. An appendix with theorems on the asymptotic normality of sample sums when the observations are not identically distributed, with proofs, supports the presentation on asymptotic properties of the maximum likelihood estimators. Finally, two applications of the model are presented using data from the Hypertension Detection and Follow-up Program, a prospective, population-based, randomized trial of treatment for hypertension. The first application compares the risk of five-year mortality from cardiovascular causes with that from noncardiovascular causes; the second application compares risk factors for fatal or nonfatal coronary heart disease with those for fatal or nonfatal stroke. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The performance of the Hosmer-Lemeshow global goodness-of-fit statistic for logistic regression models was explored in a wide variety of conditions not previously fully investigated. Computer simulations, each consisting of 500 regression models, were run to assess the statistic in 23 different situations. The items which varied among the situations included the number of observations used in each regression, the number of covariates, the degree of dependence among the covariates, the combinations of continuous and discrete variables, and the generation of the values of the dependent variable for model fit or lack of fit.^ The study found that the $\rm\ C$g* statistic was adequate in tests of significance for most situations. However, when testing data which deviate from a logistic model, the statistic has low power to detect such deviation. Although grouping of the estimated probabilities into quantiles from 8 to 30 was studied, the deciles of risk approach was generally sufficient. Subdividing the estimated probabilities into more than 10 quantiles when there are many covariates in the model is not necessary, despite theoretical reasons which suggest otherwise. Because it does not follow a X$\sp2$ distribution, the statistic is not recommended for use in models containing only categorical variables with a limited number of covariate patterns.^ The statistic performed adequately when there were at least 10 observations per quantile. Large numbers of observations per quantile did not lead to incorrect conclusions that the model did not fit the data when it actually did. However, the statistic failed to detect lack of fit when it existed and should be supplemented with further tests for the influence of individual observations. Careful examination of the parameter estimates is also essential since the statistic did not perform as desired when there was moderate to severe collinearity among covariates.^ Two methods studied for handling tied values of the estimated probabilities made only a slight difference in conclusions about model fit. Neither method split observations with identical probabilities into different quantiles. Approaches which create equal size groups by separating ties should be avoided. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Los incendios forestales son la principal causa de mortalidad de árboles en la Europa mediterránea y constituyen la amenaza más seria para los ecosistemas forestales españoles. En la Comunidad Valenciana, diariamente se despliega cerca de un centenar de vehículos de vigilancia, cuya distribución se apoya, fundamentalmente, en un índice de riesgo de incendios calculado en función de las condiciones meteorológicas. La tesis se centra en el diseño y validación de un nuevo índice de riesgo integrado de incendios, especialmente adaptado a la región mediterránea y que facilite el proceso de toma de decisiones en la distribución diaria de los medios de vigilancia contra incendios forestales. El índice adopta el enfoque de riesgo integrado introducido en la última década y que incluye dos componentes de riesgo: el peligro de ignición y la vulnerabilidad. El primero representa la probabilidad de que se inicie un fuego y el peligro potencial para que se propague, mientras que la vulnerabilidad tiene en cuenta las características del territorio y los efectos potenciales del fuego sobre el mismo. Para el cálculo del peligro potencial se han identificado indicadores relativos a los agentes naturales y humanos causantes de incendios, la ocurrencia histórica y el estado de los combustibles, extremo muy relacionado con la meteorología y las especies. En cuanto a la vulnerabilidad se han empleado indicadores representativos de los efectos potenciales del incendio (comportamiento del fuego, infraestructuras de defensa), como de las características del terreno (valor, capacidad de regeneración…). Todos estos indicadores constituyen una estructura jerárquica en la que, siguiendo las recomendaciones de la Comisión europea para índices de riesgo de incendios, se han incluido indicadores representativos del riesgo a corto plazo y a largo plazo. El cálculo del valor final del índice se ha llevado a cabo mediante la progresiva agregación de los componentes que forman cada uno de los niveles de la estructura jerárquica del índice y su integración final. Puesto que las técnicas de decisión multicriterio están especialmente orientadas a tratar con problemas basados en estructuras jerárquicas, se ha aplicado el método TOPSIS para obtener la integración final del modelo. Se ha introducido en el modelo la opinión de los expertos, mediante la ponderación de cada uno de los componentes del índice. Se ha utilizado el método AHP, para obtener las ponderaciones de cada experto y su integración en un único peso por cada indicador. Para la validación del índice se han empleado los modelos de Ecuaciones de Estimación Generalizadas, que tienen en cuenta posibles respuestas correlacionadas. Para llevarla a cabo se emplearon los datos de oficiales de incendios ocurridos durante el período 1994 al 2003, referenciados a una cuadrícula de 10x10 km empleando la ocurrencia de incendios y su superficie, como variables dependientes. Los resultados de la validación muestran un buen funcionamiento del subíndice de peligro de ocurrencia con un alto grado de correlación entre el subíndice y la ocurrencia, un buen ajuste del modelo logístico y un buen poder discriminante. Por su parte, el subíndice de vulnerabilidad no ha presentado una correlación significativa entre sus valores y la superficie de los incendios, lo que no descarta su validez, ya que algunos de sus componentes tienen un carácter subjetivo, independiente de la superficie incendiada. En general el índice presenta un buen funcionamiento para la distribución de los medios de vigilancia en función del peligro de inicio. No obstante, se identifican y discuten nuevas líneas de investigación que podrían conducir a una mejora del ajuste global del índice. En concreto se plantea la necesidad de estudiar más profundamente la aparente correlación que existe en la provincia de Valencia entre la superficie forestal que ocupa cada cuadrícula de 10 km del territorio y su riesgo de incendios y que parece que a menor superficie forestal, mayor riesgo de incendio. Otros aspectos a investigar son la sensibilidad de los pesos de cada componente o la introducción de factores relativos a los medios potenciales de extinción en el subíndice de vulnerabilidad. Summary Forest fires are the main cause of tree mortality in Mediterranean Europe and the most serious threat to the Spanisf forest. In the Spanish autonomous region of Valencia, forest administration deploys a mobile fleet of 100 surveillance vehicles in forest land whose allocation is based on meteorological index of wildlandfire risk. This thesis is focused on the design and validation of a new Integrated Wildland Fire Risk Index proposed to efficient allocation of vehicles and specially adapted to the Mediterranean conditions. Following the approaches of integrated risk developed last decade, the index includes two risk components: Wildland Fire Danger and Vulnerability. The former represents the probability a fire ignites and the potential hazard of fire propagation or spread danger, while vulnerability accounts for characteristics of the land and potential effects of fire. To calculate the Wildland Fire Danger, indicators of ignition and spread danger have been identified, including human and natural occurrence agents, fuel conditions, historical occurrence and spread rate. Regarding vulnerability se han empleado indicadores representativos de los efectos potenciales del incendio (comportamiento del fuego, infraestructurasd de defensa), como de las características del terreno (valor, capacidad de regeneración…). These indicators make up the hierarchical structure for the index, which, following the criteria of the European Commission both short and long-term indicators have been included. Integration consists of the progressive aggregation of the components that make up every level in risk the index and, after that, the integration of these levels to obtain a unique value for the index. As Munticriteria methods are oriented to deal with hierarchically structured problems and with situations in which conflicting goals prevail, TOPSIS method is used in the integration of components. Multicriteria methods were also used to incorporate expert opinion in weighting of indicators and to carry out the aggregation process into the final index. The Analytic Hierarchy Process method was used to aggregate experts' opinions on each component into a single value. Generalized Estimation Equations, which account for possible correlated responses, were used to validate the index. Historical records of daily occurrence for the period from 1994 to 2003, referred to a 10x10-km-grid cell, as well as the extent of the fires were the dependant variables. The results of validation showed good Wildland Fire Danger component performance, with high correlation degree between Danger and occurrence, a good fit of the logistic model used and a good discrimination power. The vulnerability component has not showed a significant correlation between their values and surface fires, which does not mean the index is not valid, because of the subjective character of some of its components, independent of the surface of the fires. Overall, the index could be used to optimize the preventing resources allocation. Nevertheless, new researching lines are identified and discussed to improve the overall performance of the index. More specifically the need of study the inverse relationship between the value of the wildfire Fire Danger component and the forested surface of each 10 - km cell is set out. Other points to be researched are the sensitivity of the index component´s weight and the possibility of taking into account indicators related to fire fighting resources to make up the vulnerability component.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The cell concentration and size distribution of the microalgae Nannochloropsis gaditana were studied over the whole growth process. Various samples were taken during the light and dark periods the algae were exposed to. The distributions obtained exhibited positive skew, and no change in the type of distribution was observed during the growth process. The size distribution shifted to lower diameters in dark periods while in light periods the opposite occurred. The overall trend during the growth process was one where the size distribution shifted to larger cell diameters, with differences between initial and final distributions of individual cycles becoming smaller. A model based on the Logistic model for cell concentration as a function of time in the dark period that also takes into account cell respiration and growth processes during dark and light periods, respectively, was proposed and successfully applied. This model provides a picture that is closer to the real growth and evolution of cultures, and reveals a clear effect of light and dark periods on the different ways in which cell concentration and diameter evolve with time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To effectively assess and mitigate risk of permafrost disturbance, disturbance-p rone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape charac- teristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Pen- insula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed lo- cations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) N 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Addition- ally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results in- dicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of dis- turbances were similar regardless of the location. Disturbances commonly occurred on slopes between 4 and 15°, below Holocene marine limit, and in areas with low potential incoming solar radiation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Equilibrium adsorption and desorption in mesoporous adsorbents is considered on the basis of rigorous thermodynamic analysis, in which the curvature-dependent solid-fluid potential and the compressibility of the adsorbed phase are accounted for. The compressibility of the adsorbed phase is considered for the first time in the literature in the framework of a rigorous thermodynamic approach. Our model is a further development of continuum thermodynamic approaches proposed by Derjaguin and Broekhoff and de Boer, and it is based on a reference isotherm of a non-porous material having the same chemical structure as that of the pore wall. In this improved thermodynamic model, we incorporated a prescription for transforming the solid-fluid potential exerted by the flat reference surface to the potential inside cylindrical and spherical pores. We relax the assumption that the adsorbed film density is constant and equal to that of the saturated liquid. Instead, the density of the adsorbed fluid is allowed to vary over the adsorbed film thickness and is calculated by an equation of state. As a result, the model is capable to describe the adsorption-desorption reversibility in cylindrical pores having diameter less than 2 nm. The generalized thermodynamic model may be applied to the pore size characterization of mesoporous materials instead of much more time-consuming molecular approaches. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A stochastic metapopulation model accounting for habitat dynamics is presented. This is the stochastic SIS logistic model with the novel aspect that it incorporates varying carrying capacity. We present results of Kurtz and Barbour, that provide deterministic and diffusion approximations for a wide class of stochastic models, in a form that most easily allows their direct application to population models. These results are used to show that a suitably scaled version of the metapopulation model converges, uniformly in probability over finite time intervals, to a deterministic model previously studied in the ecological literature. Additionally, they allow us to establish a bivariate normal approximation to the quasi-stationary distribution of the process. This allows us to consider the effects of habitat dynamics on metapopulation modelling through a comparison with the stochastic SIS logistic model and provides an effective means for modelling metapopulations inhabiting dynamic landscapes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study a generalized Hubbard model on the two-leg ladder at zero temperature, focusing on a parameter region with staggered flux (SF)/d-density wave (DDW) order. To guide our numerical calculations, we first investigate the location of a SF/DDW phase in the phase diagram of the half-filled weakly interacting ladder using a perturbative renormalization group (RG) and bosonization approach. For hole doping 6 away from half-filling, finite-system density-matrix renormalizationgroup (DMRG) calculations are used to study ladders with up to 200 rungs for intermediate-strength interactions. In the doped SF/DDW phase, the staggered rung current and the rung electron density both show periodic spatial oscillations, with characteristic wavelengths 2/delta and 1/delta, respectively, corresponding to ordering wavevectors 2k(F) and 4k(F) for the currents and densities, where 2k(F) = pi(1 - delta). The density minima are located at the anti-phase domain walls of the staggered current. For sufficiently large dopings, SF/DDW order is suppressed. The rung density modulation also exists in neighboring phases where currents decay exponentially. We show that most of the DMRG results can be qualitatively understood from weak-coupling RG/bosonization arguments. However, while these arguments seem to suggest a crossover from non-decaying correlations to power-law decay at a length scale of order 1/delta, the DMRG results are consistent with a true long-range order scenario for the currents and densities. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Магдалина Василева Тодорова - В статията е описан подход за верификация на процедурни програми чрез изграждане на техни модели, дефинирани чрез обобщени мрежи. Подходът интегрира концепцията “design by contract” с подходи за верификация от тип доказателство на теореми и проверка на съгласуваност на модели. За целта разделно се верифицират функциите, които изграждат програмата относно спецификации според предназначението им. Изгражда се обобщен мрежов модел, специфициащ връзките между функциите във вид на коректни редици от извиквания. За главната функция на програмата се построява обобщен мрежов модел и се проверява дали той съответства на мрежовия модел на връзките между функциите на програмата. Всяка от функциите на програмата, която използва други функции се верифицира и относно спецификацията, зададена чрез мрежовия модел на връзките между функциите на програмата.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To effectively assess and mitigate risk of permafrost disturbance, disturbance-p rone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape charac- teristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Pen- insula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed lo- cations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) N 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Addition- ally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results in- dicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of dis- turbances were similar regardless of the location. Disturbances commonly occurred on slopes between 4 and 15°, below Holocene marine limit, and in areas with low potential incoming solar radiation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, we explore and demonstrate the potential for modeling and classification using quantile-based distributions, which are random variables defined by their quantile function. In the first part we formalize a least squares estimation framework for the class of linear quantile functions, leading to unbiased and asymptotically normal estimators. Among the distributions with a linear quantile function, we focus on the flattened generalized logistic distribution (fgld), which offers a wide range of distributional shapes. A novel naïve-Bayes classifier is proposed that utilizes the fgld estimated via least squares, and through simulations and applications, we demonstrate its competitiveness against state-of-the-art alternatives. In the second part we consider the Bayesian estimation of quantile-based distributions. We introduce a factor model with independent latent variables, which are distributed according to the fgld. Similar to the independent factor analysis model, this approach accommodates flexible factor distributions while using fewer parameters. The model is presented within a Bayesian framework, an MCMC algorithm for its estimation is developed, and its effectiveness is illustrated with data coming from the European Social Survey. The third part focuses on depth functions, which extend the concept of quantiles to multivariate data by imposing a center-outward ordering in the multivariate space. We investigate the recently introduced integrated rank-weighted (IRW) depth function, which is based on the distribution of random spherical projections of the multivariate data. This depth function proves to be computationally efficient and to increase its flexibility we propose different methods to explicitly model the projected univariate distributions. Its usefulness is shown in classification tasks: the maximum depth classifier based on the IRW depth is proven to be asymptotically optimal under certain conditions, and classifiers based on the IRW depth are shown to perform well in simulated and real data experiments.