869 resultados para Cox regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Four randomized phase II/III trials investigated the addition of cetuximab to platinum-based, first-line chemotherapy in patients with advanced non-small cell lung cancer (NSCLC). A meta-analysis was performed to examine the benefit/risk ratio for the addition of cetuximab to chemotherapy. MATERIALS AND METHODS: The meta-analysis included individual patient efficacy data from 2018 patients and individual patient safety data from 1970 patients comprising respectively the combined intention-to-treat and safety populations of the four trials. The effect of adding cetuximab to chemotherapy was measured by hazard ratios (HRs) obtained using a Cox proportional hazards model and odds ratios calculated by logistic regression. Survival rates at 1 year were calculated. All applied models were stratified by trial. Tests on heterogeneity of treatment effects across the trials and sensitivity analyses were performed for all endpoints. RESULTS: The meta-analysis demonstrated that the addition of cetuximab to chemotherapy significantly improved overall survival (HR 0.88, p=0.009, median 10.3 vs 9.4 months), progression-free survival (HR 0.90, p=0.045, median 4.7 vs 4.5 months) and response (odds ratio 1.46, p<0.001, overall response rate 32.2% vs 24.4%) compared with chemotherapy alone. The safety profile of chemotherapy plus cetuximab in the meta-analysis population was confirmed as manageable. Neither trials nor patient subgroups defined by key baseline characteristics showed significant heterogeneity for any endpoint. CONCLUSION: The addition of cetuximab to platinum-based, first-line chemotherapy for advanced NSCLC significantly improved outcome for all efficacy endpoints with an acceptable safety profile, indicating a favorable benefit/risk ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual localization in outdoor environments is often hampered by the natural variation in appearance caused by such things as weather phenomena, diurnal fluctuations in lighting, and seasonal changes. Such changes are global across an environment and, in the case of global light changes and seasonal variation, the change in appearance occurs in a regular, cyclic manner. Visual localization could be greatly improved if it were possible to predict the appearance of a particular location at a particular time, based on the appearance of the location in the past and knowledge of the nature of appearance change over time. In this paper, we investigate whether global appearance changes in an environment can be learned sufficiently to improve visual localization performance. We use time of day as a test case, and generate transformations between morning and afternoon using sample images from a training set. We demonstrate the learned transformation can be generalized from training data and show the resulting visual localization on a test set is improved relative to raw image comparison. The improvement in localization remains when the area is revisited several weeks later.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a semiparametric estimation approach for mixed count regression models based on series expansion for the unknown density of the unobserved heterogeneity. We use the generalized Laguerre series expansion around a gamma baseline density to model unobserved heterogeneity in a Poisson mixture model. We establish the consistency of the estimator and present a computational strategy to implement the proposed estimation techniques in the standard count model as well as in truncated, censored, and zero-inflated count regression models. Monte Carlo evidence shows that the finite sample behavior of the estimator is quite good. The paper applies the method to a model of individual shopping behavior. © 1999 Elsevier Science S.A. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aortic root replacement is a complex procedure, though subsequent modifications of the original Bentall procedure have made surgery more reproducible. The study aim was to examine the outcomes of a modified Bentall procedure, using the Medtronic Open PivotTM valved conduit. Whilst short-term data on the conduit and long-term data on the valve itself are available, little is known of the long-term results with the valved conduit. Patients undergoing aortic root replacement between February 1999 and February 2010, using the Medtronic Open Pivot valved conduit were identified from the prospectively collected Cardiothoracic Register at The Prince Charles Hospital, Brisbane, Australia. All patients were followed up echocardiographically and clinically. The primary end-point was death, and a Cox proportional model was used to identify factors associated.with survival. Secondary end-points were valve-related morbidity (as defined by STS guidelines) and postoperative morbidity. Predictors of morbidity were identified using logistic regression. A total of 246 patients (mean age 50 years) was included in the study. The overall mortality was 12%, with actuarial 10-year survival 79% and a 10-year estimate of valve-related death of 0.04 (95% CI: 0.004, 0.07). Preoperative myocardial infarction (p = 0.004, HR 4.74), urgency of operation (p = 0.038, HR 2.8) and 10% incremental decreases in ejection fraction (p = 0.046, HR 0.69) were predictive of mortality. Survival was also affected by the valve gradients, with a unit increase in peak gradient reducing mortality (p = 0.021, HR 0.93). Valve-related morbidity occurred in 11 patients. Urgent surgery (p <0.001, OR 4.12), aortic dissection (p = 0.015, OR 3.35), calcific aortic stenosis (p = 0.016, OR 2.35) and Marfan syndrome (p 0.009, OR 3.75) were predictive of postoperative morbidity. The reoperation rate was 1.2%. The Medtronic Open Pivot valved conduit is a safe and durable option for aortic root replacement, and is associated with low morbidity and 10-year survival of 79%. However, further studies are required to determine the effect of valve gradient on survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The high recurrence rate of chronic venous leg ulcers has a significant impact on an individual’s quality of life and healthcare costs. Objectives This study aimed to identify risk and protective factors for recurrence of venous leg ulcers using a theoretical approach by applying a framework of self and family management of chronic conditions to underpin the study. Design Secondary analysis of combined data collected from three previous prospective longitudinal studies. Setting The contributing studies’ participants were recruited from two metropolitan hospital outpatient wound clinics and three community-based wound clinics. Participants Data were available on a sample of 250 adults, with a leg ulcer of primarily venous aetiology, who were followed after ulcer healing for a median follow-up time of 17 months after healing (range: 3 to 36 months). Methods Data from the three studies were combined. The original participant data were collected through medical records and self-reported questionnaires upon healing and every 3 months thereafter. A Cox proportion-hazards regression analysis was undertaken to determine the influential factors on leg ulcer recurrence based on the proposed conceptual framework. Results The median time to recurrence was 42 weeks (95% CI 31.9–52.0), with an incidence of 22% (54 of 250 participants) recurrence within three months of healing, 39% (91 of 235 participants) for those who were followed for six months, 57% (111 of 193) by 12 months, 73% (53 of 72) by two years and 78% (41 of 52) of those who were followed up for three years. A Cox proportional-hazards regression model revealed that the risk factors for recurrence included a history of deep vein thrombosis (HR 1.7, 95% CI 1.07–2.67, p=0.024), history of multiple previous leg ulcers (HR 4.4, 95% CI 1.84–10.5, p=0.001), and longer duration (in weeks) of previous ulcer (HR 1.01, 95% CI 1.003–1.01, p<0.001); while the protective factors were elevating legs for at least 30 minutes per day (HR 0.33, 95% CI 0.19–0.56, p<0.001), higher levels of self-efficacy (HR 0.95, 95% CI 0.92–0.99, p=0.016), and walking around for at least three hours/day (HR 0.66, 95% CI 0.44–0.98, p=0.040). Conclusions Results from this study provide a comprehensive examination of risk and protective factors associated with leg ulcer recurrence based on the chronic disease self and family management framework. These results in turn provide essential steps towards developing and testing interventions to promote optimal prevention strategies for venous leg ulcer recurrence.