997 resultados para GLAUCOMA PROBABILITY SCORE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION The aim of the study was to identify the appropriate level of Charlson comorbidity index (CCI) in older patients (>70 years) with high-risk prostate cancer (PCa) to achieve survival benefit following radical prostatectomy (RP). METHODS We retrospectively analyzed 1008 older patients (>70 years) who underwent RP with pelvic lymph node dissection for high-risk prostate cancer (preoperative prostate-specific antigen >20 ng/mL or clinical stage ≥T2c or Gleason ≥8) from 14 tertiary institutions between 1988 and 2014. The study population was further grouped into CCI < 2 and ≥2 for analysis. Survival rate for each group was estimated with Kaplan-Meier method and competitive risk Fine-Gray regression to estimate the best explanatory multivariable model. Area under the curve (AUC) and Akaike information criterion were used to identify ideal 'Cut off' for CCI. RESULTS The clinical and cancer characteristics were similar between the two groups. Comparison of the survival analysis using the Kaplan-Meier curve between two groups for non-cancer death and survival estimations for 5 and 10 years shows significant worst outcomes for patients with CCI ≥ 2. In multivariate model to decide the appropriate CCI cut-off point, we found CCI 2 has better AUC and p value in log rank test. CONCLUSION Older patients with fewer comorbidities harboring high-risk PCa appears to benefit from RP. Sicker patients are more likely to die due to non-prostate cancer-related causes and are less likely to benefit from RP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE To compare patient outcomes and complication rates after different decompression techniques or instrumented fusion (IF) in lumbar spinal stenosis (LSS). METHODS The multicentre study was based on Spine Tango data. Inclusion criteria were LSS with a posterior decompression and pre- and postoperative COMI assessment between 3 and 24 months. 1,176 cases were assigned to four groups: (1) laminotomy (n = 642), (2) hemilaminectomy (n = 196), (3) laminectomy (n = 230) and (4) laminectomy combined with an IF (n = 108). Clinical outcomes were achievement of minimum relevant change in COMI back and leg pain and COMI score (2.2 points), surgical and general complications, measures taken due to complications, and reintervention on the index level based on patient information. The inverse propensity score weighting method was used for adjustment. RESULTS Laminotomy, hemilaminectomy and laminectomy were significantly less beneficial than laminectomy in combination with IF regarding leg pain (ORs with 95% CI 0.52, 0.34-0.81; 0.25, 0.15-0.41; 0.44, 0.27-0.72, respectively) and COMI score improvement (ORs with 95% CI 0.51, 0.33-0.81; 0.30, 0.18-0.51; 0.48, 0.29-0.79, respectively). However, the sole decompressions caused significantly fewer surgical (ORs with 95% CI 0.42, 0.26-0.69; 0.33, 0.17-0.63; 0.39, 0.21-0.71, respectively) and general complications (ORs with 95% CI 0.11, 0.04-0.29; 0.03, 0.003-0.41; 0.25, 0.09-0.71, respectively) than laminectomy in combination with IF. Accordingly, the likelihood of required measures was also significantly lower after laminotomy (OR 0.28, 95% CI 0.17-0.46), hemilaminectomy (OR 0.28, 95% CI 0.15-0.53) and after laminectomy (OR 0.39, 95% CI 0.22-0.68) in comparison with laminectomy with IF. The likelihood of a reintervention was not significantly different between the treatment groups. DISCUSSION As already demonstrated in the literature, decompression in patients with LSS is a very effective treatment. Despite better patient outcomes after laminectomy in combination with IF, caution is advised due to higher rates of surgical and general complications and consequent required measures. Based on the current study, laminotomy or laminectomy, rather than hemilaminectomy, is recommendable for minimum relevant pain relief.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been observed in various practical applications that data do not conform to the normal distribution, which is symmetric with no skewness. The skew normal distribution proposed by Azzalini(1985) is appropriate for the analysis of data which is unimodal but exhibits some skewness. The skew normal distribution includes the normal distribution as a special case where the skewness parameter is zero. In this thesis, we study the structural properties of the skew normal distribution, with an emphasis on the reliability properties of the model. More specifically, we obtain the failure rate, the mean residual life function, and the reliability function of a skew normal random variable. We also compare it with the normal distribution with respect to certain stochastic orderings. Appropriate machinery is developed to obtain the reliability of a component when the strength and stress follow the skew normal distribution. Finally, IQ score data from Roberts (1988) is analyzed to illustrate the procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this study was to develop and validate a computer-based statistical algorithm based on a multivariable logistic model that can be translated into a simple scoring system in order to ascertain stroke cases using hospital admission medical records data. This algorithm, the Risk Index Score (RISc), was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christ (BASIC) project. The validity of the RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment accomplished by physician review of hospital admission records. The goal of this study was to develop a rapid, simple, efficient, and accurate method to ascertain the incidence of stroke from routine hospital admission hospital admission records for epidemiologic investigations. ^ The main objectives of this study were to develop and validate a computer-based statistical algorithm based on a multivariable logistic model that could be translated into a simple scoring system to ascertain stroke cases using hospital admission medical records data. (Abstract shortened by UMI.)^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Data Envelopment Analysis (DEA) efficiency score obtained for an individual firm is a point estimate without any confidence interval around it. In recent years, researchers have resorted to bootstrapping in order to generate empirical distributions of efficiency scores. This procedure assumes that all firms have the same probability of getting an efficiency score from any specified interval within the [0,1] range. We propose a bootstrap procedure that empirically generates the conditional distribution of efficiency for each individual firm given systematic factors that influence its efficiency. Instead of resampling directly from the pooled DEA scores, we first regress these scores on a set of explanatory variables not included at the DEA stage and bootstrap the residuals from this regression. These pseudo-efficiency scores incorporate the systematic effects of unit-specific factors along with the contribution of the randomly drawn residual. Data from the U.S. airline industry are utilized in an empirical application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to better take advantage of the abundant results from large-scale genomic association studies, investigators are turning to a genetic risk score (GRS) method in order to combine the information from common modest-effect risk alleles into an efficient risk assessment statistic. The statistical properties of these GRSs are poorly understood. As a first step toward a better understanding of GRSs, a systematic analysis of recent investigations using a GRS was undertaken. GRS studies were searched in the areas of coronary heart disease (CHD), cancer, and other common diseases using bibliographic databases and by hand-searching reference lists and journals. Twenty-one independent case-control studies, cohort studies, and simulation studies (12 in CHD, 9 in other diseases) were identified. The underlying statistical assumptions of the GRS using the experience of the Framingham risk score were investigated. Improvements in the construction of a GRS guided by the concept of composite indicators are discussed. The GRS will be a promising risk assessment tool to improve prediction and diagnosis of common diseases.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this study was to determine the external validity of a clinical prediction rule developed by the European Multicenter Study on Human Spinal Cord Injury (EM-SCI) to predict the ambulation outcomes 12 months after traumatic spinal cord injury. Data from the North American Clinical Trials Network (NACTN) data registry with approximately 500 SCI cases were used for this validity study. The predictive accuracy of the EM-SCI prognostic model was evaluated using calibration and discrimination based on 231 NACTN cases. The area under the receiver-operating-characteristics curve (ROC) curve was 0.927 (95% CI 0.894 – 0.959) for the EM-SCI model when applied to NACTN population. This is lower than the AUC of 0.956 (95% CI 0.936 – 0.976) reported for the EM-SCI population, but suggests that the EM-SCI clinical prediction rule distinguished well between those patients in the NACTN population who were able to achieve independent ambulation and those who did not achieve independent ambulation. The calibration curve suggests that higher the prediction score is, the better the probability of walking with the best prediction for AIS D patients. In conclusion, the EM-SCI clinical prediction rule was determined to be generalizable to the adult NACTN SCI population.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En países en vías de desarrollo como Argentina, la sobrevida de prematuros de peso inferior a 1000 gramos dista mucho de los resultados reportados por países desarrolladas. Controles prenatales deficitarios, recursos técnicos limitados y la saturación de los servicios de Neonatología son en parte responsables de estas diferencias. Una de las situaciones frecuentemente asociada a decisiones éticas en neonatología se produce en torno al prematuro extremo. Las preguntas más difíciles de responder son si existe un límite de peso o edad gestacional por debajo del cual no se deban iniciar o agregar terapéuticas encaminadas a salvar la vida, por considerarlas inútiles para el niño, prolongan sin esperanza la vida, hacen sufrir al paciente y su familia y ocupar una unidad que priva de atención a otro niño con mayores posibilidades de sobrevida. En el presente estudio se elaboró un score de riesgo neonatal constituido por variables que caracterizan a muchas poblaciones de nuestros países latinoamericanos y que fue validado estadísticamente.El score es de rápida y fácil realización. Permite predecir si el prematuro grave es recuperable o no, posibilitando tomar decisiones éticas basadas en una técnica validada, que permite actuar en el mayor beneficio del niño y su familia, al mismo tiempo que se hace un uso más equitativo de los recursos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

reduce costs and labor associated with predicting the genotypic mean (GM) of a synthetic variety (SV) of maize (Zea mays L.), breeders can develop SVs from L lines and s single crosses (SynL,SC) instead of L+2s lines (SynL). The objective of this work was to derive and study formulae for the inbreeding coefficient (IC) and GM of SynL,SC, SynL, and the SV derived from (L+2s)/2 single crosses (SynSC). All SVs were derived from the same L+2s unrelated lines whose IC is FL, and each parent of a SV was represented by m plants. An a priori probability equation for the IC was used. Important results were: 1) the largest and smallest GMs correspond to SynL and SynL,SC, respectively; 2) the GM predictors with the largest and intermediate precision are those for SynL and SynL,SC, respectively; 3) only when FL=1, or m is large, SynL and SynSC are the same population, but only with SynSC prediction costs and labor undergo the maximum decrease, although its prediction precision is the lowest. To determine the SV to be developed, breeders should also consider the availability of lines, single crosses, manpower and land area; besides budget, target farmers, target environments, etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.