914 resultados para hazards


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a consequence of flood impacts, communities inhabiting mountain areas are increasingly affected by considerable damage to infrastructure and property. The design of effective flood risk mitigation strategies and their subsequent implementation is crucial for a sustainable development in mountain areas. The assessment of the dynamic evolution of flood risk is the pillar of any subsequent planning process that is targeted at a reduction of the expected adverse consequences of the hazard impact. Given these premises, firstly, a comprehensive method to derive flood hazard process scenarios for well-defined areas at risk is presented. Secondly, conceptualisations of a static and dynamic flood risk assessment are provided. These are based on formal schemes to compute the risk mitigation performance of devised mitigation strategies within the framework of economic cost-benefit analysis. In this context, techniques suitable to quantify the expected losses induced by the identified flood impacts are provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efforts have been made to provide a scientific basis for using environmental services as a conceptual tool to enhance conservation and improve livelihoods in protected mountain areas (MtPAS). Little attention has been paid to participatory research or locals’ concerns as environmental service (ES) users and providers. Such perspectives can illuminate the complex interplay between mountain ecosystems, environmental services and the determinants of human well-being. Repeat photography, long used in geographical fieldwork, is new as a qualitative research tool. This study uses a novel application of repeat photography as a diachronic photo-diary to examine local perceptions of change in ES in Sagarmatha National Park. Results show a consensus among locals on adverse changes to ES, particularly protection against natural hazards, such as landslides and floods, in the UNESCO World Heritage Site. We argue that our methodology could complement biophysical ecosystem assessments in MtPAS, especially since assessing ES, and acting on that, requires integrating diverse stakeholders’ knowledge, recognizing power imbalances and grappling with complex social-ecological systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Tumor levels of steroid hormone receptors, a factor used to select adjuvant treatment for early-stage breast cancer, are currently determined with immunohistochemical assays. These assays have a discordance of 10%-30% with previously used extraction assays. We assessed the concordance and predictive value of hormone receptor status as determined by immunohistochemical and extraction assays on specimens from International Breast Cancer Study Group Trials VIII and IX. These trials predominantly used extraction assays and compared adjuvant chemoendocrine therapy with endocrine therapy alone among pre- and postmenopausal patients with lymph node-negative breast cancer. Trial conclusions were that combination therapy provided a benefit to pre- and postmenopausal patients with estrogen receptor (ER)-negative tumors but not to ER-positive postmenopausal patients. ER-positive premenopausal patients required further study. METHODS: Tumor specimens from 571 premenopausal and 976 postmenopausal patients on which extraction assays had determined ER and progesterone receptor (PgR) levels before randomization from October 1, 1988, through October 1, 1999, were re-evaluated with an immunohistochemical assay in a central pathology laboratory. The endpoint was disease-free survival. Hazard ratios of recurrence or death for treatment comparisons were estimated with Cox proportional hazards regression models, and discriminatory ability was evaluated with the c index. All statistical tests were two-sided. RESULTS: Concordance of hormone receptor status determined by both assays ranged from 74% (kappa = 0.48) for PgR among postmenopausal patients to 88% (kappa = 0.66) for ER in postmenopausal patients. Hazard ratio estimates were similar for the association between disease-free survival and ER status (among all patients) or PgR status (among postmenopausal patients) as determined by the two methods. However, among premenopausal patients treated with endocrine therapy alone, the discriminatory ability of PgR status as determined by immunohistochemical assay was statistically significantly better (c index = 0.60 versus 0.51; P = .003) than that determined by extraction assay, and so immunohistochemically determined PgR status could predict disease-free survival. CONCLUSIONS: Trial conclusions in which ER status (for all patients) or PgR status (for postmenopausal patients) was determined by immunohistochemical assay supported those determined by extraction assays. However, among premenopausal patients, trial conclusions drawn from PgR status differed--immunohistochemically determined PgR status could predict response to endocrine therapy, unlike that determined by the extraction assay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Patients coinfected with hepatitis C virus (HCV) and HIV experience higher mortality rates than patients infected with HIV alone. We designed a study to determine whether risks for later mortality are similar for HCV-positive and HCV-negative individuals when subjects are stratified on the basis of baseline CD4+ T-cell counts. METHODS: Antiretroviral-naive individuals, who initiated highly active antiretroviral therapy (HAART) between 1996 and 2002 were included in the study. HCV-positive and HCV-negative individuals were stratified separately by baseline CD4+ T-cell counts of 50 cell/microl increments. Cox-proportional hazards regression was used to model the effect of these strata with other variables on survival. RESULTS: CD4+ T-cell strata below 200 cells/microl, but not above, imparted an increased relative hazard (RH) of mortality for both HCV-positive and HCV-negative individuals. Among HCV-positive individuals, after adjustment for baseline age, HIV RNA levels, history of injection drug use and adherence to therapy, only CD4+ T-cell strata of <50 cells/microl (RH=4.60; 95% confidence interval [CI] 2.72-7.76) and 50-199 cells/microl (RH=2.49; 95% CI 1.63-3.81) were significantly associated with increased mortality when compared with those initiating therapy at cell counts >500 cells/microl. The same baseline CD4+ T-cell strata were found for HCV-negative individuals. CONCLUSION: In a within-groups analysis, the baseline CD4+ T-cell strata that are associated with increased RHs for mortality are the same for HCV-positive and HCV-negative individuals initiating HAART. However, a between-groups analysis reveals a higher absolute mortality risk for HCV-positive individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: We sought to characterize the impact that hepatitis C virus (HCV) infection has on CD4 cells during the first 48 weeks of antiretroviral therapy (ART) in previously ART-naive human immunodeficiency virus (HIV)-infected patients. METHODS: The HIV/AIDS Drug Treatment Programme at the British Columbia Centre for Excellence in HIV/AIDS distributes all ART in this Canadian province. Eligible individuals were those whose first-ever ART included 2 nucleoside reverse transcriptase inhibitors and either a protease inhibitor or a nonnucleoside reverse transcriptase inhibitor and who had a documented positive result for HCV antibody testing. Outcomes were binary events (time to an increase of > or = 75 CD4 cells/mm3 or an increase of > or = 10% in the percentage of CD4 cells in the total T cell population [CD4 cell fraction]) and continuous repeated measures. Statistical analyses used parametric and nonparametric methods, including multivariate mixed-effects linear regression analysis and Cox proportional hazards analysis. RESULTS: Of 1186 eligible patients, 606 (51%) were positive and 580 (49%) were negative for HCV antibodies. HCV antibody-positive patients were slower to have an absolute (P<.001) and a fraction (P = .02) CD4 cell event. In adjusted Cox proportional hazards analysis (controlling for age, sex, baseline absolute CD4 cell count, baseline pVL, type of ART initiated, AIDS diagnosis at baseline, adherence to ART regimen, and number of CD4 cell measurements), HCV antibody-positive patients were less likely to have an absolute CD4 cell event (adjusted hazard ratio [AHR], 0.84 [95% confidence interval [CI], 0.72-0.98]) and somewhat less likely to have a CD4 cell fraction event (AHR, 0.89 [95% CI, 0.70-1.14]) than HCV antibody-negative patients. In multivariate mixed-effects linear regression analysis, HCV antibody-negative patients had increases of an average of 75 cells in the absolute CD4 cell count and 4.4% in the CD4 cell fraction, compared with 20 cells and 1.1% in HCV antibody-positive patients, during the first 48 weeks of ART, after adjustment for time-updated pVL, number of CD4 cell measurements, and other factors. CONCLUSION: HCV antibody-positive HIV-infected patients may have an altered immunologic response to ART.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessments of environmental and territorial justice are similar in that both assess whether empirical relations between the spatial arrangement of undesirable hazards (or desirable public goods and services) and socio-demographic groups are consistent with notions of social justice, evaluating the spatial distribution of benefits and burdens (outcome equity) and the process that produces observed differences (process equity. Using proximity to major highways in NYC as a case study, we review methodological issues pertinent to both fields and discuss choice and computation of exposure measures, but focus primarily on measures of inequity. We present inequity measures computed from the empirically estimated joint distribution of exposure and demographics and compare them to traditional measures such as linear regression, logistic regression and Theil’s entropy index. We find that measures computed from the full joint distribution provide more unified, transparent and intuitive operational definitions of inequity and show how the approach can be used to structure siting and decommissioning decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In biostatistical applications interest often focuses on the estimation of the distribution of a time-until-event variable T. If one observes whether or not T exceeds an observed monitoring time at a random number of monitoring times, then the data structure is called interval censored data. We extend this data structure by allowing the presence of a possibly time-dependent covariate process that is observed until end of follow up. If one only assumes that the censoring mechanism satisfies coarsening at random, then, by the curve of dimensionality, typically no regular estimators will exist. To fight the curse of dimensionality we follow the approach of Robins and Rotnitzky (1992) by modeling parameters of the censoring mechanism. We model the right-censoring mechanism by modeling the hazard of the follow up time, conditional on T and the covariate process. For the monitoring mechanism we avoid modeling the joint distribution of the monitoring times by only modeling a univariate hazard of the pooled monitoring times, conditional on the follow up time, T, and the covariates process, which can be estimated by treating the pooled sample of monitoring times as i.i.d. In particular, it is assumed that the monitoring times and the right-censoring times only depend on T through the observed covariate process. We introduce inverse probability of censoring weighted (IPCW) estimator of the distribution of T and of smooth functionals thereof which are guaranteed to be consistent and asymptotically normal if we have available correctly specified semiparametric models for the two hazards of the censoring process. Furthermore, given such correctly specified models for these hazards of the censoring process, we propose a one-step estimator which will improve on the IPCW estimator if we correctly specify a lower-dimensional working model for the conditional distribution of T, given the covariate process, that remains consistent and asymptotically normal if this latter working model is misspecified. It is shown that the one-step estimator is efficient if each subject is at most monitored once and the working model contains the truth. In general, it is shown that the one-step estimator optimally uses the surrogate information if the working model contains the truth. It is not optimal in using the interval information provided by the current status indicators at the monitoring times, but simulations in Peterson, van der Laan (1997) show that the efficiency loss is small.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To compare the prediction of hip fracture risk of several bone ultrasounds (QUS), 7062 Swiss women > or =70 years of age were measured with three QUSs (two of the heel, one of the phalanges). Heel QUSs were both predictive of hip fracture risk, whereas the phalanges QUS was not. INTRODUCTION: As the number of hip fracture is expected to increase during these next decades, it is important to develop strategies to detect subjects at risk. Quantitative bone ultrasound (QUS), an ionizing radiation-free method, which is transportable, could be interesting for this purpose. MATERIALS AND METHODS: The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk (SEMOF) study is a multicenter cohort study, which compared three QUSs for the assessment of hip fracture risk in a sample of 7609 elderly ambulatory women > or =70 years of age. Two QUSs measured the heel (Achilles+; GE-Lunar and Sahara; Hologic), and one measured the heel (DBM Sonic 1200; IGEA). The Cox proportional hazards regression was used to estimate the hazard of the first hip fracture, adjusted for age, BMI, and center, and the area under the ROC curves were calculated to compare the devices and their parameters. RESULTS: From the 7609 women who were included in the study, 7062 women 75.2 +/- 3.1 (SD) years of age were prospectively followed for 2.9 +/- 0.8 years. Eighty women reported a hip fracture. A decrease by 1 SD of the QUS variables corresponded to an increase of the hip fracture risk from 2.3 (95% CI, 1.7, 3.1) to 2.6 (95% CI, 1.9, 3.4) for the three variables of Achilles+ and from 2.2 (95% CI, 1.7, 3.0) to 2.4 (95% CI, 1.8, 3.2) for the three variables of Sahara. Risk gradients did not differ significantly among the variables of the two heel QUS devices. On the other hand, the phalanges QUS (DBM Sonic 1200) was not predictive of hip fracture risk, with an adjusted hazard risk of 1.2 (95% CI, 0.9, 1.5), even after reanalysis of the digitalized data and using different cut-off levels (1700 or 1570 m/s). CONCLUSIONS: In this elderly women population, heel QUS devices were both predictive of hip fracture risk, whereas the phalanges QUS device was not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerative distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicate for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c-myc expression level, is subject to measurement error.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Use of microarray technology often leads to high-dimensional and low- sample size data settings. Over the past several years, a variety of novel approaches have been proposed for variable selection in this context. However, only a small number of these have been adapted for time-to-event data where censoring is present. Among standard variable selection methods shown both to have good predictive accuracy and to be computationally efficient is the elastic net penalization approach. In this paper, adaptation of the elastic net approach is presented for variable selection both under the Cox proportional hazards model and under an accelerated failure time (AFT) model. Assessment of the two methods is conducted through simulation studies and through analysis of microarray data obtained from a set of patients with diffuse large B-cell lymphoma where time to survival is of interest. The approaches are shown to match or exceed the predictive performance of a Cox-based and an AFT-based variable selection method. The methods are moreover shown to be much more computationally efficient than their respective Cox- and AFT- based counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.