939 resultados para receiver operating characteristic curve


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A retrospective review was undertaken in 744 patients who were dose-individualized with gentamicin once daily to evaluate a change in gentamicin clearance as a potential predictor of nephrotoxicity. The definition of nephrotoxicity was chosen to be a change in creatinine clearance greater than 20%. Similarly, a change in gentamicin clearance of greater than 20% was also considered a possible index of nephrotoxicity. Four criteria were developed to assess the usefulness of gentamicin clearance as a predictor of nephrotoxicity. Following the application of the inclusion/exclusion criteria, 132 patients were available for the analysis. The sensitivity, specificity, positive predictive value, and negative predictive value were assessed for each of the criteria. Receiver operating characteristic (ROC) curves were produced to determine if an optimum value in the change of gentamicin clearance could be found to maximize sensitivity and specificity. The overall incidence of nephrotoxicity based on a decrease in creatinine clearance by 20% or more was 3.8%. Women were overrepresented in the nephrotoxic group [71.4% versus 40.1% (P = 0.0025)]. Patients with nephrotoxicity had statistically longer treatment periods, increased cumulative dose, and more dosing predictions (P < 0.05 in each case). The sensitivity of the criteria ranged from 43 to 46%, and specificity ranged from 93 to 99%. The positive and negative predictive values ranged from 63 to 94% and 86 to 89%, respectively. In those patients in whom nephrotoxicity was predicted from a change in gentamicin clearance, this change occurred on average 3 days before the change in creatinine clearance (P < 0.05). A change in gentamicin clearance to predict nephrotoxicity may be a useful addition to current monitoring methods, although it is not the complete answer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES This study was designed to predict the response and prognosis after cardiac resynchronization therapy (CRT) in patients with end-stage heart failure (HF). BACKGROUND Cardiac resynchronization therapy improves HF symptoms, exercise capacity, and left ventricular (LV) function. Because not all patients respond, preimplantation identification of responders is needed. In the present study, response to CRT was predicted by the presence of LV dyssynchrony assessed by tissue Doppler imaging. Moreover, the prognostic value of LV dyssynchrony in patients undergoing CRT was assessed. METHODS Eighty-five patients with end-stage HF, QRS duration >120 ins, and left bundle-branch block were evaluated by tissue Doppler imaging before CRT. At baseline and six months follow-up, New York Heart Association functional class, quality of life and 6-min walking distance, LV volumes, and LV ejection fraction were determined. Events (death, hospitalization for decompensated HF) were obtained during one-year follow-up. RESULTS Responders (74%) and nonresponders (26%) had comparable baseline characteristics, except for a larger dyssynchrony in responders (87 +/- 49 ms vs. 35 +/- 20 ms, p < 0.01). Receiver-operator characteristic curve analysis demonstrated that an optimal cutoff value of 65 ms for LV dyssynchrony yielded a sensitivity and specificity of 80% to predict clinical improvement and of 92% to predict LV reverse remodeling. Patients with dyssynchrony :65 ms had an excellent prognosis (6% event rate) after CRT as compared with a 50% event rate in patients with dyssynchrony <65 ins (p < 0.001). CONCLUSIONS Patients with LV dyssynchrony greater than or equal to65 ms respond to CRT and have an excellent prognosis after CRT. (C) 2004 by the American College of Cardiology Foundation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical evaluation of arterial potency in acute ST-elevation myocardial infarction (STEMI) is unreliable. We sought to identify infarction and predict infarct-related artery potency measured by the Thrombolysis In Myocardial Infarction (TIMI) score with qualitative and quantitative intravenous myocardial contrast echocardiography (MCE). Thirty-four patients with suspected STEMI underwent MCE before emergency angiography and planned angioplasty. MCE was performed with harmonic imaging and variable triggering intervals during intravenous administration of Optison. Myocardial perfusion was quantified offline, fitting an exponential function to contrast intensity at various pulsing intervals. Plateau myocardial contrast intensity (A), rate of rise (beta), and myocardial flow (Q = A x beta) were assessed in 6 segments. Qualitative assessment of perfusion defects was sensitive for the diagnosis of infarction (sensitivity 93%) and did not differ between anterior and inferior infarctions. However, qualitative assessment had only moderate specificity (50%), and perfusion defects were unrelated to TIMI flow. In patients with STEMI, quantitatively derived myocardial blood flow Q (A x beta) was significantly lower in territories subtended by an artery with impaired (TIMI 0 to 2) flow than those territories supplied by a reperfused artery with TIMI 3 flow (10.2 +/- 9.1 vs 44.3 +/- 50.4, p = 0.03). Quantitative flow was also lower in segments with impaired flow in the subtending artery compared with normal patients with TIMI 3 flow (42.8 +/- 36.6, p = 0.006) and all segments with TIMI 3 flow (35.3 +/- 32.9, p = 0.018). An receiver-operator characteristic curve derived cut-off Q value of

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Fetal scalp lactate testing has been shown to be as useful as pH with added benefits. One remaining question is What level of lactate should trigger intervention in the first stage of labour?' Aims: This study aimed to establish the lactate level in the first stage of labour that indicates the need for intervention to ensure satisfactory outcomes for both babies and mothers. Methods: A prospective study at Mater Mothers' Hospital, Brisbane, Australia, a tertiary referral centre. One hundred and forty women in labour, with non-reassuring fetal heart rate traces, were tested using fetal blood scalp sampling of 5 mu L of capillary blood tested on an Accusport (Boeringer, Mannheim, East Sussex, UK) lactate meter. Decision to intervene in labour was based on clinical assessment plus a predetermined cut off. Main outcome measures were APGAR scores, cord arterial pH, meconium stained liquor and Intensive Care Nursery admission. Results: Two-graph receiver operating characteristic (TG-ROC) analysis showed optimal specificity, and sensitivity for predicting adverse neonatal outcomes was a scalp lactate level above 4.2 mmol/L. Conclusions: Fetal blood sampling remains the standard for further investigating-non-reassuring cardiotocograph (CTG) traces. Even so, it is a poor predictor of fetal outcomes. Scalp lactate has been shown to be at least as good a predictor as scalp pH, with the advantages of being easier, cheaper and with a lower rate of technical failure. Our study, found that a cut off fetal scalp lactate level of 4.2 mmol/L, in combination with an assessment of the entire clinical picture, is a useful tool in identifying those women who need intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting the various responses of different species to changes in landscape structure is a formidable challenge to landscape ecology. Based on expert knowledge and landscape ecological theory, we develop five competing a priori models for predicting the presence/absence of the Koala (Phascolarctos cinereus) in Noosa Shire, south-east Queensland (Australia). A priori predictions were nested within three levels of ecological organization: in situ (site level) habitat (< 1 ha), patch level (100 ha) and landscape level (100-1000 ha). To test the models, Koala surveys and habitat surveys (n = 245) were conducted across the habitat mosaic. After taking into account tree species preferences, the patch and landscape context, and the neighbourhood effect of adjacent present sites, we applied logistic regression and hierarchical partitioning analyses to rank the alternative models and the explanatory variables. The strongest support was for a multilevel model, with Koala presence best predicted by the proportion of the landscape occupied by high quality habitat, the neighbourhood effect, the mean nearest neighbour distance between forest patches, the density of forest patches and the density of sealed roads. When tested against independent data (n = 105) using a receiver operator characteristic curve, the multilevel model performed moderately well. The study is consistent with recent assertions that habitat loss is the major driver of population decline, however, landscape configuration and roads have an important effect that needs to be incorporated into Koala conservation strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study is to test the case linkage principles of behavioural consistency and behavioural distinctiveness using serial vehicle theft data. Data from 386 solved vehicle thefts committed by 193 offenders were analysed using Jaccard's, regression and Receiver Operating Characteristic analyses to determine whether objectively observable aspects of crime scene behaviour could be used to distinguish crimes committed by the same offender from those committed by different offenders. The findings indicate that spatial behaviour, specifically the distance between theft locations and between dump locations, is a highly consistent and distinctive aspect of vehicle theft behaviour; thus, intercrime and interdump distance represent the most useful aspects of vehicle theft for the purpose of case linkage analysis. The findings have theoretical and practical implications for understanding of criminal behaviour and for the development of decision-support tools to assist police investigation and apprehension of serial vehicle theft offenders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a new entropy measure known as kernel entropy (KerEnt), which quantifies the irregularity in a series, was applied to nocturnal oxygen saturation (SaO 2) recordings. A total of 96 subjects suspected of suffering from sleep apnea-hypopnea syndrome (SAHS) took part in the study: 32 SAHS-negative and 64 SAHS-positive subjects. Their SaO 2 signals were separately processed by means of KerEnt. Our results show that a higher degree of irregularity is associated to SAHS-positive subjects. Statistical analysis revealed significant differences between the KerEnt values of SAHS-negative and SAHS-positive groups. The diagnostic utility of this parameter was studied by means of receiver operating characteristic (ROC) analysis. A classification accuracy of 81.25% (81.25% sensitivity and 81.25% specificity) was achieved. Repeated apneas during sleep increase irregularity in SaO 2 data. This effect can be measured by KerEnt in order to detect SAHS. This non-linear measure can provide useful information for the development of alternative diagnostic techniques in order to reduce the demand for conventional polysomnography (PSG). © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - The binding between peptide epitopes and major histocompatibility complex proteins (MHCs) is an important event in the cellular immune response. Accurate prediction of the binding between short peptides and the MHC molecules has long been a principal challenge for immunoinformatics. Recently, the modeling of MHC-peptide binding has come to emphasize quantitative predictions: instead of categorizing peptides as "binders" or "non-binders" or as "strong binders" and "weak binders", recent methods seek to make predictions about precise binding affinities. Results - We developed a quantitative support vector machine regression (SVR) approach, called SVRMHC, to model peptide-MHC binding affinities. As a non-linear method, SVRMHC was able to generate models that out-performed existing linear models, such as the "additive method". By adopting a new "11-factor encoding" scheme, SVRMHC takes into account similarities in the physicochemical properties of the amino acids constituting the input peptides. When applied to MHC-peptide binding data for three mouse class I MHC alleles, the SVRMHC models produced more accurate predictions than those produced previously. Furthermore, comparisons based on Receiver Operating Characteristic (ROC) analysis indicated that SVRMHC was able to out-perform several prominent methods in identifying strongly binding peptides. Conclusion - As a method with demonstrated performance in the quantitative modeling of MHC-peptide binding and in identifying strong binders, SVRMHC is a promising immunoinformatics tool with not inconsiderable future potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Despite initial concerns about the sensitivity of the proposed diagnostic criteria for DSM-5 Autism Spectrum Disorder (ASD; e.g. Gibbs et al., 2012; McPartland et al., 2012), evidence is growing that the DSM-5 criteria provides an inclusive description with both good sensitivity and specificity (e.g. Frazier et al., 2012; Kent, Carrington et al., 2013). The capacity of the criteria to provide high levels of sensitivity and specificity comparable with DSM-IV-TR however relies on careful measurement to ensure that appropriate items from diagnostic instruments map onto the new DSM-5 descriptions.Objectives: To use an existing DSM-5 diagnostic algorithm (Kent, Carrington et .al., 2013) to identify a set of ‘essential’ behaviors sufficient to make a reliable and accurate diagnosis of DSM-5 Autism Spectrum Disorder (ASD) across age and ability level. Methods: Specific behaviors were identified and tested from the recently published DSM-5 algorithm for the Diagnostic Interview for Social and Communication Disorders (DISCO). Analyses were run on existing DISCO datasets, with a total participant sample size of 335. Three studies provided step-by-step development towards identification of a minimum set of items. Study 1 identified the most highly discriminating items (p<.0001). Study 2 used a lower selection threshold than in Study 1 (p<.05) to facilitate better representation of the full DSM-5 ASD profile. Study 3 included additional items previously reported as significantly more frequent in individuals with higher ability. The discriminant validity of all three item sets was tested using Receiver Operating Characteristic curves. Finally, sensitivity across age and ability was investigated in a subset of individuals with ASD (n=190).Results: Study 1 identified an item set (14 items) with good discriminant validity, but which predominantly measured social-communication behaviors (11/14). The Study 2 item set (48 items) better represented the DSM-5 ASD and had good discriminant validity, but the item set lacked sensitivity for individuals with higher ability. The final Study 3 adjusted item set (54 items) improved sensitivity for individuals with higher ability and performance and was comparable to the published DISCO DSM-5 algorithm.Conclusions: This work represents a first attempt to derive a reduced set of behaviors for DSM-5 directly from an existing standardized ASD developmental history interview. Further work involving existing ASD diagnostic tools with community-based and well characterized research samples will be required to replicate these findings and exploit their potential to contribute to a more efficient and focused ASD diagnostic process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to identify a set of 'essential' behaviours sufficient for diagnosis of DSM-5 Autism Spectrum Disorder (ASD). Highly discriminating, 'essential' behaviours were identified from the published DSM-5 algorithm developed for the Diagnostic Interview for Social and Communication Disorders (DISCO). Study 1 identified a reduced item set (48 items) with good predictive validity (as measured using receiver operating characteristic curves) that represented all symptom sub-domains described in the DSM-5 ASD criteria but lacked sensitivity for individuals with higher ability. An adjusted essential item set (54 items; Study 2) had good sensitivity when applied to individuals with higher ability and performance was comparable to the published full DISCO DSM-5 algorithm. Investigation at the item level revealed that the most highly discriminating items predominantly measured social-communication behaviours. This work represents a first attempt to derive a reduced set of behaviours for DSM-5 directly from an existing standardised ASD developmental history interview and has implications for the use of DSM-5 criteria for clinical and research practice. © 2014 The Authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’obésité est un problème de santé publique reconnu. Dans la dernière décennie l’obésité abdominale (OA) a été considérée comme une maladie métabolique qui contribue davantage au risque de diabète et de maladies cardiovasculaires que l’obésité générale définie par l’indice de masse corporelle. Toutefois, dans les populations d’origine africaine, la relation entre l’OA et les autres biomarqueurs de risque cardiométabolique (RCM) demeure obscure à cause du manque d’études chez ces populations et de l’absence de valeurs-seuils spécifiques pour juger d’une OA. Cette étude visait à comparer la prévalence des biomarqueurs de RCM (OA, hypertension artérielle, hyperglycémie, dyslipidémie, résistance à l'insuline et inflammation pré-clinique) chez les Béninois de Cotonou et les Haïtiens de Port-au-Prince (PAP), à étudier l’association de l’OA avec les autres biomarqueurs de RCM, à documenter le rôle du niveau socio-économique (NSE) et du mode de vie dans cette association et à ’identifier les indicateurs anthropométriques de l’OA -tour de taille (TT) et le ratio TT/hauteur (TT/H)- et les seuils qui prédisent le mieux le RCM à Cotonou et à PAP. Il s’est agi d’une analyse de données transversales chez 452 adultes (52 % hommes) apparemment en bonne santé, âgés de 25 à 60 ans, avec 200 sujets vivant à Cotonou (Bénin) et 252 sujets à PAP (Haïti). Les biomarqueurs de RCM considérés étaient : le syndrome métabolique (SMet) d’après les critères harmonisés de 2009 et ses composantes individuelles - une OA à partir d’un TT ≥ 94cm chez les hommes et ≥ 80cm chez les femmes, une hypertension, une dyslipidémie et une hyperglycémie; la résistance à l’insuline définie chez l’ensemble des sujets de l’étude à partir du 75e centile de l’Homeostasis Model Assessment (HOMA-IR); un ratio d’athérogénicité élevé (Cholestérol sérique total/HDL-Cholestérol); et l’inflammation pré-clinique mesurée à partir d’un niveau de protéine C-réactive ultrasensible (PCRus) entre 3 et 10 mg/l. Le ratio TT/H était aussi considéré pour définir l’OA à partir d’un seuil de 0,5. Les données sur les habitudes alimentaires, la consommation d’alcool, le tabagisme, les caractéristiques sociodémographiques et les conditions socio-économiques incluant le niveau d’éducation et un proxy du revenu (basé sur l’analyse par composante principale des biens et des possessions) ont été recueillies au moyen d’un questionnaire. Sur la base de données de fréquence de consommation d’aliments occidentaux, urbains et traditionnels, des schémas alimentaires des sujets de chaque ville ont été identifiés par analyse typologique. La validité et les valeurs-seuils de TT et du ratio TT/H prédictives du RCM ont été définies à partir des courbes ROC (Receiver Operating Characteristics). Le SMet était présent chez 21,5 % et 16,1 % des participants, respectivement à Cotonou et à PAP. La prévalence d’OA était élevée à Cotonou (52,5 %) qu’à PAP (36%), avec une prévalence plus élevée chez les femmes que chez les hommes. Le profil lipidique sérique était plus athérogène à PAP avec 89,3 % d’HDL-c bas à PAP contre 79,7 % à Cotonou et un ratio CT/HDL-c élevé de 73,4 % à PAP contre 42 % à Cotonou. Les valeurs-seuils spécifiques de TT et du TT/H étaient respectivement 94 cm et 0,59 chez les femmes et 80 cm et 0,50 chez les hommes. Les analyses multivariées de l’OA avec les biomarqueurs de RCM les plus fortement prévalents dans ces deux populations montraient que l’OA était associée à un risque accru de résistance à l’insuline, d’athérogénicité et de tension artérielle élevée et ceci, indépendamment des facteurs socio-économiques et du mode de vie. Deux schémas alimentaires ont émergé, transitionnel et traditionnel, dans chaque ville, mais ceux-ci ne se révélaient pas associés aux biomarqueurs de RCM bien qu’ils soient en lien avec les variables socio-économiques. La présente étude confirme la présence de plusieurs biomarqueurs de RCM chez des sujets apparemment sains. En outre, l’OA est un élément clé du RCM dans ces deux populations. Les seuils actuels de TT devraient être reconsidérés éventuellement à la lumière d’études de plus grande envergure, afin de mieux définir l’OA chez les Noirs africains ou d’origine africaine, ce qui permettra une surveillance épidémiologique plus adéquate des biomarqueurs de RCM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS: Renal dysfunction is a powerful predictor of adverse outcomes in patients hospitalized for acute coronary syndrome. Three new glomerular filtration rate (GFR) estimating equations recently emerged, based on serum creatinine (CKD-EPIcreat), serum cystatin C (CKD-EPIcyst) or a combination of both (CKD-EPIcreat/cyst), and they are currently recommended to confirm the presence of renal dysfunction. Our aim was to analyse the predictive value of these new estimated GFR (eGFR) equations regarding mid-term mortality in patients with acute coronary syndrome, and compare them with the traditional Modification of Diet in Renal Disease (MDRD-4) formula. METHODS AND RESULTS: 801 patients admitted for acute coronary syndrome (age 67.3±13.3 years, 68.5% male) and followed for 23.6±9.8 months were included. For each equation, patient risk stratification was performed based on eGFR values: high-risk group (eGFR<60ml/min per 1.73m2) and low-risk group (eGFR⩾60ml/min per 1.73m2). The predictive performances of these equations were compared using area under each receiver operating characteristic curves (AUCs). Overall risk stratification improvement was assessed by the net reclassification improvement index. The incidence of the primary endpoint was 18.1%. The CKD-EPIcyst equation had the highest overall discriminate performance regarding mid-term mortality (AUC 0.782±0.20) and outperformed all other equations (ρ<0.001 in all comparisons). When compared with the MDRD-4 formula, the CKD-EPIcyst equation accurately reclassified a significant percentage of patients into more appropriate risk categories (net reclassification improvement index of 11.9% (p=0.003)). The CKD-EPIcyst equation added prognostic power to the Global Registry of Acute Coronary Events (GRACE) score in the prediction of mid-term mortality. CONCLUSION: The CKD-EPIcyst equation provides a novel and improved method for assessing the mid-term mortality risk in patients admitted for acute coronary syndrome, outperforming the most widely used formula (MDRD-4), and improving the predictive value of the GRACE score. These results reinforce the added value of cystatin C as a risk marker in these patients.