986 resultados para Predictive values
Resumo:
BACKGROUND Arthroscopy is considered as "the gold standard" for the diagnosis of traumatic intraarticular knee lesions. However, recent developments in magnetic resonance imaging (MRI) now offer good opportunities for the indirect assessment of the integrity and structural changes of the knee articular cartilage. The study was to investigate whether cartilage-specific sequences on a 3-Tesla MRI provide accurate assessment for the detection of cartilage defects. METHODS A 3-Tesla (3-T) MRI combined with three-dimensional double-echo steady-state (3D-DESS) cartilage specific sequences was performed on 210 patients with knee pain prior to knee arthroscopy. Sensitivity, specificity, and positive and negative predictive values of magnetic resonance imaging were calculated and correlated to the arthroscopic findings of cartilaginous lesions. Lesions were classified using the modified Outerbridge classification. RESULTS For the 210 patients (1260 cartilage surfaces: patella, trochlea, medial femoral condyle, medial tibia, lateral femoral condyle, lateral tibia) evaluated, the sensitivities, specificities, positive predictive values, and negative predictive values of 3-T MRI were 83.3, 99.8, 84.4, and 99.8 %, respectively, for the detection of grade IV lesions; 74.1, 99.6, 85.2, and 99.3 %, respectively, for grade III lesions; 67.9, 99.2, 76.6, and 98.2 %, respectively, for grade II lesions; and 8.8, 99.5, 80, and 92 %, respectively, for grade I lesions. CONCLUSIONS For grade III and IV lesions, 3-T MRI combined with 3D-DESS cartilage-specific sequences represents an accurate diagnostic tool. For grade II lesions, the technique demonstrates moderate sensitivity, while for grade I lesions, the sensitivity is limited to provide reliable diagnosis compared to knee arthroscopy.
Resumo:
BACKGROUND Cam-type femoroacetabular impingement (FAI) resulting from an abnormal nonspherical femoral head shape leads to chondrolabral damage and is considered a cause of early osteoarthritis. A previously developed experimental ovine FAI model induces a cam-type impingement that results in localized chondrolabral damage, replicating the patterns found in the human hip. Biochemical MRI modalities such as T2 and T2* may allow for evaluation of the cartilage biochemistry long before cartilage loss occurs and, for that reason, may be a worthwhile avenue of inquiry. QUESTIONS/PURPOSES We asked: (1) Does the histological grading of degenerated cartilage correlate with T2 or T2* values in this ovine FAI model? (2) How accurately can zones of degenerated cartilage be predicted with T2 or T2* MRI in this model? METHODS A cam-type FAI was induced in eight Swiss alpine sheep by performing a closing wedge intertrochanteric varus osteotomy. After ambulation of 10 to 14 weeks, the sheep were euthanized and a 3-T MRI of the hip was performed. T2 and T2* values were measured at six locations on the acetabulum and compared with the histological damage pattern using the Mankin score. This is an established histological scoring system to quantify cartilage degeneration. Both T2 and T2* values are determined by cartilage water content and its collagen fiber network. Of those, the T2* mapping is a more modern sequence with technical advantages (eg, shorter acquisition time). Correlation of the Mankin score and the T2 and T2* values, respectively, was evaluated using the Spearman's rank correlation coefficient. We used a hierarchical cluster analysis to calculate the positive and negative predictive values of T2 and T2* to predict advanced cartilage degeneration (Mankin ≥ 3). RESULTS We found a negative correlation between the Mankin score and both the T2 (p < 0.001, r = -0.79) and T2* values (p < 0.001, r = -0.90). For the T2 MRI technique, we found a positive predictive value of 100% (95% confidence interval [CI], 79%-100%) and a negative predictive value of 84% (95% CI, 67%-95%). For the T2* technique, we found a positive predictive value of 100% (95% CI, 79%-100%) and a negative predictive value of 94% (95% CI, 79%-99%). CONCLUSIONS T2 and T2* MRI modalities can reliably detect early cartilage degeneration in the experimental ovine FAI model. CLINICAL RELEVANCE T2 and T2* MRI modalities have the potential to allow for monitoring the natural course of osteoarthrosis noninvasively and to evaluate the results of surgical treatments targeted to joint preservation.
Resumo:
BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.
Resumo:
AIMS A non-invasive gene-expression profiling (GEP) test for rejection surveillance of heart transplant recipients originated in the USA. A European-based study, Cardiac Allograft Rejection Gene Expression Observational II Study (CARGO II), was conducted to further clinically validate the GEP test performance. METHODS AND RESULTS Blood samples for GEP testing (AlloMap(®), CareDx, Brisbane, CA, USA) were collected during post-transplant surveillance. The reference standard for rejection status was based on histopathology grading of tissue from endomyocardial biopsy. The area under the receiver operating characteristic curve (AUC-ROC), negative (NPVs), and positive predictive values (PPVs) for the GEP scores (range 0-39) were computed. Considering the GEP score of 34 as a cut-off (>6 months post-transplantation), 95.5% (381/399) of GEP tests were true negatives, 4.5% (18/399) were false negatives, 10.2% (6/59) were true positives, and 89.8% (53/59) were false positives. Based on 938 paired biopsies, the GEP test score AUC-ROC for distinguishing ≥3A rejection was 0.70 and 0.69 for ≥2-6 and >6 months post-transplantation, respectively. Depending on the chosen threshold score, the NPV and PPV range from 98.1 to 100% and 2.0 to 4.7%, respectively. CONCLUSION For ≥2-6 and >6 months post-transplantation, CARGO II GEP score performance (AUC-ROC = 0.70 and 0.69) is similar to the CARGO study results (AUC-ROC = 0.71 and 0.67). The low prevalence of ACR contributes to the high NPV and limited PPV of GEP testing. The choice of threshold score for practical use of GEP testing should consider overall clinical assessment of the patient's baseline risk for rejection.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.
Resumo:
Primary ciliary dyskinesia is a rare heterogeneous recessive genetic disorder of motile cilia, leading to chronic upper and lower respiratory symptoms. Prevalence is estimated at around 1:10,000, but many patients remain undiagnosed, while others receive the label incorrectly. Proper diagnosis is complicated by the fact that the key symptoms such as wet cough, chronic rhinitis and recurrent upper and lower respiratory infection, are common and nonspecific. There is no single gold standard test to diagnose PCD. Presently, the diagnosis is made by augmenting the medical history and physical examination with in patients with a compatible medical history following a demanding combination of tests including nasal nitric oxide, high- speed video microscopy, transmission electron microscopy, genetics, and ciliary culture. These tests are costly and need sophisticated equipment and experienced staff, restricting use to highly specialised centers. Therefore, it would be desirable to have a screening test for identifying those patients who should undergo detailed diagnostic testing. Three recent studies focused on potential screening tools: one paper assessed the validity of nasal nitric oxide for screening, and two studies developed new symptom-based screening tools. These simple tools are welcome, and hopefully remind physicians whom to refer for definitive testing. However, they have been developed in tertiary care settings, where 10 to 50% of tested patients have PCD. Sensitivity and specificity of the tools are reasonable, but positive and negative predictive values may be poor in primary or secondary care settings. While these studies take an important step forward towards an earlier diagnosis of PCD, more remains to be done before we have tools tailored to different health care settings.
Resumo:
Background. Cardiac tamponade can occur when a large amount of fluid, gas, singly or in combination, accumulating within the pericardium, compresses the heart causing circulatory compromise. Although previous investigators have found the 12-lead ECG to have a poor predictive value in diagnosing cardiac tamponade, very few studies have evaluated it as a follow up tool for ruling in or ruling out tamponade in patients with previously diagnosed malignant pericardial effusions. ^ Methods. 127 patients with malignant pericardial effusions at the MD Anderson Cancer Center were included in this retrospective study. While 83 of these patients had a cardiac tamponade diagnosed by echocardiographic criteria (Gold standard), 44 did not. We computed the sensitivity (Se), specificity (Sp), positive (PPV) and negative predictive values (NPV) for individual and combinations of ECG abnormalities. Individual ECG abnormalities were also entered singly into a univariate logistic regression model to predict tamponade. ^ Results. For patients with effusions of all sizes, electrical alternans had a Se, Sp, PPV and NPV of 22.61%, 97.61%, 95% and 39.25% respectively. These parameters for low voltage complexes were 55.95%, 74.44%, 81.03%, 46.37% respectively. The presence of all three ECG abnormalities had a Se = 8.33%, Sp = 100%, PPV = 100% and NPV = 35.83% while the presence of at least one of the three ECG abnormalities had a Se = 89.28%, Sp = 46.51%, PPV = 76.53%, NPV = 68.96%. For patients with effusions of all sizes electrical alternans had an OR of 12.28 (1.58–95.17, p = 0.016), while the presence of at least one ECG abnormality had an OR of 7.25 (2.9–18.1, p = 0.000) in predicting tamponade. ^ Conclusions. Although individual ECG abnormalities had low sensitivities, specificities, NPVs and PPVs with the exception of electrical alternans, the presence of at least one of the three ECG abnormalities had a high sensitivity in diagnosing cardiac tamponade. This could point to its potential use as a screening test with a correspondingly high NPV to rule out a diagnosis of tamponade in patients with malignant pericardial effusions. This could save expensive echocardiographic assessments in patients with previously diagnosed pericardial effusions. ^
Resumo:
The use of exercise electrocardiography (ECG) to detect latent coronary heart disease (CHD) is discouraged in apparently healthy populations because of low sensitivity. These recommendations however, are based on the efficacy of evaluation of ischemia (ST segment changes) with little regard for other measures of cardiac function that are available during exertion. The purpose of this investigation was to determine the association of maximal exercise hemodynamic responses with risk of mortality due to all-causes, cardiovascular disease (CVD), and coronary heart disease (CHD) in apparently healthy individuals. Study participants were 20,387 men (mean age = 42.2 years) and 6,234 women (mean age = 41.9 years) patients of a preventive medicine center in Dallas, TX examined between 1971 and 1989. During an average of 8.1 years of follow-up, there were 348 deaths in men and 66 deaths in women. In men, age-adjusted all-cause death rates (per 10,000 person years) across quartiles of maximal systolic blood pressure (SBP) (low to high) were: 18.2, 16.2, 23.8, and 24.6 (p for trend $<$0.001). Corresponding rates for maximal heart rate were: 28.9, 15.9, 18.4, and 15.1 (p trend $<$0.001). After adjustment for confounding variables including age, resting systolic pressure, serum cholesterol and glucose, body mass index, smoking status, physical fitness and family history of CVD, risks (and 95% confidence interval (CI)) of all-cause mortality for quartiles of maximal SBP, relative to the lowest quartile, were: 0.96 (0.70-1.33), 1.36 (1.01-1.85), and 1.37 (0.98-1.92) for quartiles 2-4 respectively. Similar risks for maximal heart rate were: 0.61 (0.44-0.85), 0.69 (0.51-0.93), and 0.60 (0.41-0.87). No associations were noted between maximal exercise rate-pressure product mortality. Similar results were seen for risk of CVD and CHD death. In women, similar trends in age-adjusted all-cause and CVD death rates across maximal SBP and heart rate categories were observed. Sensitivity of the exercise test in predicting mortality was enhanced when ECG results were evaluated together with maximal exercise SBP or heart rate with a concomitant decrease in specificity. Positive predictive values were not improved. The efficacy of the exercise test in predicting mortality in apparently healthy men and women was not enhanced by using maximal exercise hemodynamic responses. These results suggest that an exaggerated systolic blood pressure or an attenuated heart rate response to maximal exercise are risk factors for mortality in apparently healthy individuals. ^
Resumo:
Nutrient intake and specific food item data from 24-hour dietary recalls were utilized to study the relationship between measures of diet diversity and dietary adequacy in a population of white females of child-bearing age and socioeconomic subgroups of that population. As the basis of the diet diversity measures, twelve food groups were constructed from the 24-hour recall data and the number of unique foods per food group counted and weighted according to specified weighting schemes. Utilizing these food groups, nine diet diversity indices were developed.^ Sensitivity/specificity analysis was used to determine the ability of varying levels of selected diet diversity indices to identify individuals above and below preselected intakes of different nutrients. The true prevalence proportions, sensitivity and specificity, false positive and false negative rates, and positive predictive values observed at the selected levels of diet diversity indices were investigated in relation to the objectives and resources of a variety of nutrition improvement programs. Diet diversity indices constructed from the total population data were evaluated as screening tools for respondent nutrient intakes in each of the socioeconomic subgroups as well.^ The results of the sensitivity/specificity analysis demonstrated that the false positive rate, the false negative rate, or both were too high at each diversity cut-off level to validate the widespread use of any of the diversity indices in the dietary assessment of the study population. Although diet diversity has been shown to be highly correlated with the intakes of a number of nutrients, the diet diversity indices constructed in this study did not adequately represent nutrient intakes in the diet as reported, in this study, intakes as reported in the 24-hour dietary recall. Specific cut-off levels of selected diversity indices might have limited application in some nutrition programs. The results were applicable to the sensitivity/specificity analyses in the socioeconomic subgroups as well as in the total population. ^
Resumo:
Introdução: O diagnóstico histológico das estenoses biliares é fundamental na definição da terapêutica a ser empregada, devido à heterogeneidade dos resultados dos estudos comparando o uso do escovado citológico e da biópsia transpapilar na colangiopancreatografia retrógada endoscópica (CPRE) com a punção aspirativa ecoguiada com agulha fina (ECO-PAAF) no diagnóstico histológico da estenose biliar maligna, e o fato de não existirem revisões sistemáticas e metanálises comparando esses métodos, este estudo propõe comparar esses dois métodos no diagnóstico histológico da estenose biliar maligna, através de revisão sistemática e metanálise da literatura. Métodos: Utilizando as bases de dados eletrônicas Medline, Embase, Cochrane, LILACS, CINAHL, e Scopus foram pesquisados estudos datados anteriormente a novembro de 2014. De um total de 1009 estudos publicados, foram selecionados três estudos prospectivos comparando ECO-PAAF e CPRE no diagnóstico histológico da estenose biliar maligna e cinco estudos transversais comparando ECO-PAAF com o mesmo padrão-ouro dos outros três estudos comparativos. Todos os pacientes foram submetidos ao mesmo padrão-ouro. Foram calculadas as variáveis do estudo (prevalência, sensibilidade, especificidade, valores preditivos positivos e negativos e acurácia) e realizada a metanálise utilizando os softwares Rev Man 5 e Meta-DiSc 1.4. Resultados: Um total de 294 pacientes foi incluído na análise. A probabilidade pré-teste para estenose biliar maligna foi de 76,66%. As sensibilidades médias da CPRE e da ECO-PAAF para o diagnóstico histológico da estenose biliar maligna foram de 49% e 76,5%, respectivamente; especificidades foram de 96,33% e 100%, respectivamente. As probabilidades pós-teste também foram determinadas: valores preditivos positivos de 98,33% e 100%, respectivamente, e valores preditivos negativos de 34% e 58,87%. As acurácias foram 60,66% e 82,25%, respectivamente. Conclusão: A ECO-PAAF é superior a CPRE com escovado citológico e/ou biópsia transpapilar no diagnóstico histológico da estenose biliar maligna. No entanto, um teste de ECO-PAAF ou CPRE com amostra histológica negativa não pode excluir a estenose biliar maligna, pois ambos os testes apresentam baixo valor preditivo negativo
Resumo:
A lesão do plexo braquial é considerada a alteração neural mais grave das extremidades. A principal causa é o trauma de alta energia, especialmente acidentes envolvendo veículos a motor. Por este motivo, as lesões traumáticas do plexo braquial são cada vez mais frequentes. O presente estudo avaliou a acurácia da ressonância magnética (RM) no diagnóstico das lesões traumáticas do plexo braquial no adulto, utilizando o achado intraoperatório como padrão-ouro. Também foi avaliada a acurácia da neurografia pesada em difusão (neurografia DW) em relação à RM convencional e a capacidade de diferenciação dos três tipos de lesão: avulsão, ruptura e lesão em continuidade. Trinta e três pacientes com história e diagnóstico clínico de lesão traumática do plexo braquial foram prospectivamente estudados por RM. Os achados obtidos pela RM sem e com o uso da neurografia DW, e os achados de exame clínico foram comparados com os achados intraoperatórios. A análise estatística foi feita com associação de significância de 5%. Observou-se alta correlação entre a RM com neurografia DW e a cirurgia (rs=0,79), e baixa correlação entre a RM convencional e a cirurgia (rs=0,41). A correlação interobservador foi maior para a RM com neurografia DW (rs = 0,94) do que para a RM sem neurografia DW (rs = 0,75). Os resultados de sensibilidade, acurácia e valor preditivo positivo foram acima de 95% para as RM com e sem neurografia DW no estudo de todo o plexo. As especificidades foram, em geral, maiores para a neurografia DW (p < 0,05). Em relação à diferenciação dos tipos de lesão, a RM com neurografia DW apresentou altas acurácias e sensibilidades no diagnóstico da avulsão/rotura, e alta especificidade no diagnóstico da lesão em continuidade. A acurácia da RM (93,9%) foi significativamente maior que a do exame clínico (76,5%) no diagnóstico das lesões de todo o plexo braquial (p < 0,05).
Resumo:
Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014
Resumo:
A retrospective review was undertaken in 744 patients who were dose-individualized with gentamicin once daily to evaluate a change in gentamicin clearance as a potential predictor of nephrotoxicity. The definition of nephrotoxicity was chosen to be a change in creatinine clearance greater than 20%. Similarly, a change in gentamicin clearance of greater than 20% was also considered a possible index of nephrotoxicity. Four criteria were developed to assess the usefulness of gentamicin clearance as a predictor of nephrotoxicity. Following the application of the inclusion/exclusion criteria, 132 patients were available for the analysis. The sensitivity, specificity, positive predictive value, and negative predictive value were assessed for each of the criteria. Receiver operating characteristic (ROC) curves were produced to determine if an optimum value in the change of gentamicin clearance could be found to maximize sensitivity and specificity. The overall incidence of nephrotoxicity based on a decrease in creatinine clearance by 20% or more was 3.8%. Women were overrepresented in the nephrotoxic group [71.4% versus 40.1% (P = 0.0025)]. Patients with nephrotoxicity had statistically longer treatment periods, increased cumulative dose, and more dosing predictions (P < 0.05 in each case). The sensitivity of the criteria ranged from 43 to 46%, and specificity ranged from 93 to 99%. The positive and negative predictive values ranged from 63 to 94% and 86 to 89%, respectively. In those patients in whom nephrotoxicity was predicted from a change in gentamicin clearance, this change occurred on average 3 days before the change in creatinine clearance (P < 0.05). A change in gentamicin clearance to predict nephrotoxicity may be a useful addition to current monitoring methods, although it is not the complete answer.
Resumo:
Aim: The aim of this study was to assess the discriminatory power and potential turn around time ( TAT) of a PCR-based method for the detection of methicillin-resistant Staphylococcus aureus (MRSA) from screening swabs. Methods: Screening swabs were examined using the current laboratory protocol of direct culture on mannitol salt agar supplemented with oxacillin (MSAO-direct). The PCR method involved pre-incubation in broth for 4 hours followed by a multiplex PCR with primers directed to mecA and nuc genes of MRSA. The reference standard was determined by pre-incubation in broth for 4 hours followed by culture on MSAO (MSAO-broth). Results: A total of 256 swabs was analysed. The rates of detection of MRSA using MSAO-direct, MSAO-broth and PCR were 10.2, 13.3 and 10.2%, respectively. For PCR, the sensitivity, specificity, positive predictive value and negative predictive values were 66.7% (95% CI 51.9 - 83.3%), 98.6% ( 95% CI 97.1 - 100%), 84.6% ( 95% CI 76.2 - 100%) and 95.2% ( 95% CI 92.4 - 98.0%), respectively, and these results were almost identical to those obtained from MSAO-direct. The agreement between MSAO-direct and PCR was 61.5% ( 95% CI 42.8 - 80.2%) for positive results, 95.6% ( 95% CI 93.0 - 98.2%) for negative results and overall was 92.2% ( 95% CI 88.9 - 95.5%). Conclusions: ( 1) The discriminatory power of PCR and MSAO-direct is similar but the level of agreement, especially for true positive results, is low. ( 2) The potential TAT for the PCR method provides a marked advantage over conventional methods. ( 3) Further modifications to the PCR method such as increased broth incubation time, use of selective broth and adaptation to real-time PCR may lead to improvement in sensitivity and TAT.
Resumo:
Objective: The description and evaluation of the performance of a new real-time seizure detection algorithm in the newborn infant. Methods: The algorithm includes parallel fragmentation of EEG signal into waves; wave-feature extraction and averaging; elementary, preliminary and final detection. The algorithm detects EEG waves with heightened regularity, using wave intervals, amplitudes and shapes. The performance of the algorithm was assessed with the use of event-based and liberal and conservative time-based approaches and compared with the performance of Gotman's and Liu's algorithms. Results: The algorithm was assessed on multi-channel EEG records of 55 neonates including 17 with seizures. The algorithm showed sensitivities ranging 83-95% with positive predictive values (PPV) 48-77%. There were 2.0 false positive detections per hour. In comparison, Gotman's algorithm (with 30 s gap-closing procedure) displayed sensitivities of 45-88% and PPV 29-56%; with 7.4 false positives per hour and Liu's algorithm displayed sensitivities of 96-99%, and PPV 10-25%; with 15.7 false positives per hour. Conclusions: The wave-sequence analysis based algorithm displayed higher sensitivity, higher PPV and a substantially lower level of false positives than two previously published algorithms. Significance: The proposed algorithm provides a basis for major improvements in neonatal seizure detection and monitoring. Published by Elsevier Ireland Ltd. on behalf of International Federation of Clinical Neurophysiology.