996 resultados para ROC Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Falls of elderly people may cause permanent disability or death. Particularly susceptible are elderly patients in rehabilitation hospitals. We systematically reviewed the literature to identify falls prediction tools available for assessing elderly inpatients in rehabilitation hospitals. Methods and Findings We searched six electronic databases using comprehensive search strategies developed for each database. Estimates of sensitivity and specificity were plotted in ROC space graphs and pooled across studies. Our search identified three studies which assessed the prediction properties of falls prediction tools in a total of 754 elderly inpatients in rehabilitation hospitals. Only the STRATIFY tool was assessed in all three studies; the other identified tools (PJC-FRAT and DOWNTON) were assessed by a single study. For a STRATIFY cut-score of two, pooled sensitivity was 73% (95%CI 63 to 81%) and pooled specificity was 42% (95%CI 34 to 51%). An indirect comparison of the tools across studies indicated that the DOWNTON tool has the highest sensitivity (92%), while the PJC-FRAT offers the best balance between sensitivity and specificity (73% and 75%, respectively). All studies presented major methodological limitations. Conclusions We did not identify any tool which had an optimal balance between sensitivity and specificity, or which were clearly better than a simple clinical judgment of risk of falling. The limited number of identified studies with major methodological limitations impairs sound conclusions on the usefulness of falls risk prediction tools in geriatric rehabilitation hospitals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial independent component analysis (sICA) of functional magnetic resonance imaging (fMRI) time series can generate meaningful activation maps and associated descriptive signals, which are useful to evaluate datasets of the entire brain or selected portions of it. Besides computational implications, variations in the input dataset combined with the multivariate nature of ICA may lead to different spatial or temporal readouts of brain activation phenomena. By reducing and increasing a volume of interest (VOI), we applied sICA to different datasets from real activation experiments with multislice acquisition and single or multiple sensory-motor task-induced blood oxygenation level-dependent (BOLD) signal sources with different spatial and temporal structure. Using receiver operating characteristics (ROC) methodology for accuracy evaluation and multiple regression analysis as benchmark, we compared sICA decompositions of reduced and increased VOI fMRI time-series containing auditory, motor and hemifield visual activation occurring separately or simultaneously in time. Both approaches yielded valid results; however, the results of the increased VOI approach were spatially more accurate compared to the results of the decreased VOI approach. This is consistent with the capability of sICA to take advantage of extended samples of statistical observations and suggests that sICA is more powerful with extended rather than reduced VOI datasets to delineate brain activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies of diagnostic accuracy require more sophisticated methods for their meta-analysis than studies of therapeutic interventions. A number of different, and apparently divergent, methods for meta-analysis of diagnostic studies have been proposed, including two alternative approaches that are statistically rigorous and allow for between-study variability: the hierarchical summary receiver operating characteristic (ROC) model (Rutter and Gatsonis, 2001) and bivariate random-effects meta-analysis (van Houwelingen and others, 1993), (van Houwelingen and others, 2002), (Reitsma and others, 2005). We show that these two models are very closely related, and define the circumstances in which they are identical. We discuss the different forms of summary model output suggested by the two approaches, including summary ROC curves, summary points, confidence regions, and prediction regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Meta-analysis of studies of the accuracy of diagnostic tests currently uses a variety of methods. Statistically rigorous hierarchical models require expertise and sophisticated software. We assessed whether any of the simpler methods can in practice give adequately accurate and reliable results. STUDY DESIGN AND SETTING: We reviewed six methods for meta-analysis of diagnostic accuracy: four simple commonly used methods (simple pooling, separate random-effects meta-analyses of sensitivity and specificity, separate meta-analyses of positive and negative likelihood ratios, and the Littenberg-Moses summary receiver operating characteristic [ROC] curve) and two more statistically rigorous approaches using hierarchical models (bivariate random-effects meta-analysis and hierarchical summary ROC curve analysis). We applied the methods to data from a sample of eight systematic reviews chosen to illustrate a variety of patterns of results. RESULTS: In each meta-analysis, there was substantial heterogeneity between the results of different studies. Simple pooling of results gave misleading summary estimates of sensitivity and specificity in some meta-analyses, and the Littenberg-Moses method produced summary ROC curves that diverged from those produced by more rigorous methods in some situations. CONCLUSION: The closely related hierarchical summary ROC curve or bivariate models should be used as the standard method for meta-analysis of diagnostic accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dual energy X-ray absorptiometry (DXA) is widely accepted as the reference method for diagnosis and monitoring of osteoporosis and for assessment of fracture risk, especially at hip. However, axial-DXA is not suitable for mass screening, because it is usually confined to specialized centers. We propose a two-step diagnostic approach to postmenopausal osteoporosis: the first step, using an inexpensive, widely available screening technique, aims at risk stratification in postmenopausal women; the second step, DXA of spine and hip is applied only to potentially osteoporotic women preselected on the basis of the screening measurement. In a group of 110 healthy postmenopausal woman, the capability of various peripheral bone measurement techniques to predict osteoporosis at spine and/or hip (T-score < -2.5SD using DXA) was tested using receiver operating characteristic (ROC) curves: radiographic absorptiometry of phalanges (RA), ultrasonometry at calcaneus (QUS. CALC), tibia (SOS.TIB), and phalanges (SOS.PHAL). Thirty-three women had osteoporosis at spine and/or hip with DXA. Areas under the ROC curves were 0.84 for RA, 0.83 for QUS.CALC, 0.77 for SOS.PHAL (p < 0.04 vs RA) and 0.74 for SOS.TIB (p < 0.02 vs RA and p = 0.05 vs QUS.CALC). For levels of sensitivity of 90%, the respective specificities were 67% (RA), 64% (QUS.CALC), 48% (SOS.PHAL), and 39% (SOS.TIB). In a cost-effective two-step, the price of the first step should not exceed 54% (RA), 51% (QUS.CALC), 42% (SOS.PHAL), and 25% (SOS.TIB). In conclusion, RA, QUS.CALC, SOS.PHAL, and SOS.TIB may be useful to preselect postmenopausal women in whom axial DXA is indicated to confirm/exclude osteoporosis at spine or hip.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigates the performance of cardiorespiratory analysis detecting periodic breathing (PB) in chest wall recordings in mountaineers climbing to extreme altitude. The breathing patterns of 34 mountaineers were monitored unobtrusively by inductance plethysmography, ECG and pulse oximetry using a portable recorder during climbs at altitudes between 4497 and 7546 m on Mt. Muztagh Ata. The minute ventilation (VE) and heart rate (HR) signals were studied, to identify visually scored PB, applying time-varying spectral, coherence and entropy analysis. In 411 climbing periods, 30-120 min in duration, high values of mean power (MP(VE)) and slope (MSlope(VE)) of the modulation frequency band of VE, accurately identified PB, with an area under the ROC curve of 88 and 89%, respectively. Prolonged stay at altitude was associated with an increase in PB. During PB episodes, higher peak power of ventilatory (MP(VE)) and cardiac (MP(LF)(HR) ) oscillations and cardiorespiratory coherence (MP(LF)(Coher)), but reduced ventilation entropy (SampEn(VE)), was observed. Therefore, the characterization of cardiorespiratory dynamics by the analysis of VE and HR signals accurately identifies PB and effects of altitude acclimatization, providing promising tools for investigating physiologic effects of environmental exposures and diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Analysis of exhaled volatile organic compounds (VOCs) in breath is an emerging approach for cancer diagnosis, but little is known about its potential use as a biomarker for colorectal cancer (CRC). We investigated whether a combination of VOCs could distinct CRC patients from healthy volunteers. Methods: In a pilot study, we prospectively analyzed breath exhalations of 38 CRC patient and 43 healthy controls all scheduled for colonoscopy, older than 50 in the average-risk category. The samples were ionized and analyzed using a Secondary ElectroSpray Ionization (SESI) coupled with a Time-of-Flight Mass Spectrometer (SESI-MS). After a minimum of 2 hours fasting, volunteers deeply exhaled into the system. Each test requires three soft exhalations and takes less than ten minutes. No breath condensate or collection are required and VOCs masses are detected in real time, also allowing for a spirometric profile to be analyzed along with the VOCs. A new sampling system precludes ambient air from entering the system, so background contamination is reduced by an overall factor of ten. Potential confounding variables from the patient or the environment that could interfere with results were analyzed. Results: 255 VOCs, with masses ranging from 30 to 431 Dalton have been identified in the exhaled breath. Using a classification technique based on the ROC curve for each VOC, a set of 9 biomarkers discriminating the presence of CRC from healthy volunteers was obtained, showing an average recognition rate of 81.94%, a sensitivity of 87.04% and specificity of 76.85%. Conclusions: A combination of cualitative and cuantitative analysis of VOCs in the exhaled breath could be a powerful diagnostic tool for average-risk CRC population. These results should be taken with precaution, as many endogenous or exogenous contaminants could interfere as confounding variables. On-line analysis with SESI-MS is less time-consuming and doesn’t need sample preparation. We are recruiting in a new pilot study including breath cleaning procedures and spirometric analysis incorporated into the postprocessing algorithms, to better control for confounding variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between sleep apnoea–hypopnoea syndrome (SAHS) severity and the regularity of nocturnal oxygen saturation (SaO2) recordings was analysed. Three different methods were proposed to quantify regularity: approximate entropy (AEn), sample entropy (SEn) and kernel entropy (KEn). A total of 240 subjects suspected of suffering from SAHS took part in the study. They were randomly divided into a training set (96 subjects) and a test set (144 subjects) for the adjustment and assessment of the proposed methods, respectively. According to the measurements provided by AEn, SEn and KEn, higher irregularity of oximetry signals is associated with SAHS-positive patients. Receiver operating characteristic (ROC) and Pearson correlation analyses showed that KEn was the most reliable predictor of SAHS. It provided an area under the ROC curve of 0.91 in two-class classification of subjects as SAHS-negative or SAHS-positive. Moreover, KEn measurements from oximetry data exhibited a linear dependence on the apnoea–hypopnoea index, as shown by a correlation coefficient of 0.87. Therefore, these measurements could be used for the development of simplified diagnostic techniques in order to reduce the demand for polysomnographies. Furthermore, KEn represents a convincing alternative to AEn and SEn for the diagnostic analysis of noisy biomedical signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Cyprus dispute accurately portrays the evolution of the conflict from ‘warfare to lawfare’ enriched in politics; this research has proven that the Cyprus problem has been and will continue to be one of the most judicialised disputes across the globe. Notwithstanding the ‘normalisation’ of affairs between the two ethno-religious groups on the island since the division in 1974, the Republic of Cyprus’ (RoC) European Union (EU) membership in 2004 failed to catalyse reunification and terminate the legal, political and economic isolation of the Turkish Cypriot community. So the question is; why is it that the powerful legal order of the EU continuously fails to tame the tiny troublesome island of Cyprus? This is a thesis on the interrelationship of the EU legal order and the Cyprus problem. A literal and depoliticised interpretation of EU law has been maintained throughout the EU’s dealings with Cyprus, hence, pre-accession and post-accession. The research has brought to light that this literal interpretation of EU law vis-à-vis Cyprus has in actual fact deepened the division on the island. Pessimists outnumber optimists so far as resolving this problem is concerned, and rightly so if you look back over the last forty years of failed attempts to do just that, a diplomatic combat zone scattered with the bones of numerous mediators. This thesis will discuss how the decisions of the EU institutions, its Member States and specifically of the European Court of Justice, despite conforming to the EU legal order, have managed to disregard the principle of equality on the divided island and thus prevent the promised upgrade of the status of the Turkish Cypriot community since 2004. Indeed, whether a positive or negative reading of the Union’s position towards the Cyprus problem is adopted, the case remains valid for an organisation based on the rule of law to maintain legitimacy, democracy, clarity and equality to the decisions of its institutions. Overall, the aim of this research is to establish a link between the lack of success of the Union to build a bridge over troubled waters and the right of self-determination of the Turkish Cypriot community. The only way left for the EU to help resolve the Cyprus problem is to aim to broker a deal between the two Cypriot communities which will permit the recognition of the Turkish Republic of Northern Cyprus (TRNC) or at least the ‘Taiwanisation’ of Northern Cyprus. Albeit, there are many studies that address the impact of the EU on the conflict or the RoC, which represents the government that has monopolised EU accession, the argument advanced in this thesis is that despite the alleged Europeanisation of the Turkish Cypriot community, they are habitually disregarded because of the EU’s current legal framework and the Union’s lack of conflict transformation strategy vis-à-vis the island. Since the self-declared TRNC is not recognised and EU law is suspended in northern Cyprus in accordance with Protocol No 10 on Cyprus of the Act of Accession 2003, the Turkish-Cypriots represent an idiomatic partner of Brussels but the relations between the two resemble the experience of EU enlargement: the EU’s relevance to the community has been based on the prospects for EU accession (via reunification) and assistance towards preparation for potential EU integration through financial and technical aid. Undeniably, the pre-accession and postaccession strategy of Brussels in Cyprus has worsened the Cyprus problem and hindered the peace process. The time has come for the international community to formally acknowledge the existence of the TRNC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incidental findings on low-dose CT images obtained during hybrid imaging are an increasing phenomenon as CT technology advances. Understanding the diagnostic value of incidental findings along with the technical limitations is important when reporting image results and recommending follow-up, which may result in an additional radiation dose from further diagnostic imaging and an increase in patient anxiety. This study assessed lesions incidentally detected on CT images acquired for attenuation correction on two SPECT/CT systems. Methods: An anthropomorphic chest phantom containing simulated lesions of varying size and density was imaged on an Infinia Hawkeye 4 and a Symbia T6 using the low-dose CT settings applied for attenuation correction acquisitions in myocardial perfusion imaging. Twenty-two interpreters assessed 46 images from each SPECT/CT system (15 normal images and 31 abnormal images; 41 lesions). Data were evaluated using a jackknife alternative free-response receiver-operating-characteristic analysis (JAFROC). Results: JAFROC analysis showed a significant difference (P < 0.0001) in lesion detection, with the figures of merit being 0.599 (95% confidence interval, 0.568, 0.631) and 0.810 (95% confidence interval, 0.781, 0.839) for the Infinia Hawkeye 4 and Symbia T6, respectively. Lesion detection on the Infinia Hawkeye 4 was generally limited to larger, higher-density lesions. The Symbia T6 allowed improved detection rates for midsized lesions and some lower-density lesions. However, interpreters struggled to detect small (5 mm) lesions on both image sets, irrespective of density. Conclusion: Lesion detection is more reliable on low-dose CT images from the Symbia T6 than from the Infinia Hawkeye 4. This phantom-based study gives an indication of potential lesion detection in the clinical context as shown by two commonly used SPECT/CT systems, which may assist the clinician in determining whether further diagnostic imaging is justified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Financial abuse of elders is an under acknowledged problem and professionals' judgements contribute to both the prevalence of abuse and the ability to prevent and intervene. In the absence of a definitive "gold standard" for the judgement, it is desirable to try and bring novice professionals' judgemental risk thresholds to the level of competent professionals as quickly and effectively as possible. This study aimed to test if a training intervention was able to bring novices' risk thresholds for financial abuse in line with expert opinion. Methods: A signal detection analysis, within a randomised controlled trial of an educational intervention, was undertaken to examine the effect on the ability of novices to efficiently detect financial abuse. Novices (n = 154) and experts (n = 33) judged "certainty of risk" across 43 scenarios; whether a scenario constituted a case of financial abuse or not was a function of expert opinion. Novices (n = 154) were randomised to receive either an on-line educational intervention to improve financial abuse detection (n = 78) or a control group (no on-line educational intervention, n = 76). Both groups examined 28 scenarios of abuse (11 "signal" scenarios of risk and 17 "noise" scenarios of no risk). After the intervention group had received the on-line training, both groups then examined 15 further scenarios (5 "signal" and 10 "noise" scenarios). Results: Experts were more certain than the novices, pre (Mean 70.61 vs. 58.04) and post intervention (Mean 70.84 vs. 63.04); and more consistent. The intervention group (mean 64.64) were more certain of abuse post-intervention than the control group (mean 61.41, p = 0.02). Signal detection analysis of sensitivity (Á) and bias (C) revealed that this was due to the intervention shifting the novices' tendency towards saying "at risk" (C post intervention -.34) and away from their pre intervention levels of bias (C-.12). Receiver operating curves revealed more efficient judgments in the intervention group. Conclusion: An educational intervention can improve judgements of financial abuse amongst novice professionals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to develop and validate a multivariate predictive model to detect glaucoma by using a combination of retinal nerve fiber layer (RNFL), retinal ganglion cell-inner plexiform (GCIPL), and optic disc parameters measured using spectral-domain optical coherence tomography (OCT). Methods: Five hundred eyes from 500 participants and 187 eyes of another 187 participants were included in the study and validation groups, respectively. Patients with glaucoma were classified in five groups based on visual field damage. Sensitivity and specificity of all glaucoma OCT parameters were analyzed. Receiver operating characteristic curves (ROC) and areas under the ROC (AUC) were compared. Three predictive multivariate models (quantitative, qualitative, and combined) that used a combination of the best OCT parameters were constructed. A diagnostic calculator was created using the combined multivariate model. Results: The best AUC parameters were: inferior RNFL, average RNFL, vertical cup/disc ratio, minimal GCIPL, and inferior-temporal GCIPL. Comparisons among the parameters did not show that the GCIPL parameters were better than those of the RNFL in early and advanced glaucoma. The highest AUC was in the combined predictive model (0.937; 95% confidence interval, 0.911–0.957) and was significantly (P = 0.0001) higher than the other isolated parameters considered in early and advanced glaucoma. The validation group displayed similar results to those of the study group. Conclusions: Best GCIPL, RNFL, and optic disc parameters showed a similar ability to detect glaucoma. The combined predictive formula improved the glaucoma detection compared to the best isolated parameters evaluated. The diagnostic calculator obtained good classification from participants in both the study and validation groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fourier transform-infrared (FT-IR) signature of dry samples of DNA and DNA-polypeptide complexes, as studied by IR microspectroscopy using a diamond attenuated total reflection (ATR) objective, has revealed important discriminatory characteristics relative to the PO2(-) vibrational stretchings. However, DNA IR marks that provide information on the sample's richness in hydrogen bonds have not been resolved in the spectral profiles obtained with this objective. Here we investigated the performance of an all reflecting objective (ARO) for analysis of the FT-IR signal of hydrogen bonds in DNA samples differing in base richness types (salmon testis vs calf thymus). The results obtained using the ARO indicate prominent band peaks at the spectral region representative of the vibration of nitrogenous base hydrogen bonds and of NH and NH2 groups. The band areas at this spectral region differ in agreement with the DNA base richness type when using the ARO. A peak assigned to adenine was more evident in the AT-rich salmon DNA using either the ARO or the ATR objective. It is concluded that, for the discrimination of DNA IR hydrogen bond vibrations associated with varying base type proportions, the use of an ARO is recommended.