865 resultados para assessment methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In a high proportion of patients with favorable outcome after aneurysmal subarachnoid hemorrhage (aSAH), neuropsychological deficits, depression, anxiety, and fatigue are responsible for the inability to return to their regular premorbid life and pursue their professional careers. These problems often remain unrecognized, as no recommendations concerning a standardized comprehensive assessment have yet found entry into clinical routines. METHODS: To establish a nationwide standard concerning a comprehensive assessment after aSAH, representatives of all neuropsychological and neurosurgical departments of those eight Swiss centers treating acute aSAH have agreed on a common protocol. In addition, a battery of questionnaires and neuropsychological tests was selected, optimally suited to the deficits found most prevalent in aSAH patients that was available in different languages and standardized. RESULTS: We propose a baseline inpatient neuropsychological screening using the Montreal Cognitive Assessment (MoCA) between days 14 and 28 after aSAH. In an outpatient setting at 3 and 12 months after bleeding, we recommend a neuropsychological examination, testing all relevant domains including attention, speed of information processing, executive functions, verbal and visual learning/memory, language, visuo-perceptual abilities, and premorbid intelligence. In addition, a detailed assessment capturing anxiety, depression, fatigue, symptoms of frontal lobe affection, and quality of life should be performed. CONCLUSIONS: This standardized neuropsychological assessment will lead to a more comprehensive assessment of the patient, facilitate the detection and subsequent treatment of previously unrecognized but relevant impairments, and help to determine the incidence, characteristics, modifiable risk factors, and the clinical course of these impairments after aSAH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: For free-breathing cardiovascular magnetic resonance (CMR), the self-navigation technique recently emerged, which is expected to deliver high-quality data with a high success rate. The purpose of this study was to test the hypothesis that self-navigated 3D-CMR enables the reliable assessment of cardiovascular anatomy in patients with congenital heart disease (CHD) and to define factors that affect image quality. METHODS: CHD patients ≥2 years-old and referred for CMR for initial assessment or for a follow-up study were included to undergo a free-breathing self-navigated 3D CMR at 1.5T. Performance criteria were: correct description of cardiac segmental anatomy, overall image quality, coronary artery visibility, and reproducibility of great vessels diameter measurements. Factors associated with insufficient image quality were identified using multivariate logistic regression. RESULTS: Self-navigated CMR was performed in 105 patients (55% male, 23 ± 12y). Correct segmental description was achieved in 93% and 96% for observer 1 and 2, respectively. Diagnostic quality was obtained in 90% of examinations, and it increased to 94% if contrast-enhanced. Left anterior descending, circumflex, and right coronary arteries were visualized in 93%, 87% and 98%, respectively. Younger age, higher heart rate, lower ejection fraction, and lack of contrast medium were independently associated with reduced image quality. However, a similar rate of diagnostic image quality was obtained in children and adults. CONCLUSION: In patients with CHD, self-navigated free-breathing CMR provides high-resolution 3D visualization of the heart and great vessels with excellent robustness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we show how the use of state-of-the-art methods in computer science based on machine perception and learning allows the unobtrusive capture and automated analysis of interpersonal behavior in real time (social sensing). Given the high ecological validity of the behavioral sensing, the ease of behavioral-cue extraction for large groups over long observation periods in the field, the possibility of investigating completely new research questions, and the ability to provide people with immediate feedback on behavior, social sensing will fundamentally impact psychology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of qualitative research methods has grown substantially over the last twenty years, both in social sciences and, more recently, in the health sciences. This growth came with questions on the quality criteria needed to evaluate this work, and numerous guidelines were published. The latters include many discrepancies though, both in their vocabulary and construction. Many expert evaluators decry the absence of consensual and reliable evaluation tools. The authors present the results of an evaluation of 58 existing guidelines in 4 major health science fields (medicine and epidemiology; nursing and health education; social sciences and public health; psychology / psychiatry, research methods and organization) by expert users (article reviewers, experts allocating funds, editors, etc.). The results propose a toolbox containing 12 consensual criteria with the definitions given by expert users. They also indicate in which disciplinary field each type of criteria is known to be more or less essential. Nevertheless, the authors highlight the limitations of the criteria comparability, as soon as one focuses on their specific definitions. They conclude that each criterion in the toolbox must be explained to come to broader consensus and identify definitions that are consensual to all the fields examined and easily operational.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Independently of total caloric intake, a better quality of the diet (for example, conformity to the Mediterranean diet) is associated with lower obesity risk. It is unclear whether a brief dietary assessment tool, instead of full-length comprehensive methods, can also capture this association. In addition to reduced costs, a brief tool has the interesting advantage of allowing immediate feedback to participants in interventional studies. Another relevant question is which individual items of such a brief tool are responsible for this association. We examined these associations using a 14-item tool of adherence to the Mediterranean diet as exposure and body mass index, waist circumference and waist-to-height ratio (WHtR) as outcomes. Design: Cross-sectional assessment of all participants in the"PREvención con DIeta MEDiterránea" (PREDIMED) trial. Subjects: 7,447 participants (55-80 years, 57% women) free of cardiovascular disease, but with either type 2 diabetes or $3 cardiovascular risk factors. Trained dietitians used both a validated 14-item questionnaire and a full-length validated 137-item food frequency questionnaire to assess dietary habits. Trained nurses measured weight, height and waist circumference. Results: Strong inverse linear associations between the 14-item tool and all adiposity indexes were found. For a two-point increment in the 14-item score, the multivariable-adjusted differences in WHtR were 20.0066 (95% confidence interval,- 0.0088 to 20.0049) for women and-0.0059 (-0.0079 to-0.0038) for men. The multivariable-adjusted odds ratio for a WHtR.0.6 in participants scoring $10 points versus #7 points was 0.68 (0.57 to 0.80) for women and 0.66 (0.54 to 0.80) for men. High consumption of nuts and low consumption of sweetened/carbonated beverages presented the strongest inverse associations with abdominal obesity. Conclusions: A brief 14-item tool was able to capture a strong monotonic inverse association between adherence to a good quality dietary pattern (Mediterranean diet) and obesity indexes in a population of adults at high cardiovascular risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Two important risk factors for abnormal neurodevelopment are preterm birth and neonatal hypoxic ischemic encephalopathy. The new revisions of Griffiths Mental Development Scale (Griffiths-II, [1996]) and the Bayley Scales of Infant Development (BSID-II, [1993]) are two of the most frequently used developmental diagnostics tests. The Griffiths-II is divided into five subscales and a global development quotient (QD), and the BSID-II is divided into two scales, the Mental scale (MDI) and the Psychomotor scale (PDI). The main objective of this research was to establish the extent to which developmental diagnoses obtained using the new revisions of these two tests are comparable for a given child. MATERIAL AND METHODS: Retrospective study of 18-months-old high-risk children examined with both tests in the follow-up Unit of the Clinic of Neonatology of our tertiary care university Hospital between 2011 and 2012. To determine the concurrent validity of the two tests paired t-tests and Pearson product-moment correlation coefficients were computed. Using the BSID-II as a gold standard, the performance of the Griffiths-II was analyzed with receiver operating curves. RESULTS: 61 patients (80.3% preterm, 14.7% neonatal asphyxia) were examined. For the BSID-II the MDI mean was 96.21 (range 67-133) and the PDI mean was 87.72 (range 49-114). For the Griffiths-II, the QD mean was 96.95 (range 60-124), the locomotors subscale mean was 92.57 (range 49-119). The score of the Griffiths locomotors subscale was significantly higher than the PDI (p<0.001). Between the Griffiths-II QD and the BSID-II MDI no significant difference was found, and the area under the curve was 0.93, showing good validity. All correlations were high and significant with a Pearson product-moment correlation coefficient >0.8. CONCLUSIONS: The meaning of the results for a given child was the same for the two tests. Two scores were interchangeable, the Griffiths-II QD and the BSID-II MDI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Iterative algorithms introduce new challenges in the field of image quality assessment. The purpose of this study is to use a mathematical model to evaluate objectively the low contrast detectability in CT. MATERIALS AND METHODS: A QRM 401 phantom containing 5 and 8 mm diameter spheres with a contrast level of 10 and 20 HU was used. The images were acquired at 120 kV with CTDIvol equal to 5, 10, 15, 20 mGy and reconstructed using the filtered back-projection (FBP), adaptive statistical iterative reconstruction 50% (ASIR 50%) and model-based iterative reconstruction (MBIR) algorithms. The model observer used is the Channelized Hotelling Observer (CHO). The channels are dense difference of Gaussian channels (D-DOG). The CHO performances were compared to the outcomes of six human observers having performed four alternative forced choice (4-AFC) tests. RESULTS: For the same CTDIvol level and according to CHO model, the MBIR algorithm gives the higher detectability index. The outcomes of human observers and results of CHO are highly correlated whatever the dose levels, the signals considered and the algorithms used when some noise is added to the CHO model. The Pearson coefficient between the human observers and the CHO is 0.93 for FBP and 0.98 for MBIR. CONCLUSION: The human observers' performances can be predicted by the CHO model. This opens the way for proposing, in parallel to the standard dose report, the level of low contrast detectability expected. The introduction of iterative reconstruction requires such an approach to ensure that dose reduction does not impair diagnostics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: This article is part of a research study on the organization of primary health care (PHC) for mental health in two of Quebec's remote regions. It introduces a methodological approach based on information found in health records, for assessing the quality of PHC offered to people suffering from depression or anxiety disorders. METHODS: Quality indicators were identified from evidence and case studies were reconstructed using data collected in health records over a 2-year observation period. Data collection was developed using a three-step iterative process: (1) feasibility analysis, (2) development of a data collection tool, and (3) application of the data collection method. The adaptation of quality-of-care indicators to remote regions was appraised according to their relevance, measurability and construct validity in this context. RESULTS: As a result of this process, 18 quality indicators were shown to be relevant, measurable and valid for establishing a critical quality appraisal of four recommended dimensions of PHC clinical processes: recognition, assessment, treatment and follow-up. CONCLUSIONS: There is not only an interest in the use of health records to assess the quality of PHC for mental health in remote regions but also a scientific value for the rigorous and meticulous methodological approach developed in this study. From the perspective of stakeholders in the PHC system of care in remote areas, quality indicators are credible and provide potential for transferability to other contexts. This study brings information that has the potential to identify gaps in and implement solutions adapted to the context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine the number of punctures in fine-needle aspiration biopsies required for a safe cytological analysis of thyroid nodules. MATERIALS AND METHODS: Cross-sectional study with focus on diagnosis. The study population included 94 patients. RESULTS: The mean age of the patients participating in the study was 52 years (standard-deviation = 13.7) and 90.4% of them were women. Considering each puncture as an independent event, the first puncture has showed conclusive results in 78.7% of cases, the second, in 81.6%, and the third, in 71.8% of cases. With a view to the increasing chance of a conclusive diagnosis at each new puncture, two punctures have showed conclusive results in 89.5% of cases, and three punctures, in 90.6% of cases with at least one conclusive result. CONCLUSION: Two punctures in fine-needle aspiration biopsies of thyroid nodules have lead to diagnosis in 89.5% of cases in the study sample, suggesting that there is no need for multiple punctures to safely obtain the diagnosis of thyroid nodules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To investigate superior mesenteric artery flow measurement by Doppler ultrasonography as a means of characterizing inflammatory activity in Crohn's disease. Materials and Methods Forty patients were examined and divided into two groups – disease activity and remission – according to their Crohn's disease activity index score. Mean superior mesenteric artery flow volume was calculated for each group and correlated with Crohn's disease activity index score. Results The mean superior mesenteric artery flow volume was significantly greater in the patients with active disease (626 ml/min ± 236 × 376 ml/min ± 190; p = 0.001). As a cut off corresponding to 500 ml/min was utilized, the superior mesenteric artery flow volume demonstrated sensitivity of 83% and specificity of 82% for the diagnosis of Crohn's disease activity. Conclusion The present results suggest that patients with active Crohn's disease have increased superior mesenteric artery flow volume as compared with patients in remission. Superior mesenteric artery flow measurement had a good performance in the assessment of disease activity in this study sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To compare automatic and manual measurements of intima-media complex (IMC) in common carotid, common femoral and right subclavian arteries of HIV-infected patients in relation to a control group, taking into consideration the classical risk factors for atherosclerosis. Materials and Methods The study sample comprised 70 HIV-infected patients and 70 non-HIV-infected controls paired according sex and age. Automatic (gold standard) and manual measurements of IMC were performed in the carotid arteries. Manual measurements were also performed in common femoral and right subclavian arteries. Bland-Altman graphs were utilized in the comparison and the adopted level significance was 5%. Results Intima-media complex alterations were not observed in any of the individuals as the mean automatic measurement in the right common carotid (RCC) artery was considered as the gold standard. As the gold standard was compared with the manual measurements (mean, maximum and minimum), no clinically significant alteration was observed. As the gold standard was compared with other sites, the difference was statistically and clinically significant at the origin of right subclavian artery (RCC: 0.51 mm vs. 0.91 mm) (p < 0.001). Conclusion HIV-infected individuals are not at higher risk for atherosclerosis than the control population.