1000 resultados para myocardial biopsies images
Resumo:
The left ventricular response to dobutamine may be quantified using tissue Doppler measurement of myocardial velocity or displacement or 3-dimensional echocardiography to measure ventricular volume and ejection fraction. This study sought to explore the accuracy of these methods for predicting segmental and global responses to therapy. Standard dobutamine and 3-dimensional echocardiography were performed in 92 consecutive patients with abnormal left ventricular function at rest. Recovery of function was defined by comparison with follow-up echocardiography at rest 5 months later. Segments that showed improved regional function at follow-up showed a higher increment in peak tissue Doppler velocity with dobutamine therapy than in nonviable segments (1.2 +/- 0.4 vs 0.3 +/- 0.2 cm/s, p = 0.001). Similarly, patients who showed a > 5% improvement of ejection fraction at follow-up showed a greater displacement response to dobutamine (6.9 +/- 3.2 vs 2.1 +/- 2.3 mm, p = 0.001), as well as a higher rate of ejection fraction, response to dobutamine (9 +/- 3% vs 2 +/- 2%, p = 0.001). The optimal cutoff values for predicting subsequent recovery of function at rest were an increment of peak velocity > 1 cm/s, >5 mm of displacement, and a >5% improvement of ejection fraction with low-dose dobutamine. (C) 2003 by Excerpta Medica, Inc.
Resumo:
Myocardial infarction leads to compensatory ventricular remodeling. Disturbances in myocardial contractility depend on the active transport of Ca2+ and Na+, which are regulated by Na+-K+ ATPase. Inappropriate regulation of Na+-K+ ATPase activity leads to excessive loss of K+ and gain of Na+ by the cell. We determined the participation of Na+-K+ ATPase in ventricular performance early and late after myocardial infarction. Wistar rats (8-10 per group) underwent left coronary artery ligation (infarcted, Inf) or sham-operation (Sham). Ventricular performance was measured at 3 and 30 days after surgery using the Langendorff technique. Left ventricular systolic pressure was obtained under different ventricular diastolic pressures and increased extracellular Ca2+ concentrations (Ca2+e) and after low and high ouabain concentrations. The baseline coronary perfusion pressure increased 3 days after myocardial infarction and normalized by 30 days (Sham 3 = 88 ± 6; Inf 3 = 130 ± 9; Inf 30 = 92 ± 7 mmHg; P < 0.05). The inotropic response to Ca2+e and ouabain was reduced at 3 and 30 days after myocardial infarction (Ca2+ = 1.25 mM; Sham 3 = 70 ± 3; Inf 3 = 45 ± 2; Inf 30 = 29 ± 3 mmHg; P < 0.05), while the Frank-Starling mechanism was preserved. At 3 and 30 days after myocardial infarction, ventricular Na+-K+ ATPase activity and contractility were reduced. This Na+-K+ ATPase hypoactivity may modify the Na+, K+ and Ca2+ transport across the sarcolemma resulting in ventricular dysfunction.
Resumo:
Pectus excavatum is the most common congenital deformity of the anterior thoracic wall. The surgical correction of such deformity, using Nuss procedure, consists in the placement of a personalized convex prosthesis into sub-sternal position to correct the deformity. The aim of this work is the CT-scan substitution by ultrasound imaging for the pre-operative diagnosis and pre-modeling of the prosthesis, in order to avoid patient radiation exposure. To accomplish this, ultrasound images are acquired along an axial plane, followed by a rigid registration method to obtain the spatial transformation between subsequent images. These images are overlapped to reconstruct an axial plane equivalent to a CT-slice. A phantom was used to conduct preliminary experiments and the achieved results were compared with the corresponding CT-data, showing that the proposed methodology can be capable to create a valid approximation of the anterior thoracic wall, which can be used to model/bend the prosthesis
Resumo:
Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.
Resumo:
Dental implant recognition in patients without available records is a time-consuming and not straightforward task. The traditional method is a complete user-dependent process, where the expert compares a 2D X-ray image of the dental implant with a generic database. Due to the high number of implants available and the similarity between them, automatic/semi-automatic frameworks to aide implant model detection are essential. In this study, a novel computer-aided framework for dental implant recognition is suggested. The proposed method relies on image processing concepts, namely: (i) a segmentation strategy for semi-automatic implant delineation; and (ii) a machine learning approach for implant model recognition. Although the segmentation technique is the main focus of the current study, preliminary details of the machine learning approach are also reported. Two different scenarios are used to validate the framework: (1) comparison of the semi-automatic contours against implant’s manual contours of 125 X-ray images; and (2) classification of 11 known implants using a large reference database of 601 implants. Regarding experiment 1, 0.97±0.01, 2.24±0.85 pixels and 11.12±6 pixels of dice metric, mean absolute distance and Hausdorff distance were obtained, respectively. In experiment 2, 91% of the implants were successfully recognized while reducing the reference database to 5% of its original size. Overall, the segmentation technique achieved accurate implant contours. Although the preliminary classification results prove the concept of the current work, more features and an extended database should be used in a future work.
Resumo:
In daily cardiology practice, assessment of left ventricular (LV) global function using non-invasive imaging remains central for the diagnosis and follow-up of patients with cardiovascular diseases. Despite the different methodologies currently accessible for LV segmentation in cardiac magnetic resonance (CMR) images, a fast and complete LV delineation is still limitedly available for routine use. In this study, a localized anatomically constrained affine optical flow method is proposed for fast and automatic LV tracking throughout the full cardiac cycle in short-axis CMR images. Starting from an automatically delineated LV in the end-diastolic frame, the endocardial and epicardial boundaries are propagated by estimating the motion between adjacent cardiac phases using optical flow. In order to reduce the computational burden, the motion is only estimated in an anatomical region of interest around the tracked boundaries and subsequently integrated into a local affine motion model. Such localized estimation enables to capture complex motion patterns, while still being spatially consistent. The method was validated on 45 CMR datasets taken from the 2009 MICCAI LV segmentation challenge. The proposed approach proved to be robust and efficient, with an average distance error of 2.1 mm and a correlation with reference ejection fraction of 0.98 (1.9 ± 4.5%). Moreover, it showed to be fast, taking 5 seconds for the tracking of a full 4D dataset (30 ms per image). Overall, a novel fast, robust and accurate LV tracking methodology was proposed, enabling accurate assessment of relevant global function cardiac indices, such as volumes and ejection fraction.
Resumo:
The use of iris recognition for human authentication has been spreading in the past years. Daugman has proposed a method for iris recognition, composed by four stages: segmentation, normalization, feature extraction, and matching. In this paper we propose some modifications and extensions to Daugman's method to cope with noisy images. These modifications are proposed after a study of images of CASIA and UBIRIS databases. The major modification is on the computationally demanding segmentation stage, for which we propose a faster and equally accurate template matching approach. The extensions on the algorithm address the important issue of pre-processing that depends on the image database, being mandatory when we have a non infra-red camera, like a typical WebCam. For this scenario, we propose methods for reflection removal and pupil enhancement and isolation. The tests, carried out by our C# application on grayscale CASIA and UBIRIS images show that the template matching segmentation method is more accurate and faster than the previous one, for noisy images. The proposed algorithms are found to be efficient and necessary when we deal with non infra-red images and non uniform illumination.
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).
Resumo:
Myocardial perfusion-gated-SPECT (MP-gated-SPECT) imaging often shows radiotracer uptake in abdominal organs. This accumulation interferes frequently with qualitative and quantitative assessment of the infero-septal region of myocardium. The objective of this study is to evaluate the effect of ingestion of different fat content on the reduction of extra-myocardial uptake and to improve MP-gated-SPECT image quality. In this study, 150 patients (65 ^ 18 years) who were referred for MP-gated-SPECT underwent a 1-day-protocol including imaging after stress (physical or pharmacological) and resting conditions. All patients gave written informed consent. Patients were subdivided into five groups: GI, GII, GIII, GIV and GV. In the first four groups, patients ate two chocolate bars with different fat content. Patients in GV – control group (CG) – had just water. Uptake indices (UI) of myocardium (M)/liver(L) and M/stomach–proximal bowel(S) revealed lower UI of M/S at rest in all groups. Both stress and rest studies using different food intake indicate that patients who ate chocolate with different fat content showed better UI of M/L than the CG. The UI of M/L and M/S of groups obtained under physical stress are clearly superior to that of groups obtained under pharmacological stress. These differences are only significant in patients who ate high-fat chocolate or drank water. The analysis of all stress studies together (GI, GII, GIII and GIV) in comparison with CG shows higher mean ranks of UI of M/L for those who ate high-fat chocolate. After pharmacological stress, the mean ranks of UI of M/L were higher for patients who ate high- and low-fat chocolate. In conclusion, eating food with fat content after radiotracer injection increases, respectively, the UI of M/L after stress and rest in MP-gated-SPECT studies. It is, therefore, recommended that patients eat a chocolate bar after radiotracer injection and before image acquisition.
Resumo:
OBJECTIVE: Myocardial infarction is an acute and severe cardiovascular disease that generally leads to patient admissions to intensive care units and few cases are initially admitted to infirmaries. The objective of the study was to assess whether estimates of air pollution effects on myocardial infarction morbidity are modified by the source of health information. METHODS: The study was carried out in hospitals of the Brazilian Health System in the city of São Paulo, Southern Brazil. A time series study (1998-1999) was performed using two outcomes: infarction admissions to infirmaries and to intensive care units, both for people older than 64 years of age. Generalized linear models controlling for seasonality (long and short-term trends) and weather were used. The eight-day cumulative effects of air pollutants were assessed using third degree polynomial distributed lag models. RESULTS: Almost 70% of daily hospital admissions due to myocardial infarction were to infirmaries. Despite that, the effects of air pollutants on infarction were higher for intensive care units admissions. All pollutants were positively associated with the study outcomes but SO2 presented the strongest statistically significant association. An interquartile range increase on SO2 concentration was associated with increases of 13% (95% CI: 6-19) and 8% (95% CI: 2-13) of intensive care units and infirmary infarction admissions, respectively. CONCLUSIONS: It may be assumed there is a misclassification of myocardial infarction admissions to infirmaries leading to overestimation. Also, despite the absolute number of events, admissions to intensive care units data provides a more adequate estimate of the magnitude of air pollution effects on infarction admissions.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.