24 resultados para Missions to leprosy patients.
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Leprosy is a spectral disease exhibiting two polar sides, namely, lepromatous leprosy (LL) characterised by impaired T-cell responses and tuberculoid leprosy in which T-cell responses are strong. Proper T-cell activation requires signalling through costimulatory molecules expressed by antigen presenting cells and their ligands on T-cells. We studied the influence of costimulatory molecules on the immune responses of subjects along the leprosy spectrum. The expression of the costimulatory molecules was evaluated in in vitro-stimulated peripheral blood mononuclear cells of lepromatous and tuberculoid patients and healthy exposed individuals (contacts). We show that LL patients have defective monocyte CD86 expression, which likely contributes to the impairment of the antigen presentation process and to patients anergy. Accordingly, CD86 but not CD80 blockade inhibited the lymphoproliferative response to Mycobacterium leprae. Consistent with the LL anergy, there was reduced expression of the positive signalling costimulatory molecules CD28 and CD86 on the T-cells in these patients. In contrast, tuberculoid leprosy patients displayed increased expression of the negative signalling molecules CD152 and programmed death-1 (PD-1), which represents a probable means of modulating an exacerbated immune response and avoiding immunopathology. Notably, the contacts exhibited proper CD86 and CD28 expression but not exacerbated CD152 or PD-1 expression, suggesting that they tend to develop a balanced immunity without requiring immunosuppressive costimulatory signalling.
Resumo:
T regulatory cells (Tregs) play an important role in the mechanism of host's failure to control pathogen dissemination in severe forms of different chronic granulomatous diseases, but their role in leprosy has not yet been elucidated; 28 newly diagnosed patients (16 patients with lepromatous leprosy and 12 patients with tuberculoid leprosy) and 6 healthy Mycobacterium leprae-exposed individuals (contacts) were studied. Tregs were quantified by flow cytometry (CD4+ CD25+ Foxp3+) in peripheral blood mononuclear cells stimulated in vitro with a M. leprae antigenic preparation and phytohemagglutinin as well as in skin lesions by immunohistochemistry. The lymphoproliferative (LPR), interleukin-10 (IL-10), and interferon-gamma (IFN-gamma) responses of the in vitro-stimulated peripheral blood mononuclear cells and the in situ expression of IL-10, transforming growth factor-beta (TGF-beta), and cytotoxic T-lymphocyte antigen 4 (CTLA-4) were also determined. We show that M. leprae antigens induced significantly lower LPR but significantly higher Treg numbers in lepromatous than tuberculoid patients and contacts. Mitogen-induced LPR and Treg frequencies were not significantly different among the three groups. Tregs were also more frequent in situ in lepromatous patients, and this finding was paralleled by increased expression of the antiinflammatory molecules IL-10 and CTLA-4 but not TGF-beta. In lepromatous patients, Tregs were intermingled with vacuolized hystiocyte infiltrates all over the lesion, whereas in tuberculoid patients, Tregs were rare. Our results suggest that Tregs are present in increased numbers, and they may have a pathogenic role in leprosy patients harboring uncontrolled bacillary multiplication but not in those individuals capable of limiting M. leprae growth.
Resumo:
OBJECTIVE: This study aimed to determine the frequency of coinfections in leprosy patients and whether there is a relationship between the presence of coinfections and the development of leprosy reactional episodes. METHOD: A cross-sectional study based on an analysis of the medical records of the patients who were treated at the Leprosy Clinics of the Ribeirao Preto Medical School, University of Sao Paulo, was conducted from 2000 to 2010. Information was recorded regarding the age, sex, clinical status, WHO classification, treatment, presence of reactions and coinfections. Focal and systemic infections were diagnosed based on the history, physical examination, and laboratory tests. Multinomial logistic regression was used to evaluate the associations between the leprosy reactions and the patients' gender, age, WHO classification and coinfections. RESULTS: Two hundred twenty-five patients were studied. Most of these patients were males (155/225 = 68.8%) of an average age of 49.31 +/- 15.92 years, and the most prevalent clinical manifestation was the multibacillary (MB) form (n = 146), followed by the paucibacillary (PB) form (n = 79). Erythema nodosum leprosum (ENL) was more prevalent (78/122 = 63.9%) than the reversal reaction (RR) (44/122 = 36.1%), especially in the MB patients (OR 5.07; CI 2.86-8.99; p<0.0001) who exhibited coinfections (OR 2.26; CI 1.56-3.27; p<0.0001). Eighty-eight (88/225 = 39.1%) patients exhibited coinfections. Oral coinfections were the most prevalent (40/88 = 45.5%), followed by urinary tract infections (17/88 = 19.3%), sinusopathy (6/88 = 6.8%), hepatitis C (6/88 = 6.8%), and hepatitis B (6/88 = 6.8%). CONCLUSIONS: Coinfections may be involved in the development and maintenance of leprosy reactions.
Resumo:
Background: Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Methods: Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. Results: We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. Conclusions: The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.
Resumo:
Iron is essential for all organisms and its availability can control the growth of microorganisms; therefore, we examined the role of iron metabolism in multibacillary (MB) leprosy, focusing on the involvement of hepcidin. Erythrograms, iron metabolism parameters, pro-inflammatory cytokines and urinary hepcidin levels were evaluated in patients with MB and matched control subjects. Hepcidin expression in MB lesions was evaluated by quantitative polymerase chain reaction. The expression of ferroportin and hepcidin was evaluated by immunofluorescence in paucibacillary and MB lesions. Analysis of hepcidin protein levels in urine and of hepcidin mRNA and protein levels in leprosy lesions and skin biopsies from healthy control subjects showed elevated hepcidin levels in MB patients. Decreases in haematologic parameters and total iron binding capacity were observed in patients with MB leprosy. Moreover, interleukin-1 beta, ferritin, soluble transferrin receptor and soluble transferrin receptor/log ferritin index values were increased in leprosy patients. Hepcidin was elevated in lepromatous lesions, whereas ferroportin was more abundant in tuberculoid lesions. In addition, hepcidin and ferroportin were not colocalised in the biopsies from leprosy lesions. Anaemia was not commonly observed in patients with MB; however, the observed changes in haematologic parameters indicating altered iron metabolism appeared to result from a mixture of anaemia of inflammation and iron deficiency. Thus, iron sequestration inside host cells might play a role in leprosy by providing an optimal environment for the bacillus.
Resumo:
Leprosy is an infectious disease caused by Mycobacterium leprae. The polymerase chain reaction (PCR) has been applied to detect M. leprae in different clinical samples and urine seems to be attractive for this purpose. PCR was used to improve the sensitivity for diagnosing leprosy by amplifying a 151-bp PCR fragment of the M. leprae pra gene (PCR-Pra) in urine samples. Seventy-three leprosy patients (39 males and 34 females, 14 to 78 years old) were selected for leprosy diagnosis at a reference laboratory in Maringa, PR, Brazil. Of these, 36 were under anti-leprosy multidrug therapy with dapsone and rifampicin for tuberculoid (TT) and dapsone, rifampicin and clofazimine for borderline (BB) and lepromatous (LL) forms. The control group contained 50 healthy individuals without any clinical history of leprosy. DNA isolated from leprosy patients' urine samples was successfully amplified by PCR-Pra in 46.6% (34/73) of the cases. The positivity of PCR-Pra for patients with the TT form was 75% for both patients under treatment and non-treated patients (P = 0.1306). In patients with the LL form, PCR-Pra positivity was 52 and 30% for patients under treatment and non-treated patients, respectively (P = 0.2386). PCR-Pra showed a statistically significant difference in detecting M. leprae between the TT and LL forms of leprosy in patients under treatment (P = 0.0033). Although the current study showed that the proposed PCR-Pra has some limitations in the detection of M. leprae, this method has the potential to be a useful tool for leprosy diagnosis mainly in TT leprosy where the AFB slit-skin smear is always negative.
Resumo:
Introduction and Objective: Because of the improvements on detection of early stage prostate cancer over the last decade, focal therapy for localized prostate cancer (PC) has been proposed for patients with low-risk disease. Such treatment would allow the control of cancer, thereby diminishing side effects, such as urinary incontinence and sexual dysfunction, which have an enormous impact on quality of life. The critical issue is whether it is possible to preoperatively predict clinically significant unifocal or unilateral prostate cancer with sufficient accuracy. Our aim is to determine whether there is any preoperative feature that can help select the ideal patient for focal therapy. Material and methods: A total of 599 patients who underwent transrectal ultrasound, (TRUS)-guided prostate biopsy followed by radical prostatectomy to treat PC were examined in our laboratory between 2001 and 2009. We established very restricted criteria to select patients with very-low-risk disease for whom focal therapy would be suitable (only I biopsy core positive, tumor no larger than 80% of a single core, no perineural invasion, PSA serum level < 10 ng/ml, Gleason score < 7 and clinical stage T1c, T2a-b). We defined 2 groups of patients who would be either adequately treated or not treated by focal therapy. The primary endpoint was the evaluation of preoperative features in order to identify which parameters should be considered when choosing good candidates for focal therapy. Results: Fifty-six out of 599 patients met our criteria. The mean age was 59 years, and the mean number of biopsy cores was 14.4. Forty-seven (83.9%) were staged T1c, and 9 (16.1%) were staged T2a-b. Forty-four (78.6%) patients could be considered to have been adequately treated by focal therapy, and 12 (21.4%) could not. There was no statistical difference between the 2 groups considering age, clinical stage, PSA levels, Gleason score, and tumor volume in the biopsy. All 12 patients who could be considered inadequately treated had a bilateral, significant secondary tumor, 58.3% had Gleason >= 7, and 25% were staged pT3. Conclusion: Although focal therapy might be a good option for patients with localized prostate cancer, we are so far unable to select which of them would benefit from it based on preoperative data, even using very restricted criteria, and a considerable proportion of men would still be left undertreated. (c) 2012 Elsevier Inc. All rights reserved.
Resumo:
Background and Purpose: Oropharyngeal dysphagia is a common manifestation in acute stroke. Aspiration resulting from difficulties in swallowing is a symptom that should be considered due to the frequent occurrence of aspiration pneumonia that could influence the patient's recovery as it causes clinical complications and could even lead to the patient's death. The early clinical evaluation of swallowing disorders can help define approaches and avoid oral feeding, which may be detrimental to the patient. This study aimed to create an algorithm to identify patients at risk of developing dysphagia following acute ischemic stroke in order to be able to decide on the safest way of feeding and minimize the complications of stroke using the National Institutes of Health Stroke Scale (NHISS). Methods: Clinical assessment of swallowing was performed in 50 patients admitted to the emergency unit of the University Hospital, Faculty of Medicine of Ribeirao Preto, Sao Paulo, Brazil, with a diagnosis of ischemic stroke, within 48 h after the beginning of symptoms. Patients, 25 females and 25 males with a mean age of 64.90 years (range 26-91 years), were evaluated consecutively. An anamnesis was taken before the patient's participation in the study in order to exclude a prior history of deglutition difficulties. For the functional assessment of swallowing, three food consistencies were used, i.e. pasty, liquid and solid. After clinical evaluation, we concluded whether there was dysphagia. For statistical analysis we used the Fisher exact test, verifying the association between the variables. To assess whether the NIHSS score characterizes a risk factor for dysphagia, a receiver operational characteristics curve was constructed to obtain characteristics for sensitivity and specificity. Results: Dysphagia was present in 32% of the patients. The clinical evaluation is a reliable method of detection of swallowing difficulties. However, the predictors of risk for the swallowing function must be balanced, and the level of consciousness and the presence of preexisting comorbidities should be considered. Gender, age and cerebral hemisphere involved were not significantly associated with the presence of dysphagia. NIHSS, Glasgow Coma Scale, and speech and language changes had a statistically significant predictive value for the presence of dysphagia. Conclusions: The NIHSS is highly sensitive (88%) and specific (85%) in detecting dysphagia; a score of 12 may be considered as the cutoff value. The creation of an algorithm to detect dysphagia in acute ischemic stroke appears to be useful in selecting the optimal feeding route while awaiting a specialized evaluation. Copyright (C) 2012 S. Karger AG, Basel
Resumo:
Infections are an important cause of morbidity and mortality in juvenile systemic lupus erythematosus (JSLE). Among them, invasive aspergillosis (IA), which is usually related to immunosuppressed patients, has been rarely reported in JSLE. From 1983 to 2011, 5604 patients were followed at our institution and 283 (5%) met the American College of Rheumatology (ACR) classification criteria for SLE. Six (2.1%) of our JSLE patients had IA. One of them was previously reported and five will be described herein. Four of them were female. The median age at JSLE diagnosis was 12 years (8-16) and the median interval between diagnosis of JSLE and IA was 6 months (1-38). All had pulmonary involvement and three of them had systemic involvement. The median Systemic Lupus Erythematosus Disease Activity Index 2000 (SLEDAI-2K) was 19 (7-22). Diagnosis of IA was performed by isolation of Aspergillus spp., two in bronchoalveolar lavage culture and by way of autopsy in the others. All of them were treated with corticosteroids and/or immunosuppressive drugs at IA diagnosis (azathioprine and/or intravenous cyclophosphamide). They all required treatment in the pediatric intensive care unit with mechanical ventilation and antifungal therapy (fluconazole, amphotericin B, itraconazole and/or voriconazole); nonetheless, none of them survived. In conclusion, this was the first report that evaluated the prevalence of IA in a large population of JSLE patients from a tertiary pediatric hospital, and clearly showed the severity of the outcome, especially in patients with active disease and treated with immunosuppressive agents. This study reinforces the importance of early diagnosis and treatment with certain antifungals, especially in critically ill patients. Lupus (2012) 21, 1011-1016.
Resumo:
Background: Large amounts of reactive oxygen species are produced in hemodialysis (HD) patients, and, at higher concentrations, reactive oxygen species are thought to be involved in the pathogenesis of cardiovascular disease. It has been proposed that selenium (Se) may exert an antiatherogenic influence by reducing oxidative stress. The richest known food source of Se is the Brazil nut (Bertholletia excelsa, family Lecythidaceae), found in the Amazon region. Objective: The objective of this work was to determine if Se plasma levels in HD patients submitted to a program of supplementation during 3 months with 1 Brazil nut by day could be sustained after 12 months. Methods: A total of 21 HD patients (54.2 +/- 15.2 years old; average time on dialysis, 82.3 +/- 51.6 months; body mass index, 24.4 +/- 3.8 kg/m(2)) from the RenalCor Clinic in Rio de Janeiro, Brazil, were followed up 12 months after the supplementation study ended. The Se plasma levels were determined by atomic absorption spectrophotometry with hydride generation. Results: The Se Plasma levels (17.3 +/- 19.9 mg/L) were below the normal range (60 to 120 mu g/L) before nut supplementation, and after 3 months of supplementation, the levels increased to 106.8 +/- 50.3 mu g/L (P < .0001). Twelve months after supplementation, the plasma Se levels decreased to 31.9 +/- 14.8 mu g/L (P < .0001). Conclusions: The data showed that these patients were Se deficient and that the consumption of Brazil nut was effective to increase the Se parameters of nutritional status. Se levels 12 months after the supplementation period were not as low as presupplementation levels but yet significantly lower, and we needed to motivate patients to adopt different dietary intake patterns. (C) 2012 by the National Kidney Foundation, Inc. All rights reserved.
Resumo:
The purpose of this study was to warn the dental community about a possible problem in function with partial implant-supported prostheses used for long periods. The misalignment between natural teeth and the implant-supported prosthesis on teeth 11 and 12, observed in a 14-year clinical follow-up, illustrates the fact. The metal-ceramic crowns were placed in 1995 after a rigorous occlusal adjustment. Evaluations were made at 4, 6, 9, and 14 years, when it was noticed that the restorations were positioned palatally and extruded in comparison with the natural teeth. After 9 years, a greater discrepancy was noticed, with anterior occlusion and esthetic changes. The possible causes have been discussed: occlusal problems, parafunctional habits, and natural movement. The first 2 options were discarded after clinical analysis and diagnosis. Therefore, the natural movement probably deriving from an interaction of mechanical and genetic factors might have been the cause. The implants do not have periodontal ligaments but rather ankylosis, so they do not suffer those movements. This case emphasizes the need to inform patients that implants can last more than 10 years in function, but this is not the case with restorations, which lose function and esthetics and must be replaced.
Resumo:
Background: Cryptococcus neoformans causes meningitis and disseminated infection in healthy individuals, but more commonly in hosts with defective immune responses. Cell-mediated immunity is an important component of the immune response to a great variety of infections, including yeast infections. We aimed to evaluate a specific lymphocyte transformation assay to Cryptococcus neoformans in order to identify immunodeficiency associated to neurocryptococcosis (NCC) as primary cause of the mycosis. Methods: Healthy volunteers, poultry growers, and HIV-seronegative patients with neurocryptococcosis were tested for cellular immune response. Cryptococcal meningitis was diagnosed by India ink staining of cerebrospinal fluid and cryptococcal antigen test (Immunomycol-Inc, SP, Brazil). Isolated peripheral blood mononuclear cells were stimulated with C. neoformans antigen, C. albicans antigen, and pokeweed mitogen. The amount of H-3-thymidine incorporated was assessed, and the results were expressed as stimulation index (SI) and log SI, sensitivity, specificity, and cut-off value (receiver operating characteristics curve). We applied unpaired Student t tests to compare data and considered significant differences for p<0.05. Results: The lymphotoxin alpha showed a low capacity with all the stimuli for classifying patients as responders and non-responders. Lymphotoxin alpha stimulated by heated-killed antigen from patients with neurocryptococcosis was not affected by TCD4+ cell count, and the intensity of response did not correlate with the clinical evolution of neurocryptococcosis. Conclusion: Response to lymphocyte transformation assay should be analyzed based on a normal range and using more than one stimulator. The use of a cut-off value to classify patients with neurocryptococcosis is inadequate. Statistical analysis should be based on the log transformation of SI. A more purified antigen for evaluating specific response to C. neoformans is needed.
Resumo:
To assess the prevalence of depression and fatigue symptoms in head and neck cancer patients during radiotherapy treatment and relate them symptoms with these patients' quality of life. This is a prospective study. The Beck Depression Inventory (BDI), Piper Fatigue Scale-revised and Functional Assessment Cancer Therapy Head and Neck (FACT-H&N) were applied to 41 head and neck cancer patients at three times: at the start of treatment (T1), approximately 15 days after the start of treatment (T2) and at the end of treatment (T3), approximately 30 days after the start of the radiotherapy. The mean BDI and PIPER increased during the radiotherapy treatment. BDI scores did not demonstrate the presence of depression, although the number of symptoms increased, and the presence of fatigue rose as treatment advanced. The mean FACT H&N decreased in the middle and at the end of treatment, indicating worsening in these patients' Quality of Life. Depression and fatigue symptoms increased during radiotherapy treatment, while QoL levels decreased. This demonstrates that these symptoms are strongly correlated and that their presence negatively influenced QoL. At the start of treatment, nurses need to advise patients and plan care, offering interventions to decrease these symptoms and improve QoL.
Resumo:
Leprosy in children is correlated with community-level factors, including the recent presence of disease and active foci of transmission in the community. We performed clinical and serological examinations of 1,592 randomly selected school children (SC) in a cross-sectional study of eight hyperendemic municipalities in the Brazilian Amazon Region. Sixty-three (4%) SC, with a mean age of 13.3 years (standard deviation = 2.6), were diagnosed with leprosy and 777 (48.8%) were seropositive for anti-phenolic glycolipid-I (PGL-I). Additionally, we evaluated 256 household contacts (HHCs) of the students diagnosed with leprosy; 24 (9.4%) HHC were also diagnosed with leprosy and 107 (41.8%) were seropositive. The seroprevalence of anti-PGL-I was significantly higher amongst girls, students from urban areas and students from public schools (p < 0.0001). Forty-five (71.4%) new cases detected amongst SC were classified as paucibacillary and 59 (93.6%) patients did not demonstrate any degree of physical disability at diagnosis. The results of this study suggest that there is a high rate of undiagnosed leprosy and subclinical infection amongst children in the Amazon Region. The advantages of school surveys in hyperendemic areas include identifying leprosy patients at an early stage when they show no physical disabilities, preventing the spread of the infection in the community and breaking the chain of transmission.
Resumo:
Semi-quantitative stenosis assessment by coronary CT angiography only modestly predicts stress-induced myocardial perfusion abnormalities. The performance of quantitative CT angiography (QCTA) for identifying patients with myocardial perfusion defects remains unclear. CorE-64 is a multicenter, international study to assess the accuracy of 64-slice QCTA for detecting a parts per thousand yen50% coronary arterial stenoses by quantitative coronary angiography (QCA). Patients referred for cardiac catheterization with suspected or known coronary artery disease were enrolled. Area under the receiver-operating-characteristic curve (AUC) was used to evaluate the diagnostic accuracy of the most severe coronary artery stenosis in a subset of 63 patients assessed by QCTA and QCA for detecting myocardial perfusion abnormalities on exercise or pharmacologic stress SPECT. Diagnostic accuracy of QCTA for identifying patients with myocardial perfusion abnormalities by SPECT revealed an AUC of 0.71, compared to 0.72 by QCA (P = .75). AUC did not improve after excluding studies with fixed myocardial perfusion abnormalities and total coronary arterial occlusions. Optimal stenosis threshold for QCTA was 43% yielding a sensitivity of 0.81 and specificity of 0.50, respectively, compared to 0.75 and 0.69 by QCA at a threshold of 59%. Sensitivity and specificity of QCTA to identify patients with both obstructive lesions and myocardial perfusion defects were 0.94 and 0.77, respectively. Coronary artery stenosis assessment by QCTA or QCA only modestly predicts the presence and the absence of myocardial perfusion abnormalities by SPECT. Confounding variables affecting the relationship between coronary anatomy and myocardial perfusion likely account for some of the observed discrepancies between coronary angiography and SPECT results.