95 resultados para Long term evaluation
em Scielo Saúde Pública - SP
Resumo:
The aim of this article is to present an investigation of cure rate, after long follow up, of specific chemotherapy with benznidazole in patients with both acute and chronic Chagas disease, applying quantitative conventional serological tests as the base of the criterion of cure. Twenty one patients with the acute form and 113 with one or other of the various chronic clinical forms of the disease were evaluated, after a follow up period of 13 to 21 years, for the acute, and 6 to 18 years, for the chronic patients. The duration of the acute as well as the chronic disease, a condition which influences the results of the treatment, was determined. The therapeutic schedule was presented, with emphasis on the correlation between adverse reactions and the total dose of 18 grams, approximately, as well as taking into consideration precautions to assure the safety of the treatment. Quantitative serological reactions consisting of complement fixation, indirect immunofluorescence, indirect hemagglutination, and, occasionally, ELISA, were used. Cure was found in 76 per cent of the acute patients but only in 8 per cent of those with chronic forms of the disease. In the light of such contrasting results, fundamentals of the etiological therapy of Chagas disease were discussed, like the criterion of cure, the pathogenesis and the role of immunosuppression showing tissue parasitism in long standing chronic disease, in support of the concept that post-therapeutic consistently positive serological reactions mean the presence of the parasite in the patient's tissues. In relation to the life-cycle of T. cruzi in vertebrate host, there are still some obscure and controversial points, though there is no proof of the existence of resistant or latent forms. However, the finding over the last 15 years, that immunosuppression brings about the reappearance of acute disease in long stand chronic patients justifies a revision of the matter. Facts were quoted in favor of the treatment of chronic patients.
Resumo:
Ileal pouch-anal anastomosis was an important advancement in the treatment of ulcerative colitis. The aim of this study was to determine whether early complications of ileal pouch-anal anastomosis in patients with ulcerative colitis are associated with poor late functional results. PATIENTS AND METHODS: Eighty patients were operated on from 1986 to 2000, 62 patients with ileostomy and 18 without. The early and late complications were recorded. Specific emphasis has been placed on the incidence of pouchitis with prolonged follow-up. RESULTS: The ileostomy was closed an average of 9.2 months after the first operation. Fourteen patients were excluded from the long-term evaluation; 6 patients were lost to regular follow-up, 4 died, and 4 patients still have the ileostomy. Of the 4 patients that died, 1 died from surgical complications. Early complications after operation (41) occurred in 34 patients (42.5%). Late complications (29) occurred in 25 patients as follows: 16 had pouchitis, 3 associated with stenosis and 1 with sexual dysfunction; 5 had stenosis; and there was 1 case each of incisional hernia, ileoanal fistula, hepatic cancer, and endometriosis. Pouchitis occurred in 6 patients (9.8%) 1 year after ileal pouch-anal anastomosis, 9 (14.8%) after 3 years, 13 (21.3%) after 5 years, and 16 (26.2%) after more than 6 years. The mean daily stool frequency was 12 before and 5.8 after operation. One pouch was removed because of fistulas that appeared 2 years later. CONCLUSIONS: Ileal pouch-anal anastomosis is associated with a considerable number of early complications. There was no correlation between pouchitis and severe disease, operation with or without ileostomy, or early postoperative complications. The incidence of pouchitis was directly proportional to duration of time of follow-up.
Resumo:
Our objective was to compare food intake and nutritional status of Pemphigus Foliaceus patients (PG) on long term glucocorticoid therapy to a Control Group (CG). Fourteen PG female inpatients receiving prednisone (0.33 ± 0.22mg/kg) for at least 12 months and twelve CG subjects were submitted to nutritional evaluation, including anthropometry, urinary creatinine determination and serum biochemical measurements, besides 48-h-based food intake records. Groups were compared by Chi-square, Mann-Whitney and "t" tests. PG patients and CG were paired, respectively, in relation to age (24.7 ± 14.1 vs. 22.0 ± 12.0 years), body mass index (25.8 ± 6.4 vs. 24.0 ± 5.6kg/m2), daily protein intake (132.9 ± 49.8 vs. 95.2 ± 58.9g), and serum albumin (median; range) (3.8; 3.5-4.1 vs. 3.8; 3.6-5.0g/dl). However, PG patients had lower height-creatinine index (64.8 ± 17.6 vs. 90.1 ± 33.4%), and higher daily energy (3080 ± 1099 vs. 2187 ± 702kcal) and carbohydrate (376.8 ± 135.8 vs. 242.0 ± 80.7g) intakes. Despite high food, protein and energy consumption, PG patients on long term glucocorticoid therapy had lower body muscle mass than controls, while showing high body fat stores. These findings are possibly related to combined metabolic effects of long term corticotherapy and inflammatory disease plus corticosteroid-induced increased appetite.
Resumo:
This article describes the standardization and evaluation of an in-house specific IgG avidity ELISA for distinguishing recent primary from long-term human cytomegalovirus (HCMV) infection. The test was standardized with the commercial kit ETI-CYTOK G Plus (Sorin Biomedica, Italy) using 8 M urea in phosphate-buffered saline to dissociate low-avidity antibodies after the antigen-antibody interaction. The performance of the in-house assay was compared to that of the commercial automated VIDAS CMV IgG avidity test (bioMérieux, France). Forty-nine sera, 24 from patients with a recent primary HCMV infection and 25 from patients with a long-term HCMV infection and a sustained persistence of specific IgM antibodies, were tested. Similar results were obtained with the two avidity methods. All 24 sera from patients with recently acquired infection had avidity indices compatible with acute HCMV infection by the VIDAS method, whereas with the in-house method, one serum sample had an equivocal result. In the 25 sera from patients with long-term infection, identical results were obtained with the two methods, with only one serum sample having an incompatible value. These findings suggest that our in-house avidity test could be a potentially useful tool for the immunodiagnosis of HCMV infection.
Resumo:
OBJECTIVES: To correlate the expression of p53 protein and VEGF with the prognosis of patients submitted to curative resection to treat esophageal adenocarcinoma. METHODS: Forty-six patients with esophageal adenocarcinoma, submitted to curative resection, were studied. The expressions of p53 protein and VEGF were assessed by immunohistochemistry in 52.2% and 47.8% of tumors, respectively. RESULTS: P53 protein and VEGF expressions coincided in 26% of the cases, and no correlation between these expressions was observed. None of the clinicopathological factors showed a significant correlation with p53 protein or VEGF expressions. There was no significant association between p53 protein and VEGF expressions and long-term survival. CONCLUSION: The expression of p53 protein and VEGF did not correlate with prognosis in esophageal adenocarcinoma patients submitted to curative resection.
Resumo:
Because of the increasing prevalence of obesity, prevention and treatment of overweight has become a major public health concern. In addition to diet and exercise, drugs are needed for patients who failed to lose weight with behavioral treatment. The current article aimed to summarize recent concerns on the safety and efficacy of appetite suppressants. Several appetite suppressants have been banned for safety reasons. In 2010, sibutramine was withdrawn from the market because a long-term study showed it increased the risks of cardiovascular events. So far no study with a sufficiently large sample size has demonstrated that appetite suppressants can reduce morbidity and mortality associated with overweight. The withdrawal of sibutramine highlights that guidelines for the evaluation of weight control drugs must be more stringent, and studies on their long-term health benefits are needed prior to their marketing.
Resumo:
Abnormalities of renal function have been demonstrated inpatients with visceral leishmaniasis; although there was a trend toward normalization following antiparasitic therapy, some abnormalities persisted. With thepurpose of studying the long- term clinical course of renal involvement in visceral leishmaniasis, 32 patients with a diagnosis of this parasitic disease were evaluated in the endemic area and at least 6 months after the clinical cure of the disease and compared with a control group of 28 individuals. No patient had a history or clinical findings suggestive of renal disease and all were normotensive. Laboratory evaluation was normal in all except 3 patients with abnormal urinalysis. Mild proteinuria and microscopic hematuria were seen in a single urinalysis in one patient (although three other urinalysis were normal), and leucocyturia in two female patients. It was concluded that the renal involvement in visceral leishmaniasis is mild and transient, with normal renal function observed on long-term follow-up after cure of the parasitic infection.
Resumo:
OBJECTIVE: The aim of this work was the follow-up and evaluation of valve replacement in children under 12 years of age. METHODS: Forty-four children less than 12 years old were underwent valve replacement at INCOR-HCFMUSP between January 1986 and December 1992. Forty (91%) were rheumatic, 39 (88.7%) were in functional classes II or IV, 19 (43.2%) were operated upon on an emergency basis, and 6 (13.6%) had atrial fibrillation. Biological prostheses (BP) were employed in 26 patients (59.1%), and mechanical prostheses (MP) in 18 (40.9%). Mitral valves were replaced in 30 (68.7%), aortic valves in 8 (18.2%), a tricuspid valve in 1 (2.3%), and double (aortic and mitral) valves in 5 (11.4) of the patients. RESULTS: Hospital mortality was of 4.5% (2 cases). The mean follow-up period was 5.8 years. Re-operations occurred in 63.3% of the patients with BP and in 12.5% of those with MP (p=0.002). Infectious endocarditis was present in 26.3% of the BP, but in none of the cases of MP (p=0.049). Thrombosis occurred in 2 (12.5%) and hemorrhage in one (6.5%) of the patients with a MP. Delayed mortality occurred in 5 (11.9%) of the patients over a mean period of 2.6 years; four had had BP and one had a MP (NS). Actuarial survival and re-operation-free curves after 10 years were respectively, 82.5±7.7 (SD)% and 20.6±15.9%. CONCLUSION: Patients with MP required fewer re-operation, had less infectious endocarditis and lower late mortality rates compared with patients with bioprostheses. The former, therefore, appear to be the best valve replacement for pediatric patients.
Resumo:
OBJECTIVE: To access the incidence of diagnostic errors in the initial evaluation of children with cardiac murmurs. METHODS: We evaluated our 7-years of experience in a public pediatric cardiology outpatient clinic. Of 3692 patients who were sent to the hospital, 2603 presented with a heart murmur and were investigated. Patients for whom a disagreement existed between the initial and final diagnoses were divided into the following 2 groups: G1 (n=17) with an initial diagnosis of an innocent murmur and a final diagnosis of cardiopathy, and G2 (n=161) with an initial diagnosis of cardiopathy and a final diagnosis of a normal heart. RESULTS: In G1, the great majority of patients had cardiac defects with mild hemodynamic repercussions, such as small ventricular septal defect and mild pulmonary stenosis. In G2, the great majority of structural defects were interventricular communication, atrial septal defect and pulmonary valve stenosis. CONCLUSION: A global analysis demonstrated that diagnostic error in the initial evaluation of children with cardiac murmurs is real, reaching approximately 6% of cases. The majority of these misdiagnoses were in patients with an initial diagnosis of cardiopathy, which was not confirmed through later complementary examinations. Clinical cardiovascular examination is an excellent resource in the evaluation of children suspected of having cardiopathy. Immediate outpatient discharge of children with an initial diagnosis of an innocent heart murmur seems to be a suitable approach.
Resumo:
Soil C-CO2 emissions are sensitive indicators of management system impacts on soil organic matter (SOM). The main soil C-CO2 sources at the soil-plant interface are the decomposition of crop residues, SOM turnover, and respiration of roots and soil biota. The objectives of this study were to evaluate the impacts of tillage and cropping systems on long-term soil C-CO2 emissions and their relationship with carbon (C) mineralization of crop residues. A long-term experiment was conducted in a Red Oxisol in Cruz Alta, RS, Brazil, with subtropical climate Cfa (Köppen classification), mean annual precipitation of 1,774 mm and mean annual temperature of 19.2 ºC. Treatments consisted of two tillage systems: (a) conventional tillage (CT) and (b) no tillage (NT) in combination with three cropping systems: (a) R0- monoculture system (soybean/wheat), (b) R1- winter crop rotation (soybean/wheat/soybean/black oat), and (c) R2- intensive crop rotation (soybean/ black oat/soybean/black oat + common vetch/maize/oilseed radish/wheat). The soil C-CO2 efflux was measured every 14 days for two years (48 measurements), by trapping the CO2 in an alkaline solution. The soil gravimetric moisture in the 0-0.05 m layer was determined concomitantly with the C-CO2 efflux measurements. The crop residue C mineralization was evaluated with the mesh-bag method, with sampling 14, 28, 56, 84, 112, and 140 days after the beginning of the evaluation period for C measurements. Four C conservation indexes were used to assess the relation between C-CO2 efflux and soil C stock and its compartments. The crop residue C mineralization fit an exponential model in time. For black oat, wheat and maize residues, C mineralization was higher in CT than NT, while for soybean it was similar. Soil moisture was higher in NT than CT, mainly in the second year of evaluation. There was no difference in tillage systems for annual average C-CO2 emissions, but in some individual evaluations, differences between tillage systems were noticed for C-CO2 evolution. Soil C-CO2 effluxes followed a bi-modal pattern, with peaks in October/November and February/March. The highest emission was recorded in the summer and the lowest in the winter. The C-CO2 effluxes were weakly correlated to air temperature and not correlated to soil moisture. Based on the soil C conservation indexes investigated, NT associated to intensive crop rotation was more C conserving than CT with monoculture.
Resumo:
Significant improvements have been noted in heart transplantation with the advent of cyclosporine. However, cyclosporine use is associated with significant side effects, such as chronic renal failure. We were interested in evaluating the incidence of long-term renal dysfunction in heart transplant recipients. Fifty-three heart transplant recipients were enrolled in the study. Forty-three patients completed the entire evaluation and follow-up. Glomerular (serum creatinine, creatinine clearance measured, and creatinine clearance calculated) and tubular functions (urinary retinol-binding protein, uRBP) were re-analyzed after 18 months. At the enrollment time, the prevalence of renal failure ranged from 37.7 to 54% according to criteria used to define it (serum creatinine > or = 1.5 mg/dL and creatinine clearance <60 mL/min). Mean serum creatinine was 1.61 ± 1.31 mg/dL (range 0.7 to 9.8 mg/dL) and calculated and measured creatinine clearances were 67.7 ± 25.9 and 61.18 ± 25.04 mL min-1 (1.73 m²)-1, respectively. Sixteen of the 43 patients who completed the follow-up (37.2%) had tubular dysfunction detected by increased levels of uRBP (median 1.06, 0.412-6.396 mg/dL). Eleven of the 16 patients (68.7%) with elevated uRBP had poorer renal function after 18 months of follow-up, compared with only eight of the 27 patients (29.6%) with normal uRBP (RR = 3.47, P = 0.0095). Interestingly, cyclosporine trough levels were not different between patients with or without tubular and glomerular dysfunction. Renal function impairment is common after heart transplantation. Tubular dysfunction, assessed by uRBP, correlates with a worsening of glomerular filtration and can be a useful tool for early detection of renal dysfunction.
Resumo:
Cardiopulmonary exercise testing (CPET) plays an important role in the assessment of functional capacity in patients with interstitial lung disease. The aim of this study was to identify CPET measures that might be helpful in predicting the vital capacity and diffusion capacity outcomes of patients with thoracic sarcoidosis. A longitudinal study was conducted on 42 nonsmoking patients with thoracic sarcoidosis (median age = 46.5 years, 22 females). At the first evaluation, spirometry, the measurement of single-breath carbon monoxide diffusing capacity (D LCOsb) and CPET were performed. Five years later, the patients underwent a second evaluation consisting of spirometry and D LCOsb measurement. After 5 years, forced vital capacity (FVC)% and D LCOsb% had decreased significantly [95.5 (82-105) vs 87.5 (58-103) and 93.5 (79-103) vs 84.5 (44-102), respectively; P < 0.0001 for both]. In CPET, the peak oxygen uptake, maximum respiratory rate, breathing reserve, alveolar-arterial oxygen pressure gradient at peak exercise (P(A-a)O2), and Δ SpO2 values showed a strong correlation with the relative differences for FVC% and D LCOsb% (P < 0.0001 for all). P(A-a)O2 ≥22 mmHg and breathing reserve ≤40% were identified as significant independent variables for the decline in pulmonary function. Patients with thoracic sarcoidosis showed a significant reduction in FVC% and D LCOsb% after 5 years of follow-up. These data show that the outcome measures of CPET are predictors of the decline of pulmonary function.
Resumo:
Damping off is a nursery disease of great economic importance in papaya and seed treatment may be an effective measure to control. The aim of this work was to evaluate the quality of papaya seeds treated with fungicides and stored under two environmental and packaging conditions. Additionally, the efficiency of fungicide treatments in the control of damping-off caused by Rhizoctonia solani was evaluated. Papaya seeds were treated with the fungicides Captan, Tolylfluanid and the mixture Tolylfluanid + Captan (all commercial wettable powder formulations). Seeds of the control group were not treated. The seeds were stored for nine months in two conditions: packed in aluminum coated paper and kept at 7 ± 1ºC and in permeable kraft paper and kept in non-controlled environment. At the beginning of the storage and every three months the seed quality (germination and vigor tests), emergence rate index, height, dry mass and damping of plants in pre and post-emergence (in contaminated substrate and mycelia-free substrate) were analyzed. Both storage conditions as well as the fungicide treatments preserved the germination and seed vigor. In the infested substrate, seedling emergence was favored by fungicides, but in post-emergence, fungicides alone did not control the damping off caused by R. solani. Symptoms of damping off were not observed in the clean substrate. The results showed that the fungicide treatments may be used to pretreat papaya seed for long-term storage and commercialization.
Resumo:
The response to interferon treatment in chronic hepatitis NANB/C has usually been classified as complete, partial or absent, according to the behavior of serum alanine aminotransferase (ALT). However, a more detailed observation of the enzymatic activity has shown that the patterns may be more complex. The aim of this study was to describe the long term follow-up and patterns of ALT response in patients with chronic hepatitis NANB/C treated with recombinant interferon-alpha. A follow-up of 6 months or more after interferon-a was achieved in 44 patients. We have classified the serum ALT responses into six patterns and the observed frequencies were as follows: I. Long term response = 9 (20.5%); II. Normalization followed by persistent relapse after IFN = 7 (15.9%); III. Normalization with transient relapse = 5 (11.9%); IV. Temporary normalization and relapse during IFN = 4 (9.1%); V. Partial response (more than 50% of ALT decrease) = 7 (15.9%); VI. No response = 12 (27.3%). In conclusion, ALT patterns vary widely during and after IFN treatment and can be classified in at least 6 types.
Resumo:
A case of sporotrichosis in a woman presenting 63 cutaneous lesions distributed all over the tegument is related. The patient had both humoral (Immunoglobulins) and cellular (Lymphocytes subpopulations) immunity within normal limits, but was under treatment with steroid during a long time (Prednisone 10 mg daily for 2 years), due to a sciatic pain. In addition a review of the Brazilian literature on this type of lesions was carried out and commented.