20 resultados para MDT 24 months

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Silva F.J., Conceicao W. L. F., Fagliari J.J., Girio R.J.S., Dias R. A., Borba M. R. & Mathias L. A. 2012. [Prevalence and risk factors of bovine leptospirosis in the State of Maranhao, Brazil.] Prevalencia e fatores de risco de leptospirose bovina no Estado do Maranhao. Pesquisa Veterineria Brasileira 32(4): 303-312. Departamento de Medicina Veterinaria Preventiva e Reproducao Animal, Faculdade de Ciencias Agrarias e Veterinarias, Universidade Estadual Paulista, Via de Acesso Professor Paulo Donato Castellane s/n, Zona Rural, Jaboticabal, SP 14884-900, Brazil. E-mail: fjsepi@gmail.com Prevalence and risk factors of bovine leptospirosis in the State of Maranhao were investigated. Based on production parameters that vary across different production systems, management practices, the purpose of exploitation, the average size of herds and market systems, the state was divided in four sampling circuits. The study aimed to investigate the epidemiological features of bovine leptospirosis in the State of Maranhao, in order to determine the prevalence of the infection in cattle and herds, to determine the occurrence of serovars of Leptospira spp., to identify risk factors associated with leptospirosis in cattle and to differentiate the livestock circuits itself regarding the prevalence of leptospirosis. The survey was conducted in 136 herds in the circuit I, in which 841 >= 24 months old females were analyzed; 238 in the circuit II and 2,582 females were analyzed; 122 in the circuit III and 869 females were analyzed; 77 in the circuit IV and 540 females were analyzed; a total of 573 herds and 4,832 females were analyzed. The presence of antibodies against Leptospira spp. was verified by microscopic agglutination test (MAT). Of the 4,832 cows examined, 1,904 (35.94%, CI 95% = 33.01% -38.98%) were positive. Of the 573 herds, 380 (64.81%, CI 95% = 61.10% -68.35%) were positive. Serovars Hardjo and Wolffi were the most frequent in the state. The circuit III showed the lowest prevalence of leptospirosis in all comparisons. The variables presence of horses (p = 0.000), presence of capybaras (p = 0.034) and herds with up to 32 adult females (p = 0.002) were identified as risk factors for leptospirosis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study evaluated color change, stability, and tooth sensitivity in patients submitted to different bleaching techniques. Material and methods: In this study, 48 patients were divided into five groups. A half-mouth design was conducted to compare two in-office bleaching bleaching techniques (with and without light activation): G1: 35% hydrogen peroxide (HP) (Lase Peroxide - DMC Equipments, Sao Carlos, SP, Brazil) + hybrid light (HL) (LED/Diode Laser, Whitening Lase II DMC Equipments, Sao Carlos, SP, Brazil); G2: 35% HP; G3: 38% HP (X-traBoost - Ultradent, South Jordan UT, USA) + HL; G4: 38% HP; and G5: 15% carbamide peroxide (CP) (Opalescence PF - Ultradent, South Jordan UT, USA). For G1 and G3, HP was applied on the enamel surface for 3 consecutive applications activated by HL. Each application included 3x3' HL activations with 1' between each interval; for G2 and G4, HP was applied 3x15' with 15' between intervals; and for G5, 15% CP was applied for 120'/10 days at home. A spectrophotometer was used to measure color change before the treatment and after 24 h, 1 week, 1, 6, 12, 18 and 24 months. A VAS questionnaire was used to evaluate tooth sensitivity before the treatment, immediately following treatment, 24 h after and finally 1 week after. Results: Statistical analysis did not reveal any significant differences between in-office bleaching with or without HL activation related to effectiveness; nevertheless the time required was less with HL. Statistical differences were observed between the result after 24 h, 1 week and 1, 6, 12, 18 and 24 months (integroup). Immediately, in-office bleaching increased tooth sensitivity. The groups activated with HL required less application time with gel. Conclusion: All techniques and bleaching agents used were effective and demonstrated similar behaviors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: In Brazil nationally representative donor data are limited on human immunodeficiency virus (HIV) prevalence, incidence, and residual transfusion risk. The objective of this study was to analyze HIV data obtained over 24 months by the Retrovirus Epidemiology Donor Study-II program in Brazil. STUDY DESIGN AND METHODS: Donations reactive to third-and fourth-generation immunoassays (IAs) were further confirmed by a less-sensitive (LS) IA algorithm and Western blot (WB). Incidence was calculated for first-time (FT) donors using the LS-EIA results and for repeat donors with a model developed to include all donors with a previous negative donation. Residual risk was projected by multiplying composite FT and repeat donor incidence rates by HIV marker-negative infectious window periods. RESULTS: HIV prevalence among FT donors was 92.2/ 105 donations. FT and repeat donor and composite incidences were 38.5 (95% confidence interval [CI], 25.651.4), 22.5 (95% CI, 17.6-28.0), and 27.5 (95% CI, 22.0-33.0) per 100,000 person-years, respectively. Male and community donors had higher prevalence and incidence rates than female and replacement donors. The estimated residual risk of HIV transfusion transmission was 11.3 per 106 donations (95% CI, 8.4-14.2), which could be reduced to 4.2 per 106 donations (95% CI, 3.2-5.2) by use of individual-donation nucleic acid testing (NAT). CONCLUSION: The incidence and residual transfusion risk of HIV infection are relatively high in Brazil. Implementation of NAT will not be sufficient to decrease transmission rates to levels seen in the United States or Europe; therefore, other measures focused on decreasing donations by at-risk individuals are also necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long-term sample storage can affect the intensity of the hybridization signals provided by molecular diagnostic methods that use chemiluminescent detection. The aim of this study was to evaluate the effect of different storage times on the hybridization signals of 13 bacterial species detected by the Checkerboard DNA-DNA hybridization method using whole-genomic DNA probes. Ninety-six subgingival biofilm samples were collected from 36 healthy subjects, and the intensity of hybridization signals was evaluated at 4 different time periods: (1) immediately after collecting (n = 24) and (2) after storage at -20 degrees C for 6 months (n = 24), (3) for 12 months (n = 24), and (4) for 24 months (n = 24). The intensity of hybridization signals obtained from groups 1 and 2 were significantly higher than in the other groups (p < 0.001). No differences were found between groups 1 and 2 (p > 0.05). The Checkerboard DNA-DNA hybridization method was suitable to detect hybridization signals from all groups evaluated, and the intensity of signals decreased significantly after long periods of sample storage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Although iron deficiency is considered to be the main cause of anemia in children worldwide, other contributors to childhood anemia remain little studied in developing countries. We estimated the relative contributions of different factors to anemia in a population-based, cross-sectional survey. Methodology: We obtained venous blood samples from 1111 children aged 6 months to 10 years living in the frontier town of Acrelandia, northwest Brazil, to estimate the prevalence of anemia and iron deficiency by measuring hemoglobin, erythrocyte indices, ferritin, soluble transferrin receptor, and C-reactive protein concentrations. Children were simultaneously screened for vitamin A, vitamin B-12, and folate deficiencies; intestinal parasite infections; glucose-6-phosphate dehydrogenase deficiency; and sickle cell trait carriage. Multiple Poisson regression and adjusted prevalence ratios (aPR) were used to describe associations between anemia and the independent variables. Principal Findings: The prevalence of anemia, iron deficiency, and iron-deficiency anemia were 13.6%, 45.4%, and 10.3%, respectively. Children whose families were in the highest income quartile, compared with the lowest, had a lower risk of anemia (aPR, 0.60; 95% CI, 0.37-0.98). Child age (<24 months, 2.90; 2.01-4.20) and maternal parity (>2 pregnancies, 2.01; 1.40-2.87) were positively associated with anemia. Other associated correlates were iron deficiency (2.1; 1.4-3.0), vitamin B-12 (1.4; 1.0-2.2), and folate (2.0; 1.3-3.1) deficiencies, and C-reactive protein concentrations (>5 mg/L, 1.5; 1.1-2.2). Conclusions: Addressing morbidities and multiple nutritional deficiencies in children and mothers and improving the purchasing power of poorer families are potentially important interventions to reduce the burden of anemia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a controversy about the best way to report results after bariatric surgery. Several indices have been proposed over the years such as percentage of total weight loss (%TWL), percentage of excess weight loss (%EWL), and percentage of excess body mass index loss (%EBMIL). More recently, it has been suggested to individualize the body mass index (BMI) goal to be achieved by the patients (predicted BMI-PBMI). The objective was to assess the reproducibility of this PBMI in our service. In this retrospective study, we assessed the %TWL, %EWL, %EBMIL (with expected BMI of 25 kg/m(2)), and %EBMIL (with PBMI) over 4 years of observation in two groups of patients: BMI < 50 kg/m(2) and BMI a parts per thousand yen50 kg/m(2). The medical records of 403 patients were studied. From 18 to 42 months after surgery, %TWL was higher in the superobese group, whereas %EWL was similar for the two groups. %EBMIL was higher in less obese patients up to 24 months and similar thereafter. In contrast, %EBMIL with PBMI was greater in the superobese group, although it never reached the 100% goal. We conclude that %EBMIL results according to PBMI were not reproducible in our institution. There is a need to elaborate a new easy-to-obtain and reproducible index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJETIVO: Descrever uma série de pacientes portadores de obstrução do sistema lacrimal associado à radioiodoterapia para tratamento de carcinoma de tireoide, revisar os dados clínicos e a resposta ao tratamento cirúrgico desta rara complicação. MÉTODOS: Foi realizada uma análise retrospectiva dos achados oftalmológicos de pacientes com histórico de carcinoma de tireoide previamente submetidos à tireoidectomia e à RIT que foram encaminhados para cirurgia de vias lacrimais. RESULTADOS: Dezessete pacientes com carcinoma de tireoide tratados com tireoidectomia e RIT apresentaram obstrução do ducto nasolacrimal sintomática após período médio de 13,2 meses do tratamento do câncer. Onze pacientes tiveram epífora bilateral, 8 com mucocele de saco lacrimal. A idade dos pacientes variou entre 30 e 80 anos, sendo 10 com idade menor ou igual a 49 anos. A dose cumulativa média de radioiodo administrada foi de 571 mCi (variação entre 200-1200 mCi). Sintomas de obstrução nasal e aumento de glândulas salivares ocorreram em 53% dos pacientes. Todos os pacientes foram submetidos à dacriocistorrinostomia. Observou-se ainda que nos 3 pacientes mais jovens houve maior sangramento intraoperatótio e dilatação de saco lacrimal. A resolução completa da epífora e da dacriocistite ocorreu em 82,4%, e foi parcial em 17,6% (3 pacientes mantiveram queixa unilateral após a correção da obstrução bilateralmente). O seguimento médio foi de 6 meses (intervalo: 2-24 meses). CONCLUSÕES: Alta dose cumulativa de radioiodo, disfunção nasal e de glândulas salivares estão associadas à obstrução das vias lacrimais. Observa-se uma maior porcentagem de pacientes mais jovens apresentando quadro de dacriocistite quando comparado à dacrioestenose idiopática. A absorção de iodo radioativo pela mucosa do ducto nasolacrimal com subsequente inflamação, edema e fibrose parece ter relação direta com a obstrução do ducto nasolacrimal. O conhecimento desta complicação é importante para o estudo e abordagem correta desses pacientes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Langerhans cell histiocytosis (LCH) is a rare disorder that can affect almost any organ, including bone. Treatment options include local corticosteroid infiltration in isolated bone lesions and oral corticosteroids and chemotherapy in multifocal bone lesions. Several studies show local corticosteroid injection in unifocal bone lesions heal in more than 75% of patients with minimal side effects. Therefore, it is unclear whether chemotherapy adds materially to the healing rate. We therefore compared overall survival, remission rate, and recurrence rate in patients with bone LCH treated with chemotherapy and corticosteroids or corticosteroids alone. We retrospectively reviewed the records of 198 patients with LCH since 1950. Median age at diagnosis was 5 years, male-to-female ratio was 1.33, and the most frequent symptom was local pain (95%). We recorded the disease presentation, demographics, treatment, and clinical evolution of each patient. Minimum followup was 4 months (median, 24 months; range, 4-360 months). The survival rate of the systemic disease group was 76.5% (65 of 85) while the survival rate in the unifocal and multifocal bone involvement groups was 100% at a median 5-year followup. All patients with unifocal bone involvement and 40 of 43 (93%) with multifocal bone involvement had complete remission. One of 30 patients with multifocal bone involvement treated with chemotherapy and oral corticosteroids did not achieve remission whereas two of six receiving only corticosteroids did not achieve remission. Our observations suggest intralesional corticosteroid injection without adjunctive chemotherapy achieves remission in unifocal bone LCH but may not do so in multifocal single-system bone involvement. Larger series would be required to confirm this observation. Level IV, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective The ketogenic diet is used as a therapeutic alternative for the treatment of epilepsy in patients with refractory epilepsy. It simulates biochemical changes typical of fasting. The present study verified the nutritional impact of the ketogenic diet on children with refractory epilepsy. Methods Nutritional status data (dietary, biochemical and anthropometric measurements), seizure frequency, and adverse events were collected from the medical records and during outpatient clinic visits of children over a period of 36 months. Results Of the 29 children who initiated the ketogenic diet, 75.8% presented fewer seizures after one month of treatment. After six months, 48.3% of the patients had at least a 90.0% decrease in seizure frequency, and 50.0% of these patients presented total seizure remission. At 12 months, eight patients continued to show positive results, and seven of these children remained on the ketogenic diet for 24 months. There was an improvement of the nutritional status at 24 months, especially in terms of weight, which culminated with the recovery of proper weight-for-height. There were no significant changes in biochemical indices (total cholesterol and components, triglycerides, albumin, total protein, creatinine, glycemia, serum aspartate transaminase and serum alanine transaminase). Serum cholesterol levels increased significantly in the first month, fell in the following six months, and remained within the normal limits thereafter. Conclusion In conclusion, patients on the classic ketogenic diet for at least 24 months gained weight. Moreover, approximately one third of the patients achieved significant reduction in seizure frequency, and some patients achieved total remission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hematopoietic cell transplantation (HCT) is an emerging therapy for patients with severe autoimmune diseases (AID). We report data on 368 patients with AID who underwent HCT in 64 North and South American transplantation centers reported to the Center for International Blood and Marrow Transplant Research between 1996 and 2009. Most of the HCTs involved autologous grafts (n = 339); allogeneic HCT (n = 29) was done mostly in children. The most common indications for HCT were multiple sclerosis, systemic sclerosis, and systemic lupus erythematosus. The median age at transplantation was 38 years for autologous HCT and 25 years for allogeneic HCT. The corresponding times from diagnosis to HCT were 35 months and 24 months. Three-year overall survival after autologous HCT was 86% (95% confidence interval [CI], 81%-91%). Median follow-up of survivors was 31 months (range, 1-144 months). The most common causes of death were AID progression, infections, and organ failure. On multivariate analysis, the risk of death was higher in patients at centers that performed fewer than 5 autologous HCTs (relative risk, 3.5; 95% CI, 1.1-11.1; P = .03) and those that performed 5 to 15 autologous HCTs for AID during the study period (relative risk, 4.2; 95% CI, 1.5-11.7; P = .006) compared with patients at centers that performed more than 15 autologous HCTs for AID during the study period. AID is an emerging indication for HCT in the region. Collaboration of hematologists and other disease specialists with an outcomes database is important to promote optimal patient selection, analysis of the impact of prognostic variables and long-term outcomes, and development of clinical trials. Biol Blood Marrow Transplant 18: 1471-1478 (2012) (C) 2012 Published by Elsevier Inc. on behalf of American Society for Blood and Marrow Transplantation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Ankle-brachial index (ABI) can access peripheral artery disease and predict mortality in prevalent patients on hemodialysis. However, ABI has not yet been tested in incident patients, who present significant mortality. Typically, ABI is measured by Doppler, which is not always available, limiting its use in most patients. We therefore hypothesized that ABI, evaluated by a simplified method, can predict mortality in an incident hemodialysis population. Methodology/Principal Findings: We studied 119 patients with ESRD who had started hemodialysis three times weekly. ABI was calculated by using two oscillometric blood pressure devices simultaneously. Patients were followed until death or the end of the study. ABI was categorized in two groups normal (0.9-1.3) or abnormal (<0.9 and >1.3). There were 33 deaths during a median follow-up of 12 months (from 3 to 24 months). Age (1 year) (hazard of ratio, 1.026; p = 0.014) and ABI abnormal (hazard ratio, 3.664; p = 0.001) were independently related to mortality in a multiple regression analysis. Conclusions: An easy and inexpensive technique to measure ABI was tested and showed to be significant in predicting mortality. Both low and high ABI were associated to mortality in incident patients on hemodialysis. This technique allows nephrologists to identify high-risk patients and gives the opportunity of early intervention that could alter the natural progression of this population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work aimed to evaluate the seasonal increment in diameter of Eucalyptus grandis trees for 24 months and its relationship with the climatic variables and fertilization with nitrogen and with sewer mud. The trees were planted in the spacing of 3 x 2 m and fertilized with nitrogen (planting, 6, 12, 18 months) and sewer mud (planting and 8 months). 20 trees were selected by treatment according witch the distribution of basal area and installed dendrometer bands at a 1.3 meter. The results showed a clear effect of the climatic variables on the seasonal increment in diameter of trees, being observed a delay period (lag) of 28 days for the answer of the trees in relation to the climatic variables. Regading to the fertilization effect, it was observed that the increment of trunk diameter was higher in the eucalypt trees with organic in relation to mineral fertilization with nitrogen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: Cognitive decline related to neurocysticercosis (NC) remains poorly characterized and underdiagnosed. In a cross-sectional study with a prospective phase, we evaluated cognitive decline in patients with strictly calcified form (C-NC), the epidemiologically largest subgroup of NC, and investigated whether there is a spectrum of cognitive abnormalities in the disease. Methods: Forty treatment-naive patients with C-NC aged 37.6 +/- 11.3 years and fulfilling criteria for definitive C-NC were submitted to a comprehensive cognitive and functional evaluation and were compared with 40 patients with active NC (A-NC) and 40 healthy controls (HC) matched for age and education. Patients with dementia were reassessed after 24 months. Results: Patients with C-NC presented 9.4 +/- 3.1 altered test scores out of the 30 from the cognitive battery when compared to HC. No patient with C-NC had dementia and 10 patients (25%) presented cognitive impairment-no dementia (CIND). The A-NC group had 5 patients (12.5%) with dementia and 11 patients (27.5%) with CIND. On follow-up, 3 out of 5 patients with A-NC with dementia previously still presented cystic lesions with scolex on MRI and still had dementia. One patient died and the remaining patient no longer fulfilled criteria for either dementia or CIND, presenting exclusively calcified lesions on neuroimaging. Conclusions: Independently of its phase, NC leads to a spectrum of cognitive abnormalities, ranging from impairment in a single domain, to CIND and, occasionally, to dementia. These findings are more conspicuous during active vesicular phase and less prominent in calcified stages. Neurology (R) 2012; 78: 861-866

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: To assess corneal wavefront-guided photorefractive keratectomy (PRK) to correct hyperopia after radial keratotomy (RK). SETTING: Sadalla Amin Ghanem Eye Hospital, Joinville, Santa Catarina, Brazil. DESIGN: Case series. METHODS: Excimer laser corneal wavefront-guided PRK with intraoperative mitomycin-C (MMC) 0.02% was performed. Main outcome measures were uncorrected (UDVA) and corrected (CDVA) distance visual acuities, spherical equivalent (SE), corneal aberrations, and haze. RESULTS: The mean time between RK and PRK in the 61 eyes (39 patients) was 18.8 years +/- 3.8 (SD). Before PRK, the mean SE was +4.17 +/- 1.97 diopters (D); the mean astigmatism, -1.39 +/- 1.04 D; and the mean CDVA, 0.161 +/- 0.137 logMAR. At 24 months, the mean values were 0.14 +/- 0.99 D (P<.001), -1.19 +/- 1.02 D (P=.627), and 0.072 +/- 0.094 logMAR (P<.001), respectively; the mean UDVA was 0.265 +/- 0.196 (P<.001). The UDVA was 20/25 or better in 37.7% of eyes and 20/40 or better in 68.9%. The CDVA improved by 1 or more lines in 62.3% of eyes. Two eyes (3.3%) lost 2 or more lines, 1 due to corneal ectasia. Thirty eyes (49.2%) were within +/- 0.50 D of intended SE and 45 (73.8%) were within +/- 1.00 D. From 6 to 24 months, the mean SE regression was +0.39 D (P<.05). A significant decrease in coma, trefoil, and spherical aberration occurred. Three eyes developed peripheral haze more than grade 1. CONCLUSION: Corneal wavefront-guided PRK with MMC for hyperopia after RK significantly improved UDVA, CDVA, and higher-order corneal aberrations with a low incidence of visually significant corneal haze.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To build a life table and determine the factors related to the time of treatment of undernourished children at a nutrition rehabilitation centre (CREN), Sao Paulo, Brazil. Design: Nutritional status was assessed from weight-for-age, height-for-age and BMI-for-age Z-scores, while neuropsychomotor development was classified according to the milestones of childhood development. Life tables, Kaplan-Meier survival curves and Cox multiple regression models were employed in data analysis. Setting: CREN (Centre of Nutritional Recovery and Education), Sao Paulo, Brazil. Subjects: Undernourished children (n 228) from the southern slums of Sao Paulo who had received treatment at CREN under a day-hospital regime between the years 1994 and 2009. Results: The Kaplan-Meier curves of survival analysis showed statistically significant differences in the periods of treatment at CREN between children presenting different degrees of neuropsychomotor development (log-rank = 6.621; P = 0.037). Estimates based on the multivariate Cox model revealed that children aged >= 24 months at the time of admission exhibited a lower probability of nutritional rehabilitation (hazard ratio (HR) = 0.49; P = 0.046) at the end of the period compared with infants aged up 12 months. Children presenting slow development were better rehabilitated in comparison with those exhibiting adequate evolution (HR = 4.48; P = 0.023). No significant effects of sex, degree of undernutrition or birth weight on the probability of nutritional rehabilitation were found. Conclusions: Age and neuropsychomotor developmental status at the time of admission to CREN are critical factors in determining the duration of treatment.