12 resultados para Take-up rate
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Abstract Background Bipolar Disorder (BD) is a chronic, recurrent and highly prevalent illness. Despite the need for correct diagnosis to allow proper treatment, studies have shown that reaching a diagnosis can take up to ten years due to the lack of recognition of the broader presentations of BD. Frequent comorbidities with other psychiatric disorders are a major cause of misdiagnosis and warrant thorough evaluation. Methods/Design ESPECTRA (Occurrence of Bipolar Spectrum Disorders in Eating Disorder Patients) is a single-site cross-sectional study involving a comparison group, designed to evaluate the prevalence of bipolar spectrum in an eating disorder sample. Women aged 18-45 years will be evaluated using the SCID-P and Zurich criteria for diagnosis and the HAM-D, YOUNG, SCI-MOODS, HCL-32, BIS-11, BSQ, WHOQoL and EAS instruments for rating symptoms and measuring clinical correlates. Discussion The classificatory systems in psychiatry are based on categorical models that have been criticized for simplifying the diagnosis and leading to an increase in comorbidities. Some dimensional approaches have been proposed aimed at improving the validity and reliability of psychiatric disorder assessments, especially in conditions with high rates of comorbidity such as BD and Eating Disorder (ED). The Bipolar Spectrum (BS) remains under-recognized in clinical practice and its definition is not well established in current diagnostic guidelines. Broader evaluation of psychiatric disorders combining categorical and dimensional views could contribute to a more realistic understanding of comorbidities and help toward establishing a prognosis.
Resumo:
The seroprevalence and geographic distribution of HTLV-1/2 among blood donors are extremely important to transfusion services. We evaluated the seroprevalence of HTLV-1/2 infection among first-time blood donor candidates in Ribeirão Preto city and region. From January 2000 to December 2010, 1,038,489 blood donations were obtained and 301,470 were first-time blood donations. All samples were screened with serological tests for HTLV-1/2 using enzyme immunoassay (EIA). In addition, the frequency of coinfection with hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus (HIV), Chagas disease (CD) and syphilis was also determined. In-house PCR was used as confirmatory test for HTLV-1/2. A total of 296 (0.1%) first-time donors were serologically reactive for HTLV-1/2. Confirmatory PCR of 63 samples showed that 28 were HTLV-1 positive, 13 HTLV-2 positive, 19 negative and three indeterminate. Regarding HTLV coinfection rates, the most prevalent was with HBV (51.3%) and HCV (35.9%), but coinfection with HIV, CD and syphilis was also detected. The real number of HTLV-infected individual and coinfection rate in the population is underestimated and epidemiological studies like ours are very informative.
Resumo:
Background and Objectives: Patients who survive acute kidney injury (AKI), especially those with partial renal recovery, present a higher long-term mortality risk. However, there is no consensus on the best time to assess renal function after an episode of acute kidney injury or agreement on the definition of renal recovery. In addition, only limited data regarding predictors of recovery are available. Design, Setting, Participants, & Measurements: From 1984 to 2009, 84 adult survivors of acute kidney injury were followed by the same nephrologist (RCRMA) for a median time of 4.1 years. Patients were seen at least once each year after discharge until end stage renal disease (ESRD) or death. In each consultation serum creatinine was measured and glomerular filtration rate estimated. Renal recovery was defined as a glomerular filtration rate value >= 60 mL/min/1.73 m2. A multiple logistic regression was performed to evaluate factors independently associated with renal recovery. Results: The median length of follow-up was 50 months (30-90 months). All patients had stabilized their glomerular filtration rates by 18 months and 83% of them stabilized earlier: up to 12 months. Renal recovery occurred in 16 patients (19%) at discharge and in 54 (64%) by 18 months. Six patients died and four patients progressed to ESRD during the follow up period. Age (OR 1.09, p < 0.0001) and serum creatinine at hospital discharge (OR 2.48, p = 0.007) were independent factors associated with non renal recovery. The acute kidney injury severity, evaluated by peak serum creatinine and need for dialysis, was not associated with non renal recovery. Conclusions: Renal recovery must be evaluated no earlier than one year after an acute kidney injury episode. Nephrology referral should be considered mainly for older patients and those with elevated serum creatinine at hospital discharge.
Resumo:
In this study, the physiological responses and rate of perceived exertion in Brazilian jiu-jitsu fighters submitted to a combat simulation were investigated. Venous blood samples and heart rate were taken from twelve male Brazilian jiu-jitsu athletes (27.1+/-2.7 yrs, 75.4+/-8.8 kg, 174.9+/-4.4 cm, 9.2+/-2.4% fat), at rest, after a warm-up (ten minutes), immediately after the fight simulation (seven minutes) and after recovery (fourteen minutes). After the combat the rate of perceived exertion was collected. The combat of the Brazilian jiu-jitsu fighters did not change blood concentrations of glucose, triglycerides, total cholesterol, low density lipoprotein and very low density lipoprotein, ureia and ammonia. However, blood levels of high density lipoprotein were significantly higher post-fight (before: 43.0+/-6.9 mg/dL, after: 45.1+/-8.0 mg/dL) and stayed at high levels during the recovery period (43.6+/-8.1 mg/dL) compared to the rest values (40.0+/-6.6 mg/dL). The fight did not cause changes in the concentrations of the cell damage markers of creatine kinase, aspartate aminotransferase and creatinine. However, blood concentrations of the alanine aminotransferase (before: 16.1+/-7.1 U/L, after: 18.6+/-7.1 U/L) and lactate dehydrogenase (before: 491.5+/-177.6 U/L, after: 542.6+/-141.4 U/L) enzymes were elevated after the fight. Heart rate (before: 122+/-25 bpm, after: 165+/-17 bpm) and lactate (before: 2.5+/-1.2 mmol/L, after: 11.9+/-5.8 mmol/L) increased significantly with the completion of combat. Despite this, the athletes rated the fight as being light or somewhat hard (12+/-2). These results showed that muscle glycogen is not the only substrate used in Brazilian jiu-jitsu fights, since there are indications of activation of the glycolytic, lipolytic and proteolytic pathways. Furthermore, the athletes rated the combats as being light or somewhat hard although muscle damage markers were generated.
Resumo:
Assessing the efficacy of implantable cardioverter-defibrillators (ICD) in patients with Chagas' heart disease (ChHD) and identifying the clinical predictors of mortality and ICD shock during long-term follow-up. ChHD is associated with ventricular tachyarrhythmias and an increased risk of sudden cardiac death. Although ChHD is a common form of cardiomyopathy in Latin American ICD users, little is known about its efficacy in the treatment of this population. The study cohort included 116 consecutive patients with ChHD and an ICD implanted for secondary prevention. Of the 116 patients, 83 (72%) were men; the mean age was 54 +/- 10.7 years. Several clinical variables were tested in a multivariate Cox model for predicting long-term mortality. The average follow-up was 45 +/- 32 months. New York Heart Association class I-II developed in 83% of patients. The mean left ventricular ejection fraction was 42 +/- 16% at implantation. Of the 116 patients, 58 (50%) had appropriate shocks and 13 (11%) had inappropriate therapy. A total of 31 patients died (7.1% annual mortality rate). New York Heart Association class III (hazard ratio [HR] 3.09, 95% confidence interval 1.37 to 6.96, p = 0.0064) was a predictor of a worse prognosis. The left ventricular ejection fraction (HR 0.972, 95% confidence interval 0.94 to 0.99, p = 0.0442) and low cumulative right ventricular pacing (HR 0.23, 95% confidence interval 0.11 to 0.49, p = 0.0001) were predictors of better survival. The left ventricular diastolic diameter was an independent predictor of appropriate shock (I-ER 1.032, 95% confidence interval 1.004 to 1.060, p = 0.025). In conclusion, in a long-term follow-up, ICD efficacy for secondary sudden cardiac death prevention in patients with ChHD was marked by a favorable annual rate of all-cause mortality (7.1%); 50% of the cohort received appropriate shock therapy. New York Heart Association class III and left ventricular ejection fraction were independent predictors of worse prognosis, and low cumulative right ventricular pacing defined better survival. (C) 2012 Elsevier Inc. All rights reserved. (Am J Cardiol 2012;110:1040-1045)
Resumo:
Increasing age is associated with a reduction in overall heart rate variability as well as changes in complexity of physiologic dynamics. The aim of this study was to verify if the alterations in autonomic modulation of heart rate caused by the aging process could be detected by Shannon entropy (SE), conditional entropy (CE) and symbolic analysis (SA). Complexity analysis was carried out in 44 healthy subjects divided into two groups: old (n = 23, 63 +/- A 3 years) and young group (n = 21, 23 +/- A 2). It was analyzed SE, CE [complexity index (CI) and normalized CI (NCI)] and SA (0V, 1V, 2LV and 2ULV patterns) during short heart period series (200 cardiac beats) derived from ECG recordings during 15 min of rest in a supine position. The sequences characterized by three heart periods with no significant variations (0V), and that with two significant unlike variations (2ULV) reflect changes in sympathetic and vagal modulation, respectively. The unpaired t test (or Mann-Whitney rank sum test when appropriate) was used in the statistical analysis. In the aging process, the distributions of patterns (SE) remain similar to young subjects. However, the regularity is significantly different; the patterns are more repetitive in the old group (a decrease of CI and NCI). The amounts of pattern types are different: 0V is increased and 2LV and 2ULV are reduced in the old group. These differences indicate marked change of autonomic regulation. The CE and SA are feasible techniques to detect alteration in autonomic control of heart rate in the old group.
Resumo:
We evaluated the involvement of paraventricular nucleus (PVN) in the changes in mean arterial pressure (MAP) and heart rate (HR) during an orthostatic challenge (head up tilt, HUT). Adult male Wistar rats, instrumented with guide cannulas to PVN and artery and vein catheters were submitted to MAP and HR recording in conscious state and induction of HUT. The HUT induced an increase in MAP and HR and the pretreatment with prazosin and atenolol blocked these effects. After inhibition of neurotransmission with cobalt chloride (1 mM/100 nl) into the PVN the HR parameters did not change, however we observed a decrease in MAP during HUT. Our data suggest the involvement of PVN in the brain circuitry involved in cardiovascular adjustment during orthostatic challenges. (C) 2011 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
Resumo:
Aims: The long-term clinical performance of drug-eluting stents (DES) coated with biodegradable polymers is poorly known. Methods and results: A total of 274 coronary patients were randomly allocated to paclitaxel-eluting stents, sirolimus-eluting stents, or bare metal stems (2:2:1 ratio). The two DES used the same biodegradable polymers and were identical except for the drug. At three years, the pooled DES population had similar rates of cardiac death or myocardial infarction (9.0% vs. 7.1; p=0.6), but lower risk of repeat interventions (10.0% vs. 29.9%; p<0.01) than controls with bare stents. The cumulative 3-year incidence of definite or probable stent thrombosis in the pooled DES group was 2.3% (first year: 1.8%; second year: 0.4%; third year: zero). There were no significant differences in outcomes between paclitaxel- and sirolimus-eluting stents. Conclusions: The biodegradable-polymer coated DES releasing either paclitaxel or sirolimus were effective in reducing the 3-year rate of re-interventions.
Resumo:
Dentigerous cyst (DC) is one of the most common odontogenic cysts of the jaws and rarely recurs. On the other hand, keratocystic odontogenic tumor (KCOT), formerly known as odontogenic keratocyst (OKC), is considered a benign unicystic or multicystic intraosseous neoplasm and one of the most aggressive odontogenic lesions presenting relatively high recurrence rate and a tendency to invade adjacent tissue. Two cases of these odontogenic lesions occurring in children are presented. They were very similar in clinical and radiographic characteristics, and both were treated by marsupialization. The treatment was chosen in order to preserve the associated permanent teeth with complementary orthodontic treatment to direct eruption of the associated permanent teeth. At 7-years of follow-up, none of the cases showed recurrence.
Resumo:
This study evaluated the five-year clinical performance of ceramic inlays and onlays made with two systems: sintered Duceram (Dentsply-Degussa) and pressable IPS Empress (Ivoclar Vivadent). Eighty-six restorations were placed by a single operator in 35 patients with a median age of 33 years. The restorations were cemented with dual-cured resin cement (Variolink II, Ivoclar Vivadent) and Syntac Classic adhesive under rubber dam. The evaluations were conducted by two independent investigators at baseline, and at one, two, three, and five years using the modified United States Public Health Service (USPHS) criteria. At the five-year recall, 26 patients were evaluated (74.28%), totalling 62 (72.09%) restorations. Four IPS restorations were fractured, two restorations presented secondary caries (one from IPS and one from Duceram), and two restorations showed unacceptable defects at the restoration margin and needed replacement (one restoration from each ceramic system). A general success rate of 87% was recorded. The Fisher exact test revealed no significant difference between Duceram and IPS Empress ceramic systems for all aspects evaluated at different recall appointments (p>0.05). The McNemar chi-square test showed significant differences in relation to marginal discoloration, marginal integrity, and surface texture between the baseline and five-year recall for both systems (p<0.001), with an increased percentage of Bravo scores. However, few Charlie or Delta scores were attributed to these restorations. In conclusion, these two types of ceramic materials demonstrated acceptable clinical performance after five years
HPV clearance in postpartum period of HIV-positive and negative women: a prospective follow-up study
Resumo:
Abstract Background HPV persistence is a key determinant of cervical carcinogenesis. The influence of postpartum on HPV clearance has been debated. This study aimed to assess HPV clearance in later pregnancy and postpartum among HIV-positive and negative women. Methods We conducted a follow-up study with 151 HPV-positive women coinfected with HIV, in 2007–2010. After baseline assessment, all women were retested for HPV infection using PCR in later pregnancy and after delivery. Multivariable logistic regressions assessed the putative association of covariates with HPV status in between each one of the successive visits. Results Seventy-one women (47%) have eliminated HPV between the baseline visit and their second or third visits. HIV-positive women took a significantly longer time (7.0 ± 3.8 months) to clear HPV, compared to those not infected by HIV (5.9 ± 3.0 months). HPV clearance was significantly more likely to take place after delivery than during pregnancy (84.5% x 15.5%). Conclusions Both HIV-positive and negative women presented a significant reduction in HPV infection during the postpartum period. HIV-positive status was found to be associated with a longer period of time to clear HPV infection in pregnant women.
Resumo:
Background. Brazil conducted mass immunization of women of childbearing age in 2001 and 2002. Surveillance was initiated for vaccination of women during pregnancy to monitor the effects of rubella vaccination on fetal outcomes. Methods. Women vaccinated while pregnant or prior to conception were reported to the surveillance system. Susceptibility to rubella infection was determined by anti-rubella immunoglobulin (Ig) M and IgG immunoassays. Susceptible women were observed through delivery. Live-born infants were tested for anti-rubella IgM antibody; IgM-seropositive newborns were tested for viral shedding and observed for 12 months for signs of congenital rubella syndrome. Incidence of congenital rubella infection was calculated using data from 7 states. Results. A total of 22 708 cases of rubella vaccination during pregnancy or prior to conception were reported nationwide, 20 536 (90%) of which were from 7 of 27 states in Brazil. Of these, 2332 women were susceptible to rubella infection at vaccination. Sixty-seven (4.1%) of 1647 newborns had rubella IgM antibody (incidence rate, 4.1 congenital infections per 100 susceptible women vaccinated during pregnancy [95% confidence interval, 3.2–5.1]). None of the infants infected with rubella vaccine virus was born with congenital rubella syndrome. Conclusions. As rubella elimination goals are adopted worldwide, evidence of rubella vaccine safety aids in planning and implementation of mass adult immunization.