977 resultados para single longitudinal mode
Resumo:
Introduction. Previous research has demonstrated that sildenafil citrate users alter dosing-sexual attempt behavior when switched to tadalafil. The impact of geography and culture on sexual behavior with phosphodiesterase type 5 (PDE5) inhibitor treatment has not been fully investigated. Aim. To describe and compare the changes in dosing-sexual attempt behavior with sildenafil citrate vs. tadalafil treatment across four distinct geographies: Asia, Australia/New Zealand (ANZ), Central Eastern Europe/Middle East (CEE/ME), and Latin America (LA). Methods. Data from a single-arm, open-label clinical trial conducted in 21 countries from November 2002 to May 2004 were used in this analysis. Men with erectile dysfunction and a history of >= 6-week prior sildenafil citrate use continued sildenafil citrate treatment for 4 weeks then switched to tadalafil for 8 weeks. Dosing instructions were provided. Main Outcomes Measures. Timing of dose and sexual intercourse was assessed through patient diaries for the final 4 weeks of each treatment period. Results. A total of 2,760 men were enrolled: Asia 15.8%; ANZ 29.4%; CEE/ME 19.7%; LA 35.1%. The median time from dosing to intercourse was significantly increased during tadalafil treatment across all geographical regions; however, the magnitude of increase differed significantly by geography (P < 0.0001). The Asian cohort demonstrated the shortest duration between dosing and sexual intercourse attempts (irrespective of drug), and altered sexual behavior the least upon switching to tadalafil. The ANZ cohort demonstrated the longest duration between dosing and sexual intercourse attempts (irrespective of drug), and altered sexual behavior the most upon switching to tadalafil. Conclusion. Men with a history of established sildenafil citrate use alter their dose-attempt behavior when treated with tadalafil irrespective of geography. However, the extent to which sexual behavior alters is not uniform across geographical regions, suggesting that dosing instructions and duration of drug effectiveness, in combination with personal and cultural preferences, may determine sexual behavior with PDE5 inhibitor use. Rubio-Aurioles E, Glina S, Abdo CHN, Hernandez-Serrano R, Rampazzo C, Sotomayor M, West TM, Gallagher GL, and Lenero E. Timing of dose relative to sexual intercourse attempt in previous sildenafil citrate users treated with tadalafil: A geographical comparison from a single arm, open-label study. J Sex Med 2009;6:2836-2850.
Resumo:
Background. - Tardive dyskinesia (TD) is a movement disorder observed after chronic neuroleptic treatment. Smoking is presumed to increase the prevalence of TD. The question of a cause-effect-relationship between smoking and TD, however, remains to be answered. Purpose of this study was to examine the correlation between the degree of smoking and the severity of TD with respect to differences caused by medication. Method. - We examined 60 patients suffering from schizophrenia and TD, We compared a clozapine-treated group With a group treated with typical neuroleptics. Movement disorders were assessed using the Abnormal-Involuntary-Movement-Scale and the technical device digital image processing, providing rater independent information on perioral movements. Results. - We found a strong correlation (.80 < r < .90, always p < .0001) between the degree of smoking and severity of TD. Repeated measurements revealed a positive correlation between changes in cigarette consumption and changes of the severity of TD (p < .0001). Analyses of covariance indicated a significant group-effect with a lower severity of TD in the clozapine-group compared to the typical-neuroleptics-group (p = .010). Interaction-analyses indicated a higher impact of smoking oil the severity of TD in the typical-neuroleptics-group compared to the clozapine-group (p = .033). Conclusion. - Concerning a possible cause-effect-relationship between smoking and TD, smoking is more of a general health hazard than neuroleptic exposure in terms of TD. (C) 2008 Elsevier Masson SAS. All rights reserved.
Resumo:
We propose a mechanism by which single outbreaks of vector-borne infections can happen even when the value of the basic reproduction number, R(o), of the infection is below one. With this hypothesis we have shown that dynamical models simulations demonstrate that the arrival of a relatively small (with respect to the host population) number of infected vectors can trigger a short-lived epidemic but with a huge number of cases. These episodes are characterized by a sudden outbreak in a previously virgin area that last from weeks to a few months, and then disappear without leaving vestiges. The hypothesis proposed in this paper to explain those single outbreaks of vector-borne infections, even when total basic reproduction number, Ro, is less than one (which explain the fact that those infections fail to establish themselves at endemic levels), is that the vector-to-host component of Ro is greater than one and that a sufficient amount of infected vectors are imported to the vulnerable area, triggering the outbreak. We tested the hypothesis by performing numerical simulations that reproduce the observed outbreaks of chikungunya in Italy in 2007 and the plague in Florence in 1348. The theory proposed provides an explanation for isolated outbreaks of vector-borne infections, ways to calculate the size of those outbreaks from the number of infected vectors arriving in the affected areas. Given the ever-increasing worldwide transportation network, providing a high degree of mobility from endemic to virgin areas, the proposed mechanism may have important implications for public health planning. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This was a longitudinal study carried out during a period over 2 years with a cohort of 946 individuals of both sexes, aged 1 year and older, from an endemic area of American visceral leishmaniasis (AVL) in Para State, Brazil. The object was to analyze the transmission dynamics of human Leishmania (Leishmania) infantum chagasi infection based principally on the prevalence and incidence. For diagnosis of the infection, the indirect fluorescent antibody test (IFAT) and leishmanin skin test (LST) were performed with amastigote and promastigote antigens of the parasite, respectively. The prevalence by LST (11.2%) was higher (p < 0.0001) than that (3.4%) by IFAT, and the combined prevalence by both tests was 12.6%. The incidences by LST were also higher (p < 0.05) than those by IFAT at 6 (4.7% A- 0.6%), 12 (4.7% A- 2.7%), and 24 months (2.9% A- 0.3%). Moreover, there were no differences (p > 0.05) between the combined incidences by both tests on the same point surveys, 5.2%, 6.3%, and 3.6%. During the study, 12 infected persons showed high IFAT IgG titers with no LST reactions: five children and two adults developed AVL (2,560-10,120), and two children and three adults developed subclinical oligosymptomatic infection (1,280-2,560). The combined tests diagnosed a total of 231 cases of infection leading to an accumulated prevalence of 24.4%.
Resumo:
Introduction. The use of arterial grafts (AG) in pediatric orthotopic liver transplantation (OLT) is an alternative in cases of poor hepatic arterial inflow, small or anomalous recipient hepatic arteries, and retransplantations (re-OLT) due to hepatic artery thrombosis (HAT). AG have been crucial to the success of the procedure among younger children. Herein we have reported our experience with AG. Methods. We retrospectively reviewed data from June 1989 to June 2010 among OLT in which we used AG, analyzing indications, short-term complications, and long-term outcomes. Results. Among 437 pediatric OLT, 58 children required an AG. A common iliac artery interposition graft was used in 57 cases and a donor carotid artery in 1 case. In 38 children the graft was used primarily, including 94% (36/38) in which it was due to poor hepatic arterial inflow. Ductopenia syndromes (n = 14), biliary atresia (BA; n = 11), and fulminant hepatitis (n = 8) were the main preoperative diagnoses among these children. Their mean weight was 18.4 kg and mean age was 68 months. At the mean follow-up of 27 months, multiple-organ failure and primary graft nonfunction (PNF) were the short-term causes of death in 9 children (26.5%). Among the remaining 29 patients, 2 (6,8%) developed early graft thrombosis requiring re-OLT; 5 (17%) developed biliary complications, and 1 (3.4%) had asymptomatic arterial stenosis. In 20 children, a graft was used during retransplantation. The main indication was HAT (75%). BA (n = 15), ductopenia syndromes (n = 2), and primary sclerosing cholangitis (n = 2) were the main diagnoses. Their mean weight was 16.7 kg and age was 65 months. At a mean follow-up of 53 months, 7 children died due to multiple-organ failure or PNF. Among the remaining 13 patients, 3 developed biliary complications and 1 had arterial stenosis. No thrombosis was observed. Conclusion. The data suggested that use of an AG is useful alternative in pediatric OLT. The technique is safe with a low risk of thrombosis.
Resumo:
Introduction. Biliary atresia (BA) is the leading indication for orthotopic liver transplantation (OLT) among children. However, there are technical difficulties, including the limited dimensions of anatomical structures, hypoplasia and/or thrombosis of the portal vein and previous portoenterostomy procedures. Objective. The objective of this study was to present our experience of 239 children with BA who underwent OLT between September 1989 and June 2010 compared with OLT performed for other causes. Methods. We performed a retrospective analysis of patient charts and analysis of complications and survival. Results. BA was the most common indication for OLT (207/409; 50.6%). The median age of subjects was 26 months (range, 7-192). Their median weight was 11 kg (range, 5-63) with 110 children (53.1%) weighing <= 10 kg. We performed 126 transplantations from cadaveric donors (60.8%) and 81 from living-related donors (LRD) (39.2%). Retransplantation was required for 31 recipients (14.9%), primarily due to hepatic artery thrombosis (HAT; 64.5%). Other complications included the following: portal vein thrombosis (PVT; 13.0%), biliary stenosis and/or fistula (22.2%), bowel perforation (7.0%), and posttransplantation lymphoproliferative disorder (PTLD; 5.3%). Among the cases of OLT for other causes, the median age of recipients was 81 months (range, 11-17 years), which was higher than that for children with BA. Retransplantation was required in 3.5% of these patients (P < .05), mostly due to HAT. The incidences of PVT, bowel perforation, and PTLD were significantly lower (P < .05). There was no significant difference between biliary complications in the 2 groups. The overall survival rates at 1 versus 5 years were 79.7% versus 68.1% for BA, and 81.2% versus 75.7% for other causes, respectively. Conclusions. Children who undergo OLT for BA are younger than those engrafted for other causes, displaying a higher risk of complications and retransplantations.
Resumo:
Study design: Single-blind randomized, controlled clinical study. Objectives: To evaluate, using kinematic gait analysis, the results obtained from gait training on a treadmill with body weight support versus those obtained with conventional gait training and physiotherapy. Setting: Thirty patients with sequelae from traumatic incomplete spinal cord injuries at least 12 months earlier; patients were able to walk and were classified according to motor function as ASIA (American Spinal Injury Association) impairment scale C or D. Methods: Patients were divided randomly into two groups of 15 patients by the drawing of opaque envelopes: group A (weight support) and group B (conventional). After an initial assessment, both groups underwent 30 sessions of gait training. Sessions occurred twice a week, lasted for 30min each and continued for four months. All of the patients were evaluated by a single blinded examiner using movement analysis to measure angular and linear kinematic gait parameters. Six patients (three from group A and three from group B) were excluded because they attended fewer than 85% of the training sessions. Results: There were no statistically significant differences in intra-group comparisons among the spatial-temporal variables in group B. In group A, the following significant differences in the studied spatial-temporal variables were observed: increases in velocity, distance, cadence, step length, swing phase and gait cycle duration, in addition to a reduction in stance phase. There were also no significant differences in intra-group comparisons among the angular variables in group B. However, group A achieved significant improvements in maximum hip extension and plantar flexion during stance. Conclusion: Gait training with body weight support was more effective than conventional physiotherapy for improving the spatial-temporal and kinematic gait parameters among patients with incomplete spinal cord injuries. Spinal Cord (2011) 49, 1001-1007; doi:10.1038/sc.2011.37; published online 3 May 2011
Resumo:
Background Bone chondrosarcomas are rare malignant tumors that have variable biologic behavior, and their treatment is controversial. For low-grade tumors, there is no consensus on whether intralesional en bloc resections are the best treatment. Questions/purposes We therefore compared patients with Grade 1 and Grade 2 primary central chondrosarcomas to (1) determine difference in survival and (2) local recurrence rates; and (3) determine any association of histological grade with some clinical and demographic characteristics. Methods We retrospectively reviewed 46 patients with grade 1 and 2 chondrosarcomas. There were 25 men and 21 women with a mean age of 43 years (range, 17-79 years). Minimum followup was 32 months (mean, 99 months; range, 32-312 months) for the patients who remained alive in the end of the study. Twenty-three of the tumors were intracompartmental (Enneking A); of these, 19 were Grade 1 and 4 were Grade 2. Twenty-three tumors were extracompartmental (Enneking B); of these, 4 were Grade 1 and 19 were Grade 2. Twenty-five patients underwent intralesional resection, 18 had wide resection, and three had amputations. Results The overall survival rate was 94% and the disease-free survival rate was 90%. Among the 23 Grade 1 tumors, we observed six local recurrences and none of these patients died; among the 23 Grade 2 tumors, 10 recurred and two patients died. Local recurrence negatively influenced survival. Conclusions For lesions with radiographic characteristics of intracompartmental Grade 1 chondrosarcoma, we believe intralesional resection followed by electrocauterization and cement is the best treatment. When the imaging suggests aggressive (Grade 2 or 3) chondrosarcoma, then wide resection is promptly indicated.
Resumo:
Background: Perioperative complications following robotic-assisted radical prostatectomy (RARP) have been previously reported in recent series. Few studies, however, have used standardized systems to classify surgical complications, and that inconsistency has hampered accurate comparisons between different series or surgical approaches. Objective: To assess trends in the incidence and to classify perioperative surgical complications following RARP in 2500 consecutive patients. Design, setting, and participants: We analyzed 2500 patients who underwent RARP for treatment of clinically localized prostate cancer (PCa) from August 2002 to February 2009. Data were prospectively collected in a customized database and retrospectively analyzed. Intervention: All patients underwent RARP performed by a single surgeon. Measurements: The data were collected prospectively in a customized database. Complications were classified using the Clavien grading system. To evaluate trends regarding complications and radiologic anastomotic leaks, we compared eight groups of 300 patients each, categorized according the surgeon`s experience (number of cases). Results and limitations: Our median operative time was 90 min (interquartile range [IQR]: 75-100 min). The median estimated blood loss was 100 ml (IQR: 100-150 ml). Our conversion rate was 0.08%, comprising two procedures converted to standard laparoscopy due to robot malfunction. One hundred and forty complications were observed in 127 patients (5.08%). The following percentages of patients presented graded complications: grade 1, 2.24%; grade 2, 1.8%; grade 3a, 0.08%; grade 3b, 0.48%; grade 4a, 0.40%. There were no cases of multiple organ dysfunction or death (grades 4b and 5). There were significant decreases in the overall complication rates (p = 0.0034) and in the number of anastomotic leaks (p < 0.001) as the surgeon`s experience increased. Conclusions: RARP is a safe option for treatment of clinically localized PCa, presenting low complication rates in experienced hands. Although the robotic system provides the surgeon with enhanced vision and dexterity, proficiency is only accomplished with consistent surgical volume; complication rates demonstrated a tendency to decrease as the surgeon`s experience increased. (C) 2010 European Association of Urology. Published by Elsevier B. V. All rights reserved.
Resumo:
Posttransplantation lymphoproliferative disorder (PTLD) is a serious complication following solid organ transplantation that has been linked to Epstein-Barr virus (EBV) infection. The aim of this article was to describe a single-center experience with the multiplicity of clinical presentations of PTLD. Among 350 liver transplantations performed in 303 children, 13 survivor children displayed a histological diagnosis of PTLD (13/242 survivors; 5.4%). The age at diagnosis ranged from 12 to 258 months (median, 47), and the time from transplantation ranged from 1 to 84 months (median, 13). Ten of these children (76.9%) were EBV-naive prior to transplantation. Fever was present in all cases. The clinical signs at presentation were anemia (92.3%), diarrhea and vomiting (69.2%), recurrent upper airway infections (38.4%), Waldeyer ring lymphoid tissue hypertrophy (23.0%), abdominal mass lesions (30.7%), massive cervical and mediastinal adenopathy (15.3%), or gastrointestinal and respiratory symptoms (30.7%). One child developed fulminant hepatic allograft failure secondary to graft involvement by PTLD. Polymorphic PTLD was diagnosed in 6 patients; 7 had the diagnosis of lymphoma. Treatment consisted of stopping immunosuppression as well as starting intravenous gancyclovir and anti-CD20 monoclonal antibody therapy. The mortality rate was 53.8%. The clinical presentation of PTLD varied from fever of unknown origin to fulminant hepatic failure. The other symptoms that may be linked to the diagnosis of PTLD are pancytopenia, tonsil and adenoid hypertrophy, cervical or mediastinal lymph node enlargement, as well as abdominal masses. Despite numerous advances, the optimal treatment approach for PTLD is not completely known and the mortality rate is still high.
Resumo:
Objective: to describe women`s feelings about mode of birth. Design: exploratory descriptive design. Semi-structured interviews were conducted using a questionnaire that had been developed previously (categorical data and open-and closed-ended questions). Qualitative analysis of the results was performed through a context analysis technique. Setting: the largest public university hospital in Brazil. Participants: 48 women in their third trimester of pregnancy. Findings: most women expressed a preference for vaginal birth, as they perceived that they would have a faster recovery. Women who expressed a preference for caesarean section did so because of lack of pain during the birth and the need for tubal sterilisation. The majority of women considered it important to have experience with a mode of birth in order to choose a preference. Complications associated with maternal illness were very influential in the decision-making process. Key conclusions: these results provide a useful first step towards the identification of aspects of women`s feelings about modes of birth. Most women expressed a preference for vaginal birth. Further exploration of women`s feelings regarding parturition and the decision-making process is required. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Introduction. Cytomegalovirus (CMV) infection, a common complication in lung transplant (LT) patients, is associated with worse outcomes. Therefore, prophylaxis and surveillance with preemptive treatment is recommended. Objectives. Describe the epidemiology and impact on mortality of CMV infection in LT patients receiving CMV prophylaxis. Methods. Single-center retrospective cohort of LT recipients from August 2003 to March 2008. We excluded patients with survival or follow-up shorter than 30 days. We reviewed medical charts and all CMV pp65 antigen results. Results. Forty-seven patients met the inclusion criteria and 19 (40%) developed a CMV event: eight CMV infections, seven CMV syndromes, and 15 CMV diseases. The mean number of CMV events for each patient was 1.68 +/- 0.88. Twelve patients developed CMV events during prophylaxis (5/12 had CMV serology D+/R-). Forty-six of the 47 patients had at least one episode of acute rejection (mean 2.23 +/- 1.1). Median follow-up was 22 months (range = 3-50). There were seven deaths. Upon univariate analysis, CMV events were related to greater mortality (P = .04), especially if the patient experienced more than two events (P = .013) and if the first event occurred during the first 3 months after LT (P = .003). Nevertheless, a marginally significant relationship between CMV event during the first 3 months after LT and mortality was observed in the multivariate analysis (hazards ratio: 7.46; 95% confidence interval: 0.98-56.63; P = .052). Patients with CMV events more than 3 months post-LT showed the same survival as those who remained CMV-free. Conclusion. Prophylaxis and preemptive treatment are safe and effective; however, the patients who develop CMV events during prophylaxis experience a worse prognosis.
Resumo:
Purpose: To evaluate the ability of the GDx Variable Corneal Compensation (VCC) Guided Progression Analysis (GPA) software for detecting glaucomatous progression. Design: Observational cohort study. Participants: The study included 453 eyes from 252 individuals followed for an average of 46 +/- 14 months as part of the Diagnostic Innovations in Glaucoma Study. At baseline, 29% of the eyes were classified as glaucomatous, 67% of the eyes were classified as suspects, and 5% of the eyes were classified as healthy. Methods: Images were obtained annually with the GDx VCC and analyzed for progression using the Fast Mode of the GDx GPA software. Progression using conventional methods was determined by the GPA software for standard automated achromatic perimetry (SAP) and by masked assessment of optic disc stereophotographs by expert graders. Main Outcome Measures: Sensitivity, specificity, and likelihood ratios (LRs) for detection of glaucoma progression using the GDx GPA were calculated with SAP and optic disc stereophotographs used as reference standards. Agreement among the different methods was reported using the AC(1) coefficient. Results: Thirty-four of the 431 glaucoma and glaucoma suspect eyes (8%) showed progression by SAP or optic disc stereophotographs. The GDx GPA detected 17 of these eyes for a sensitivity of 50%. Fourteen eyes showed progression only by the GDx GPA with a specificity of 96%. Positive and negative LRs were 12.5 and 0.5, respectively. None of the healthy eyes showed progression by the GDx GPA, with a specificity of 100% in this group. Inter-method agreement (AC1 coefficient and 95% confidence intervals) for non-progressing and progressing eyes was 0.96 (0.94-0.97) and 0.44 (0.28-0.61), respectively. Conclusions: The GDx GPA detected glaucoma progression in a significant number of cases showing progression by conventional methods, with high specificity and high positive LRs. Estimates of the accuracy for detecting progression suggest that the GDx GPA could be used to complement clinical evaluation in the detection of longitudinal change in glaucoma. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 462-470 (C) 2010 by the American Academy of Ophthalmology.