983 resultados para Single-strand conformation polymorphism
Resumo:
We propose a mechanism by which single outbreaks of vector-borne infections can happen even when the value of the basic reproduction number, R(o), of the infection is below one. With this hypothesis we have shown that dynamical models simulations demonstrate that the arrival of a relatively small (with respect to the host population) number of infected vectors can trigger a short-lived epidemic but with a huge number of cases. These episodes are characterized by a sudden outbreak in a previously virgin area that last from weeks to a few months, and then disappear without leaving vestiges. The hypothesis proposed in this paper to explain those single outbreaks of vector-borne infections, even when total basic reproduction number, Ro, is less than one (which explain the fact that those infections fail to establish themselves at endemic levels), is that the vector-to-host component of Ro is greater than one and that a sufficient amount of infected vectors are imported to the vulnerable area, triggering the outbreak. We tested the hypothesis by performing numerical simulations that reproduce the observed outbreaks of chikungunya in Italy in 2007 and the plague in Florence in 1348. The theory proposed provides an explanation for isolated outbreaks of vector-borne infections, ways to calculate the size of those outbreaks from the number of infected vectors arriving in the affected areas. Given the ever-increasing worldwide transportation network, providing a high degree of mobility from endemic to virgin areas, the proposed mechanism may have important implications for public health planning. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The goal of this work was to compare the differences between human immunodeficiency Virus type 1 (HIV-1) of B and F1 Subtypes in the acquisition of major and rninot- protease inhibitor (P1)-associated resistance mutations and of other polymorphisms at the protease (PR) gene, through a cross sectional Study. PR sequences from subtypes B and F1 isolates matched according to P1 exposure time from Brazilian patients were included in this study. Sequences were separated in four groups: 24 and 90 from children and 141 and 99 from adults infected with isolates of subtypes F1 and B, respectively. For comparison, 211 subype B and 79 subtype F1 PR sequences from drug-naive individuals Were included. Demographic and clinical data were similar among B- and F1-infected patients. In untreated patients, Mutations L1OV, K20R, and M361 were more frequent in subtype F1, while L63P, A7IT, and V771 were more prevalent in Subtype B. In treated patients, K20M, D30N, G73S, 184V, and L90M, were More prevalent in subtype B, and K20T and N88S Were more prevalent in Subtype F1. A higher proportion of subtype F1 than Of subtype B Strains Containing other polymorphisms was observed. V82L mutation was Present With increased frequency in isolates from children compared to isolates from adults infected with both subtypes. We could observe a faster resistance emergence in children than in adults, during treatment with protease inhibitors. This data provided evidence that, although rates of overall drug resistance do not differ between subtypes B and F1, the former accumulates resistance at higher proportion in specific amino acid positions of protease when compared to the latter. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Paraffin-embedded samples commonly stored at educational and research institutions constitute tissues banks for follow-up or epidemiological studies; however, the paraffin inclusion process involves the use of substances that can cause DNA degradation. In this study, a PCR protocol was applied to identify Leishmania strains in 33 paraffin-embedded skin samples of patients with American cutaneous leishmaniasis. DNA was obtained by the phenol-chloroform protocol following paraffin removal and then used in PCR or nested PCR based on the nucleotide sequence of the small subunit ribosomal RNA (SSU rDNA). The amplicons obtained were cloned and sequenced to determine the single nucleotide polymorphism that distinguishes between different Leishmania species or groups. This assay allowed to distinguish organisms belonging to the subgenus Viannia and identify L. (Leishmania) amazonensis and L. (L.) chagasi of the Leishmania subgenus. Of the 33 samples, PCR and nested PCR identified 91% of samples. After sequencing the PCR product of 26 samples, 16 were identified as L. (L.) amazonensis, the other 10 contain organisms belonging to the L. (Viannia) sub-genus. These results open a huge opportunity to study stored samples and promote relevant contributions to epidemiological studies.
Resumo:
Several studies support a genetic influence on obsessive-compulsive disorder (OCD) etiology. The role of glutamate as an important neurotransmitter affecting OCD pathophysiology has been supported by neuroimaging, animal model, medication, and initial candidate gene studies. Genes involved in glutamatergic pathways, such as the glutamate receptor, ionotropic, kainate 2 (GRIK2), have been associated with OCD in previous studies. This study examines GRIK2 as a candidate gene for OCD susceptibility in a family-based approach. Probands had full DSM-IV diagnostic criteria for OCD. Forty-seven OCD probands and their parents were recruited from tertiary care OCD specialty clinics from France and USA. Genotypes of single nucleotide polymorphism (SNP) markers and related haplotypes were analyzed using Haploview and FBAT software. The polymorphism at rs1556995 (P = 0.0027; permuted P-value = 0.03) was significantly associated with the presence of OCD. Also, the two marker haplotype rs1556995/rs1417182, was significantly associated with OCD (P = 0.0019, permuted P-value = 0.01). This study supports previously reported findings of association between proximal GRIK2 SNPs and OCD in a comprehensive evaluation of the gene. Further study with independent samples and larger sample sizes is required.
Resumo:
Introduction. The use of arterial grafts (AG) in pediatric orthotopic liver transplantation (OLT) is an alternative in cases of poor hepatic arterial inflow, small or anomalous recipient hepatic arteries, and retransplantations (re-OLT) due to hepatic artery thrombosis (HAT). AG have been crucial to the success of the procedure among younger children. Herein we have reported our experience with AG. Methods. We retrospectively reviewed data from June 1989 to June 2010 among OLT in which we used AG, analyzing indications, short-term complications, and long-term outcomes. Results. Among 437 pediatric OLT, 58 children required an AG. A common iliac artery interposition graft was used in 57 cases and a donor carotid artery in 1 case. In 38 children the graft was used primarily, including 94% (36/38) in which it was due to poor hepatic arterial inflow. Ductopenia syndromes (n = 14), biliary atresia (BA; n = 11), and fulminant hepatitis (n = 8) were the main preoperative diagnoses among these children. Their mean weight was 18.4 kg and mean age was 68 months. At the mean follow-up of 27 months, multiple-organ failure and primary graft nonfunction (PNF) were the short-term causes of death in 9 children (26.5%). Among the remaining 29 patients, 2 (6,8%) developed early graft thrombosis requiring re-OLT; 5 (17%) developed biliary complications, and 1 (3.4%) had asymptomatic arterial stenosis. In 20 children, a graft was used during retransplantation. The main indication was HAT (75%). BA (n = 15), ductopenia syndromes (n = 2), and primary sclerosing cholangitis (n = 2) were the main diagnoses. Their mean weight was 16.7 kg and age was 65 months. At a mean follow-up of 53 months, 7 children died due to multiple-organ failure or PNF. Among the remaining 13 patients, 3 developed biliary complications and 1 had arterial stenosis. No thrombosis was observed. Conclusion. The data suggested that use of an AG is useful alternative in pediatric OLT. The technique is safe with a low risk of thrombosis.
Resumo:
Introduction. Biliary atresia (BA) is the leading indication for orthotopic liver transplantation (OLT) among children. However, there are technical difficulties, including the limited dimensions of anatomical structures, hypoplasia and/or thrombosis of the portal vein and previous portoenterostomy procedures. Objective. The objective of this study was to present our experience of 239 children with BA who underwent OLT between September 1989 and June 2010 compared with OLT performed for other causes. Methods. We performed a retrospective analysis of patient charts and analysis of complications and survival. Results. BA was the most common indication for OLT (207/409; 50.6%). The median age of subjects was 26 months (range, 7-192). Their median weight was 11 kg (range, 5-63) with 110 children (53.1%) weighing <= 10 kg. We performed 126 transplantations from cadaveric donors (60.8%) and 81 from living-related donors (LRD) (39.2%). Retransplantation was required for 31 recipients (14.9%), primarily due to hepatic artery thrombosis (HAT; 64.5%). Other complications included the following: portal vein thrombosis (PVT; 13.0%), biliary stenosis and/or fistula (22.2%), bowel perforation (7.0%), and posttransplantation lymphoproliferative disorder (PTLD; 5.3%). Among the cases of OLT for other causes, the median age of recipients was 81 months (range, 11-17 years), which was higher than that for children with BA. Retransplantation was required in 3.5% of these patients (P < .05), mostly due to HAT. The incidences of PVT, bowel perforation, and PTLD were significantly lower (P < .05). There was no significant difference between biliary complications in the 2 groups. The overall survival rates at 1 versus 5 years were 79.7% versus 68.1% for BA, and 81.2% versus 75.7% for other causes, respectively. Conclusions. Children who undergo OLT for BA are younger than those engrafted for other causes, displaying a higher risk of complications and retransplantations.
Resumo:
Study design: Single-blind randomized, controlled clinical study. Objectives: To evaluate, using kinematic gait analysis, the results obtained from gait training on a treadmill with body weight support versus those obtained with conventional gait training and physiotherapy. Setting: Thirty patients with sequelae from traumatic incomplete spinal cord injuries at least 12 months earlier; patients were able to walk and were classified according to motor function as ASIA (American Spinal Injury Association) impairment scale C or D. Methods: Patients were divided randomly into two groups of 15 patients by the drawing of opaque envelopes: group A (weight support) and group B (conventional). After an initial assessment, both groups underwent 30 sessions of gait training. Sessions occurred twice a week, lasted for 30min each and continued for four months. All of the patients were evaluated by a single blinded examiner using movement analysis to measure angular and linear kinematic gait parameters. Six patients (three from group A and three from group B) were excluded because they attended fewer than 85% of the training sessions. Results: There were no statistically significant differences in intra-group comparisons among the spatial-temporal variables in group B. In group A, the following significant differences in the studied spatial-temporal variables were observed: increases in velocity, distance, cadence, step length, swing phase and gait cycle duration, in addition to a reduction in stance phase. There were also no significant differences in intra-group comparisons among the angular variables in group B. However, group A achieved significant improvements in maximum hip extension and plantar flexion during stance. Conclusion: Gait training with body weight support was more effective than conventional physiotherapy for improving the spatial-temporal and kinematic gait parameters among patients with incomplete spinal cord injuries. Spinal Cord (2011) 49, 1001-1007; doi:10.1038/sc.2011.37; published online 3 May 2011
Resumo:
Background: Cardiovascular diseases (CVD) are the main cause of death and disability in developed countries. In most cases, the progress of CVD is influenced by environmental factors and multifactorial inheritance. The purpose of this study was to investigate the association between APOE genotypes, cardiovascular risk factors, and a noninvasive measure of arterial stiffness in the Brazilian population. Methods: A total of 1493 urban Brazilian individuals were randomly selected from the general population of the Vitoria City Metropolitan area. Genetic analysis of the APOE polymorphism was conducted by PCR-RFLP and pulse wave velocity analyzed with a noninvasive automatic device. Results: Age, gender, body mass index, triglycerides, creatinine, uric acid, blood glucose, blood pressure phenotypes were no different between epsilon 2, epsilon 3 and epsilon 4 alleles. The epsilon 4 allele was associated with higher total-cholesterol (p < 0.001), LDL-C (p < 0.001), total-cholesterol/HDL-C ratio (p < 0.001), LDL/HDL-C ratio (p < 0.001), lower HDL-C values (p < 0.001) and higher risk to obesity (OR = 1.358, 95% CI = 1.019-1.811) and hyperuricemia (OR = 1.748, 95% CI = 1.170-2.611). Nevertheless, pulse wave velocity (p = 0.66) measures were no different between genotypes. The significant association between APOE genotypes and lipid levels persisted after a 5-year follow-up interval, but no interaction between time and genotype was observed for lipids longitudinal behavior. Conclusion: The epsilon 4 allele of the APOE gene is associated with a worse lipid profile in the Brazilian urban population. In our relatively young sample, the observed effect of APOE genotype on lipid levels was not translated into significant effects in arterial wall stiffness.
Resumo:
Background Bone chondrosarcomas are rare malignant tumors that have variable biologic behavior, and their treatment is controversial. For low-grade tumors, there is no consensus on whether intralesional en bloc resections are the best treatment. Questions/purposes We therefore compared patients with Grade 1 and Grade 2 primary central chondrosarcomas to (1) determine difference in survival and (2) local recurrence rates; and (3) determine any association of histological grade with some clinical and demographic characteristics. Methods We retrospectively reviewed 46 patients with grade 1 and 2 chondrosarcomas. There were 25 men and 21 women with a mean age of 43 years (range, 17-79 years). Minimum followup was 32 months (mean, 99 months; range, 32-312 months) for the patients who remained alive in the end of the study. Twenty-three of the tumors were intracompartmental (Enneking A); of these, 19 were Grade 1 and 4 were Grade 2. Twenty-three tumors were extracompartmental (Enneking B); of these, 4 were Grade 1 and 19 were Grade 2. Twenty-five patients underwent intralesional resection, 18 had wide resection, and three had amputations. Results The overall survival rate was 94% and the disease-free survival rate was 90%. Among the 23 Grade 1 tumors, we observed six local recurrences and none of these patients died; among the 23 Grade 2 tumors, 10 recurred and two patients died. Local recurrence negatively influenced survival. Conclusions For lesions with radiographic characteristics of intracompartmental Grade 1 chondrosarcoma, we believe intralesional resection followed by electrocauterization and cement is the best treatment. When the imaging suggests aggressive (Grade 2 or 3) chondrosarcoma, then wide resection is promptly indicated.
Resumo:
Well-differentiated liposarcoma (WDLS) is one of the most common malignant mesenchymal tumors and dedifferentiated liposarcoma (DDLS) is a malignant tumor consisting of both WDLS and a transformed nonlipogenic sarcomatous component. Cytogenetically, WDLS is characterized by the presence of ring or giant rod chromosomes containing several amplified genes, including MDM2, TSPAN31 CDK4, and others mainly derived from chromosome bands 12q13-15. However, the 12q13-15 amplicon is large and discontinuous. The focus of this study was to identify novel critical genes that are consistently amplified in primary (nonrecurrent) WDLS and with potential relevance for future targeted therapy. Using a high-resolution (5.0 kb) ""single nucleotide polymorphism""/copy number variation microarray to screen the whole genome in a series of primary WDLS, two consistently amplified areas were found on chromosome 12: one region containing the MDM2 and CPM genes, and another region containing the FRS2 gene. Based on these findings, we further validated FRS2 amplification in both WDLS and DDLS. Fluorescence in situ hybridization confirmed FRS2 amplification in all WDLS and DDLS tested (n = 57). Real time PCR showed FRS2 mRNA transcriptional upregulation in WDLS (n = 19) and DDLS (n = 13) but not in lipoma (n = 5) and normal fat (n = 9). Immunoblotting revealed high expression levels of phospho-FRS2 at 1436 and slightly overexpression of total FRS2 protein in liposarcoma but not in normal fat or preadipocytes. Considering the critical role of FRS2 in mediating fibroblast growth factor receptor signaling, our findings indicate that FRS2 signaling should be further investigated as a potential therapeutic target for liposarcoma. (C) 2011 Wiley-Liss, Inc.
Resumo:
Objectives. Abnormalities in neurotrophic systems have been reported in Alzheimer`s disease (AD), as shown by decreased serum brain-derived neurotrophic factor (BDNF) levels and association with BDNF genetic polymorphisms. In this study, we investigate whether these findings can be detected in patients with mild cognitive impairment (MCI), which is recognized as a high risk condition for AD. We also address the impact of these variables on the progression of cognitive deficits within the MCI-AD continuum. Methods. One hundred and sixty older adults with varying degrees of cognitive impairment (30 patients with AD, 71 with MCI, and 59 healthy controls) were longitudinally assessed for up to 60 months. Baseline serum BDNF levels were determined by sandwich ELISA, and the presence of polymorphisms of BDNF and apolipoprotein E (Val66Met and APOE*E4, respectively) was determined by allelic discrimination analysis on real time PCR. Modifications of cognitive state were ascertained for non-demented subjects. Results. Mean serum BDNF levels were reduced in patients with MCI and AD, as compared to controls (509.2 +/- 210.5; 581.9 +/- 379.4; and 777.5 +/- 467.8 pg/l respectively; P < 0.001). Baseline serum BDNF levels were not associated with the progression of cognitive impairment upon follow-up in patients with MCI (progressive MCI, 750.8 +/- 463.0; stable MCI, 724.0 +/- 343.4; P = 0.8), nor with the conversion to AD. Although Val66Met polymorphisms were not associated with the cross-sectional diagnoses of MCI or AD, the presence of Met-BDNF allele was associated with a higher risk of disease-progression in patients with MCI (OR = 3.0 CI(95%) [1.2-7.8], P = 0.02). We also found a significant interaction between the APOE*E4 and Met-BDNF allele increasing the risk of progression of cognitive impairment in MCI patients (OR = 4.4 CI(95%) [1.6-12.1], P = 0.004). Conclusion. Decreased neurotrophic support, as indicated by a reduced systemic availability of BDNF, may play role in the neurodegenerative processes that underlie the continuum from MCI to AD. The presence of Met-BDNF allele, particularly in association with APOE*E4, may predict a worse cognitive outcome in patients with MCI.
Resumo:
Background: Perioperative complications following robotic-assisted radical prostatectomy (RARP) have been previously reported in recent series. Few studies, however, have used standardized systems to classify surgical complications, and that inconsistency has hampered accurate comparisons between different series or surgical approaches. Objective: To assess trends in the incidence and to classify perioperative surgical complications following RARP in 2500 consecutive patients. Design, setting, and participants: We analyzed 2500 patients who underwent RARP for treatment of clinically localized prostate cancer (PCa) from August 2002 to February 2009. Data were prospectively collected in a customized database and retrospectively analyzed. Intervention: All patients underwent RARP performed by a single surgeon. Measurements: The data were collected prospectively in a customized database. Complications were classified using the Clavien grading system. To evaluate trends regarding complications and radiologic anastomotic leaks, we compared eight groups of 300 patients each, categorized according the surgeon`s experience (number of cases). Results and limitations: Our median operative time was 90 min (interquartile range [IQR]: 75-100 min). The median estimated blood loss was 100 ml (IQR: 100-150 ml). Our conversion rate was 0.08%, comprising two procedures converted to standard laparoscopy due to robot malfunction. One hundred and forty complications were observed in 127 patients (5.08%). The following percentages of patients presented graded complications: grade 1, 2.24%; grade 2, 1.8%; grade 3a, 0.08%; grade 3b, 0.48%; grade 4a, 0.40%. There were no cases of multiple organ dysfunction or death (grades 4b and 5). There were significant decreases in the overall complication rates (p = 0.0034) and in the number of anastomotic leaks (p < 0.001) as the surgeon`s experience increased. Conclusions: RARP is a safe option for treatment of clinically localized PCa, presenting low complication rates in experienced hands. Although the robotic system provides the surgeon with enhanced vision and dexterity, proficiency is only accomplished with consistent surgical volume; complication rates demonstrated a tendency to decrease as the surgeon`s experience increased. (C) 2010 European Association of Urology. Published by Elsevier B. V. All rights reserved.
Resumo:
Posttransplantation lymphoproliferative disorder (PTLD) is a serious complication following solid organ transplantation that has been linked to Epstein-Barr virus (EBV) infection. The aim of this article was to describe a single-center experience with the multiplicity of clinical presentations of PTLD. Among 350 liver transplantations performed in 303 children, 13 survivor children displayed a histological diagnosis of PTLD (13/242 survivors; 5.4%). The age at diagnosis ranged from 12 to 258 months (median, 47), and the time from transplantation ranged from 1 to 84 months (median, 13). Ten of these children (76.9%) were EBV-naive prior to transplantation. Fever was present in all cases. The clinical signs at presentation were anemia (92.3%), diarrhea and vomiting (69.2%), recurrent upper airway infections (38.4%), Waldeyer ring lymphoid tissue hypertrophy (23.0%), abdominal mass lesions (30.7%), massive cervical and mediastinal adenopathy (15.3%), or gastrointestinal and respiratory symptoms (30.7%). One child developed fulminant hepatic allograft failure secondary to graft involvement by PTLD. Polymorphic PTLD was diagnosed in 6 patients; 7 had the diagnosis of lymphoma. Treatment consisted of stopping immunosuppression as well as starting intravenous gancyclovir and anti-CD20 monoclonal antibody therapy. The mortality rate was 53.8%. The clinical presentation of PTLD varied from fever of unknown origin to fulminant hepatic failure. The other symptoms that may be linked to the diagnosis of PTLD are pancytopenia, tonsil and adenoid hypertrophy, cervical or mediastinal lymph node enlargement, as well as abdominal masses. Despite numerous advances, the optimal treatment approach for PTLD is not completely known and the mortality rate is still high.
Resumo:
Introduction. Cytomegalovirus (CMV) infection, a common complication in lung transplant (LT) patients, is associated with worse outcomes. Therefore, prophylaxis and surveillance with preemptive treatment is recommended. Objectives. Describe the epidemiology and impact on mortality of CMV infection in LT patients receiving CMV prophylaxis. Methods. Single-center retrospective cohort of LT recipients from August 2003 to March 2008. We excluded patients with survival or follow-up shorter than 30 days. We reviewed medical charts and all CMV pp65 antigen results. Results. Forty-seven patients met the inclusion criteria and 19 (40%) developed a CMV event: eight CMV infections, seven CMV syndromes, and 15 CMV diseases. The mean number of CMV events for each patient was 1.68 +/- 0.88. Twelve patients developed CMV events during prophylaxis (5/12 had CMV serology D+/R-). Forty-six of the 47 patients had at least one episode of acute rejection (mean 2.23 +/- 1.1). Median follow-up was 22 months (range = 3-50). There were seven deaths. Upon univariate analysis, CMV events were related to greater mortality (P = .04), especially if the patient experienced more than two events (P = .013) and if the first event occurred during the first 3 months after LT (P = .003). Nevertheless, a marginally significant relationship between CMV event during the first 3 months after LT and mortality was observed in the multivariate analysis (hazards ratio: 7.46; 95% confidence interval: 0.98-56.63; P = .052). Patients with CMV events more than 3 months post-LT showed the same survival as those who remained CMV-free. Conclusion. Prophylaxis and preemptive treatment are safe and effective; however, the patients who develop CMV events during prophylaxis experience a worse prognosis.