128 resultados para PREDICT PATHOLOGICAL STAGE
Resumo:
The sarcoplasmic reticulum (SR) Ca2+-ATPase (SERCA2a) is under the control of an SR protein named phospholamban (PLN). Dephosphorylated PLN inhibits SERCA2a, whereas phosphorylation of PLN at either the Ser16 site by PKA or the Thr17 site by CaMKII reverses this inhibition, thus increasing SERCA2a activity and the rate of Ca2+ uptake by the SR. This leads to an increase in the velocity of relaxation, SR Ca2+ load and myocardial contractility. In the intact heart, ß-adrenoceptor stimulation results in phosphorylation of PLN at both Ser16 and Thr17 residues. Phosphorylation of the Thr17 residue requires both stimulation of the CaMKII signaling pathways and inhibition of PP1, the major phosphatase that dephosphorylates PLN. These two prerequisites appear to be fulfilled by ß-adrenoceptor stimulation, which as a result of PKA activation, triggers the activation of CaMKII by increasing intracellular Ca2+, and inhibits PP1. Several pathological situations such as ischemia-reperfusion injury or hypercapnic acidosis provide the required conditions for the phosphorylation of the Thr17 residue of PLN, independently of the increase in PKA activity, i.e., increased intracellular Ca2+ and acidosis-induced phosphatase inhibition. Our results indicated that PLN was phosphorylated at Thr17 at the onset of reflow and immediately after hypercapnia was established, and that this phosphorylation contributes to the mechanical recovery after both the ischemic and acidic insults. Studies on transgenic mice with Thr17 mutated to Ala (PLN-T17A) are consistent with these results. Thus, phosphorylation of the Thr17 residue of PLN probably participates in a protective mechanism that favors Ca2+ handling and limits intracellular Ca2+ overload in pathological situations.
Resumo:
The objective of the present study was to determine the acute effect of hemodialysis on endothelial venous function and oxidative stress. We studied 9 patients with end-stage renal disease (ESRD), 36.8 ± 3.0 years old, arterial pressure 133.8 ± 6.8/80.0 ± 5.0 mmHg, time on dialysis 55.0 ± 16.6 months, immediately before and after a hemodialysis session, and 10 healthy controls matched for age and gender. Endothelial function was assessed by the dorsal hand vein technique using graded local infusion of acetylcholine (endothelium-dependent venodilation, EDV) and sodium nitroprusside (endothelium-independent venodilation). Oxidative stress was evaluated by measuring protein oxidative damage (carbonyls) and antioxidant defense (total radical trapping antioxidant potential - TRAP) in blood samples. All patients were receiving recombinant human erythropoietin for at least 3 months and were not taking nitrates or a-receptor antagonists. EDV was significantly lower in ESRD patients before hemodialysis (65.6 ± 10.5) vs controls (109.6 ± 10.8; P = 0.010) and after hemodialysis (106.6 ± 15.7; P = 0.045). Endothelium-independent venodilation was similar in all comparisons performed. The hemodialysis session significantly decreased TRAP (402.0 ± 53.5 vs 157.1 ± 28.3 U Trolox/µL plasma; P = 0.001). There was no difference in protein damage comparing ESRD patients before and after hemodialysis. The magnitude of change in the EDV was correlated negatively with the magnitude of change in TRAP (r = -0.70; P = 0.037). These results suggest that a hemodialysis session improves endothelial venous function, in association with an antioxidant effect.
Resumo:
A major problem in renal transplantation is identifying a grading system that can predict long-term graft survival. The present study determined the extent to which the two existing grading systems (Banff 97 and chronic allograft damage index, CADI) correlate with each other and with graft loss. A total of 161 transplant patient biopsies with chronic allograft nephropathy (CAN) were studied. The samples were coded and evaluated blindly by two pathologists using the two grading systems. Logistic regression analyses were used to evaluate the best predictor index for renal allograft loss. Patients with higher Banff 97 and CADI scores had higher rates of graft loss. Moreover, these measures also correlated with worse renal function and higher proteinuria levels at the time of CAN diagnosis. Logistic regression analyses showed that the use of angiotensin-converting enzyme inhibitor (ACEI), hepatitis C virus (HCV), tubular atrophy, and the use of mycophenolate mofetil (MMF) were associated with graft loss in the CADI, while the use of ACEI, HCV, moderate interstitial fibrosis and tubular atrophy and the use of MMF were associated in the Banff 97 index. Although Banff 97 and CADI analyze different parameters in different renal compartments, only some isolated parameters correlated with graft loss. This suggests that we need to review the CAN grading systems in order to devise a system that includes all parameters able to predict long-term graft survival, including chronic glomerulopathy, glomerular sclerosis, vascular changes, and severity of chronic interstitial fibrosis and tubular atrophy.
Resumo:
Protein energy malnutrition (PEM) is a syndrome that often results in immunodeficiency coupled with pancytopenia. Hemopoietic tissue requires a high nutrient supply and the proliferation, differentiation and maturation of cells occur in a constant and balanced manner, sensitive to the demands of specific cell lineages and dependent on the stem cell population. In the present study, we evaluated the effect of PEM on some aspects of hemopoiesis, analyzing the cell cycle of bone marrow cells and the percentage of progenitor cells in the bone marrow. Two-month-old male Swiss mice (N = 7-9 per group) were submitted to PEM with a low-protein diet (4%) or were fed a control diet (20% protein) ad libitum. When the experimental group had lost about 20% of their original body weight after 14 days, we collected blood and bone marrow cells to determine the percentage of progenitor cells and the number of cells in each phase of the cell cycle. Animals of both groups were stimulated with 5-fluorouracil. Blood analysis, bone marrow cell composition and cell cycle evaluation was performed after 10 days. Malnourished animals presented anemia, reticulocytopenia and leukopenia. Their bone marrow was hypocellular and depleted of progenitor cells. Malnourished animals also presented more cells than normal in phases G0 and G1 of the cell cycle. Thus, we conclude that PEM leads to the depletion of progenitor hemopoietic populations and changes in cellular development. We suggest that these changes are some of the primary causes of pancytopenia in cases of PEM.
Resumo:
In breast cancer patients submitted to neoadjuvant chemotherapy (4 cycles of doxorubicin and cyclophosphamide, AC), expression of groups of three genes (gene trio signatures) could distinguish responsive from non-responsive tumors, as demonstrated by cDNA microarray profiling in a previous study by our group. In the current study, we determined if the expression of the same genes would retain the predictive strength, when analyzed by a more accessible technique (real-time RT-PCR). We evaluated 28 samples already analyzed by cDNA microarray, as a technical validation procedure, and 14 tumors, as an independent biological validation set. All patients received neoadjuvant chemotherapy (4 AC). Among five trio combinations previously identified, defined by nine genes individually investigated (BZRP, CLPTM1,MTSS1, NOTCH1, NUP210, PRSS11, RPL37A, SMYD2, and XLHSRF-1), the most accurate were established by RPL37A, XLHSRF-1based trios, with NOTCH1 or NUP210. Both trios correctly separated 86% of tumors (87% sensitivity and 80% specificity for predicting response), according to their response to chemotherapy (82% in a leave-one-out cross-validation method). Using the pre-established features obtained by linear discriminant analysis, 71% samples from the biological validation set were also correctly classified by both trios (72% sensitivity; 66% specificity). Furthermore, we explored other gene combinations to achieve a higher accuracy in the technical validation group (as a training set). A new trio, MTSS1, RPL37 and SMYD2, correctly classified 93% of samples from the technical validation group (95% sensitivity and 80% specificity; 86% accuracy by the cross-validation method) and 79% from the biological validation group (72% sensitivity and 100% specificity). Therefore, the combined expression of MTSS1, RPL37 and SMYD2, as evaluated by real-time RT-PCR, is a potential candidate to predict response to neoadjuvant doxorubicin and cyclophosphamide in breast cancer patients.
Resumo:
Because the superficial lymphatics in the lungs are distributed in the subpleural, interlobular and peribroncovascular interstitium, lymphatic impairment may occur in the lungs of patients with idiopathic interstitial pneumonias (IIPs) and increase their severity. We investigated the distribution of lymphatics in different remodeling stages of IIPs by immunohistochemistry using the D2-40 antibody. Pulmonary tissue was obtained from 69 patients with acute interstitial pneumonia/diffuse alveolar damage (AIP/DAD, N = 24), cryptogenic organizing pneumonia/organizing pneumonia (COP/OP, N = 6), nonspecific interstitial pneumonia (NSIP/NSIP, N = 20), and idiopathic pulmonary fibrosis/usual interstitial pneumonia (IPF/UIP, N = 19). D2-40+ lymphatic in the lesions was quantitatively determined and associated with remodeling stage score. We observed an increase in the D2-40+ percent from DAD (6.66 ± 1.11) to UIP (23.45 ± 5.24, P = 0.008) with the advanced process of remodeling stage of the lesions. Kaplan-Meier survival curves showed a better survival for patients with higher lymphatic D2-40+ expression than 9.3%. Lymphatic impairment occurs in the lungs of IIPs and its severity increases according to remodeling stage. The results suggest that disruption of the superficial lymphatics may impair alveolar clearance, delay organ repair and cause severe disease progress mainly in patients with AIP/DAD. Therefore, lymphatic distribution may serve as a surrogate marker for the identification of patients at greatest risk for death due to IIPs.
Resumo:
Many patients with hilar cholangiocarcinoma (HC) have a poor prognosis. Snail, a transcription factor and E-cadherin repressor, is a novel prognostic factor in many cancers. The aim of this study was to evaluate the relationship between snail and E-cadherin protein expression and the prognostic significance of snail expression in HC. We examined the protein expression of snail and E-cadherin in HC tissues from 47 patients (22 males and 25 females, mean age 61.2 years) using immunohistochemistry and RT-PCR. Proliferation rate was also evaluated in the same cases by the MIB1 index. High, low and negative snail protein expression was recorded in 18 (38%), 17 (36%), and 12 (26%) cases, respectively, and 40.4% (19/47) cases showed reduced E-cadherin protein expression in HC samples. No significant correlation was found between snail and E-cadherin protein expression levels (P = 0.056). No significant correlation was found between snail protein expression levels and gender, age, tumor grade, vascular or perineural invasion, nodal metastasis and invasion, or proliferative index. Cancer samples with positive snail protein expression were associated with poor survival compared with the negative expresser groups. Kaplan-Meier curves comparing different snail protein expression levels to survival showed highly significant separation (P < 0.0001, log-rank test). With multivariate analysis, only snail protein expression among all parameters was found to influence survival (P = 0.0003). We suggest that snail expression levels can predict poor survival regardless of pathological features and tumor proliferation. Immunohistochemical detection of snail protein expression levels in routine sections may provide the first biological prognostic marker.
Resumo:
The participation of regulatory T (Treg) cells in B cell-induced T cell tolerance has been claimed in different models. In skin grafts, naive B cells were shown to induce graft tolerance. However, neither the contribution of Treg cells to B cell-induced skin tolerance nor their contribution to the histopathological diagnosis of graft acceptance has been addressed. Here, using male C57BL/6 naive B cells to tolerize female animals, we show that skin graft tolerance is dependent on CD25+ Treg cell activity and independent of B cell-derived IL-10. In fact, B cells from IL-10-deficient mice were able to induce skin graft tolerance while Treg depletion of the host inhibited 100% graft survival. We questioned how Treg cell-mediated tolerance would impact on histopathology. B cell-tolerized skin grafts showed pathological scores as high as a rejected skin from naive, non-tolerized mice due to loss of skin appendages, reduced keratinization and mononuclear cell infiltrate. However, in tolerized mice, 40% of graft infiltrating CD4+ cells were FoxP3+ Treg cells with a high Treg:Teff (effector T cell) ratio (6:1) as compared to non-tolerized mice where Tregs comprise less than 8% of total infiltrating CD4 cells with a Treg:Teff ratio below 1:1. These results render Treg cells an obligatory target for histopathological studies on tissue rejection that may help to diagnose and predict the outcome of a transplanted organ.
Resumo:
Febrile neutropenia remains a frequent complication in onco-hematological patients, and changes in the circulating level of inflammatory molecules (IM) may precede the occurrence of fever. The present observational prospective study was carried out to evaluate the behavior of plasma tumor necrosis factor alpha (TNF-α), soluble TNF-α I and II receptors (sTNFRI and sTNFRII), monocyte chemoattractant protein-1 [MCP-1 or chemokine (c-c motif) ligand 2 (CCL2)], macrophage inflammatory protein-1α (MIP-1α or CCL3), eotaxin (CCL11), interleukin-8 (IL-8 or CXCL8), and interferon-inducible protein-10 (IP-10 or CXCL10) in 32 episodes of neutropenia in 26 onco-hematological patients. IM were tested on enrollment and 24-48 h before the onset of fever and within 24 h of the first occurrence of fever. Eight of 32 episodes of neutropenia did not present fever (control group) and the patients underwent IM tests on three different occasions. sTNFRI levels, measured a median of 11 h (1-15) before the onset of fever, were significantly higher in patients presenting fever during follow-up compared to controls (P = 0.02). Similar results were observed for sTNFRI and CCL2 levels (P = 0.04 for both) in non-transplanted patients. A cut-off of 1514 pg/mL for sTNFRI was able to discriminate between neutropenic patients with or without fever during follow-up, with 65% sensitivity, 87% specificity, and 93% positive predictive value. Measurement of the levels of plasma sTNFRI can be used to predict the occurrence of fever in neutropenic patients.
Resumo:
Because histopathological changes in the lungs of patients with systemic sclerosis (SSc) are consistent with alveolar and vessel cell damage, we presume that this interaction can be characterized by analyzing the expression of proteins regulating nitric oxide (NO) and plasminogen activator inhibitor-1 (PAI-1) synthesis. To validate the importance of alveolar-vascular interactions and to explore the quantitative relationship between these factors and other clinical data, we studied these markers in 23 cases of SSc nonspecific interstitial pneumonia (SSc-NSIP). We used immunohistochemistry and morphometry to evaluate the amount of cells in alveolar septa and vessels staining for NO synthase (NOS) and PAI-1, and the outcomes of our study were cellular and fibrotic NSIP, pulmonary function tests, and survival time until death. General linear model analysis demonstrated that staining for septal inducible NOS (iNOS) related significantly to staining of septal cells for interleukin (IL)-4 and to septal IL-13. In univariate analysis, higher levels of septal and vascular cells staining for iNOS were associated with a smaller percentage of septal and vascular cells expressing fibroblast growth factor and myofibroblast proliferation, respectively. Multivariate Cox model analysis demonstrated that, after controlling for SSc-NSIP histological patterns, just three variables were significantly associated with survival time: septal iNOS (P=0.04), septal IL-13 (P=0.03), and septal basic fibroblast growth factor (bFGF; P=0.02). Augmented NOS, IL-13, and bFGF in SSc-NSIP histological patterns suggest a possible functional role for iNOS in SSc. In addition, the extent of iNOS, PAI-1, and IL-4 staining in alveolar septa and vessels provides a possible independent diagnostic measure for the degree of pulmonary dysfunction and fibrosis with an impact on the survival of patients with SSc.
Resumo:
Hypertrophy is a major predictor of progressive heart disease and has an adverse prognosis. MicroRNAs (miRNAs) that accumulate during the course of cardiac hypertrophy may participate in the process. However, the nature of any interaction between a hypertrophy-specific signaling pathway and aberrant expression of miRNAs remains unclear. In this study, Spague Dawley male rats were treated with transverse aortic constriction (TAC) surgery to mimic pathological hypertrophy. Hearts were isolated from TAC and sham operated rats (n=5 for each group at 5, 10, 15, and 20 days after surgery) for miRNA microarray assay. The miRNAs dysexpressed during hypertrophy were further analyzed using a combination of bioinformatics algorithms in order to predict possible targets. Increased expression of the target genes identified in diverse signaling pathways was also analyzed. Two sets of miRNAs were identified, showing different expression patterns during hypertrophy. Bioinformatics analysis suggested the miRNAs may regulate multiple hypertrophy-specific signaling pathways by targeting the member genes and the interaction of miRNA and mRNA might form a network that leads to cardiac hypertrophy. In addition, the multifold changes in several miRNAs suggested that upregulation of rno-miR-331*, rno-miR-3596b, rno-miR-3557-5p and downregulation of rno-miR-10a, miR-221, miR-190, miR-451 could be seen as biomarkers of prognosis in clinical therapy of heart failure. This study described, for the first time, a potential mechanism of cardiac hypertrophy involving multiple signaling pathways that control up- and downregulation of miRNAs. It represents a first step in the systematic discovery of miRNA function in cardiovascular hypertrophy.
Resumo:
Neoadjuvant chemotherapy has practical and theoretical advantages over adjuvant chemotherapy strategy in breast cancer (BC) management. Moreover, metronomic delivery has a more favorable toxicity profile. The present study examined the feasibility of neoadjuvant metronomic chemotherapy in two cohorts [HER2+ (TraQme) and HER2− (TAME)] of locally advanced BC. Twenty patients were prospectively enrolled (TraQme, n=9; TAME, n=11). Both cohorts received weekly paclitaxel at 100 mg/m2 during 8 weeks followed by weekly doxorubicin at 24 mg/m2 for 9 weeks in combination with oral cyclophosphamide at 100 mg/day (fixed dose). The HER2+ cohort received weekly trastuzumab. The study was interrupted because of safety issues. Thirty-six percent of patients in the TAME cohort and all patients from the TraQme cohort had stage III BC. Of note, 33% from the TraQme cohort and 66% from the TAME cohort displayed hormone receptor positivity in tumor tissue. The pathological complete response rates were 55% and 18% among patients enrolled in the TraQme and TAME cohorts, respectively. Patients in the TraQme cohort had more advanced BC stages at diagnosis, higher-grade pathological classification, and more tumors lacking hormone receptor expression, compared to the TAME cohort. The toxicity profile was also different. Two patients in the TraQme cohort developed pneumonitis, and in the TAME cohort we observed more hematological toxicity and hand-foot syndrome. The neoadjuvant metronomic chemotherapy regimen evaluated in this trial was highly effective in achieving a tumor response, especially in the HER2+ cohort. Pneumonitis was a serious, unexpected adverse event observed in this group. Further larger and randomized trials are warranted to evaluate the association between metronomic chemotherapy and trastuzumab treatment.
Resumo:
Four cycles of chemotherapy are required to assess responses of multiple myeloma (MM) patients. We investigated whether circulating endothelial progenitor cells (cEPCs) could be a biomarker for predicting patient response in the first cycle of chemotherapy with bortezomib and dexamethasone, so patients might avoid ineffective and costly treatments and reduce exposure to unwanted side effects. We measured cEPCs and stromal cell-derived factor-1α (SDF-1α) in 46 MM patients in the first cycle of treatment with bortezomib and dexamethasone, and investigated clinical relevance based on patient response after four 21-day cycles. The mononuclear cell fraction was analyzed for cEPC by FACS analysis, and SDF-1α was analyzed by ELISA. The study population was divided into 3 groups according to the response to chemotherapy: good responders (n=16), common responders (n=12), and non-responders (n=18). There were no significant differences among these groups at baseline day 1 (P>0.05). cEPC levels decreased slightly at day 21 (8.2±3.3 cEPCs/μL) vs day 1 (8.4±2.9 cEPCs/μL) in good responders (P>0.05). In contrast, cEPC levels increased significantly in the other two groups (P<0.05). SDF-1α changes were closely related to changes in cEPCs. These findings indicate that change in cEPCs at day 21 in the first cycle might be considered a noninvasive biomarker for predicting a later response, and extent of change could help decide whether to continue this costly chemotherapy. cEPCs and the SDF-1α/CXCR4 axis are potential therapeutic targets for improved response and outcomes in MM patients.
Resumo:
Transforming growth factor beta 1 (TGF-β1) and bone morphogenetic protein-2 (BMP-2) are important regulators of bone repair and regeneration. In this study, we examined whether TGF-β1 and BMP-2 expressions were delayed during bone healing in type 1 diabetes mellitus. Tibial fractures were created in 95 diabetic and 95 control adult male Wistar rats of 10 weeks of age. At 1, 2, 3, 4, and 5 weeks after fracture induction, five rats were sacrificed from each group. The expressions of TGF-β1 and BMP2 in the fractured tibias were measured by immunohistochemistry and quantitative reverse-transcription polymerase chain reaction, weekly for the first 5 weeks post-fracture. Mechanical parameters (bending rigidity, torsional rigidity, destruction torque) of the healing bones were also assessed at 3, 4, and 5 weeks post-fracture, after the rats were sacrificed. The bending rigidity, torsional rigidity and destruction torque of the two groups increased continuously during the healing process. The diabetes group had lower mean values for bending rigidity, torsional rigidity and destruction torque compared with the control group (P<0.05). TGF-β1 and BMP-2 expression were significantly lower (P<0.05) in the control group than in the diabetes group at postoperative weeks 1, 2, and 3. Peak levels of TGF-β1 and BMP-2 expression were delayed by 1 week in the diabetes group compared with the control group. Our results demonstrate that there was a delayed recovery in the biomechanical function of the fractured bones in diabetic rats. This delay may be associated with a delayed expression of the growth factors TGF-β1 and BMP-2.
Resumo:
Biliary atresia (BA) is classically described at the neonatal age. However, rare cases of BA in older infants have also been reported. We report four cases of late-onset BA in infants older than 4 weeks (3 males, 1 female), and describe the diagnostic and management difficulties. One of the cases had a late-onset (29 weeks) presentation with a successful surgical procedure. We highlight the importance of this unusual differential diagnosis in infants with cholestatic syndrome, who may benefit from Kasai surgery, regardless of age.