645 resultados para Transfusion culot globulaire
Resumo:
Objective: To evaluate the effectiveness and safety of correction of pectus excavatum by the Nuss technique based on the available scientific evidence.Methods: We conducted an evidence synthesis following systematic processes of search, selection, extraction and critical appraisal. Outcomes were classified by importance and had their quality assessed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE).Results: The process of selection of items led to the inclusion of only one systematic review, which synthesized the results of nine observational studies comparing the Nuss and Ravitch procedures. The evidence found was rated as poor and very poor quality. The Nuss procedure has increased the incidence of hemothorax (RR = 5.15; 95% CI: 1.07; 24.89), pneumothorax (RR = 5.26; 95% CI: 1.55; 17.92) and the need for reintervention (RR = 4.88; 95% CI: 2.41; 9.88) when compared to the Ravitch. There was no statistical difference between the two procedures in outcomes: general complications, blood transfusion, hospital stay and time to ambulation. The Nuss operation was faster than the Ravitch (mean difference [MD] = -69.94 minutes, 95% CI: -139.04, -0.83).Conclusion: In the absence of well-designed prospective studies to clarify the evidence, especially in terms of aesthetics and quality of life, surgical indication should be individualized and the choice of the technique based on patient preference and experience of the team.
Resumo:
Trauma is one of the world's leading causes of death within the first 40 years of life and thus a significant health problem. Trauma accounts for nearly a third of the lost years of productive life before 65 years of age and is associated with infection, hemorrhagic shock, reperfusion syndrome, and inflammation. The control of hemorrhage, coagulopathy, optimal use of blood products, balancing hypo and hyperperfusion, and hemostatic resuscitation improve survival in cases of trauma with massive hemorrhage. This review discusses inflammation in the context of trauma-associated hemorrhagic shock. When one considers the known immunomodulatory effects of traumatic injury, allogeneic blood transfusion, and the overlap between patient populations, it is surprising that so few studies have assessed their combined effects on immune function. We also discuss the relative benefits of curbing inflammation rather than attempting to prevent it.
Resumo:
Objective: To evaluate perioperative outcomes, safety and feasibility of video-assisted resection for primary and secondary liver lesions. Methods : From a prospective database, we analyzed the perioperative results (up to 90 days) of 25 consecutive patients undergoing video-assisted resections in the period between June 2007 and June 2013. Results : The mean age was 53.4 years (23-73) and 16 (64%) patients were female. Of the total, 84% were suffering from malignant diseases. We performed 33 resections (1 to 4 nodules per patient). The procedures performed were non-anatomical resections (n = 26), segmentectomy (n = 1), 2/3 bisegmentectomy (n = 1), 6/7 bisegmentectomy (n = 1), left hepatectomy (n = 2) and right hepatectomy (n = 2). The procedures contemplated postero-superior segments in 66.7%, requiring multiple or larger resections. The average operating time was 226 minutes (80-420), and anesthesia time, 360 minutes (200-630). The average size of resected nodes was 3.2 cm (0.8 to 10) and the surgical margins were free in all the analyzed specimens. Eight percent of patients needed blood transfusion and no case was converted to open surgery. The length of stay was 6.5 days (3-16). Postoperative complications occurred in 20% of patients, with no perioperative mortality. Conclusion : The video-assisted liver resection is feasible and safe and should be part of the liver surgeon armamentarium for resection of primary and secondary liver lesions.
Resumo:
The study of canine immunohematology is very important for veterinary transfusion medicine. The objective of this study was to determine the DEA blood type frequencies in a purebred canine blood donor population from Porto Alegre, RS, Brazil. One hundred clinically healthy purebred dogs were chosen, 20 dogs from each breed (Great Dane, Rottweiler, Golden Retriever, German Shepherd and Argentine Dogo). Blood samples were taken in ACD-A tubes and the MSU hemagglutination tube test (MI, USA) was used to determine the blood types. The studied population presented general frequencies of 61% for DEA 1.1, 22% for DEA 1.2, 7% for DEA 3, 100% for DEA 4, 9% for DEA 5 and 16% for DEA 7. A significant association was found between breeds and certain combinations of blood types in this population. The results are in agreement with the literature since most part of the canine population studied was positive for DEA 1.1, the most antigenic blood type in dogs. Differences were found among the studied breeds and those should be considered when selecting a blood donor. The knowledge of blood types frequencies and their combinations in different canine populations, including different breeds, is important because it shows the particularities of each group, helps to keep a data bank of local frequencies and minimizes the risks of transfusion reactions.
Resumo:
Background: Approximately 11,000 revascularization procedures, either percutaneous coronary interventions (PCI) or coronary artery bypass grafting surgery (CABG), are performed yearly in Finland for coronary artery disease. Periprocedural risk factors for mortality and morbidity as well as long-term outcome have been extensively studied in general populations undergoing revascularization. Treatment choice between PCI and CABG in many high risk groups and risk-stratification, however, needs clarification and there is still room for improvement in periprocedural outcomes. Materials and methods: Cohorts of patients from Finnish hospitals revascularized between 2001 and 2011 were retrospectively analyzed. Patient records were reviewed for baseline variables and postprocedural outcomes (stroke, myocardial infarction, quality of life measured by the EQ-5D –questionnaire, repeat revascularization, bleeding episodes). Data on date and mode of death was acquired from Statistics Finland. Statistical analysis was performed to identify predictors of adverse events and compare procedures. Results: Postoperative administration of blood products (red blood cells, fresh frozen plasma, platelets) after isolated CABG independently and dose-dependently increases the risk of stroke. Patients 80 years or older who underwent CABG had better survival at 5 years compared to those who underwent PCI. After adjusting for baseline differences survival was similar. Patients on oral anticoagulation (OAC) for atrial fibrillation (AF) treated with CABG had better survival and overall outcome at 3 years compared to PCI patients. There was no difference in incidence of stroke or bleeding episodes. Differences in outcome remained significant after adjusting for propensity score. Lower health-related quality of life (HRQOL) scores as measured by the visual analogue scale (VAS) of the EQ-5D questionnaire at 6 months after CABG predicted later major adverse cardiac and cerebrovascular events (MACCE). Deteriorating function and VAS scores between 0 and 6 months on the EQ-5D also independently predicted later MACCE. Conclusions: Administration of blood products can increase the risk of stroke after CABG and liberal use of transfusions should be avoided. In the frail subpopulations of patients on OAC and octogenarians CABG appears to offer superior long-term outcome as compared to PCI. Deteriorating HRQOL scores predict later adverse events after CABG. Keywords: percutaneous coronary intervention, coronary artery bypass grafting, age over 80, transfusion, anticoagulants, coronary artery disease, health-related quality of life, outcome.
Resumo:
Immature and mature leaves of juvenile and adult plants of Araucaria angustifolia (Araucariaceae) were observed with the objective of updating the morphoanatomical data of the leaves of this species, which were restricted to basic descriptions in previous studies. The observations, made in optical allowed to establish anatomical differences among mature leaves of juvenile and adult plants in relation to the number of palisade parenchimal layers, the number of compartmented cells and the transfusion tissue development. Epidermis, the albuminous cells, the phloem, and the transfusion tissue descriptions are in disagreement with the data obtained data by different authors. The epidermal tissue and the hypodermis differ entirely when the plant is still juvenile, being inferred that these tissues would soon perform the protection function against mechanical damages and water loss, the vital characteristics during the first development months of young offspring.
Resumo:
Thirty-seven patients were submitted to kidney transplantation after transfusion at 2-week intervals with 4-week stored blood from their potential donors. All patients and donors were typed for HLA-A-B and DR antigens. The patients were also tested for cytotoxic antibodies against donor antigens before each transfusion. The percentage of panel reactive antibodies (PRA) was determined against a selected panel of 30 cell donors before and after the transfusions. The patients were immunosuppressed with azathioprine and prednisone. Rejection crises were treated with methylprednisolone. The control group consisted of 23 patients who received grafts from an unrelated donor but who did not receive donor-specific pretransplant blood transfusion. The incidence and reversibility of rejection episodes, allograft loss caused by rejection, and patient and graft survival rates were determined for both groups. Non-parametric methods (chi-square and Fisher tests) were used for statistical analysis, with the level of significance set at P<0.05. The incidence and reversibility of rejection crises during the first 60 post-transplant days did not differ significantly between groups. The actuarial graft and patient survival rates at five years were 56% and 77%, respectively, for the treated group and 39.8% and 57.5% for the control group. Graft loss due to rejection was significantly higher in the untreated group (P = 0.0026) which also required more intense immunosuppression (P = 0.0001). We conclude that tranfusions using stored blood have the immunosuppressive effect of fresh blood transfusions without the risk of provoking a widespread formation of antibodies. In addition, this method permits a reduction of the immunosuppressive drugs during the process without impairing the adequate functioning of the renal graft
Resumo:
Red blood cells (RBC) are viable if kept in an adequate preservative solution, although gradual changes in morphology and metabolism may occur. There is a gradual decrease in adenosine-5'-triphosphate (ATP) concentration, pH, glucose consumption, and enzyme activity during preservation. The normal discocyte shapes are initially replaced by echinocytes and stomatocytes and, at final stages, by spherocytes, the last step before splenic sequestration. Post-transfusional survival has been correlated with the ATP concentration. RBC preserved in ADSOL, a solution containing adenine, dextrose, sodium chloride, and mannitol, are viable for transfusion for up to 6 weeks. Erythrocytes from 10 blood units taken from healthy adult donors were preserved for 12 weeks in ADSOL at 4oC. We now report a significant correlation (r2 = 0.98) between the percentage of discocytes (89 to 7%) and ATP (100 to 10%) concentration in ADSOL-preserved RBC. The results suggest that the percent of discocyte shapes used as an indicator of ATP concentration may be a useful indicator for quality control of RBC viability in centers which have limited assay facilities.
Resumo:
Systemic iron overload (IO) is considered a principal determinant in the clinical outcome of different forms of IO and in allogeneic hematopoietic stem cell transplantation (alloSCT). However, indirect markers for iron do not provide exact quantification of iron burden, and the evidence of iron-induced adverse effects in hematological diseases has not been established. Hepatic iron concentration (HIC) has been found to represent systemic IO, which can be quantified safely with magnetic resonance imaging (MRI), based on enhanced transverse relaxation. The iron measurement methods by MRI are evolving. The aims of this study were to implement and optimise the methodology of non-invasive iron measurement with MRI to assess the degree and the role of IO in the patients. An MRI-based HIC method (M-HIC) and a transverse relaxation rate (R2*) from M-HIC images were validated. Thereafter, a transverse relaxation rate (R2) from spin-echo imaging was calibrated for IO assessment. Two analysis methods, visual grading and rSI, for a rapid IO grading from in-phase and out-of-phase images were introduced. Additionally, clinical iron indicators were evaluated. The degree of hepatic and cardiac iron in our study patients and IO as a prognostic factor in patients undergoing alloSCT were explored. In vivo and in vitro validations indicated that M-HIC and R2* are both accurate in the quantification of liver iron. R2 was a reliable method for HIC quantification and covered a wider HIC range than M-HIC and R2*. The grading of IO was able to be performed rapidly with the visual grading and rSI methods. Transfusion load was more accurate than plasma ferritin in predicting transfusional IO. In patients with hematological disorders, the prevalence of hepatic IO was frequent, opposite to cardiac IO. Patients with myelodysplastic syndrome were found to be the most susceptible to IO. Pre-transplant IO predicted severe infections during the early post-transplant period, in contrast to the reduced risk of graft-versus-host disease. Iron-induced, poor transplantation results are most likely to be mediated by severe infections.
Resumo:
We determined and analyzed risk factors of hepatitis C virus (HCV)-infected Brazilian hemophiliacs according to their virological, clinical and epidemiological characteristics. A cross-sectional and retrospective study of 469 hemophiliacs was carried out at a Brazilian blood center starting in October 1997. The prevalence of HCV infection, HCV genotypes and factors associated with HCV RNA detection was determined. The seroprevalence of anti-HCV antibodies (ELISA-3.0) was 44.6% (209/469). Virological, clinical and epidemiological assessments were completed for 162 positive patients. There were seven (4.3%) anti-HCV seroconversions between October 1992 and October 1997. During the same period, 40.8% of the positive anti-HCV hemophiliacs had abnormal alanine transaminase (ALT) levels. Plasma HCV RNA was detected by nested-RT-PCR in 116 patients (71.6%). RFLP analysis showed the following genotype distribution: HCV-1 in 98 hemophiliacs (84.5%), HCV-3 in ten (8.6%), HCV-4 in three (2.6%), HCV-2 in one (0.9%), and not typeable in four cases (3.4%). Univariate analysis indicated that older age (P = 0.017) and abnormal ALT levels (P = 0.010) were associated with HCV viremia, while the presence of inhibitor antibodies (P = 0.024) and HBsAg (P = 0.007) represented a protective factor against the presence of HCV RNA. These findings may contribute to a better understanding of the relationship between HCV infection and hemophilia.
Resumo:
The purpose of the present study was to identify noninvasive methods to evaluate the severity of iron overload in transfusion-dependent ß-thalassemia and the efficiency of intensive intravenous therapy as an additional tool for the treatment of iron-overloaded patients. Iron overload was evaluated for 26 ß-thalassemia homozygous patients, and 14 of them were submitted to intensive chelation therapy with high doses of intravenous deferoxamine (DF). Patients were classified into six groups of increasing clinical severity and were divided into compliant and non-compliant patients depending on their adherence to chronic chelation treatment. Several methods were used as indicators of iron overload. Total gain of transfusion iron, plasma ferritin, and urinary iron excretion in response to 20 to 60 mg/day subcutaneous DF for 8 to 12 h daily are useful to identify iron overload; however, urinary iron excretion in response to 9 g intravenous DF over 24 h and the increase of urinary iron excretion induced by high doses of the chelator are more reliable to identify different degrees of iron overload because of their correlation with the clinical grades of secondary hemochromatosis and the significant differences observed between the groups of compliant and non-compliant patients. Finally, the use of 3-9 g intravenous DF for 6-12 days led to a urinary iron excretion corresponding to 4.1 to 22.4% of the annual transfusion iron gain. Therefore, continuous intravenous DF at high doses may be an additional treatment for these patients, as a complement to the regular subcutaneous infusion at home, but requires individual planning and close monitoring of adverse reactions.
Resumo:
Patients with sickle-cell anemia submitted to frequent blood transfusions are at risk of contamination with hepatitis C virus (HCV). Determination of HCV RNA and genotype characterization are parameters that are relevant for the treatment of the viral infection. The objective of the present study was to determine the frequency of HCV infection and the positivity for HCV RNA and to identify the HCV genotype in patients with sickle-cell anemia with a history of blood transfusion who had been treated at the Hospital of the HEMOPE Foundation. Sera from 291 patients were tested for anti-HCV antibodies by ELISA 3.0 and RIBA 3.0 Chiron and for the presence of HCV RNA by RT-PCR. HCV genotyping was performed in 19 serum samples. Forty-one of 291 patients (14.1%) were anti-HCV positive by ELISA and RIBA. Both univariate and multivariate analysis showed a greater risk of anti-HCV positivity in those who had started a transfusion regime before 1992 and received more than 10 units of blood. Thirty-four of the anti-HCV-positive patients (34/41, 82.9%) were also HCV RNA positive. Univariate analysis, used to compare HCV RNA-negative and -positive patients, did not indicate a higher risk of HCV RNA positivity for any of the variables evaluated. The genotypes identified were 1b (63%), 1a (21%) and 3a (16%). A high prevalence of HCV infection was observed in our patients with sickle-cell anemia (14.1%) compared to the population in general (3%). In the literature, the frequency of HCV infection in sickle-cell anemia ranges from 2 to 30%. The serological screening for anti-HCV at blood banks after 1992 has contributed to a better control of the dissemination of HCV infection. Because of the predominance of genotype 1, these patients belong to a group requiring special treatment, with a probable indication of new therapeutic options against HCV.
Resumo:
Blood transfusion in patients with sickle cell disease (SCD) is limited by the development of alloantibodies to erythrocytes. In the present study, the frequency and risk factors for alloimmunization were determined. Transfusion records and medical charts of 828 SCD patients who had been transfused and followed at the Belo Horizonte Blood Center, Belo Horizonte, MG, Brazil, were retrospectively reviewed. Alloimmunization frequency was 9.9% (95% CI: 7.9 to 11.9%) and 125 alloantibodies were detected, 79% of which belonged to the Rhesus and Kell systems. Female patients developed alloimmunization more frequently (P = 0.03). The median age of the alloimmunized group was 23.3 years, compared to 14.6 years for the non-alloimmunized group (P < 0.0001). Multivariate analyses were applied to the data for 608 hemoglobin (Hb) SS or SC patients whose number of transfusions was recorded accurately. Number of transfusions (P = 0.00006), older age (P = 0.056) and Hb SC (P = 0.02) showed independent statistical associations with alloimmunization. Hb SC patients older than 14 years faced a 2.8-fold higher (95% CI: 1.3 to 6.0) risk of alloimmunization than Hb SS patients. Female Hb SC patients had the highest risk of developing alloantibodies. In patients younger than 14 years, only the number of transfusions was significant. We conclude that an increased risk of alloimmunization was associated with older patients with Hb SC, specially females, even after adjustments were made for the number of transfusions received, the most significant variable.
Resumo:
Cardiopulmonary bypass is frequently associated with excessive blood loss. Platelet dysfunction is the main cause of non-surgical bleeding after open-heart surgery. We randomized 65 patients in a double-blind fashion to receive tranexamic acid or placebo in order to determine whether antifibrinolytic therapy reduces chest tube drainage. The tranexamic acid group received an intravenous loading dose of 10 mg/kg, before the skin incision, followed by a continuous infusion of 1 mg kg-1 h-1 for 5 h. The placebo group received a bolus of normal saline solution and continuous infusion of normal saline for 5 h. Postoperative bleeding and fibrinolytic activity were assessed. Hematologic data, convulsive seizures, allogeneic transfusion, occurrence of myocardial infarction, mortality, allergic reactions, postoperative renal insufficiency, and reopening rate were also evaluated. The placebo group had a greater postoperative blood loss (median (25th to 75th percentile) 12 h after surgery (540 (350-750) vs 300 (250-455) mL, P = 0.001). The placebo group also had greater blood loss 24 h after surgery (800 (520-1050) vs 500 (415-725) mL, P = 0.008). There was a significant increase in plasma D-dimer levels after coronary artery bypass grafting only in patients of the placebo group, whereas no significant changes were observed in the group treated with tranexamic acid. The D-dimer levels were 1057 (1025-1100) µg/L in the placebo group and 520 (435-837) µg/L in the tranexamic acid group (P = 0.01). We conclude that tranexamic acid effectively reduces postoperative bleeding and fibrinolysis in patients undergoing first-time coronary artery bypass grafting compared to placebo.
Resumo:
Hepatitis C virus (HCV) infection has been identified as the major cause of chronic liver disease among patients on chronic hemodialysis (HD), despite the important reduction in risks obtained by testing candidate blood donors for anti-HCV antibodies and the use of recombinant erythropoietin to treat anemia. A cross-sectional study was performed to estimate the prevalence of HCV infection and genotypes among HD patients in Salvador, Northeastern Brazil. Anti-HCV seroprevalence was determined by ELISA in 1243 HD patients from all ten different dialysis centers of the city. HCV infection was confirmed by RT-PCR and genotyping was performed by restriction fragment length polymorphism. Anti-HCV seroprevalence among HD patients was 10.5% (95% CI: 8.8-12.3) (Murex anti-HCV, Abbott Murex, Chicago, IL, USA). Blood samples for qualitative HCV detection and genotyping were collected from 125/130 seropositive HD patients (96.2%). HCV-RNA was detected in 92/125 (73.6%) of the anti-HCV-positive patients. HCV genotype 1 (77.9%) was the most prevalent, followed by genotype 3 (10.5%) and genotype 2 (4.6%). Mixed infections of genotypes 1 and 3 were found in 7.0% of the total number of patients. The present results indicate a significant decrease in anti-HCV prevalence from 23.8% detected in a study carried out in 1994 to 10.5% in the present study. The HCV genotype distribution was closely similar to that observed in other hemodialysis populations in Brazil, in local candidate blood donors and in other groups at risk of transfusion-transmitted infection.