984 resultados para Blood Component Transfusion
Resumo:
Thirty-seven patients were submitted to kidney transplantation after transfusion at 2-week intervals with 4-week stored blood from their potential donors. All patients and donors were typed for HLA-A-B and DR antigens. The patients were also tested for cytotoxic antibodies against donor antigens before each transfusion. The percentage of panel reactive antibodies (PRA) was determined against a selected panel of 30 cell donors before and after the transfusions. The patients were immunosuppressed with azathioprine and prednisone. Rejection crises were treated with methylprednisolone. The control group consisted of 23 patients who received grafts from an unrelated donor but who did not receive donor-specific pretransplant blood transfusion. The incidence and reversibility of rejection episodes, allograft loss caused by rejection, and patient and graft survival rates were determined for both groups. Non-parametric methods (chi-square and Fisher tests) were used for statistical analysis, with the level of significance set at P<0.05. The incidence and reversibility of rejection crises during the first 60 post-transplant days did not differ significantly between groups. The actuarial graft and patient survival rates at five years were 56% and 77%, respectively, for the treated group and 39.8% and 57.5% for the control group. Graft loss due to rejection was significantly higher in the untreated group (P = 0.0026) which also required more intense immunosuppression (P = 0.0001). We conclude that tranfusions using stored blood have the immunosuppressive effect of fresh blood transfusions without the risk of provoking a widespread formation of antibodies. In addition, this method permits a reduction of the immunosuppressive drugs during the process without impairing the adequate functioning of the renal graft
Resumo:
Red blood cells (RBC) are viable if kept in an adequate preservative solution, although gradual changes in morphology and metabolism may occur. There is a gradual decrease in adenosine-5'-triphosphate (ATP) concentration, pH, glucose consumption, and enzyme activity during preservation. The normal discocyte shapes are initially replaced by echinocytes and stomatocytes and, at final stages, by spherocytes, the last step before splenic sequestration. Post-transfusional survival has been correlated with the ATP concentration. RBC preserved in ADSOL, a solution containing adenine, dextrose, sodium chloride, and mannitol, are viable for transfusion for up to 6 weeks. Erythrocytes from 10 blood units taken from healthy adult donors were preserved for 12 weeks in ADSOL at 4oC. We now report a significant correlation (r2 = 0.98) between the percentage of discocytes (89 to 7%) and ATP (100 to 10%) concentration in ADSOL-preserved RBC. The results suggest that the percent of discocyte shapes used as an indicator of ATP concentration may be a useful indicator for quality control of RBC viability in centers which have limited assay facilities.
Resumo:
Simultaneous measurements of EEG-functional magnetic resonance imaging (fMRI) combine the high temporal resolution of EEG with the distinctive spatial resolution of fMRI. The purpose of this EEG-fMRI study was to search for hemodynamic responses (blood oxygen level-dependent - BOLD responses) associated with interictal activity in a case of right mesial temporal lobe epilepsy before and after a successful selective amygdalohippocampectomy. Therefore, the study found the epileptogenic source by this noninvasive imaging technique and compared the results after removing the atrophied hippocampus. Additionally, the present study investigated the effectiveness of two different ways of localizing epileptiform spike sources, i.e., BOLD contrast and independent component analysis dipole model, by comparing their respective outcomes to the resected epileptogenic region. Our findings suggested a right hippocampus induction of the large interictal activity in the left hemisphere. Although almost a quarter of the dipoles were found near the right hippocampus region, dipole modeling resulted in a widespread distribution, making EEG analysis too weak to precisely determine by itself the source localization even by a sophisticated method of analysis such as independent component analysis. On the other hand, the combined EEG-fMRI technique made it possible to highlight the epileptogenic foci quite efficiently.
Resumo:
Contexte. Les phénotypes ABO et Rh(D) des donneurs de sang ainsi que des patients transfusés sont analysés de façon routinière pour assurer une complète compatibilité. Ces analyses sont accomplies par agglutination suite à une réaction anticorps-antigènes. Cependant, pour des questions de coûts et de temps d’analyses faramineux, les dons de sang ne sont pas testés sur une base routinière pour les antigènes mineurs du sang. Cette lacune peut résulter à une allo-immunisation des patients receveurs contre un ou plusieurs antigènes mineurs et ainsi amener des sévères complications pour de futures transfusions. Plan d’étude et Méthodes. Pour ainsi aborder le problème, nous avons produit un panel génétique basé sur la technologie « GenomeLab _SNPstream» de Beckman Coulter, dans l’optique d’analyser simultanément 22 antigènes mineurs du sang. La source d’ADN provient des globules blancs des patients préalablement isolés sur papiers FTA. Résultats. Les résultats démontrent que le taux de discordance des génotypes, mesuré par la corrélation des résultats de génotypage venant des deux directions de l’ADN, ainsi que le taux d’échec de génotypage sont très bas (0,1%). Également, la corrélation entre les résultats de phénotypes prédit par génotypage et les phénotypes réels obtenus par sérologie des globules rouges et plaquettes sanguines, varient entre 97% et 100%. Les erreurs expérimentales ou encore de traitement des bases de données ainsi que de rares polymorphismes influençant la conformation des antigènes, pourraient expliquer les différences de résultats. Cependant, compte tenu du fait que les résultats de phénotypages obtenus par génotypes seront toujours co-vérifiés avant toute transfusion sanguine par les technologies standards approuvés par les instances gouvernementales, les taux de corrélation obtenus sont de loin supérieurs aux critères de succès attendus pour le projet. Conclusion. Le profilage génétique des antigènes mineurs du sang permettra de créer une banque informatique centralisée des phénotypes des donneurs, permettant ainsi aux banques de sang de rapidement retrouver les profiles compatibles entre les donneurs et les receveurs.
Resumo:
Dans le but de vérifier l’impact d’un changement soudain dans l’agrégation érythrocytaire sur certains paramètres cardiovasculaires, une transfusion par échange sanguin du tiers du volume a été effectuée avec du sang hyperagrégeant chez le rat de souche Brown Norway. La pression caudale, le volume cardiaque systolique, la fraction d’éjection, le débit cardiaque, le rythme cardiaque et la résistance périphérique à l’écoulement sanguin ont été observés non-intrusivement sur 19 jours suite à la transfusion. Les rats ont été sacrifiés plus d’un mois suivant la transfusion et une étude ex vivo de la réponse à deux agents dilatateurs (l’acétylcholine et le nitroprussiate de sodium) a été menée sur les artérioles mésentériques. Des variations des paramètres cardiovasculaires, soit le débit, le volume systolique et la résistance périphérique, ont été remarquées dans les trois premiers jours posttransfusion. Une résistance du muscle vasculaire lisse au monoxyde d’azote a été notée chez les rats transfusés au sang hyperagrégeant alors qu’aucune dysfonction endothéliale n’était apparente en réponse à l’acétylcholine.
Resumo:
Introduction. Fractal geometry measures the irregularity of abstract and natural objects with the fractal dimension. Fractal calculations have been applied to the structures of the human body and to quantifications in physiology from the theory of dynamic systems.Material and Methods. The fractal dimensions were calculated, the number of occupation spaces in the space border of box counting and the area of two red blood cells groups, 7 normal ones, group A, and 7 abnormal, group B, coming from patient and of bags for transfusion, were calculated using the method of box counting and a software developed for such effect. The obtained measures were compared, looking for differences between normal and abnormal red blood cells, with the purpose of differentiating samples.Results. The abnormality characterizes by a number of squares of occupation of the fractal space greater or equal to 180; values of areas between 25.117 and 33.548 correspond to normality. In case that the evaluation according to the number of pictures is of normality, must be confirmed with the value of the area applied to adjacent red blood cells within the sample, that in case of having values by outside established and/or the greater or equal spaces to 180, they suggest abnormality of the sample.Conclusions. The developed methodology is effective to differentiate the red globules alterations and probably useful in the analysis of bags of transfusion for clinical use
Resumo:
Although shorebirds spending the winter in temperate areas frequently use estuarine and supratidal (upland) feeding habitats, the relative contribution of each habitat to individual diets has not been directly quantified. We quantified the proportional use that Calidris alpina pacifica (Dunlin) made of estuarine vs. terrestrial farmland resources on the Fraser River Delta, British Columbia, using stable isotope analysis (δ13C, δ15N) of blood from 268 Dunlin over four winters, 1997 through 2000. We tested for individual, age, sex, morphological, seasonal, and weather-related differences in dietary sources. Based on single- (δ13C) and dual-isotope mixing models, the agricultural habitat contributed approximately 38% of Dunlin diet averaged over four winters, with the balance from intertidal flats. However, there was a wide variation among individuals in the extent of agricultural feeding, ranging from about 1% to 95% of diet. Younger birds had a significantly higher terrestrial contribution to diet (43%) than did adults (35%). We estimated that 6% of adults and 13% of juveniles were obtaining at least 75% of their diet from terrestrial sources. The isotope data provided no evidence for sex or overall body size effects on the proportion of diet that is terrestrial in origin. The use of agricultural habitat by Dunlin peaked in early January. Adult Dunlin obtained a greater proportion of their diet terrestrially during periods of lower temperatures and high precipitation, whereas no such relationship existed for juveniles. Seasonal variation in the use of agricultural habitat suggests that it is used more during energetically stressful periods. The terrestrial farmland zone appears to be consistently important as a habitat for juveniles, but for adults it may provide an alternative feeding site used as a buffer against starvation during periods of extreme weather. Loss or reduction of agricultural habitat adjacent to estuaries may negatively impact shorebird fitness, with juveniles disproportionately affected.
Resumo:
Objective: The objective of this study was to explore the relationship between low density lipoprotein (LDL) and dendritic cell (DC) activation, based upon the hypothesis that reactive oxygen species (ROS)-mediated modification of proteins that may be present in local DC microenvironments could be important as mediators of this activation. Although LDL are known to be oxidised in vivo, and taken up by macrophages during atherogenesis; their effect on DC has not been explored previously. Methods: Human DCs were prepared from peripheral blood monocytes using GM-CSF and IL-4. Plasma LDLs were isolated by sequential gradient centrifugation, oxidised in CuSO4, and oxidation arrested to yield mild, moderate and highly oxidised LDL forms. DCs exposed to these LDLs were investigated using combined phenotypic, functional (autologous T cell activation), morphological and viability assays. Results: Highly-oxidised LDL increased DC HLA-DR, CD40 and CD86 expression, corroborated by increased DC-induced T cell proliferation. Both native and oxidised LDL induced prominent DC clustering. However, high concentrations of highly-oxidised LDL inhibited DC function, due to increased DC apoptosis. Conclusions: This study supports the hypothesis that oxidised LDL are capable of triggering the transition from sentinel to messenger DC. Furthermore, the DC clustering–activation–apoptosis sequence in the presence of different LDL forms is consistent with a regulatory DC role in immunopathogenesis of atheroma. A sequence of initial accumulation of DC, increasing LDL oxidation, and DC-induced T cell activation, may explain why local breach of tolerance can occur. Above a threshold level, however, supervening DC apoptosis limits this, contributing instead to the central plaque core.
Resumo:
Dietary nitrate, from beetroot, has been reported to lower blood pressure (BP) by the sequential reduction of nitrate to nitrite and further to NO in the circulation. However, the impact of beetroot on microvascular vasodilation and arterial stiffness is unknown. In addition, beetroot is consumed by only 4.5% of the UK population, whereas bread is a staple component of the diet. Thus, we investigated the acute effects of beetroot bread (BB) on microvascular vasodilation, arterial stiffness, and BP in healthy participants. Twenty-three healthy men received 200 g bread containing 100 g beetroot (1.1 mmol nitrate) or 200 g control white bread (CB; 0 g beetroot, 0.01 mmol nitrate) in an acute, randomized, open-label, controlled crossover trial. The primary outcome was postprandial microvascular vasodilation measured by laser Doppler iontophoresis and the secondary outcomes were arterial stiffness measured by Pulse Wave Analysis and Velocity and ambulatory BP measured at regular intervals for a total period of 6 h. Plasma nitrate and nitrite were measured at regular intervals for a total period of 7 h. The incremental area under the curve (0-6 h after ingestion of bread) for endothelium-independent vasodilation was greater (P = 0.017) and lower for diastolic BP (DBP; P = 0.032) but not systolic (P = 0.99) BP after BB compared with CB. These effects occurred in conjunction with increases in plasma and urinary nitrate (P < 0.0001) and nitrite (P < 0.001). BB acutely increased endothelium-independent vasodilation and decreased DBP. Therefore, enriching bread with beetroot may be a suitable vehicle to increase intakes of cardioprotective beetroot in the diet and may provide new therapeutic perspectives in the management of hypertension.
Resumo:
P>Many hemoglobin-derived peptides are present in mouse brain, and several of these have bioactive properties including the hemopressins, a related series of peptides that bind to cannabinoid CB1 receptors. Although hemoglobin is a major component of red blood cells, it is also present in neurons and glia. To examine whether the hemoglobin-derived peptides in brain are similar to those present in blood and heart, we used a peptidomics approach involving mass spectrometry. Many hemoglobin-derived peptides are found only in brain and not in blood, whereas all hemoglobin-derived peptides found in heart were also seen in blood. Thus, it is likely that the majority of the hemoglobin-derived peptides detected in brain are produced from brain hemoglobin and not erythrocytes. We also examined if the hemopressins and other major hemoglobin-derived peptides were regulated in the Cpefat/fat mouse; previously these mice were reported to have elevated levels of several hemoglobin-derived peptides. Many, but not all of the hemoglobin-derived peptides were elevated in several brain regions of the Cpefat/fat mouse. Taken together, these findings suggest that the post-translational processing of alpha and beta hemoglobin into the hemopressins, as well as other peptides, is up-regulated in some but not all Cpefat/fat mouse brain regions.
Resumo:
BACKGROUND: This study evaluated demographic profiles and prevalence of serologic markers among donors who used confidential unit exclusion (CUE) to assess the effectiveness of CUE and guide public policies regarding the use of CUE for enhancing safety versus jeopardizing the blood supply by dropping CUE. STUDY DESIGN AND METHODS: We conducted a cross-sectional analysis of whole blood donations at a large public blood center in Sao Paulo from July 2007 through June 2009, compared demographic data, and confirmed serologic results among donors who used and who have never used CUE (CUE never). RESULTS: There were 265,550 whole blood units collected from 181,418 donors from July 2007 through June 2009. A total of 9658 (3.6%) units were discarded, 2973 (1.1%) because CUE was used at the current donation (CUE now) and 6685 (2.5%) because CUE was used in the past (CUE past). The CUE rate was highest among donors with less than 8 years of education (odds ratio [OR], 2.78; 95% confidence interval [CI], 2.51-3.08). CUE now donations were associated with higher positive infectious disease marker rates than CUE never donations (OR, 1.41; CI, 1.13-1.77), whereas CUE past donations were not (OR, 1.04; CI, 0.75-1.45). CONCLUSION: The CUE process results in a high rate of unit discard. CUE use on an individual donation appears predictive of a high-risk marker-positive donation and, thus, appears to contribute modestly to blood safety. The policy of discarding units from donors who have previously CUE-positive donations does not improve safety and should be discontinued.
Resumo:
Os cães possuem cinco grupos sangüíneos bem estabelecidos, compostos por sete determinantes antigênicos eritrocitários, os quais são denominados de dog erythrocyte antigen (DEA). O grupo DEA 1 (subgrupos 1.1, 1.2 e 1.3) tem sido considerado o mais importante no que se refere às transfusões de sangue. Isto ocorre porque esse grupo possui um alto potencial para estimulação antigênica e, dessa forma, pode estimular a produção de anticorpos se um receptor DEA 1 negativo receber uma transfusão de sangue DEA 1 positivo, levando a uma reação transfusional hemolítica em uma segunda transfusão com hemácias do tipo DEA 1. A freqüência de aparecimento do grupo DEA 1 é bem conhecida em outros países, porém, até então, não havia informações disponíveis sobre o referido grupo no Brasil. No presente estudo, objetivou-se avaliar a prevalência do grupo sangüíneo DEA 1 (subgrupos 1.1 e 1.2) em cães criados no Brasil. Para tanto, 150 cães de raças, sexos e idades diferentes, triados junto ao Hospital Veterinário da FCAV/UNESP, Campus de Jaboticabal, foram submetidos a tipagem sangüínea para o grupo DEA 1 (subgrupos 1.1 e 1.2) canino, utilizando-se reagentes adquiridos comercialmente junto ao Laboratório de Imunoematologia e Sorologia da Universidade de Michigan (EUA). Os resultados obtidos neste ensaio revelaram que a prevalência geral para o grupo DEA 1 é de 91,3%, consideradas as condições e características da população estudada, compreendendo 51,3% de cães do tipo DEA 1.1, 40% de cães do tipo DEA 1.2, e os 8,7% restantes sendo negativos para o referido grupo. A partir das prevalências encontradas, calculou-se que a probabilidade de um cão DEA 1 negativo receber sangue DEA 1.1, em uma primeira transfusão feita ao acaso, é de aproximadamente 4,5%. Sendo assim, este índice reflete um risco potencial para a sensibilização de um receptor DEA 1 negativo, o que deflagraria a produção de anticorpos. Posteriormente, se este mesmo paciente recebesse uma segunda transfusão de sangue, feita ao acaso, a probabilidade de receber hemácias do tipo DEA 1.1 seria de aproximadamente 2,3%, o que representaria o risco potencial de ocorrência de uma reação transfusional hemolítica aguda. Por outro lado, a probabilidade de este cão receber sangue do tipo DEA 1.2 seria cerca de 1,8%, o que levaria a uma reação transfusional menos grave, porém potencialmente prejudicial. No presente estudo, observou-se que o risco potencial para uma reação transfusional é mínimo, quando se trata de um cão mestiço.
Resumo:
Background: Although galactose is an important component in human lactose, there are few reports of its role in the newborn metabolism. Objective: To determine the relationship of blood galactose and glucose levels in mothers, cord blood, and breast-fed full-term newborn infants. Methods: Maternal and cord vein blood samples were obtained from 27 pregnant women at delivery, and from their breastfed, full-term newborns 48 h later. Galactose and glucose were determined by HPLC. Statistical analysis used ANOVA and Pearson correlation with p < 0.05. Results: Maternal galactose concentrations (0.08 +/- 0.03 mmol/l) were similar to cord blood galactose (0.07 +/- 0.03 mmol/l; p = 0.129). However, newborn blood galactose (0.05 +/- 0.02 mmol/l) was significantly lower than both cord (p = 0.042) and maternal blood (p = 0.002). Maternal blood glucose levels (4.72 +/- 0.86 mmol/l) were higher than cord blood (3.98 +/- 0.57 mmol/l; p < 0.001), and cord blood concentrations were higher than newborn blood levels (3.00 +/- 0.56 mmol/l; p < 0.001); all values expressed as mean +/- SD. Significant correlation was only seen between maternal and cord blood galactose levels (r = 0.67; p < 0.001) and glucose levels (r = 0.38; p = 0.047). Conclusion: the association and similarity between maternal and cord blood galactose levels suggest that the fetus is dependent on maternal galactose. In contrast, the lower galactose levels in newborn infants and a lack of association between both suggest self-regulation and a dependence on galactose ingestion. Copyright (c) 2007 S. Karger AG, Basel.
Resumo:
Objective To test the hypothesis that red blood cell (RBC) transfusions in preterm infants are associated with increased intra-hospital mortality.Study design Variables associated with death were studied with Cox regression analysis in a prospective cohort of preterm infants with birth weight <1500 g in the Brazilian Network on Neonatal Research. Intra-hospital death and death after 28 days of life were analyzed as dependent variables. Independent variables were infant demographic and clinical characteristics and RBC transfusions.Results of 1077 infants, 574 (53.3%) received at least one RBC transfusion during the hospital stay. The mean number of transfusions per infant was 3.3 +/- 3.4, with 2.1 +/- 2.1 in the first 28 days of life. Intra-hospital death occurred in 299 neonates (27.8%), and 60 infants (5.6%) died after 28 days of life. After adjusting for confounders, the relative risk of death during hospital stay was 1.49 in infants who received at least one RBC transfusion in the first 28 days of life, compared with infants who did not receive a transfusion. The risk of death after 28 days of life was 1.89 times higher in infants who received more than two RBC transfusions during their hospital stay, compared with infants who received one or two transfusions.Conclusion Transfusion was associated with increased death, and transfusion guidelines should consider risks and benefits of transfusion. (J Pediatr 2011; 159: 371-6).