940 resultados para Blood coagulation factors
Resumo:
The objective is to reinforce the importance of blood reinfusion as a cheap, safe and simple method, which can be used in small hospitals, especially those in which there is no blood bank. Moreover, even with the use of devices that perform the collection and filtration of blood, more recent studies show that the cost-benefit ratio is much better when autologous transfusion is compared with blood transfusions, even when there is injury to hollow viscera and blood contamination. It is known that the allogeneic blood transfusion carries a number of risks to patients, among them are the coagulation disorders mediated by excess enzymes in the conserved blood, and deficiency in clotting factors, mainly the Factor V, the proacelerin. Another factor would be the risk of contamination with still unknown pathogens or that are not investigated during screening for selection of donors, such as the West Nile Fever and Creutzfeldt-Jacob, better known as "Mad Cow" disease. Comparing both methods, we conclude that blood autotransfusion has numerous advantages over heterologous transfusion, even in large hospitals. We are not against blood transfusions, just do not agree that the patient's own blood is discarded without making sure there will be enough blood in stock to get him out of the hemorrhagic shock.
Resumo:
Toxoplasmosis is one of the most common parasitic zoonoses throughout the world. Infection in man and animals varies in different geographical areas influenced by many environmental conditions. Seroprevalence of Toxoplasma gondii infection in cattle in Brazil ranges from 1.03 to 71%. A cross-sectional survey was carried out in 58 out of 453 farms in the South Fluminense Paraiba Valley, State of Rio de Janeiro, Brazil. Over 3-year-old cattle (n=589) from dairy herds were selected for blood collection and detection of anti-T. gondii antibodies by indirect immunofluorescence reaction (IFA) with initial titration of 1:16; titers > 64 were considered positive. Univariate analysis of risk factors showed that cats in contact with cattle, cats in contact with drinking water, and number of cats were associated with T. gondii seroprevalence. Logistic regression revealed a two-fold increased risk for infection of cattle (p=0.0138) through larger number of cats (>3) compared with low numbers of cats (1-2) on the farm. In contrast, the presence of chickens was considered a protective factor (p=0.025).
Resumo:
The objective of this study was to investigate the prevalence of anti-Neospora caninum antibodies in cattle from milk producing farms of the microregion of Batalha, state of Alagoas, Brazil, as well as to identify the risk factors associated with the infection. Blood samples were collected from 1,004 cattle of 17 farms for the serological investigation regarding the presence of anti-N. caninum antibodies by the Indirect Immunofluorescence Reaction Technique (IMRT). From the total amount of samples analyzed, 77/1,004 (7.67%) were positive and 927/1,004 (92.33%) were negative. The logistical regression identified that cattle from farms without consortium breeding have an infection risk 6.33 (p<0.001; C.I. 2.89-13.10) times higher than cattle from farms with that type of breeding. Cattle from farms where the aborted fetuses are not adequately buried have an infection risk 3.04 (p<0.001; C.I. 1.64-5.63) times higher than cattle from farms with adequate destination of these fetuses. Infection by N. caninum occurs in cattle of the investigated region. The factors identified in our study can be used as risk indicators, so that control measures could be implemented to avoid infection by N. caninum in the herds of this region.
Resumo:
Few data are available on the prevalence and risk factors of Chlamydophila abortus infection in goats in Brazil. A cross-sectional study was carried out to determine the flock-level prevalence of C. abortus infection in goats from the semiarid region of the Paraíba State, Northeast region of Brazil, as well as to identify risk factors associated with the infection. Flocks were randomly selected and a pre-established number of female goats > 12 mo old were sampled in each of these flocks. A total of 975 serum samples from 110 flocks were collected, and structured questionnaire focusing on risk factors for C. abortus infection was given to each farmer at the time of blood collection. For the serological diagnosis the complement fixation test (CFT) using C. abortus S26/3 strain as antigen was performed. The flock-level factors for C. abortus prevalence were tested using multivariate logistic regression model. Fifty-five flocks out of 110 presented at least one seropositive animal with an overall prevalence of 50.0% (95%; CI: 40.3%, 59.7%). Ninety-one out of 975 dairy goats examined were seropositive with titers >32, resulting in a frequency of 9.3%. Lend buck for breeding (odds ratio = 2.35; 95% CI: 1.04-5.33) and history of abortions (odds ratio = 3.06; 95% CI: 1.37-6.80) were associated with increased flock prevalence.
Resumo:
Abstract: Platelet-rich plasma (PRP) is a product easy and inxpesnsive, and stands out to for its growth factors in tissue repair. To obtain PRP, centrifugation of whole blood is made with specific time and gravitational forces. Thus, the present work aimed to study a method of double centrifugation to obtain PRP in order to evaluate the effective increase of platelet concentration in the final product, the preparation of PRP gel, and to optimize preparation time of the final sample. Fifteen female White New Zealand rabbits underwent blood sampling for the preparation of PRP. Samples were separated in two sterile tubes containing sodium citrate. Tubes were submitted to the double centrifugation protocol, with lid closed and 1600 revolutions per minute (rpm) for 10 minutes, resulting in the separation of red blood cells, plasma with platelets and leucocytes. After were opened and plasma was pipetted and transferred into another sterile tube. Plasma was centrifuged again at 2000rpm for 10 minutes; as a result it was split into two parts: on the top, consisting of platelet-poor plasma (PPP) and at the bottom of the platelet button. Part of the PPP was discarded so that only 1ml remained in the tube along with the platelet button. This material was gently agitated to promote platelets resuspension and activated when added 0.3ml of calcium gluconate, resulting in PRP gel. Double centrifugation protocol was able to make platelet concentration 3 times higher in relation to the initial blood sample. The volume of calcium gluconate used for platelet activation was 0.3ml, and was sufficient to coagulate the sample. Coagulation time ranged from 8 to 20 minutes, with an average of 17.6 minutes. Therefore, time of blood centrifugation until to obtain PRP gel took only 40 minutes. It was concluded that PRP was successfully obtained by double centrifugation protocol, which is able to increase the platelet concentration in the sample compared with whole blood, allowing its use in surgical procedures. Furthermore, the preparation time is appropriate to obtain PRP in just 40 minutes, and calcium gluconate is able to promote the activation of platelets.
Resumo:
Background: Approximately 11,000 revascularization procedures, either percutaneous coronary interventions (PCI) or coronary artery bypass grafting surgery (CABG), are performed yearly in Finland for coronary artery disease. Periprocedural risk factors for mortality and morbidity as well as long-term outcome have been extensively studied in general populations undergoing revascularization. Treatment choice between PCI and CABG in many high risk groups and risk-stratification, however, needs clarification and there is still room for improvement in periprocedural outcomes. Materials and methods: Cohorts of patients from Finnish hospitals revascularized between 2001 and 2011 were retrospectively analyzed. Patient records were reviewed for baseline variables and postprocedural outcomes (stroke, myocardial infarction, quality of life measured by the EQ-5D –questionnaire, repeat revascularization, bleeding episodes). Data on date and mode of death was acquired from Statistics Finland. Statistical analysis was performed to identify predictors of adverse events and compare procedures. Results: Postoperative administration of blood products (red blood cells, fresh frozen plasma, platelets) after isolated CABG independently and dose-dependently increases the risk of stroke. Patients 80 years or older who underwent CABG had better survival at 5 years compared to those who underwent PCI. After adjusting for baseline differences survival was similar. Patients on oral anticoagulation (OAC) for atrial fibrillation (AF) treated with CABG had better survival and overall outcome at 3 years compared to PCI patients. There was no difference in incidence of stroke or bleeding episodes. Differences in outcome remained significant after adjusting for propensity score. Lower health-related quality of life (HRQOL) scores as measured by the visual analogue scale (VAS) of the EQ-5D questionnaire at 6 months after CABG predicted later major adverse cardiac and cerebrovascular events (MACCE). Deteriorating function and VAS scores between 0 and 6 months on the EQ-5D also independently predicted later MACCE. Conclusions: Administration of blood products can increase the risk of stroke after CABG and liberal use of transfusions should be avoided. In the frail subpopulations of patients on OAC and octogenarians CABG appears to offer superior long-term outcome as compared to PCI. Deteriorating HRQOL scores predict later adverse events after CABG. Keywords: percutaneous coronary intervention, coronary artery bypass grafting, age over 80, transfusion, anticoagulants, coronary artery disease, health-related quality of life, outcome.
Resumo:
Osteoporosis is a major health problem. Little is known about the risk factors in premenopause. Sixty 40-50-year old patients with regular menses were studied cross-sectionally. None of the patients were on drugs known to interfere with bone mass. Patients answered a dietary inquiry and had their bone mineral density (BMD) measured. The Z scores were used for the comparisons. A blood sample was taken for the determination of FSH, SHBG, estradiol, testosterone, calcium and alkaline phosphatase. Calcium and creatinine were measured in 24-h urine. A Z score less than -1 was observed for the lumbar spine of 14 patients (23.3%), and for the femur of 24 patients (40%). Patients with a Z score less than -1 for the lumbar spine were older than patients with a Z score ³-1 (45.7 vs 43.8 years) and presented higher values of alkaline phosphatase (71.1 ± 18.2 vs 57.1 ± 14.3 IU/l). Multiple regression analysis showed that a lower lumbar spine BMD was associated with higher values of alkaline phosphatase, lower calcium ingestion, a smaller body mass index (BMI), less frequent exercising, and older age. The patients with a Z score less than -1 for the femur were shorter than patients with a Z score ³-1 (158.2 vs 161.3 cm). Multiple regression analysis showed that a lower femoral BMD was associated with lower BMI, higher alkaline phosphatase and caffeine intake, and less frequent exercising. A lower than expected BMD was observed in a significant proportion of premenopausal women and was associated with lower calcium intake, relatively lower physical activity and lower BMI. We conclude that the classical risk factors for osteoporosis may be present before ovarian failure, and their effect may be partly independent of estrogen levels.
Resumo:
Cardiac hypertrophy that accompanies hypertension seems to be a phenomenon of multifactorial origin whose development does not seem to depend on an increased pressure load alone, but also on local growth factors and cardioadrenergic activity. The aim of the present study was to determine if sympathetic renal denervation and its effects on arterial pressure level can prevent cardiac hypertrophy and if it can also delay the onset and attenuate the severity of deoxycorticosterone acetate (DOCA)-salt hypertension. DOCA-salt treatment was initiated in rats seven days after uninephrectomy and contralateral renal denervation or sham renal denervation. DOCA (15 mg/kg, sc) or vehicle (soybean oil, 0.25 ml per animal) was administered twice a week for two weeks. Rats treated with DOCA or vehicle (control) were provided drinking water containing 1% NaCl and 0.03% KCl. At the end of the treatment period, mean arterial pressure (MAP) and heart rate measurements were made in conscious animals. Under ether anesthesia, the heart was removed and the right and left ventricles (including the septum) were separated and weighed. DOCA-salt treatment produced a significant increase in left ventricular weight/body weight (LVW/BW) ratio (2.44 ± 0.09 mg/g) and right ventricular weight/body weight (RVW/BW) ratio (0.53 ± 0.01 mg/g) compared to control (1.92 ± 0.04 and 0.48 ± 0.01 mg/g, respectively) rats. MAP was significantly higher (39%) in DOCA-salt rats. Renal denervation prevented (P>0.05) the development of hypertension in DOCA-salt rats but did not prevent the increase in LVW/BW (2.27 ± 0.03 mg/g) and RVW/BW (0.52 ± 0.01 mg/g). We have shown that the increase in arterial pressure level is not responsible for cardiac hypertrophy, which may be more related to other events associated with DOCA-salt hypertension, such as an increase in cardiac sympathetic activity
Resumo:
The available data suggests that hypotension caused by Hg2+ administration may be produced by a reduction of cardiac contractility or by cholinergic mechanisms. The hemodynamic effects of an intravenous injection of HgCl2 (5 mg/kg) were studied in anesthetized rats (N = 12) by monitoring left and right ventricular (LV and RV) systolic and diastolic pressures for 120 min. After HgCl2 administration the LV systolic pressure decreased only after 40 min (99 ± 3.3 to 85 ± 8.8 mmHg at 80 min). However, RV systolic pressure increased, initially slowly but faster after 30 min (25 ± 1.8 to 42 ± 1.6 mmHg at 80 min). Both right and left diastolic pressures increased after HgCl2 treatment, suggesting the development of diastolic ventricular dysfunction. Since HgCl2 could be increasing pulmonary vascular resistance, isolated lungs (N = 10) were perfused for 80 min with Krebs solution (continuous flow of 10 ml/min) containing or not 5 µM HgCl2. A continuous increase in pulmonary vascular resistance was observed, suggesting the direct effect of Hg2+ on the pulmonary vessels (12 ± 0.4 to 29 ± 3.2 mmHg at 30 min). To examine the interactions of Hg2+ and changes in cholinergic activity we analyzed the effects of acetylcholine (Ach) on mean arterial blood pressure (ABP) in anesthetized rats (N = 9) before and after Hg2+ treatment (5 mg/kg). Using the same amount and route used to study the hemodynamic effects we also examined the effects of Hg2+ administration on heart and plasma cholinesterase activity (N = 10). The in vivo hypotensive response to Ach (0.035 to 10.5 µg) was reduced after Hg2+ treatment. Cholinesterase activity (µM h-1 mg protein-1) increased in heart and plasma (32 and 65%, respectively) after Hg2+ treatment. In conclusion, the reduction in ABP produced by Hg2+ is not dependent on a putative increase in cholinergic activity. HgCl2 mainly affects cardiac function. The increased pulmonary vascular resistance and cardiac failure due to diastolic dysfunction of both ventricles are factors that might contribute to the reduction of cardiac output and the fall in arterial pressure.
Resumo:
Hypomagnesemia is the most common electrolyte disturbance seen upon admission to the intensive care unit (ICU). Reliable predictors of its occurrence are not described. The objective of this prospective study was to determine factors predictive of hypomagnesemia upon admission to the ICU. In a single tertiary cancer center, 226 patients with different diagnoses upon entering were studied. Hypomagnesemia was defined by serum levels <1.5 mg/dl. Demographic data, type of cancer, cause of admission, previous history of arrhythmia, cardiovascular disease, renal failure, drug administration (particularly diuretics, antiarrhythmics, chemotherapy and platinum compounds), previous nutrition intake and presence of hypovolemia were recorded for each patient. Blood was collected for determination of serum magnesium, potassium, sodium, calcium, phosphorus, blood urea nitrogen and creatinine levels. Upon admission, 103 (45.6%) patients had hypomagnesemia and 123 (54.4%) had normomagnesemia. A normal dietary habit prior to ICU admission was associated with normal Mg levels (P = 0.007) and higher average levels of serum Mg (P = 0.002). Postoperative patients (N = 182) had lower levels of serum Mg (0.60 ± 0.14 mmol/l compared with 0.66 ± 0.17 mmol/l, P = 0.006). A stepwise multiple linear regression disclosed that only normal dietary habits (OR = 0.45; CI = 0.26-0.79) and the fact of being a postoperative patient (OR = 2.42; CI = 1.17-4.98) were significantly correlated with serum Mg levels (overall model probability = 0.001). These findings should be used to identify patients at risk for such disturbance, even in other critically ill populations.
Resumo:
Two variants (A and B) of the widely employed Walker 256 rat tumor cells are known. When inoculated sc, the A variant produces solid, invasive, highly metastasizing tumors that cause severe systemic effects and death. We have obtained a regressive variant (AR) whose sc growth is slower, resulting in 70-80% regression followed by development of immunity against A and AR variants. Simultaneously with the beginning of tumor regression, a temporary anemia developed (~8 days duration), accompanied by marked splenomegaly (~300%) and changes in red blood cell osmotic fragility, with mean corpuscular fragility increasing from 4.1 to 6.5 g/l NaCl. The possibility was raised that plasma factors associated with the immune response induced these changes. In the present study, we identify and compare the osmotic fragility increasing activity of plasma fractions obtained from A and AR tumor bearers at different stages of tumor development. The results showed that by day 4 compounds precipitating in 60% (NH4)2SO4 and able to increase red blood cell osmotic fragility appeared in the plasma of A and AR tumor bearers. Later, these compounds disappeared from the plasma of A tumor bearers but slightly increased in the plasma of AR tumor bearers. Furthermore, by day 10, compounds precipitating between 60 and 80% (NH4)2SO4 and with similar effects appeared only in plasma of AR tumor bearers. The salt solubility, production kinetics and hemolytic activity of these compounds resemble those of the immunoglobulins. This, together with their preferential increase in rats bearing the AR variant, suggest their association with an immune response against this tumor.
Resumo:
A transitory increase in blood pressure (BP) is observed following upper airway surgery for obstructive sleep apnea syndrome but the mechanisms implicated are not yet well understood. The objective of the present study was to evaluate changes in BP and heart rate (HR) and putative factors after uvulopalatopharyngoplasty and septoplasty in normotensive snorers. Patients (N = 10) were instrumented for 24-h ambulatory BP monitoring, nocturnal respiratory monitoring and urinary catecholamine level evaluation one day before surgery and on the day of surgery. The influence of postsurgery pain was prevented by analgesic therapy as confirmed using a visual analog scale of pain. Compared with preoperative values, there was a significant (P < 0.05) increase in nighttime but not daytime systolic BP (119 ± 5 vs 107 ± 3 mmHg), diastolic BP (72 ± 4 vs 67 ± 2 mmHg), HR (67 ± 4 vs 57 ± 2 bpm), respiratory disturbance index (RDI) characterized by apnea-hypopnea (30 ± 10 vs 13 ± 4 events/h of sleep) and norepinephrine levels (22.0 ± 4.7 vs 11.0 ± 1.3 µg l-1 12 h-1) after surgery. A positive correlation was found between individual variations of BP and individual variations of RDI (r = 0.81, P < 0.01) but not between BP or RDI and catecholamines. The visual analog scale of pain showed similar stress levels on the day before and after surgery (6.0 ± 0.8 vs 5.0 ± 0.9 cm, respectively). These data strongly suggest that the cardiovascular changes observed in patients who underwent uvulopalatopharyngoplasty and septoplasty were due to the increased postoperative RDI.
Resumo:
Blood transfusion in patients with sickle cell disease (SCD) is limited by the development of alloantibodies to erythrocytes. In the present study, the frequency and risk factors for alloimmunization were determined. Transfusion records and medical charts of 828 SCD patients who had been transfused and followed at the Belo Horizonte Blood Center, Belo Horizonte, MG, Brazil, were retrospectively reviewed. Alloimmunization frequency was 9.9% (95% CI: 7.9 to 11.9%) and 125 alloantibodies were detected, 79% of which belonged to the Rhesus and Kell systems. Female patients developed alloimmunization more frequently (P = 0.03). The median age of the alloimmunized group was 23.3 years, compared to 14.6 years for the non-alloimmunized group (P < 0.0001). Multivariate analyses were applied to the data for 608 hemoglobin (Hb) SS or SC patients whose number of transfusions was recorded accurately. Number of transfusions (P = 0.00006), older age (P = 0.056) and Hb SC (P = 0.02) showed independent statistical associations with alloimmunization. Hb SC patients older than 14 years faced a 2.8-fold higher (95% CI: 1.3 to 6.0) risk of alloimmunization than Hb SS patients. Female Hb SC patients had the highest risk of developing alloantibodies. In patients younger than 14 years, only the number of transfusions was significant. We conclude that an increased risk of alloimmunization was associated with older patients with Hb SC, specially females, even after adjustments were made for the number of transfusions received, the most significant variable.
Resumo:
The objective of the present study was to assess the incidence, risk factors and outcome of patients who develop acute renal failure (ARF) in intensive care units. In this prospective observational study, 221 patients with a 48-h minimum stay, 18-year-old minimum age and absence of overt acute or chronic renal failure were included. Exclusion criteria were organ donors and renal transplantation patients. ARF was defined as a creatinine level above 1.5 mg/dL. Statistics were performed using Pearsons' chi2 test, Student t-test, and Wilcoxon test. Multivariate analysis was run using all variables with P < 0.1 in the univariate analysis. ARF developed in 19.0% of the patients, with 76.19% resulting in death. Main risk factors (univariate analysis) were: higher intra-operative hydration and bleeding, higher death risk by APACHE II score, logist organ dysfunction system on the first day, mechanical ventilation, shock due to systemic inflammatory response syndrome (SIRS)/sepsis, noradrenaline use, and plasma creatinine and urea levels on admission. Heart rate on admission (OR = 1.023 (1.002-1.044)), male gender (OR = 4.275 (1.340-13642)), shock due to SIRS/sepsis (OR = 8.590 (2.710-27.229)), higher intra-operative hydration (OR = 1.002 (1.000-1004)), and plasma urea on admission (OR = 1.012 (0.980-1044)) remained significant (multivariate analysis). The mortality risk factors (univariate analysis) were shock due to SIRS/sepsis, mechanical ventilation, blood stream infection, potassium and bicarbonate levels. Only potassium levels remained significant (P = 0.037). In conclusion, ARF has a high incidence, morbidity and mortality when it occurs in intensive care unit. There is a very close association with hemodynamic status and multiple organ dysfunction.
Resumo:
Epidemiological and clinical evidence suggests that a judicious diet, regular physical activity and blood pressure (BP) monitoring must start in early childhood to minimize the impact of modifiable cardiovascular risk factors. This study was designed to evaluate BP and metabolic parameters of schoolchildren from Vitória, Espírito Santo State, Brazil, and correlate them with cardiovascular risk factors. The study was conducted on 380 students aged 10-14 years (177 boys, 203 girls) enrolled in public schools. Baseline measurements included body mass index, BP and heart rate. The students were submitted to exercise spirometry on a treadmill. VO2max was obtained from exercise testing to voluntary exhaustion. Fasting serum total cholesterol (TC), LDL-C, HDL-C, triglycerides (TG), and glucose were measured. Nine point nine percent of the boys and 11.7% of the girls were hypertensive or had pre-hypertensive levels. There was no significant correlation between VO2max and TC, LDL-C, or TG in prepubertal children, but a slight negative correlation was detected in post-pubertal boys for HDL-C and TG. In addition, children with hypertension (3.4%) or pre-hypertensive levels (6.6%) also had comorbidity for overweight and blood lipid abnormalities (14% for triglycerides, 44.7% for TC, 25.9% for LDL-C, 52% for low HDL-C). The present study shows for the first time high correlations between prehypertensive blood pressure levels and the cardiovascular risk factors high TC, high LDL-C, low HDL-C in schoolchildren. These are important for the formulation of public health policies and strategies.