931 resultados para CHD Prediction, Blood Serum Data Chemometrics Methods
Resumo:
Limited data are available about iron deficiency (ID) in Brazilian blood donors. This study evaluated the frequencies of ID and iron-deficiency anaemia (IDA) separately and according to frequency of blood donations. The protective effect of the heterozygous genotype for HFE C282Y mutation against ID and IDA in female blood donors was also determined. Five hundred and eight blood donors were recruited at the Blood Bank of Santa Casa in Sao Paulo, Brazil. Haemoglobin and serum ferritin concentrations were measured. The genotype for HFE C282Y mutation was determined by polymerase chain reaction followed by restriction fragment length polymorphism analysis. The ID was found in 21 center dot 1% of the women and 2 center dot 6% of the men whereas the IDA was found in 6 center dot 8 and 0 center dot 3%, respectively. The ID was found in 11 center dot 9% of the women in group 1 (first-time blood donors) and the frequency increased to 38 center dot 9% in women of the group 3 (blood donors donating once or more times in the last 12 months). No ID was found in men from group 1; however the ID frequency increased to 0 center dot 9% in group 2 (who had donated blood before but not in the last 12 months) and 5 center dot 0% in group 3. In summary, the heterozygous genotype was not associated with reduction of ID or IDA frequencies in both genders, but in male blood donors it was associated with a trend to elevated ferritin levels (P = 0 center dot 060). ID is most frequent in Brazilian women but was also found in men of group 3.
Resumo:
Objective: Looking for possible neuroimmune relationships, we analyzed the effects of methylenedioxymethamphetamine (MDMA) administration on neuroendocrine, neutrophil activity and leukocyte distribution in mice. Methods: Five experiments were performed. In the first, mice were treated with MDMA (10 mg/kg) 30, 60 min and 24 h prior to blood sample collection for neutrophil activity analysis. In the second experiment, the blood of nave mice was collected and incubated with MDMA for neutrophil activity in vitro analysis. In the third and fourth experiments, mice were injected with MDMA (10 mg/kg) and 60 min later, blood and brain were collected to analyze corticosterone serum levels and hypothalamic noradrenaline (NA) levels and turnover. In the last experiment, mice were injected with MDMA 10 mg/kg and 60 min later, blood, bone marrow and spleen were collected for leukocyte distribution analysis. Results: Results showed an increase in hypothalamic NA turnover and corticosterone serum levels 60 min after MDMA (10 mg/kg) administration, a decrease in peripheral blood neutrophil oxidative burst and a decrease in the percentage and intensity of neutrophil phagocytosis. It was further found that MDMA (10 mg/kg) treatment also altered leukocyte distribution in blood, bone marrow and spleen. In addition, no effects were observed for MDMA after in vitro exposure both in neutrophil oxidative burst and phagocytosis. Conclusion: The effects of MDMA administration (10 mg/kg) on neutrophil activity and leukocyte distribution might have been induced indirectly through noradrenergic neurons and/or hypothalamic-pituitary-adrenal axis activations. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
Background: Oxidative modification of low-density lipoprotein (LDL) has been demonstrated in patients with end-stage renal disease, where it is associated with oxidative stress and plays a key role in the pathogenesis of atherosclerosis. In this context, the generation of minimally oxidized LDL, also called electronegative LDL [ LDL(-)], has been associated with active disease, and is a detectable sign of atherogenic tendencies. The purpose of this study was to evaluate serum LDL(-) levels and anti-LDL(-)IgG autoantibodies in end-stage renal disease patients on dialysis, comparing patients on hemodialysis (HD), peritoneal dialysis (PD) and a control group. In addition, the serum lipid profile, nutritional status, biochemical data and parameters of mineral metabolism were also evaluated. Methods: The serum levels of LDL(-) and anti-LDL(-) IgG autoantibodies were measured in 25 patients undergoing HD and 11 patients undergoing PD at the Centro Integradode Nefrologia, Rio de Janeiro, Brazil. Ten healthy subjects served as a control group. Serum levels of albumin, total cholesterol, triglycerides and lipoproteins were measured. Calculations of subjects` body mass index and measurements of waist circumference, triceps skin fold and arm muscle area were performed. Measurements of hematocrit, serum blood urea nitrogen, creatinine, parathyroid hormone, phosphorus and calcium were taken. Results: Levels of LDL(-) were higher in HD patients (575.6 +/- 233.1 mu g/ml) as compared to PD patients (223.4 +/- 117.5 mu g/ml, p < 0.05), which in turn were higher than in the control group (54.9 +/- 33.3 mu g/ml, p < 0.01). The anti-LDL(-) IgG autoantibodies were increased in controls (0.36 +/- 0.09 mu g/ ml) as compared to PD (0.28 +/- 0.12 mu g/ml, p < 0.001) and HD patients (0.2 +/- 0.1 mu g/ml, p < 0.001). The mean values of total cholesterol and LDL were considered high in the PD group, whereas the mean triceps skin fold was significantly lower in the HD group. Conclusion: Levels of LDL(-) are higher in renal patients on dialysis than in normal individuals, and are reciprocally related to IgG autoantibodies. LDL(-) may be a useful marker of oxidative stress, and this study suggests that HD patients are more susceptible to cardiovascular risk due to this condition. Moreover, autoantibodies reactive to LDL(-) may have protective effects in chronic kidney disease. Copyright (C) 2008 S. Karger AG, Basel.
Resumo:
T cells recognize peptide epitopes bound to major histocompatibility complex molecules. Human T-cell epitopes have diagnostic and therapeutic applications in autoimmune diseases. However, their accurate definition within an autoantigen by T-cell bioassay, usually proliferation, involves many costly peptides and a large amount of blood, We have therefore developed a strategy to predict T-cell epitopes and applied it to tyrosine phosphatase IA-2, an autoantigen in IDDM, and HLA-DR4(*0401). First, the binding of synthetic overlapping peptides encompassing IA-2 was measured directly to purified DR4. Secondly, a large amount of HLA-DR4 binding data were analysed by alignment using a genetic algorithm and were used to train an artificial neural network to predict the affinity of binding. This bioinformatic prediction method was then validated experimentally and used to predict DR4 binding peptides in IA-2. The binding set encompassed 85% of experimentally determined T-cell epitopes. Both the experimental and bioinformatic methods had high negative predictive values, 92% and 95%, indicating that this strategy of combining experimental results with computer modelling should lead to a significant reduction in the amount of blood and the number of peptides required to define T-cell epitopes in humans.
Resumo:
Multi-frequency bioimpedance analysis (MFBIA) was used to determine the impedance, reactance and resistance of 103 lamb carcasses (17.1-34.2 kg) immediately after slaughter and evisceration. Carcasses were halved, frozen and one half subsequently homogenized and analysed for water, crude protein and fat content. Three measures of carcass length were obtained. Diagonal length between the electrodes (right side biceps femoris to left side of neck) explained a greater proportion of the variance in water mass than did estimates of spinal length and was selected for use in the index L-2/Z to predict the mass of chemical components in the carcass. Use of impedance (Z) measured at the characteristic frequency (Z(c)) instead of 50 kHz (Z(50)) did not improve the power of the model to predict the mass of water, protein or fat in the carcass. While L-2/Z(50) explained a significant proportion of variation in the masses of body water (r(2) 0.64), protein (r(2) 0.34) and fat (r(2) 0.35), its inclusion in multi-variate indices offered small or no increases in predictive capacity when hot carcass weight (HCW) and a measure of rib fat-depth (GR) were present in the model. Optimized equations were able to account for 65-90 % of the variance observed in the weight of chemical components in the carcass. It is concluded that single frequency impedance data do not provide better prediction of carcass composition than can be obtained from measures of HCW and GR. Indices of intracellular water mass derived from impedance at zero frequency and the characteristic frequency explained a similar proportion of the variance in carcass protein mass as did the index L-2/Z(50).
Resumo:
Study Design, The study group consisted of 53 patients who underwent 75 operations for spine metastases. Patient and tumor demographic factors, preoperative nutritional status, and perioperative adjunctive therapy were retrospectively reviewed. Objective, To determine the risk factors for wound breakdown and infection in patients undergoing surgery for spinal metastases. Summary of Background Data. Spinal Fusion using spine implants may be associated with an infection rate of 5% or more. Surgery for spine metastases is associated with an infection rate of more than 10%. Factors other than the type of surgery performed may account for the greater infection rate. Methods. Data were obtained by reviewing patient records. Age, sex, and neurologic status of the patient; tumor type and site; and surgical details were noted. Adjunctive treatment with corticosteroids and radiotherapy was recorded, Nutritional status was evaluated by determining serum protein and serum albumin concentrations and by total lymphocyte count. Results. Wound breakdown and Infection occurred in 75 of 75 wounds. No patient or tumor demographic factors other than intraoperative blood loss (P < 0.1) were statistically associated with infection; The correlation between preoperative protein deficiency (P < 0.01) or perioperative corticosteroid administration (P < 0.10) and wound infection was significant. There was no statistical correlation between lymphocyte count or perioperative radiotherapy and wound infection. Conclusions, The results indicate that preoperative protein depletion and perioperative administration of corticosteroids are risk factors for wound infection in patients undergoing surgery for spine metastases, Perioperative correction of nutritional depletion and cessation of steroid therapy may reduce wound complications.
Resumo:
Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.
Resumo:
Physiological and kinematic data were collected from elite under-19 rugby union players to provide a greater understanding of the physical demands of rugby union. Heart rate, blood lactate and time-motion analysis data were collected from 24 players (mean +/- s((x) over bar): body mass 88.7 +/- 9.9 kg, height 185 +/- 7 cm, age 18.4 +/- 0.5 years) during six competitive premiership fixtures. Six players were chosen at random from each of four groups: props and locks, back row forwards, inside backs, outside backs. Heart rate records were classified based on percent time spent in four zones (>95%, 85-95%, 75-84%, <75% HRmax). Blood lactate concentration was measured periodically throughout each match, with movements being classified as standing, walking, jogging, cruising, sprinting, utility, rucking/mauling and scrummaging. The heart rate data indicated that props and locks (58.4%) and back row forwards (56.2%) spent significantly more time in high exertion (85-95% HRmax) than inside backs (40.5%) and outside backs (33.9%) (P < 0.001). Inside backs (36.5%) and outside backs (38.5%) spent significantly more time in moderate exertion (75-84% HRmax) than props and locks (22.6%) and back row forwards (19.8%) (P < 0.05). Outside backs (20.1%) spent significantly more time in low exertion (< 75% HRmax) than props and locks (5.8%) and back row forwards (5.6%) (P < 0.05). Mean blood lactate concentration did not differ significantly between groups (range: 4.67 mmol.l(-1) for outside backs to 7.22 mmol.l(-1) for back row forwards; P < 0.05). The motion analysis data indicated that outside backs (5750 m) covered a significantly greater total distance than either props and locks or back row forwards (4400 and 4080 m, respectively; P < 0.05). Inside backs and outside backs covered significantly greater distances walking (1740 and 1780 m, respectively; P < 0.001), in utility movements (417 and 475 m, respectively; P < 0.001) and sprinting (208 and 340 m, respectively; P < 0.001) than either props and locks or back row forwards (walking: 1000 and 991 m; utility movements: 106 and 154 m; sprinting: 72 and 94 m, respectively). Outside backs covered a significantly greater distance sprinting than inside backs (208 and 340 m, respectively; P < 0.001). Forwards maintained a higher level of exertion than backs, due to more constant motion and a large involvement in static high-intensity activities. A mean blood lactate concentration of 4.8-7.2 mmol.l(-1) indicated a need for 'lactate tolerance' training to improve hydrogen ion buffering and facilitate removal following high-intensity efforts. Furthermore, the large distances (4.2-5.6 km) covered during, and intermittent nature of, match-play indicated a need for sound aerobic conditioning in all groups (particularly backs) to minimize fatigue and facilitate recovery between high-intensity efforts.
Resumo:
Background From the mid-1980s to mid-1990s, the WHO MONICA Project monitored coronary events and classic risk factors for coronary heart disease (CHD) in 38 populations from 21 countries. We assessed the extent to which changes in these risk factors explain the variation in the trends in coronary-event rates across the populations. Methods In men and women aged 35-64 years, non-fatal myocardial infarction and coronary deaths were registered continuously to assess trends in rates of coronary events. We carried out population surveys to estimate trends in risk factors. Trends in event rates were regressed on trends in risk score and in individual risk factors. Findings Smoking rates decreased in most male populations but trends were mixed in women; mean blood pressures and cholesterol concentrations decreased, body-mass index increased, and overall risk scores and coronary-event rates decreased. The model of trends in 10-year coronary-event rates against risk scores and single risk factors showed a poor fit, but this was improved with a 4-year time lag for coronary events. The explanatory power of the analyses was limited by imprecision of the estimates and homogeneity of trends in the study populations. Interpretation Changes in the classic risk factors seem to partly explain the variation in population trends in CHD. Residual variance is attributable to difficulties in measurement and analysis, including time lag, and to factors that were not included, such as medical interventions. The results support prevention policies based on the classic risk factors but suggest potential for prevention beyond these.
Resumo:
The objective of the present study was to evaluate the performance of a new bioelectrical impedance instrument, the Soft Tissue Analyzer (STA), which predicts a subject's body composition. A cross-sectional population study in which the impedance of 205 healthy adult subjects was measured using the STA. Extracellular water (ECW) volume (as a percentage of total body water, TBW) and fat-free mass (FFM) were predicted by both the STA and a compartmental model, and compared according to correlation and limits of agreement analysis, with the equivalent data obtained by independent reference methods of measurement (TBW measured by D2O dilution, and FFM measured by dual-energy X-ray absorptiometry). There was a small (2.0 kg) but significant (P < 0.02) difference in mean FFM predicted by the STA, compared with the reference technique in the males, but not in the females (-0.4 kg) or in the combined group (0.8 kg). Both methods were highly correlated. Similarly, small but significant differences for predicted mean ECW volume were observed. The limits of agreement for FFM and ECW were -7.5-9.9 and -4.1-3.0 kg, respectively. Both FFM and ECW (as a percentage of TBW) are well predicted by the STA on a population basis, but the magnitude of the limits of agreement with reference methods may preclude its usefulness for predicting body composition in an individual. In addition, the theoretical basis of an impedance method that does not include a measure of conductor length requires further validation. (C) Elsevier Science Inc. 2000.
Resumo:
Objectives. The present study was designed to test the diathesis-stress components of Beck's cognitive theory of depression and the reformulated learned helplessness model of depression in the prediction of postpartum depressive symptomatology. Design and methods. The research used a two-wave longitudinal design-data were collected from 65 primiparous women during their third trimester of pregnancy and then 6 weeks after the birth. Cognitive vulnerability and initial depressive symptomatology were assessed at Time 1, whereas stress and postpartum depressive symptomatology were assessed at Time 2. Results. There was some support for the diathesis-stress component of Beck's cognitive theory, to the extent that the negative relationship between both general and maternal-specific dysfunctional attitudes associated with performance evaluation and Time 2 depressive symptomatology was strongest for women who reported high levels of parental stress. In a similar vein, the effects of dysfunctional attitudes (general and maternal-specific) associated with performance evaluation and need for approval (general measure only) on partner ratings of emotional distress were evident only among those women whose infants were rated as being temperamentally difficult. Conclusion. There was no support for the diathesis-stress component of the reformulated learned helplessness model of depression; however, there was some support for the diathesis-stress component of Beck's cognitive theory.
Resumo:
This paper presents cost-effectiveness analyses (CEAs) of plasma collection via two alternative methods: whole blood collection (WBC) and erythroplasmapheresis collection (EPC). The objective of the study is to provide an answer to the question 'What is the least-cost method of plasma production'. This question is answered, both from the viewpoint of the blood collection agency (using financial CEA) and from that of 'society' as a whole (using economic CEA). We employ detailed financial data and economic survey data for collections made by a blood collection agency and to WBC and EPC donors in Brisbane, Australia. The results indicate that, despite the superior yield provided by EPC, WBC is actually more cost-effective. This result is robust to thorough sensitivity analysis and arises regardless of whether an economic or financial perspective is taken. We conclude that, ceteris paribus, the cost of recruiting new plasma donors would need to be quite substantial for marginal investments in EPC to be considered cost-effective. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
PURPOSE: Many guidelines advocate measurement of total or low density lipoprotein cholesterol (LDL), high density lipoprotein cholesterol (HDL), and triglycerides (TG) to determine treatment recommendations for preventing coronary heart disease (CHD) and cardiovascular disease (CVD). This analysis is a comparison of lipid variables as predictors of cardiovascular disease. METHODS: Hazard ratios for coronary and cardiovascular deaths by fourths of total cholesterol (TC), LDL, HDL, TG, non-HDL, TC/HDL, and TG/HDL values, and for a one standard deviation change in these variables, were derived in an individual participant data meta-analysis of 32 cohort studies conducted in the Asia-Pacific region. The predictive value of each lipid variable was assessed using the likelihood ratio statistic. RESULTS: Adjusting for confounders and regression dilution, each lipid variable had a positive (negative for HDL) log-linear association with fatal CHD and CVD. Individuals in the highest fourth of each lipid variable had approximately twice the risk of CHD compared with those with lowest levels. TG and HDL were each better predictors of CHD and CVD risk compared with TC alone, with test statistics similar to TC/HDL and TG/HDL ratios. Calculated LDL was a relatively poor predictor. CONCLUSIONS: While LDL reduction remains the main target of intervention for lipid-lowering, these data support the potential use of TG or lipid ratios for CHD risk prediction. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Background. It is not known if the adjustment of antihypertensive therapy based on home blood pressure monitoring (HBPM) can improve blood pressure (BP) control among haemodialysis patients. Methods. This is an open randomized clinical trial. Hypertensive patients on haemodialysis were randomized to have the antihypertensive therapy adjusted based on predialysis BP measurements or HBPM. Before and after 6 months of follow-up, patients were submitted to ambulatory blood pressure monitoring (ABPM) for 24 h, HBPM during 1 week and echocardiogram. Results. A total of 34 and 31 patients completed the study in the HBPM and predialysis BP groups, respectively. At the end of study, the systolic (SBP) and diastolic (DBP) blood pressure during the interdialytic period measured by ABPM were significantly lower in the HBPM group in relation to the predialysis BP group (mean 24-h BP: 135 +/- 12 mmHg/76 +/- 7 mmHg versus 147 +/- 15 mmHg/79 +/- 8 mmHg; P < 0.05). In the HBPM analysis, the HBPM group showed a significant reduction only in SBP compared to the predialysis BP group (weekly mean: 144 +/- 21 mmHg versus 154 +/- 22 mmHg; P < 0.05). There were no differences between the HBPM and predialysis BP groups in relation to the left ventricular mass index at the end of the study (108 +/- 35 g/m(2) versus 110 +/- 33 g/m(2); P > 0.05). Conclusions. Decision making based on HBPM among haemodialysis patients has led to a better BP control during the interdialytic period in comparison with predialysis BP measurements. HBPM may be a useful adjuvant instrument for blood pressure control among haemodialysis patients.
Resumo:
Anemia screening before blood donation requires an accurate, quick, practical, and easy method with minimal discomfort for the donors. The aim of this study was to compare the accuracy of two quantitative methods of anemia screening: the HemoCue 201(+) (Aktiebolaget Leo Diagnostics) hemoglobin (Hb) and microhematocrit (micro-Hct) tests. Two blood samples of a single fingerstick were obtained from 969 unselected potential female donors to determine the Hb by HemoCue 201(+) and micro-Hct using HemataSTAT II (Separation Technology, Inc.), in alternating order. From each participant, a venous blood sample was drawn and run in an automatic hematology analyzer (ABX Pentra 60, ABX Diagnostics). Considering results of ABX Pentra 60 as true values, the sensitivity and specificity of HemoCue 201(+) and micro-Hct as screening methods were compared, using a venous Hb level of 12.0 g per dL as cutoff for anemia. The sensitivities of the HemoCue 201(+) and HemataSTAT II in detecting anemia were 56 percent (95% confidence interval [CI], 46.1%-65.5%) and 39.5 percent (95% CI, 30.2%-49.3%), respectively (p < 0.001). Analyzing only candidates with a venous Hb level lower than 11.0 g per dL, the deferral rate was 100 percent by HemoCue 201(+) and 77 percent by HemataSTAT II. The specificities of the methods were 93.5 and 93.2 percent, respectively. The HemoCue 201(+) showed greater discriminating power for detecting anemia in prospective blood donors than the micro-Hct method. Both presented equivalent deferral error rates of nonanemic potential donors. Compared to the micro-Hct, HemoCue 201(+) reduces the risk of anemic female donors giving blood, specially for those with lower Hb levels, without increasing the deferral of nonanemic potential donors.