87 resultados para Interleukin-6 -- blood


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Murine models with modified gene function as a result of N-ethyl-N-nitrosourea (ENU) mutagenesis have been used to study phenotypes resulting from genetic change. This study investigated genetic factors associated with red blood cell (RBC) physiology and structural integrity that may impact on blood component storage and transfusion outcome. Forward and reverse genetic approaches were employed with pedigrees of ENU-treated mice using a homozygous recessive breeding strategy. In a “forward genetic” approach, pedigree selection was based upon identification of an altered phenotype followed by exome sequencing to identify a causative mutation. In a second strategy, a “reverse genetic” approach based on selection of pedigrees with mutations in genes of interest was utilised and, following breeding to homozygosity, phenotype assessed. Thirty-three pedigrees were screened by the forward genetic approach. One pedigree demonstrated reticulocytosis, microcytic anaemia and thrombocytosis. Exome sequencing revealed a novel single nucleotide variation (SNV) in Ank1 encoding the RBC structural protein ankyrin-1 and the pedigree was designated Ank1EX34. The reticulocytosis and microcytic anaemia observed in the Ank1EX34 pedigree were similar to clinical features of hereditary spherocytosis in humans. For the reverse genetic approach three pedigrees with different point mutations in Spnb1 encoding RBC protein spectrin-1β, and one pedigree with a mutation in Epb4.1, encoding band 4.1 were selected for study. When bred to homozygosity two of the spectrin-1β pedigrees (a, b) demonstrated increased RBC count, haemoglobin (Hb) and haematocrit (HCT). The third Spnb1 mutation (spectrin-1β c) and mutation in Epb4.1 (band 4.1) did not significantly affect the haematological phenotype, despite these two mutations having a PolyPhen score predicting the mutation may be damaging. Exome sequencing allows rapid identification of causative mutations and development of databases of mutations predicted to be disruptive. These tools require further refinement but provide new approaches to the study of genetically defined changes that may impact on blood component storage and transfusion outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first step in bone healing is forming a blood clot at injured bones. During bone implantation, biomaterials unavoidably come into direct contact with blood, leading to a blood clot formation on its surface prior to bone regeneration. Despite both situations being similar in forming a blood clot at the defect site, most research in bone tissue engineering virtually ignores the important role of a blood clot in supporting healing. Dental implantology has long demonstrated that the fibrin structure and cellular content of a peri-implant clot can greatly affect osteoconduction and de novo bone formation on implant surfaces. This paper reviews the formation of a blood clot during bone healing in related to the use of platelet-rich plasma (PRP) gels. It is implicated that PRP gels are dramatically altered from a normal clot in healing, resulting conflicting effect on bone regeneration. These results indicate that the effect of clots on bone regeneration depends on how the clots are formed. Factors that influence blood clot structure and properties in related to bone healing are also highlighted. Such knowledge is essential for developing strategies to optimally control blood clot formation, which ultimately alter the healing microenvironment of bone. Of particular interest are modification of surface chemistry of biomaterials, which displays functional groups at varied composition for the purpose of tailoring blood coagulation activation, resultant clot fibrin architecture, rigidity, susceptibility to lysis, and growth factor release. This opens new scope of in situ blood clot modification as a promising approach in accelerating and controlling bone regeneration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypoxia and the development and remodeling of blood vessels and connective tissue in granulation tissue that forms in a wound gap following full-thickness skin incision in the rat were examined as a function of time. A 1.5 cm-long incisional wound was created in rat groin skin and the opposed edges sutured together. Wounds were harvested between 3 days and 16 weeks and hypoxia, percent vascular volume, cell proliferation and apoptosis, α-smooth muscle actin, vascular endothelial growth factor-A, vascular endothelial growth factor receptor-2, and transforming growth factor-β 1 expression in granulation tissue were then assessed. Hypoxia was evident between 3 and 7 days while maximal cell proliferation at 3 days (123.6 ± 22.2 cells/mm 2, p < 0.001 when compared with normal skin) preceded the peak percent vascular volume that occurred at 7 days (15.83 ± 1.10%, p < 0.001 when compared with normal skin). The peak in cell apoptosis occurred at 3 weeks (12.1 ± 1.3 cells/mm 2, p < 0.001 when compared with normal skin). Intense α-smooth muscle actin labeling in myofibroblasts was evident at 7 and 10 days. Vascular endothelial growth factor receptor-2 and vascular endothelial growth factor-A were detectable until 2 and 3 weeks, respectively, while transforming growth factor-β 1 protein was detectable in endothelial cells and myofibroblasts until 3-4 weeks and in the extracellular matrix for 16 weeks. Incisional wound granulation tissue largely developed within 3-7 days in the presence of hypoxia. Remodeling, marked by a decline in the percent vascular volume and increased cellular apoptosis, occurred largely in the absence of detectable hypoxia. The expression of vascular endothelial growth factor-A, vascular endothelial growth factor receptor-2, and transforming growth factor-β 1 is evident prior, during, and after the peak of vascular volume reflecting multiple roles for these factors during wound healing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Malaria rapid diagnostic tests (RDTs) are increasingly used by remote health personnel with minimal training in laboratory techniques. RDTs must, therefore, be as simple, safe and reliable as possible. Transfer of blood from the patient to the RDT is critical to safety and accuracy, and poses a significant challenge to many users. Blood transfer devices were evaluated for accuracy and precision of volume transferred, safety and ease of use, to identify the most appropriate devices for use with RDTs in routine clinical care. Methods: Five devices, a loop, straw-pipette, calibrated pipette, glass capillary tube, and a new inverted cup device, were evaluated in Nigeria, the Philippines and Uganda. The 227 participating health workers used each device to transfer blood from a simulated finger-prick site to filter paper. For each transfer, the number of attempts required to collect and deposit blood and any spilling of blood during transfer were recorded. Perceptions of ease of use and safety of each device were recorded for each participant. Blood volume transferred was calculated from the area of blood spots deposited on filter paper. Results: The overall mean volumes transferred by devices differed significantly from the target volume of 5 microliters (p < 0.001). The inverted cup (4.6 microliters) most closely approximated the target volume. The glass capillary was excluded from volume analysis as the estimation method used is not compatible with this device. The calibrated pipette accounted for the largest proportion of blood exposures (23/225, 10%); exposures ranged from 2% to 6% for the other four devices. The inverted cup was considered easiest to use in blood collection (206/ 226, 91%); the straw-pipette and calibrated pipette were rated lowest (143/225 [64%] and 135/225 [60%] respectively). Overall, the inverted cup was the most preferred device (72%, 163/227), followed by the loop (61%, 138/227). Conclusions: The performance of blood transfer devices varied in this evaluation of accuracy, blood safety, ease of use, and user preference. The inverted cup design achieved the highest overall performance, while the loop also performed well. These findings have relevance for any point-of-care diagnostics that require blood sampling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background/Aim: Cardiotoxicity resulting in heart failure is a devastating complication of cancer therapy. It is possible that a patient may survive cancer only to develop heart failure (HF), which is more deadly than cancer. The aim of this project was to profile the characteristics of patients at risk of cancer treatment induced heart failure. Methods: Linked Health Data Analysis of Queensland Cancer Registry (QCR) from 1996-2009, Death Registry and Hospital Administration records for HF and chemotherapy admissions were reviewed. Index heart failure admission must have occurred after the date of cancer registry entry. Results: A total of 15,987 patients were included in this analysis; 1,062 (6.6%) had chemotherapy+HF admission (51.4% Female) and 14,925 (93.4%) chemotherapy_no HF admission. Median age of chemotherapy+HF patients was 67 years (IQR 58 to 75) vs. 54 years (IQR 44 to 64) for chemotherapy_no HF admission. Chemotherapy+HF patients had increased risk of all cause mortality (HR 2.79 [95% CI 2.58-3.02] and 1.67 [95% CI, 1.54 to 1.81] after adjusting for age, sex, marital status, country of birth, cancer site and chemotherapy dose). Index HF admission occurred within one year of cancer diagnosis in 47% of HF patients with 80% of patinets having there index admission with 3 years. The number of chemotherapy cycles was not associated with significant reduction in survival time in chemotherapy+HF patients. Mean survival for heart failure patients was 5.3 years (95% CI, 4.99 - 5.62) vs.9.57 years (95% CI, 9.47-9.68) for chemotherapy_no HF admission patients. Conclusion: All-cause mortality was 67% higher in patients diagnosed with HF following chemotherapy in adjusted analysis for covariates. Methods to improve and better coordinate of the interdisciplinary care for cancer patients with HF involving cardiologists and oncologists are required, including evidence-based guidelines for the comprehensive assessment, monitoring and management of this cohort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To estimate the burden of disease attributable to high blood pressure (BP) in adults aged 30 years and older in South Africa in 2000. Design World Health Organization comparative risk assessment (CRA) methodology was followed. Mean systolic BP (SBP) estimates by age and sex were obtained from the 1998 South African Demographic and Health Survey adult data. Population-attributable fractions were calculated and applied to revised burden of disease estimates for the relevant disease categories for South Africa in 2000. Monte Carlo simulation modelling techniques were used for uncertainty analysis. Setting South Africa Subjects Adults aged 30 years and older. Outcome measures Mortality and disability-adjusted life years (DALYs) from ischaemic heart disease (IHD), stroke, hypertensive disease and other cardiovascular disease (CVD). Results High BP was estimated to have caused 46 888 deaths (95% uncertainty interval 44 878 - 48 566) or 9% (95% uncertainty interval 8.6 - 9.3%) of all deaths in South Africa in 2000, and 390 860 DALYs (95% uncertainty interval 377 955 - 402 256) or 2.4% of all DALYs (95% uncertainty interval 2.3 - 2.5%) in South Africa in 2000. Overall, 50% of stroke, 42% of IHD, 72% of hypertensive disease and 22% of other CVD burden in adult males and females (30+ years) were attributable to high BP (systolic BP ≥ 115 mmHg). Conclusions High BP contributes to a considerable burden of CVD in South Africa and results indicate that there is considerable potential for health gain from implementing BP-lowering interventions that are known to be highly costeffective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polybrominated diphenylethers (PBDEs) are widely used as flame retardants in polymer materials, textiles, electronic boards and various other materials. Technical PBDE preparations are produced as mixtures of mainly penta-, octa- or decabrombiphenyl ethers1,2. PBDEs are structurally similar to other environmental pollutants like dioxins and PCBs, they are lipophilic and persistent compounds and are widespread in the environment. To date, no information is available on the levels of PBDEs in human serum in Australia. In 2003, more than 9000 blood samples were collected in Australia as part of the National Dioxins Program. The aim of this study was to evaluate PBDE concentrations in these samples, focusing on one age group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Little is known about the relation between vitamin D status in early life and neurodevelopment outcomes. Objective This study was designed to examine the association of cord blood 25-hydroxyvitamin D [25(OH)D] at birth with neurocognitive development in toddlers. Methods As part of the China-Anhui Birth Cohort Study, 363 mother-infant pairs with completed data were selected. Concentrations of 25(OH)D in cord blood were measured by radioimmunoassay. Mental development index (MDI) and psychomotor development index (PDI) in toddlers were assessed at age 16–18 mo by using the Bayley Scales of Infant Development. The data on maternal sociodemographic characteristics and other confounding factors were also prospectively collected. Results Toddlers in the lowest quintile of cord blood 25(OH)D exhibited a deficit of 7.60 (95% CI: −12.4, −2.82; P = 0.002) and 8.04 (95% CI: −12.9, −3.11; P = 0.001) points in the MDI and PDI scores, respectively, compared with the reference category. Unexpectedly, toddlers in the highest quintile of cord blood 25(OH)D also had a significant deficit of 12.3 (95% CI: −17.9, −6.67; P < 0.001) points in PDI scores compared with the reference category. Conclusions This prospective study suggested that there was an inverted-U–shaped relation between neonatal vitamin D status and neurocognitive development in toddlers. Additional studies on the optimal 25(OH)D concentrations in early life are needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To investigate the number of hypertensive patients, the optometrist is able to identify by routinely taking blood pressure (BP) measurements for patients in "at -risk" groups, and to sample patients' opinions regarding in -office BP measurement. Many of the optometrists in Saudi Arabia practice in optical stores. These stores are wide spread, easily accessible and seldom need appointments. The expanding role of the optometrist as a primary health care provider (PHCP) and the increasing global prevalence of hypertension, highlight the need for an integrated approach towards detecting and monitoring hypertension. METHODS Automated BP measurements were made twice (during the same session) at five selected optometry practices using a validated BP monitor (Omron M6) to assess the number of patients with high BP (HBP) - in at -risk groups -visiting the eye clinic routinely. Prior to data collection, practitioners underwent a two-day training workshop by a cardiologist on hypertension and how to obtain accurate BP readings. A protocol for BP measurement was distributed and retained in all participating clinics. The general attitude towards cardiovascular health of 480 patients aged 37.2 (依12.4)y and their opinion towards in-office BP measurement was assessed using a self -administered questionnaire. RESULTS A response rate of 83.6% was obtained for the survey. Ninety -three of the 443 patients (21.0% ) tested for BP in this study had HBP. Of these, (62 subjects) 67.7% were unaware of their HBP status. Thirty of the 105 subjects (28.6%) who had previously been diagnosed with HBP, still had HBP at the time of this study, and only 22 (73.3%) of these patients were on medication. Also, only 25% of the diagnosed hypertensive patients owned a BP monitor. CONCLUSION Taking BP measurements in optometry practices, we were able to identify one previously undiagnosed patient with HBP for every 8 adults tested. We also identified 30 of 105 previously diagnosed patients whose BP was poorly controlled, twenty-two of whom were on medication. The patients who participated in this study were positively disposed toward the routine measurement of BP by optometrists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultra-endurance exercise, such as an Ironman triathlon, induces muscle damage and a systemic inflammatory response. As the resolution of recovery in these parameters is poorly documented, we investigated indices of muscle damage and systemic inflammation in response to an Ironman triathlon and monitored these parameters 19 days into recovery. Blood was sampled from 42 well-trained male triathletes 2 days before, immediately after, and 1, 5 and 19 days after an Ironman triathlon. Blood samples were analyzed for hematological profile, and plasma values of myeloperoxidase (MPO), polymorphonuclear (PMN) elastase, cortisol, testosterone, creatine kinase (CK) activity, myoglobin, interleukin (IL)-6, IL-10 and high-sensitive C-reactive protein (hs-CRP). Immediately post-race there were significant (P < 0.001) increases in total leukocyte counts, MPO, PMN elastase, cortisol, CK activity, myoglobin, IL-6, IL-10 and hs-CRP, while testosterone significantly (P < 0.001) decreased compared to prerace. With the exception of cortisol, which decreased below prerace values (P < 0.001), these alterations persisted 1 day post-race (P < 0.001; P < 0.01 for IL-10). Five days post-race CK activity, myoglobin, IL-6 and hs-CRP had decreased, but were still significantly (P < 0.001) elevated. Nineteen days post-race most parameters had returned to prerace values, except for MPO and PMN elastase, which had both significantly (P < 0.001) decreased below prerace concentrations, and myoglobin and hs-CRP, which were slightly, but significantly higher than prerace. Furthermore, significant relationships between leukocyte dynamics, cortisol, markers of muscle damage, cytokines and hs-CRP after the Ironman triathlon were noted. This study indicates that the pronounced initial systemic inflammatory response induced by an Ironman triathlon declines rapidly. However, a low-grade systemic inflammation persisted until at least 5 days post-race, possibly reflecting incomplete muscle recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Ongoing shortages of blood products may be addressed through additional donations. However, donation frequency rates are typically lower than medically possible. This preliminary study aims to determine voluntary nonremunerated whole blood (WB) and plasmapheresis donors' willingness, and subsequent facilitators and barriers, to make additional donations of a different type. STUDY DESIGN AND METHODS Forty individual telephone interviews were conducted posing two additional donation pattern scenarios: first, making a single and, second, making multiple plasmapheresis donations between WB donations. Stratified purposive sampling was conducted for four samples varying in donation experience: no-plasma, new-to-both-WB-and-plasma, new-to-plasma, and plasma donors. Interviews were analyzed yielding excellent (κ values > 0.81) inter-rater reliability. RESULTS Facilitators were more endorsed than barriers for a single but not multiple plasmapheresis donation. More new-to-both donors (n = 5) were willing to make multiple plasma donations between WB donations than others (n = 1 each) and identified fewer barriers (n = 3) than those more experienced in donation (n = 8 no plasma, n = 10 new to both, n = 11 plasma). Donors in the plasma sample were concerned about the subsequent reduced time between plasma donations by adding WB donations (n = 3). The no-plasma and new-to-plasma donors were concerned about the time commitment required (n = 3). CONCLUSION Current donors are willing to add different product donations but donation history influences their willingness to change. Early introduction of multiple donation types, variation in inventory levels, and addressing barriers will provide blood collection agencies with a novel and cost-effective inventory management strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transfusion of platelet concentrates (PCs) is widely used to treat thrombocytopenia and severe trauma. Ex vivo storage of PCs is associated with a storage lesion characterized by partial platelet activation and the release of soluble mediators, such as soluble CD40 ligand (sCD40L), RANTES, and interleukin (IL)-8. An in vitro whole blood culture transfusion model was employed to assess whether mediators present in PC supernatants (PC-SNs) modulated dendritic cell (DC)-specific inflammatory responses (intracellular staining) and the overall inflammatory response (cytometric bead array). Lipopolysaccharide (LPS) was included in parallel cultures to model the impact of PC-SNs on cell responses following toll-like receptor-mediated pathogen recognition. The impact of both the PC dose (10%, 25%) and ex vivo storage period was investigated [day 2 (D2), day 5 (D5), day 7 (D7)]. PC-SNs alone had minimal impact on DC-specific inflammatory responses and the overall inflammatory response. However, in the presence of LPS, exposure to PC-SNs resulted in a significant dose associated suppression of the production of DC IL-12, IL-6, IL-1a, tumor necrosis factor-a (TNF-a), and macrophage inflammatory protein (MIP)-1b and storage-associated suppression of the production of DC IL-10, TNF-a, and IL-8. For the overall inflammatory response, IL-6, TNF-a, MIP-1a, MIP-1b, and inflammatory protein (IP)-10 were significantly suppressed and IL-8, IL-10, and IL-1b significantly increased following exposure to PC-SNs in the presence of LPS. These data suggest that soluble mediators present in PCs significantly suppress DC function and modulate the overall inflammatory response, particularly in the presence of an infectious stimulus. Given the central role of DCs in the initiation and regulation of the immune response, these results suggest that modulation of the DC inflammatory profile is a probable mechanism contributing to transfusion-related complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The aim of the current study was to determine the contribution of interleukin (IL) 1 gene cluster polymorphisms previously implicated in susceptibility for ankylosing spondylitis (AS) to AS susceptibility in different populations worldwide. Methods: Nine polymorphisms in the IL1 gene cluster members IL1A (rs2856836, rs17561 and rs1894399), IL1B (rs16944), IL1F10 (rs3811058) and IL1RN (rs419598, the IL1RA VNTR, rs315952 and rs315951) were genotyped in 2675 AS cases and 2592 healthy controls recruited in 12 different centres in 10 countries. Association of variants with AS was tested by Mantel-Haenszel random effects analysis. Results: Strong association was observed with three single nucleotide polymorphisms (SNPs) in the IL1A gene (rs2856836, rs17561, rs1894399, p = 0.0036, 0.000019 and 0.0003, respectively). There was no evidence of significant heterogeneity of effects between centres, and no evidence of non-combinability of findings. The population attributable risk fraction of these variants in Caucasians is estimated at 4-6%. Conclusions: This study confirms that IL1A is associated with susceptibility to AS. Association of the other IL1 gene complex members could not be excluded in specific populations. Prospective meta-analysis is a useful tool in confirmation studies of genes associated with complex genetic disorders such as AS, providing sufficiently large sample sizes to produce robust findings often not achieved in smaller individual cohorts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective Ankylosing spondylitis (AS) is a common inflammatory arthritis affecting primarily the axial skeleton. IL23R is genetically associated with AS. This study was undertaken to investigate and characterize the role of interleukin-23 (IL-23) signaling in AS pathogenesis. Methods The study population consisted of patients with active AS (n = 17), patients with psoriatic arthritis (n = 8), patients with rheumatoid arthritis, (n = 9), and healthy subjects (n = 20). IL-23 receptor (IL-23R) expression in T cells was determined in each subject group, and expression levels were compared. Results The proportion of IL-23R-expressing T cells in the periphery was 2-fold higher in AS patients than in healthy controls, specifically driven by a 3-fold increase in IL-23R-positive γ/δ T cells in AS patients. The proportions of CD4+ and CD8+ cells that were positive for IL-17 were unchanged. This increased IL-23R expression on γ/δ T cells was also associated with enhanced IL-17 secretion, with no observable IL-17 production from IL-23R-negative γ/δ T cells in AS patients. Furthermore, γ/δ T cells from AS patients were heavily skewed toward IL-17 production in response to stimulation with IL-23 and/or anti-CD3/CD28. Conclusion Recently, mouse models have shown IL-17-secreting γ/δ T cells to be pathogenic in infection and autoimmunity. Our data provide the first description of a potentially pathogenic role of these cells in a human autoimmune disease. Since IL-23 is a maturation and growth factor for IL-17-producing cells, increased IL-23R expression may regulate the function of this putative pathogenic γ/δ T cell population.