174 resultados para Blood - Collection and preservation
Resumo:
The Quantitative Assessment of Solar UV [ultraviolet] Exposure for Vitamin D Synthesis in Australian Adults (AusD) Study aimed to better define the relationship between sun exposure and serum 25-hydroxyvitamin D (25(OH)D) concentration. Cross-sectional data were collected between May 2009 and December 2010 from 1,002 participants aged 18-75 years in 4 Australian sites spanning 24° of latitude. Participants completed the following: 1) questionnaires on sun exposure, dietary vitamin D intake, and vitamin D supplementation; 2) 10 days of personal ultraviolet radiation dosimetry; 3) a sun exposure and physical activity diary; and 4) clinical measurements and blood collection for 25(OH)D determination. Our multiple regression model described 40% of the variance in 25(OH)D concentration; modifiable behavioral factors contributed 52% of the explained variance, and environmental and demographic or constitutional variables contributed 38% and 10%, respectively. The amount of skin exposed was the single strongest contributor to the explained variance (27%), followed by location (20%), season (17%), personal ultraviolet radiation exposure (8%), vitamin D supplementation (7%), body mass index (weight (kg)/height (m)2) (4%), and physical activity (4%). Modifiable behavioral factors strongly influence serum 25(OH)D concentrations in Australian adults. In addition, latitude was a strong determinant of the relative contribution of different behavioral factors.
Resumo:
Hypoxia and the development and remodeling of blood vessels and connective tissue in granulation tissue that forms in a wound gap following full-thickness skin incision in the rat were examined as a function of time. A 1.5 cm-long incisional wound was created in rat groin skin and the opposed edges sutured together. Wounds were harvested between 3 days and 16 weeks and hypoxia, percent vascular volume, cell proliferation and apoptosis, α-smooth muscle actin, vascular endothelial growth factor-A, vascular endothelial growth factor receptor-2, and transforming growth factor-β 1 expression in granulation tissue were then assessed. Hypoxia was evident between 3 and 7 days while maximal cell proliferation at 3 days (123.6 ± 22.2 cells/mm 2, p < 0.001 when compared with normal skin) preceded the peak percent vascular volume that occurred at 7 days (15.83 ± 1.10%, p < 0.001 when compared with normal skin). The peak in cell apoptosis occurred at 3 weeks (12.1 ± 1.3 cells/mm 2, p < 0.001 when compared with normal skin). Intense α-smooth muscle actin labeling in myofibroblasts was evident at 7 and 10 days. Vascular endothelial growth factor receptor-2 and vascular endothelial growth factor-A were detectable until 2 and 3 weeks, respectively, while transforming growth factor-β 1 protein was detectable in endothelial cells and myofibroblasts until 3-4 weeks and in the extracellular matrix for 16 weeks. Incisional wound granulation tissue largely developed within 3-7 days in the presence of hypoxia. Remodeling, marked by a decline in the percent vascular volume and increased cellular apoptosis, occurred largely in the absence of detectable hypoxia. The expression of vascular endothelial growth factor-A, vascular endothelial growth factor receptor-2, and transforming growth factor-β 1 is evident prior, during, and after the peak of vascular volume reflecting multiple roles for these factors during wound healing.
Resumo:
Glucocorticoid hormones are critical to respond and adapt to stress. Genetic variations in the glucocorticoid receptor (GR) gene alter hypothalamic-pituitary-adrenal (HPA) axis activity and associate with hypertension and susceptibility to metabolic disease. Here we test the hypothesis that reduced GR density alters blood pressure and glucose and lipid homeostasis and limits adaption to obesogenic diet. Heterozygous GR βgeo/+ mice were generated from embryonic stem (ES) cells with a gene trap integration of a β-galactosidase-neomycin phosphotransferase (βgeo) cassette into the GR gene creating a transcriptionally inactive GR fusion protein. Although GRβgeo/+ mice have 50% less functional GR, they have normal lipid and glucose homeostasis due to compensatory HPA axis activation but are hypertensive due to activation of the renin-angiotensin- aldosterone system (RAAS). When challenged with a high-fat diet, weight gain, adiposity, and glucose intolerance were similarly increased in control and GRβgeo/+ mice, suggesting preserved control of intermediary metabolism and energy balance. However, whereas a high-fat diet caused HPA activation and increased blood pressure in control mice, these adaptions were attenuated or abolished in GRβgeo/+ mice. Thus, reduced GR density balanced by HPA activation leaves glucocorticoid functions unaffected but mineralocorticoid functions increased, causing hypertension. Importantly, reduced GR limits HPA and blood pressure adaptions to obesogenic diet.
Resumo:
PURPOSE Every health care sector including hospice/palliative care needs to systematically improve services using patient-defined outcomes. Data from the national Australian Palliative Care Outcomes Collaboration aims to define whether hospice/palliative care patients' outcomes and the consistency of these outcomes have improved in the last 3 years. METHODS Data were analysed by clinical phase (stable, unstable, deteriorating, terminal). Patient-level data included the Symptom Assessment Scale and the Palliative Care Problem Severity Score. Nationally collected point-of-care data were anchored for the period July-December 2008 and subsequently compared to this baseline in six 6-month reporting cycles for all services that submitted data in every time period (n = 30) using individual longitudinal multi-level random coefficient models. RESULTS Data were analysed for 19,747 patients (46 % female; 85 % cancer; 27,928 episodes of care; 65,463 phases). There were significant improvements across all domains (symptom control, family care, psychological and spiritual care) except pain. Simultaneously, the interquartile ranges decreased, jointly indicating that better and more consistent patient outcomes were being achieved. CONCLUSION These are the first national hospice/palliative care symptom control performance data to demonstrate improvements in clinical outcomes at a service level as a result of routine data collection and systematic feedback.
Resumo:
BACKGROUND: Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. CONTENT: As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. SUMMARY: Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance. (C) 2011 American Association for Clinical Chemistry
Resumo:
Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance.
Resumo:
Human saliva harbours proteins of clinical relevance and about 30% of blood proteins are also present in saliva. This highlights that saliva can be used for clinical applications just as urine or blood. However, the translation of salivary biomarker discoveries into clinical settings is hampered by the dynamics and complexity of the salivary proteome. This review focuses on the current status of technological developments and achievements relating to approaches for unravelling the human salivary proteome. We discuss the dynamics of the salivary proteome, as well as the importance of sample preparation and processing techniques and their influence on downstream protein applications; post-translational modifications of salivary proteome and protein: protein interactions. In addition, we describe possible enrichment strategies for discerning post-translational modifications of salivary proteins, the potential utility of selected-reaction-monitoring techniques for biomarker discovery and validation, limitations to proteomics and the biomarker challenge and future perspectives. In summary, we provide recommendations for practical saliva sampling, processing and storage conditions to increase the quality of future studies in an emerging field of saliva clinical proteomics. We propose that the advent of technologies allowing sensitive and high throughput proteome-wide analyses, coupled to well-controlled study design, will allow saliva to enter clinical practice as an alternative to blood-based methods due to its simplistic nature of sampling, non-invasiveness, easy of collection and multiple collections by untrained professionals and cost-effective advantages.
Resumo:
Blood metaphors abound in everyday social discourse among both Aboriginal and non-Aboriginal people. However, ‘Aboriginal blood talk’, more specifically, is located within a contradictory and contested space in terms of the meanings and values that can be attributed to it by Aboriginal and non-Aboriginal people. In the colonial context, blood talk operated as a tool of oppression for Aboriginal people via blood quantum discourses, yet today, Aboriginal people draw upon notions of blood, namely bloodlines, in articulating their identities. This paper juxtaposes contemporary Aboriginal blood talk as expressed by Aboriginal people against colonial blood talk and critically examines the ongoing political and intellectual governance regarding the validity of this talk in articulating Aboriginalities.
Resumo:
Background Historically, the paper hand-held record (PHR) has been used for sharing information between hospital clinicians, general practitioners and pregnant women in a maternity shared-care environment. Recently in alignment with a National e-health agenda, an electronic health record (EHR) was introduced at an Australian tertiary maternity service to replace the PHR for collection and transfer of data. The aim of this study was to examine and compare the completeness of clinical data collected in a PHR and an EHR. Methods We undertook a comparative cohort design study to determine differences in completeness between data collected from maternity records in two phases. Phase 1 data were collected from the PHR and Phase 2 data from the EHR. Records were compared for completeness of best practice variables collected The primary outcome was the presence of best practice variables and the secondary outcomes were the differences in individual variables between the records. Results Ninety-four percent of paper medical charts were available in Phase 1 and 100% of records from an obstetric database in Phase 2. No PHR or EHR had a complete dataset of best practice variables. The variables with significant improvement in completeness of data documented in the EHR, compared with the PHR, were urine culture, glucose tolerance test, nuchal screening, morphology scans, folic acid advice, tobacco smoking, illicit drug assessment and domestic violence assessment (p = 0.001). Additionally the documentation of immunisations (pertussis, hepatitis B, varicella, fluvax) were markedly improved in the EHR (p = 0.001). The variables of blood pressure, proteinuria, blood group, antibody, rubella and syphilis status, showed no significant differences in completeness of recording. Conclusion This is the first paper to report on the comparison of clinical data collected on a PHR and EHR in a maternity shared-care setting. The use of an EHR demonstrated significant improvements to the collection of best practice variables. Additionally, the data in an EHR were more available to relevant clinical staff with the appropriate log-in and more easily retrieved than from the PHR. This study contributes to an under-researched area of determining data quality collected in patient records.
Resumo:
BACKGROUND Ongoing shortages of blood products may be addressed through additional donations. However, donation frequency rates are typically lower than medically possible. This preliminary study aims to determine voluntary nonremunerated whole blood (WB) and plasmapheresis donors' willingness, and subsequent facilitators and barriers, to make additional donations of a different type. STUDY DESIGN AND METHODS Forty individual telephone interviews were conducted posing two additional donation pattern scenarios: first, making a single and, second, making multiple plasmapheresis donations between WB donations. Stratified purposive sampling was conducted for four samples varying in donation experience: no-plasma, new-to-both-WB-and-plasma, new-to-plasma, and plasma donors. Interviews were analyzed yielding excellent (κ values > 0.81) inter-rater reliability. RESULTS Facilitators were more endorsed than barriers for a single but not multiple plasmapheresis donation. More new-to-both donors (n = 5) were willing to make multiple plasma donations between WB donations than others (n = 1 each) and identified fewer barriers (n = 3) than those more experienced in donation (n = 8 no plasma, n = 10 new to both, n = 11 plasma). Donors in the plasma sample were concerned about the subsequent reduced time between plasma donations by adding WB donations (n = 3). The no-plasma and new-to-plasma donors were concerned about the time commitment required (n = 3). CONCLUSION Current donors are willing to add different product donations but donation history influences their willingness to change. Early introduction of multiple donation types, variation in inventory levels, and addressing barriers will provide blood collection agencies with a novel and cost-effective inventory management strategy.
Resumo:
This column provides a summary of the recent decision of The Hospital v T [2015] QSC 185
Resumo:
Background The growing awareness of transfusion-associated morbidity and mortality necessitates investigations into the underlying mechanisms. Small animals have been the dominant transfusion model but have associated limitations. This study aimed to develop a comprehensive large animal (ovine) model of transfusion encompassing: blood collection, processing and storage, compatibility testing right through to post-transfusion outcomes. Materials and methods Two units of blood were collected from each of 12 adult male Merino sheep and processed into 24 ovine-packed red blood cell (PRBC) units. Baseline haematological parameters of ovine blood and PRBC cells were analysed. Biochemical changes in ovine PRBCs were characterized during the 42-day storage period. Immunological compatibility of the blood was confirmed with sera from potential recipient sheep, using a saline and albumin agglutination cross-match. Following confirmation of compatibility, each recipient sheep (n = 12) was transfused with two units of ovine PRBC. Results Procedures for collecting, processing, cross-matching and transfusing ovine blood were established. Although ovine red blood cells are smaller and higher in number, their mean cell haemoglobin concentration is similar to human red blood cells. Ovine PRBC showed improved storage properties in saline–adenine–glucose–mannitol (SAG-M) compared with previous human PRBC studies. Seventy-six compatibility tests were performed and 17·1% were incompatible. Only cross-match compatible ovine PRBC were transfused and no adverse reactions were observed. Conclusion These findings demonstrate the utility of the ovine model for future blood transfusion studies and highlight the importance of compatibility testing in animal models involving homologous transfusions.
Resumo:
Donors are the key to the core business of Blood Collection Agencies (BCAs). However, historically, they have not been a focus of research undertaken by these organizations. This model is now changing, with significant donor research groups established in a number of countries, including Australia. Donor research in the Australian Red Cross Blood Service (Blood Service) is concentrated in the Donor and Community Research (DCR) team. Cognizant of the complex and ever-changing landscape with regard to optimal donor management, the DCR team collaborates with academics located at universities around Australia to coordinate a broad program of research that addresses both short- and-long term challenges to the blood supply. This type of collaboration is not, however, without challenges. Two major collaborative programs of the Blood Service's research, focusing on i) the recruitment and retention of plasmapheresis donors and ii) the role of the emotion pride in donor motivation and return, are showcased to elucidate how the challenges of conducting collaborative BCA research can be met. In so doing, these and the other research programs described herein demonstrate how the Blood Service supports and contributes to research that not only revises operational procedures but also contributes to advances in basic science.
Resumo:
Vitamin D is synthesised in the skin through the action of UVB radiation (sunlight), and 25-hydroxy vitamin D (25OHD) measured in serum as a marker of vitamin D status. Several studies, mostly conducted in high latitudes, have shown an association between type 1 diabetes mellitus (T1DM) and low serum 25OHD. We conducted a case-control study to determine whether, in a sub-tropical environment with abundant sunlight (latitude 27.5°S), children with T1DM have lower serum vitamin D than children without diabetes. Fifty-six children with T1DM (14 newly diagnosed) and 46 unrelated control children participated in the study. Serum 25OHD, 1,25-dihydroxy vitamin D (1,25(OH)2D) and selected biochemical indices were measured. Vitamin D receptor (VDR) polymorphisms Taq1, Fok1, and Apa1 were genotyped. Fitzpatrick skin classification, self-reported daily hours of outdoor exposure, and mean UV index over the 35d prior to blood collection were recorded. Serum 25OHD was lower in children with T1DM (n=56) than in controls (n=46) [mean (95%CI)=78.7 (71.8-85.6) nmol/L vs. 91.4 (83.5-98.7) nmol/L, p=0.02]. T1DM children had lower self-reported outdoor exposure and mean UV exposure, but no significant difference in distribution of VDR polymorphisms. 25OHD remained lower in children with T1DM after covariate adjustment. Children newly diagnosed with T1DM had lower 1,25(OH)2D [median (IQR)=89 (68-122) pmol/L] than controls [121 (108-159) pmol/L, p=0.03], or children with established diabetes [137 (113-153) pmol/L, p=0.01]. Children with T1DM have lower 25OHD than controls, even in an environment of abundant sunlight. Whether low vitamin D is a risk factor or consequence of T1DM is unknown. © 2012 John Wiley & Sons A/S.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.