259 resultados para Complete blood count
Resumo:
Blood metaphors abound in everyday social discourse among both Aboriginal and non-Aboriginal people. However, ‘Aboriginal blood talk’, more specifically, is located within a contradictory and contested space in terms of the meanings and values that can be attributed to it by Aboriginal and non-Aboriginal people. In the colonial context, blood talk operated as a tool of oppression for Aboriginal people via blood quantum discourses, yet today, Aboriginal people draw upon notions of blood, namely bloodlines, in articulating their identities. This paper juxtaposes contemporary Aboriginal blood talk as expressed by Aboriginal people against colonial blood talk and critically examines the ongoing political and intellectual governance regarding the validity of this talk in articulating Aboriginalities.
Resumo:
The gross under-resourcing of conservation endeavours has placed an increasing emphasis on spending accountability. Increased accountability has led to monitoring forming a central element of conservation programs. Although there is little doubt that information obtained from monitoring can improve management of biodiversity, the cost (in time and/or money) of gaining this knowledge is rarely considered when making decisions about allocation of resources to monitoring. We present a simple framework allowing managers and policy advisors to make decisions about when to invest in monitoring to improve management. © 2010 Elsevier Ltd.
Resumo:
Polybrominated diphenyl ethers (PBDEs) are considered to be a cost effective and efficient way to reduce flammability therefore reducing harm caused by fires. PBDEs are incorporated into a variety of manufactured products and are found worldwide in biological and environmental samples (e.g. Hites et al. 2004). Unlike other persistent organic pollutants there is limited data on PBDE concentrations by age and/or other population specific factors. Some studies have shown no variation in adult serum PBDE concentrations with age (e.g. Mazdai et al., 2003, Meironyte Guvenius et al., 2003) while Petreas et al. (2003) and Schecter et al. (2005) found results to be suggestive of an age trend in adult data but no statistically significant correlation was found. In addition to the data on adult concentrations there is limited data which investigates the levels of PBDEs in infants and young children. Fangström et al. (2005) showed that in seven year olds there was no difference in PBDE concentration when compared to adult concentrations. While Thomsen et al. (2002, 2005) found the concentration of PBDEs in pooled samples of blood serum from a 0-4 years age group to be higher than other age groups (4 to > 60 years). In addition, a family of four was studied in the U.S. and the concentrations were found to be greatest in the 18-month-old infant followed by the 5 year old child, then the mother and father (Fischer et al., 2006). The objectives of this study were to assess age, gender and regional trends of PBDE concentrations in a representative sample of the Australian population.
Resumo:
Polybrominated diphenylethers (PBDEs) are widely used as flame retardants in polymer materials, textiles, electronic boards and various other materials. Technical PBDE preparations are produced as mixtures of mainly penta-, octa- or decabrombiphenyl ethers1,2. PBDEs are structurally similar to other environmental pollutants like dioxins and PCBs, they are lipophilic and persistent compounds and are widespread in the environment. To date, no information is available on the levels of PBDEs in human serum in Australia. In 2003, more than 9000 blood samples were collected in Australia as part of the National Dioxins Program. The aim of this study was to evaluate PBDE concentrations in these samples, focusing on one age group.
Resumo:
Polybrominated diphenyl ethers (PBDEs), a common class of brominated flame retardants, are a ubiquitous part of our built environment, and for many years have contributed to improved public safety by reducing the flammability of everyday goods. Recently, PBDEs have come under increased international attention because of their potential to impact upon the environment and human health. Some PBDE compounds have been nominated for possible inclusion on the Stockholm Convention on Persistent Organic Pollutants, to which Australia is a Party. Work under the Stockholm Convention has demonstrated the capacity of some PBDEs to persist and accumulate in the environment and to be carried long distances. Much is unknown about the impact of PBDEs on living organisms, however recent studies show that some PBDEs can inhibit growth in colonies of plankton and algae and depress the reproduction of zooplankton. Laboratory mice and rats have also shown liver disturbances and damage to developing nervous systems as a result of exposure to PBDEs. In 2004, the Australian Government Department of the Environment and Water Resources began three studies to examine levels of PBDEs in aquatic sediments, indoor environments and human blood, as knowledge about PBDEs in Australia was very limited. The aim of these studies was to improve this knowledge base so that governments were in a better position to consider appropriate management actions. Due to the high costs for laboratory analysis of PBDEs, the number of samples collected for each study was limited and so caution is required when interpreting the findings. Nevertheless, these studies will provide governments with an indication of how prevalent PBDEs are in the Australian population and the environment and will also contribute to international knowledge about these chemicals. The Department of the Environment and Water Resources will be working closely with othergovernment agencies, industry and the community to investigate any further action that may be required to address PBDEs in Australia.
Resumo:
Background Historically, the paper hand-held record (PHR) has been used for sharing information between hospital clinicians, general practitioners and pregnant women in a maternity shared-care environment. Recently in alignment with a National e-health agenda, an electronic health record (EHR) was introduced at an Australian tertiary maternity service to replace the PHR for collection and transfer of data. The aim of this study was to examine and compare the completeness of clinical data collected in a PHR and an EHR. Methods We undertook a comparative cohort design study to determine differences in completeness between data collected from maternity records in two phases. Phase 1 data were collected from the PHR and Phase 2 data from the EHR. Records were compared for completeness of best practice variables collected The primary outcome was the presence of best practice variables and the secondary outcomes were the differences in individual variables between the records. Results Ninety-four percent of paper medical charts were available in Phase 1 and 100% of records from an obstetric database in Phase 2. No PHR or EHR had a complete dataset of best practice variables. The variables with significant improvement in completeness of data documented in the EHR, compared with the PHR, were urine culture, glucose tolerance test, nuchal screening, morphology scans, folic acid advice, tobacco smoking, illicit drug assessment and domestic violence assessment (p = 0.001). Additionally the documentation of immunisations (pertussis, hepatitis B, varicella, fluvax) were markedly improved in the EHR (p = 0.001). The variables of blood pressure, proteinuria, blood group, antibody, rubella and syphilis status, showed no significant differences in completeness of recording. Conclusion This is the first paper to report on the comparison of clinical data collected on a PHR and EHR in a maternity shared-care setting. The use of an EHR demonstrated significant improvements to the collection of best practice variables. Additionally, the data in an EHR were more available to relevant clinical staff with the appropriate log-in and more easily retrieved than from the PHR. This study contributes to an under-researched area of determining data quality collected in patient records.
Resumo:
Background Little is known about the relation between vitamin D status in early life and neurodevelopment outcomes. Objective This study was designed to examine the association of cord blood 25-hydroxyvitamin D [25(OH)D] at birth with neurocognitive development in toddlers. Methods As part of the China-Anhui Birth Cohort Study, 363 mother-infant pairs with completed data were selected. Concentrations of 25(OH)D in cord blood were measured by radioimmunoassay. Mental development index (MDI) and psychomotor development index (PDI) in toddlers were assessed at age 16–18 mo by using the Bayley Scales of Infant Development. The data on maternal sociodemographic characteristics and other confounding factors were also prospectively collected. Results Toddlers in the lowest quintile of cord blood 25(OH)D exhibited a deficit of 7.60 (95% CI: −12.4, −2.82; P = 0.002) and 8.04 (95% CI: −12.9, −3.11; P = 0.001) points in the MDI and PDI scores, respectively, compared with the reference category. Unexpectedly, toddlers in the highest quintile of cord blood 25(OH)D also had a significant deficit of 12.3 (95% CI: −17.9, −6.67; P < 0.001) points in PDI scores compared with the reference category. Conclusions This prospective study suggested that there was an inverted-U–shaped relation between neonatal vitamin D status and neurocognitive development in toddlers. Additional studies on the optimal 25(OH)D concentrations in early life are needed.
Resumo:
In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.
Numerical investigation of motion and deformation of a single red blood cell in a stenosed capillary
Resumo:
It is generally assumed that influence of the red blood cells (RBCs) is predominant in blood rheology. The healthy RBCs are highly deformable and can thus easily squeeze through the smallest capillaries having internal diameter less than their characteristic size. On the other hand, RBCs infected by malaria or other diseases are stiffer and so less deformable. Thus it is harder for them to flow through the smallest capillaries. Therefore, it is very important to critically and realistically investigate the mechanical behavior of both healthy and infected RBCs which is a current gap in knowledge. The motion and the steady state deformed shape of the RBCs depend on many factors, such as the geometrical parameters of the capillary through which blood flows, the membrane bending stiffness and the mean velocity of the blood flow. In this study, motion and deformation of a single two-dimensional RBC in a stenosed capillary is explored by using smoothed particle hydrodynamics (SPH) method. An elastic spring network is used to model the RBC membrane, while the RBC's inside fluid and outside fluid are treated as SPH particles. The effect of RBC's membrane stiffness (kb), inlet pressure (P) and geometrical parameters of the capillary on the motion and deformation of the RBC is studied. The deformation index, RBC's mean velocity and the cell membrane energy are analyzed when the cell passes through the stenosed capillary. The simulation results demonstrate that the kb, P and the geometrical parameters of the capillary have a significant impact on the RBCs' motion and deformation in the stenosed section.
Resumo:
BACKGROUND Quantification of the disease burden caused by different risks informs prevention by providing an account of health loss different to that provided by a disease-by-disease analysis. No complete revision of global disease burden caused by risk factors has been done since a comparative risk assessment in 2000, and no previous analysis has assessed changes in burden attributable to risk factors over time. METHODS We estimated deaths and disability-adjusted life years (DALYs; sum of years lived with disability [YLD] and years of life lost [YLL]) attributable to the independent effects of 67 risk factors and clusters of risk factors for 21 regions in 1990 and 2010. We estimated exposure distributions for each year, region, sex, and age group, and relative risks per unit of exposure by systematically reviewing and synthesising published and unpublished data. We used these estimates, together with estimates of cause-specific deaths and DALYs from the Global Burden of Disease Study 2010, to calculate the burden attributable to each risk factor exposure compared with the theoretical-minimum-risk exposure. We incorporated uncertainty in disease burden, relative risks, and exposures into our estimates of attributable burden. FINDINGS In 2010, the three leading risk factors for global disease burden were high blood pressure (7·0% [95% uncertainty interval 6·2-7·7] of global DALYs), tobacco smoking including second-hand smoke (6·3% [5·5-7·0]), and alcohol use (5·5% [5·0-5·9]). In 1990, the leading risks were childhood underweight (7·9% [6·8-9·4]), household air pollution from solid fuels (HAP; 7·0% [5·6-8·3]), and tobacco smoking including second-hand smoke (6·1% [5·4-6·8]). Dietary risk factors and physical inactivity collectively accounted for 10·0% (95% UI 9·2-10·8) of global DALYs in 2010, with the most prominent dietary risks being diets low in fruits and those high in sodium. Several risks that primarily affect childhood communicable diseases, including unimproved water and sanitation and childhood micronutrient deficiencies, fell in rank between 1990 and 2010, with unimproved water and sanitation accounting for 0·9% (0·4-1·6) of global DALYs in 2010. However, in most of sub-Saharan Africa childhood underweight, HAP, and non-exclusive and discontinued breastfeeding were the leading risks in 2010, while HAP was the leading risk in south Asia. The leading risk factor in Eastern Europe, most of Latin America, and southern sub-Saharan Africa in 2010 was alcohol use; in most of Asia, North Africa and Middle East, and central Europe it was high blood pressure. Despite declines, tobacco smoking including second-hand smoke remained the leading risk in high-income north America and western Europe. High body-mass index has increased globally and it is the leading risk in Australasia and southern Latin America, and also ranks high in other high-income regions, North Africa and Middle East, and Oceania. INTERPRETATION Worldwide, the contribution of different risk factors to disease burden has changed substantially, with a shift away from risks for communicable diseases in children towards those for non-communicable diseases in adults. These changes are related to the ageing population, decreased mortality among children younger than 5 years, changes in cause-of-death composition, and changes in risk factor exposures. New evidence has led to changes in the magnitude of key risks including unimproved water and sanitation, vitamin A and zinc deficiencies, and ambient particulate matter pollution. The extent to which the epidemiological shift has occurred and what the leading risks currently are varies greatly across regions. In much of sub-Saharan Africa, the leading risks are still those associated with poverty and those that affect children.
Resumo:
Clinical studies have demonstrated an impairment of glucocorticoid receptor (GR)-mediated negative feedback on the hypothalamic-pituitary-adrenal (HPA) axis in patients with major depression (GR resistance), and its resolution by antidepressant treatment. Recently, we showed that this impairment is indeed due to a dysfunction of GR in depressed patients (Carvalho et al., 2009), and that the ability of the antidepressant clomipramine to decrease GR function in peripheral blood cells is impaired in patients with major depression who are clinically resistant to treatment (Carvalho et al. 2008). To further investigate the effect of antidepressants on GR function in humans, we have compared the effect of the antidepressants clomipramine, amytriptiline, sertraline, paroxetine and venlafaxine, and of the antipsychotics, haloperidol and risperidone, on GR function in peripheral blood cells from healthy volunteers (n=33). GR function was measured by glucocorticoid inhibition of lypopolysaccharide (LPS)-stimulated interleukin-6 (IL-6) levels. Compared to vehicle-treated cells, all antidepressants inhibited dexamethasone (DEX, 10-100nM) inhibition of LPS-stimulated IL-6 levels (p values ranging from 0.007 to 0.1). This effect was specific to antidepressants, as antipsychotics had no effect on DEX-inhibition of LPS-stimulated IL-6 levels. The phosphodiesterase (PDE) type 4 inhibitor, rolipram, potentiated the effect of antidepressants on GR function, while the GR antagonist, RU-486, inhibited the effect of antidepressants on GR function. These findings indicate that the effect of antidepressants on GR function are specific for this class of psychotropic drugs, and involve second messenger pathways relevant to GR function and inflammation. Furthermore, it also points towards a possible mechanism by which one maybe able to overcome treatment-resistant depression. Research in this field will lead to new insights into the pathophysiology and treatment of affective disorders.
Resumo:
AIM To investigate the number of hypertensive patients, the optometrist is able to identify by routinely taking blood pressure (BP) measurements for patients in "at -risk" groups, and to sample patients' opinions regarding in -office BP measurement. Many of the optometrists in Saudi Arabia practice in optical stores. These stores are wide spread, easily accessible and seldom need appointments. The expanding role of the optometrist as a primary health care provider (PHCP) and the increasing global prevalence of hypertension, highlight the need for an integrated approach towards detecting and monitoring hypertension. METHODS Automated BP measurements were made twice (during the same session) at five selected optometry practices using a validated BP monitor (Omron M6) to assess the number of patients with high BP (HBP) - in at -risk groups -visiting the eye clinic routinely. Prior to data collection, practitioners underwent a two-day training workshop by a cardiologist on hypertension and how to obtain accurate BP readings. A protocol for BP measurement was distributed and retained in all participating clinics. The general attitude towards cardiovascular health of 480 patients aged 37.2 (依12.4)y and their opinion towards in-office BP measurement was assessed using a self -administered questionnaire. RESULTS A response rate of 83.6% was obtained for the survey. Ninety -three of the 443 patients (21.0% ) tested for BP in this study had HBP. Of these, (62 subjects) 67.7% were unaware of their HBP status. Thirty of the 105 subjects (28.6%) who had previously been diagnosed with HBP, still had HBP at the time of this study, and only 22 (73.3%) of these patients were on medication. Also, only 25% of the diagnosed hypertensive patients owned a BP monitor. CONCLUSION Taking BP measurements in optometry practices, we were able to identify one previously undiagnosed patient with HBP for every 8 adults tested. We also identified 30 of 105 previously diagnosed patients whose BP was poorly controlled, twenty-two of whom were on medication. The patients who participated in this study were positively disposed toward the routine measurement of BP by optometrists.
Resumo:
Summary This manual was developed to guide a move towards common standards for undertaking and reporting research microscopy for malaria parasite detection, identification and quantification. It contains procedures based on agreed quality assurance standards for research malaria microscopy defined at a consultation of: TDR, the Special Programme for Research and Training in Tropical Diseases; the Worldwide Antimalarial Resistance Network (WWARN), United Kingdom; the Foundation for Innovative New Diagnostics (FIND), Switzerland; the Centers for Disease Control and Prevention (CDC), USA; the Kenya Medical Research Institute (KEMRI) and later expanded to include Amref Health Africa (Kenya); the Eijkman-Oxford Clinical Research Unit (EOCRU), Indonesia; Institut Pasteur du Cambodge (IPC); Institut de recherche pour le Développement (IRD), Senegal; the Global Good and Intellectual Ventures Laboratory (GG-IVL), USA; the Mahidol-Oxford Tropical Medicine Research Unit (MORU), Thailand; Queensland University of Technology (QUT), Australia, and the Shoklo Malaria Research Unit (SMRU), Thailand. These collaborating institutions commit to adhering to these standards in published research studies. It is hoped that they will form a solid basis for the wider adoption of standardized reference microscopy protocols for malaria research.
Resumo:
BACKGROUND Ongoing shortages of blood products may be addressed through additional donations. However, donation frequency rates are typically lower than medically possible. This preliminary study aims to determine voluntary nonremunerated whole blood (WB) and plasmapheresis donors' willingness, and subsequent facilitators and barriers, to make additional donations of a different type. STUDY DESIGN AND METHODS Forty individual telephone interviews were conducted posing two additional donation pattern scenarios: first, making a single and, second, making multiple plasmapheresis donations between WB donations. Stratified purposive sampling was conducted for four samples varying in donation experience: no-plasma, new-to-both-WB-and-plasma, new-to-plasma, and plasma donors. Interviews were analyzed yielding excellent (κ values > 0.81) inter-rater reliability. RESULTS Facilitators were more endorsed than barriers for a single but not multiple plasmapheresis donation. More new-to-both donors (n = 5) were willing to make multiple plasma donations between WB donations than others (n = 1 each) and identified fewer barriers (n = 3) than those more experienced in donation (n = 8 no plasma, n = 10 new to both, n = 11 plasma). Donors in the plasma sample were concerned about the subsequent reduced time between plasma donations by adding WB donations (n = 3). The no-plasma and new-to-plasma donors were concerned about the time commitment required (n = 3). CONCLUSION Current donors are willing to add different product donations but donation history influences their willingness to change. Early introduction of multiple donation types, variation in inventory levels, and addressing barriers will provide blood collection agencies with a novel and cost-effective inventory management strategy.