63 resultados para Mortality and race
Resumo:
This study aimed to assess the effect of the number of straw bales (SBs) provided on the behaviour and leg health of commercial broiler chickens. Houses containing ~23 000 broiler chickens were assigned to one of two treatments: (1) access to 30 SBs per house, ‘30SB’ or (2) access to 45 SB per house, ‘45SB’. This equated to bale densities of 1 bale/44 m2 and 1 bale/29 m2 of floor space within houses, respectively. Treatments were applied in one of two houses on a commercial farm, and were replicated over six production cycles. Both houses had windows and were also artificially lit. Behaviour was observed in weeks 3 to5 of the cycle. This involved observations of general behaviour and activity, gait scores (0: perfect to 5: unable to walk) and latency to lie (measured in seconds from when a bird had been encouraged to stand). Production performance and environmental parameters were also measured. SB density had no significant effect on activity levels (P>0.05) or walking ability (P>0.05). However, the average latency to lie was greater in 30SB birds compared with 45SB birds (P<0.05). The incidence of hock burn and podo dermatitis, average BW at slaughter and levels of mortality and culling were unaffected by SB density (P>0.05). The results from this study suggest that increasing SB levels from 1 bale/44 m2 to 1 bale/29 m2 floor space does not lead to significant improvements in the welfare of commercial broiler chickens in windowed houses.
Resumo:
Over recent years the moral panic that has surrounded 'boys' underachievement' has tended to encourage crude and essentialist comparisons between all boys and all girls and to eclipse the continuing and more profound effects on educational achievement exerted by social class and 'race'/ethnicity. While there are differences in educational achievement between working class boys and girls, these differences are relatively minor when comparing the overall achievement levels of working class children with those from higher, professional social class backgrounds. This paper argues that a need exists therefore for researchers to fully contextualise the gender differences that exist in educational achievement within the over-riding contexts provided by social class and 'race'/ethnicity. The paper provides an example of how this can be done through a case study of 11-year-old children from a Catholic, working class area in Belfast. The paper shows how the children's general educational aspirations are significantly mediated by their experiences of the local area in which they live. However, the way in which the children come to experience and construct a sense of locality differs between the boys and girls and this, it is argued, helps to explain the more positive educational aspirations held by some of the girls compared to the boys. The paper concludes by considering the relevance of locality for understanding its effects on educational aspirations among other working class and/or minority ethnic communities.
Resumo:
Victorian writers often claimed that the press was killing the fairy tale. In fact, it ensured the genre's popularity, bringing literary tales and folklore to the first mass readerships. Exploring penny weeklies, adult and children's monthlies, little magazines and the labour press, this innovative study is the first to combine media and fairy tale history. Bringing reading communities back into focus, Sumpter explores ingenious political uses of the fairy tale: in debates over socialism, evolution and race, and in the context of women's rights, decadence and gay culture. The book offers new insights into the popularisation of folklore and comparative science, and also recovers neglected visual material. From the fantasies of Kingsley, MacDonald and J. H. Ewing to the writings of Keir Hardie, Laurence Housman and Yeats, Sumpter reveals that the fairy tale was intimately shaped by the press, and that both were at the heart of nineteenth-century culture.
The paperback edition includes a new Preface.
Resumo:
INTRODUCTION: Acute respiratory distress syndrome (ARDS) is a common clinical syndrome with high mortality and long-term morbidity. To date there is no effective pharmacological therapy. Aspirin therapy has recently been shown to reduce the risk of developing ARDS, but the effect of aspirin on established ARDS is unknown.
METHODS: In a single large regional medical and surgical ICU between December 2010 and July 2012, all patients with ARDS were prospectively identified and demographic, clinical, and laboratory variables were recorded retrospectively. Aspirin usage, both pre-hospital and during intensive care unit (ICU) stay, was included. The primary outcome was ICU mortality. We used univariate and multivariate logistic regression analyses to assess the impact of these variables on ICU mortality.
RESULTS: In total, 202 patients with ARDS were included; 56 (28%) of these received aspirin either pre-hospital, in the ICU, or both. Using multivariate logistic regression analysis, aspirin therapy, given either before or during hospital stay, was associated with a reduction in ICU mortality (odds ratio (OR) 0.38 (0.15 to 0.96) P = 0.04). Additional factors that predicted ICU mortality for patients with ARDS were vasopressor use (OR 2.09 (1.05 to 4.18) P = 0.04) and APACHE II score (OR 1.07 (1.02 to 1.13) P = 0.01). There was no effect upon ICU length of stay or hospital mortality.
CONCLUSION: Aspirin therapy was associated with a reduced risk of ICU mortality. These data are the first to demonstrate a potential protective role for aspirin in patients with ARDS. Clinical trials to evaluate the role of aspirin as a pharmacological intervention for ARDS are needed.
Resumo:
Introduction: In this cohort study, we explored the relationship between fluid balance, intradialytic hypotension and outcomes in critically ill patients with acute kidney injury (AKI) who received renal replacement therapy (RRT).
Methods: We analysed prospectively collected registry data on patients older than 16 years who received RRT for at least two days in an intensive care unit at two university-affiliated hospitals. We used multivariable logistic regression to determine the relationship between mean daily fluid balance and intradialytic hypotension, both over seven days following RRT initiation, and the outcomes of hospital mortality and RRT dependence in survivors.
Results: In total, 492 patients were included (299 male (60.8%), mean (standard deviation (SD)) age 62.9 (16.3) years); 251 (51.0%) died in hospital. Independent risk factors for mortality were mean daily fluid balance (odds ratio (OR) 1.36 per 1000 mL positive (95% confidence interval (CI) 1.18 to 1.57), intradialytic hypotension (OR 1.14 per 10% increase in days with intradialytic hypotension (95% CI 1.06 to 1.23)), age (OR 1.15 per five-year increase (95% CI 1.07 to 1.25)), maximum sequential organ failure assessment score on days 1 to 7 (OR 1.21 (95% CI 1.13 to 1.29)), and Charlson comorbidity index (OR 1.28 (95% CI 1.14 to 1.44)); higher baseline creatinine (OR 0.98 per 10 mu mol/L (95% CI 0.97 to 0.996)) was associated with lower risk of death. Of 241 hospital survivors, 61 (25.3%) were RRT dependent at discharge. The only independent risk factor for RRT dependence was pre-existing heart failure (OR 3.13 (95% CI 1.46 to 6.74)). Neither mean daily fluid balance nor intradialytic hypotension was associated with RRT dependence in survivors. Associations between these exposures and mortality were similar in sensitivity analyses accounting for immortal time bias and dichotomising mean daily fluid balance as positive or negative. In the subgroup of patients with data on pre-RRT fluid balance, fluid overload at RRT initiation did not modify the association of mean daily fluid balance with mortality.
Conclusions: In this cohort of patients with AKI requiring RRT, a more positive mean daily fluid balance and intradialytic hypotension were associated with hospital mortality but not with RRT dependence at hospital discharge in survivors.
Resumo:
PURPOSE: To investigate whether statins used after colorectal cancer diagnosis reduce the risk of colorectal cancer-specific mortality in a cohort of patients with colorectal cancer.
PATIENTS AND METHODS: A cohort of 7,657 patients with newly diagnosed stage I to III colorectal cancer were identified from 1998 to 2009 from the National Cancer Data Repository (comprising English cancer registry data). This cohort was linked to the United Kingdom Clinical Practice Research Datalink, which provided prescription records, and to mortality data from the Office of National Statistics (up to 2012) to identify 1,647 colorectal cancer-specific deaths. Time-dependent Cox regression models were used to calculate hazard ratios (HR) for cancer-specific mortality and 95% CIs by postdiagnostic statin use and to adjust these HRs for potential confounders.
RESULTS: Overall, statin use after a diagnosis of colorectal cancer was associated with reduced colorectal cancer-specific mortality (fully adjusted HR, 0.71; 95% CI, 0.61 to 0.84). A dose-response association was apparent; for example, a more marked reduction was apparent in colorectal cancer patients using statins for more than 1 year (adjusted HR, 0.64; 95% CI, 0.53 to 0.79). A reduction in all-cause mortality was also apparent in statin users after colorectal cancer diagnosis (fully adjusted HR, 0.75; 95% CI, 0.66 to 0.84).
CONCLUSION: In this large population-based cohort, statin use after diagnosis of colorectal cancer was associated with longer rates of survival.
Resumo:
Background: Chronic kidney disease (CKD) and hypertension are global public health problems associated with considerable morbidity, premature mortality and attendant healthcare costs. Previous studies have highlighted that non-invasive examination of the retinal microcirculation can detect microvascular pathology that is associated with systemic disorders of the circulatory system such as hypertension. We examined the associations between retinal vessel caliber (RVC) and fractal dimension (DF), with both hypertension and CKD in elderly Irish nuns.
Methods: Data from 1233 participants in the cross-sectional observational Irish Nun Eye Study (INES) were assessed from digital photographs with a standardized protocol using computer-assisted software. Multivariate regression analyses were used to assess associations with hypertension and CKD, with adjustment for age, body mass index (BMI), refraction, fellow eye RVC, smoking, alcohol consumption, ischemic heart disease (IHD), cerebrovascular accident (CVA), diabetes and medication use.
Results: In total, 1122 (91%) participants (mean age: 76.3 [range: 56-100] years) had gradable retinal images of sufficient quality for blood vessel assessment. Hypertension was significantly associated with a narrower central retinal arteriolar equivalent (CRAE) in a fully adjusted analysis (P = 0.002; effect size= -2.16 μm; 95% confidence intervals [CI]: -3.51, -0.81 μm). No significant associations between other retinal vascular parameters and hypertension or between any retinal vascular parameters and CKD were found.
Conclusions: Individuals with hypertension have significantly narrower retinal arterioles which may afford an earlier opportunity for tailored prevention and treatment options to optimize the structure and function of the microvasculature, providing additional clinical utility. No significant associations between retinal vascular parameters and CKD were detected.
Resumo:
The world has experienced a public-health miracle in the past half century, as cleaner water, new health technologies, better diet and a host of other improvements have sharply reduced mortality and extended life expectancy in poor countries by as much as 20 years. A substantial portion of those gains has been realized through improvements in infant and child survival. However, the increase in income that was both a cause and effect of this miracle brought with it a new and ironic threat: a steep rise in non-communicable diseases (NCDs) like heart ailments and cancer.
Resumo:
Background/Question/Methods
Assessing the large scale impact of deer populations on forest structure and composition is important because of their increasing abundance in many temperate forests. Deer are invasive animals and sometimes thought to be responsible for immense damage to New Zealand’s forests. We report demographic changes taking place among 40 widespread indigenous tree species over 20 years, following a period of record deer numbers in the 1950s and a period of extensive hunting and depletion of deer populations during the 1960s and 1970s.
Results/Conclusions
Across a network of 578 plots there was an overall 13% reduction in sapling density of our study species with most remaining constant and a few declining dramatically. The effect of suppressed recruitment when deer populations were high was evident in the small tree size class (30 – 80 mm dbh). Stem density decreased by 15% and species with the greatest annual decreases in small tree density were those which have the highest rates of sapling recovery in exclosures indicating that deer were responsible. Densities of large canopy trees have remained relatively stable. There were imbalances between mortality and recruitment rates for 23 of the 40 species, 7 increasing and 16 in decline. These changes were again linked with sapling recovery in exclosures; species which recovered most rapidly following deer exclusion had the greatest net recruitment deficit across the wider landscape, indicating recruitment suppression by deer as opposed to mortality induced by disturbance and other herbivores. Species are not declining uniformly across all populations and no species are in decline across their entire range. Therefore we predict that with continued deer presence some forests will undergo compositional changes but that none of the species tested will become nationally extinct.
Impacts of invasive browsers on demographic rates and forest structure in New Zealand. Available from: http://www.researchgate.net/publication/267285500_Impacts_of_invasive_browsers_on_demographic_rates_and_forest_structure_in_New_Zealand [accessed Oct 9, 2015].
Resumo:
AIMS: Survival and response rates in metastatic colorectal cancer remain poor, despite advances in drug development. There is increasing evidence to suggest that gender-specific differences may contribute to poor clinical outcome. We tested the hypothesis that genomic profiling of metastatic colorectal cancer is dependent on gender.
MATERIALS & METHODS: A total of 152 patients with metastatic colorectal cancer who were treated with oxaliplatin and continuous infusion 5-fluorouracil were genotyped for 21 polymorphisms in 13 cancer-related genes by PCR. Classification and regression tree analysis tested for gender-related association of polymorphisms with overall survival, progression-free survival and tumor response.
RESULTS: Classification and regression tree analysis of all polymorphisms, age and race resulted in gender-specific predictors of overall survival, progression-free survival and tumor response. Polymorphisms in the following genes were associated with gender-specific clinical outcome: estrogen receptor β, EGF receptor, xeroderma pigmentosum group D, voltage-gated sodium channel and phospholipase A2.
CONCLUSION: Genetic profiling to predict the clinical outcome of patients with metastatic colorectal cancer may depend on gender.
Resumo:
The prenatal period is of critical importance in defining how individuals respond to their environment throughout life. Stress experienced by pregnant females has been shown to have detrimental effects on offspring biology in humans and a variety of other species. It also is becoming increasingly apparent that prenatal events can have important consequences for the behavior, health, and productivity of offspring in farmed species. Pregnant cattle may experience many potentially important stressors, for instance, relating to their social environment, housing system and physical environment, interactions with humans and husbandry procedures, and their state of health. We examined the available literature to provide a review of the implications of prenatal stress for offspring welfare in cattle. The long-term effects of dystocia on cattle offspring also are reviewed. To ensure a transparent and repeatable selection process, a systematic review approach was adopted. The research literature clearly demonstrates that prenatal stress and difficult births in beef and dairy cattle both have implications for offspring welfare and performance. Common husbandry practices, such as transport, were shown to influence offspring biology and the importance of environmental variables, including thermal stress and drought, also were highlighted. Maternal disease during pregnancy was shown to negatively impact offspring welfare. Moreover, dystocia-affected calves suffer increased mortality and morbidity, decreased transfer of passive immunity, and important physiological and behavioral changes. This review also identified considerable gaps in our knowledge and understanding of the effects of prenatal stress in cattle. © 2012 American Society of Animal Science. All rights reserved.
Resumo:
OBJECTIVES:
To determine the prevalence of cataract and pseudophakia/aphakia in the United States and to project the expected change in these prevalence figures by 2020.
METHODS:
Summary prevalence estimates of cataract and of pseudophakia/aphakia were prepared separately for black, white, and Hispanic persons (for whom only cataract surgery data were available) in 5-year age intervals starting at 40 years for women and men. The estimates were based on a standardized definition of various types of cataract: cortical, greater than 25% of the lens involved; posterior subcapsular, present according to the grading system used in each study; and nuclear, greater than or equal to the penultimate grade in the system used. Data were collected from major population-based studies in the United States, and, where appropriate, Australia, Barbados, and Western Europe. The age-, gender-, and race/ethnicity-specific rates were applied to 2000 US Census data, and projected population figures for 2020, to obtain overall estimates.
RESULTS:
An estimated 20.5 million (17.2%) Americans older than 40 years have cataract in either eye, and 6.1 million (5.1%) have pseudophakia/aphakia. Women have a significantly (odds ratio = 1.37; 95% confidence interval, 1.26-1.50) higher age-adjusted prevalence of cataract than men in the United States. The total number of persons who have cataract is estimated to rise to 30.1 million by 2020; and for those who are expected to have pseudophakia/aphakia, to 9.5 million.
CONCLUSION:
The number of Americans affected by cataract and undergoing cataract surgery will dramatically increase over the next 20 years as the US population ages.
Resumo:
OBJECTIVE:
To estimate the prevalence of refractive errors in persons 40 years and older.
METHODS:
Counts of persons with phakic eyes with and without spherical equivalent refractive error in the worse eye of +3 diopters (D) or greater, -1 D or less, and -5 D or less were obtained from population-based eye surveys in strata of gender, race/ethnicity, and 5-year age intervals. Pooled age-, gender-, and race/ethnicity-specific rates for each refractive error were applied to the corresponding stratum-specific US, Western European, and Australian populations (years 2000 and projected 2020).
RESULTS:
Six studies provided data from 29 281 persons. In the US, Western European, and Australian year 2000 populations 40 years or older, the estimated crude prevalence for hyperopia of +3 D or greater was 9.9%, 11.6%, and 5.8%, respectively (11.8 million, 21.6 million, and 0.47 million persons). For myopia of -1 D or less, the estimated crude prevalence was 25.4%, 26.6%, and 16.4% (30.4 million, 49.6 million, and 1.3 million persons), respectively, of whom 4.5%, 4.6%, and 2.8% (5.3 million, 8.5 million, and 0.23 million persons), respectively, had myopia of -5 D or less. Projected prevalence rates in 2020 were similar.
CONCLUSIONS:
Refractive errors affect approximately one third of persons 40 years or older in the United States and Western Europe, and one fifth of Australians in this age group.
Resumo:
PURPOSE: To determine the heritability of refractive error and the familial aggregation of myopia in an older population. METHODS: Seven hundred fifty-nine siblings (mean age, 73.4 years) in 241 families were recruited from the Salisbury Eye Evaluation (SEE) Study in eastern Maryland. Refractive error was determined by noncycloplegic subjective refraction (if presenting distance visual acuity was < or =20/40) or lensometry (if best corrected visual acuity was >20/40 with spectacles). Participants were considered plano (refractive error of zero) if uncorrected visual acuity was >20/40. Preoperative refraction from medical records was used for pseudophakic subjects. Heritability of refractive error was calculated with multivariate linear regression and was estimated as twice the residual between-sibling correlation after adjusting for age, gender, and race. Logistic regression models were used to estimate the odds ratio (OR) of myopia, given a myopic sibling relative to having a nonmyopic sibling. RESULTS: The estimated heritability of refractive error was 61% (95% confidence interval [CI]: 34%-88%) in this population. The age-, race-, and sex-adjusted ORs of myopia were 2.65 (95% CI: 1.67-4.19), 2.25 (95% CI: 1.31-3.87), 3.00 (95% CI: 1.56-5.79), and 2.98 (95% CI: 1.51-5.87) for myopia thresholds of -0.50, -1.00, -1.50, and -2.00 D, respectively. Neither race nor gender was significantly associated with an increased risk of myopia. CONCLUSIONS: Refractive error and myopia are highly heritable in this elderly population.
Resumo:
BACKGROUND: In sub-Saharan Africa, where infectious diseases and nutritional deficiencies are common, severe anaemia is a common cause of paediatric hospital admission, yet the evidence to support current treatment recommendations is limited. To avert overuse of blood products, the World Health Organisation advocates a conservative transfusion policy and recommends iron, folate and anti-helminthics at discharge. Outcomes are unsatisfactory with high rates of in-hospital mortality (9-10 %), 6-month mortality and relapse (6 %). A definitive trial to establish best transfusion and treatment strategies to prevent both early and delayed mortality and relapse is warranted.
METHODS/DESIGN: TRACT is a multicentre randomised controlled trial of 3954 children aged 2 months to 12 years admitted to hospital with severe anaemia (haemoglobin < 6 g/dl). Children will be enrolled over 2 years in 4 centres in Uganda and Malawi and followed for 6 months. The trial will simultaneously evaluate (in a factorial trial with a 3 x 2 x 2 design) 3 ways to reduce short-term and longer-term mortality and morbidity following admission to hospital with severe anaemia in African children. The trial will compare: (i) R1: liberal transfusion (30 ml/kg whole blood) versus conservative transfusion (20 ml/kg) versus no transfusion (control). The control is only for children with uncomplicated severe anaemia (haemoglobin 4-6 g/dl); (ii) R2: post-discharge multi-vitamin multi-mineral supplementation (including folate and iron) versus routine care (folate and iron) for 3 months; (iii) R3: post-discharge cotrimoxazole prophylaxis for 3 months versus no prophylaxis. All randomisations are open. Enrolment to the trial started September 2014 and is currently ongoing. Primary outcome is cumulative mortality to 4 weeks for the transfusion strategy comparisons, and to 6 months for the nutritional support/antibiotic prophylaxis comparisons. Secondary outcomes include mortality, morbidity (haematological correction, nutritional and infectious), safety and cost-effectiveness.
DISCUSSION: If confirmed by the trial, a cheap and widely available 'bundle' of effective interventions, directed at immediate and downstream consequences of severe anaemia, could lead to substantial reductions in mortality in a substantial number of African children hospitalised with severe anaemia every year, if widely implemented.