69 resultados para health data
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Objectives. We compared the mental health risk to unpaid caregivers bereaved of a care recipient with the risk to persons otherwise bereaved and to nonbereaved caregivers.
Methods. We linked prescription records for antidepressant and anxiolytic drugs to characteristics and life-event data of members of the Northern Ireland Longitudinal Study (n = 317 264). Using a case-control design, we fitted logistic regression models, stratified by age, to model relative likelihood of mental health problems, using the proxy measures of mental health–related prescription.
Results. Both caregivers and bereaved individuals were estimated to be at between 20% and 50% greater risk for mental health problems than noncaregivers in similar circumstances (for bereaved working-age caregivers, odds ratio = 1.41; 95% confidence interval = 1.27, 1.56). For older people, there was no evidence of additional risk to bereaved caregivers, though there was for working-age people. Older people appeared to recover more quickly from caregiver bereavement.
Conclusions. Caregivers were at risk for mental ill health while providing care and after the death of the care recipient. Targeted caregiver support needs to extend beyond the life of the care recipient.
Resumo:
High levels of As in groundwater commonly found in Bangladesh and other parts of Asia not only pose a risk via drinking water consumption but also a risk in agricultural sustainability and food safety. This review attempts to provide an overview of current knowledge and gaps related to the assessment and management of these risks, including the behaviour of As in the soil-plant system, uptake, phytotoxicity, As speciation in foods, dietary habits, and human health risks. Special emphasis has been given to the situation in Bangladesh, where groundwater via shallow tube wells is the most important source of irrigation water in the dry season. Within the soil-plant system, there is a distinct difference in behaviour of As under flooded conditions, where arsenite (AsIII) predominates, and under nonflooded conditions, where arsenate (AsV) predominates. The former is regarded as most toxic to humans and plants. Limited data indicate that As-contaminated irrigation water can result in a slow buildup of As in the topsoil. In some cases the buildup is reflected by the As levels in crops, in others not. It is not yet possible to predict As uptake and toxicity in plants based on soil parameters. It is unknown under what conditions and in what time frame As is building up in the soil. Representative phytotoxicity data necessary to evaluate current and future soil concentrations are not yet available. Although there are no indications that crop production is currently inhibited by As, long-term risks are clearly present. Therefore, with concurrent assessments of the risks, management options to further prevent As accumulation in the topsoil should already have been explored. With regard to human health, data on As speciation in foods in combination with food consumption data are needed to assess dietary exposure, and these data should include spatial and seasonal variability. It is important to control confounding factors in assessing the risks. In a country where malnutrition is prevalent, levels of inorganic As in foods should be balanced against the nutritional value of the foods. Regarding agriculture, As is only one of the many factors that may pose a risk to the sustainability of crop production. Other risk factors such as nutrient depletion and loss of organic matter also must be taken into account to set priorities in terms of research, management, and overall strategy.
Resumo:
Implications Provision of environmental enrichment in line with that required by welfare-based quality assurance schemesdoes not always appear to lead to clear improvements in broiler chicken welfare. This research perhaps serves to highlightthe deficit in information regarding the ‘real world’ implications of enrichment with perches, string and straw bales.
Introduction Earlier work showed that provision of natural light and straw bales improved leg health in commercial broilerchickens (Bailie et al., 2013). This research aimed to determine if additional welfare benefits were shown in windowedhouses by increasing straw bale provision (Study 1), or by providing perches and string in addition to straw bales (Study 2).
Material and methods Commercial windowed houses in Northern Ireland containing ~23,000 broiler chickens (placed inhouses as hatched) were used in this research which took place in 2011. In Study 1 two houses on a single farm wereassigned to one of two treatments: (1) 30 straw bales per house (1 bale/44m2), or (2) 45 straw bales per house (1bale/29m2). Bales of wheat straw, each measuring 80cm x 40cm x 40cm were provided from day 10 of the rearing cycle,as in Bailie et al. (2013). Treatments were replicated over 6 production cycles (using 276,000 Ross 308 and Cobb birds),and were swapped between houses in each replicate. In Study 2, four houses on a single farm were assigned to 1 of 4treatments in a 2 x 2 factorial design. Treatments involved 2 levels of access to perches (present (24/house), or absent), and2 levels of access to string (present (24/house), or absent), and both types of enrichment were provided from the start of thecycle. Each perch consisted of a horizontal, wooden beam (300 cm x 5 cm x 5cm) with a rounded upper edge resting on 2supports (15 cm high). In the string treatment, 6 pieces of white nylon string (60 cm x 10 mm) were tied at their mid-pointto the wire above each of 4 feeder lines. Thirty straw bales were also provided per house from day 10. This study wasreplicated over 4 production cycles using 368,000 Ross 308 birds. In both studies behaviour was observed between 0900and 1800 hours in weeks 3-5 of the cycle. In Study 1, 8 focal birds were selected in each house each week, and generalactivity, exploratory and social behaviours recorded directly for 10 minutes. In Study 2, 10 minute video recordings weremade of 6 different areas (that did not contain enrichment) of each house each week. The percentage of birds engaged inlocomotion or standing was determined through scan sampling these recordings at 120 second intervals. Four perches andfour pieces of string were filmed for 25 mins in each house that contained these enrichments on one day per week. The totalnumber of times the perch or string was used was recorded, along with the duration of each bout. In both studies, gaitscores (0 (perfect) to 5 (unable to walk)) and latency to lie (measured in seconds from when a bird had been encouraged tostand) were recorded in 25 birds in each house each week. Farm and abattoir records were also used in both studies todetermine the number of birds culled for leg and other problems, mortality levels, slaughter weights, and levels of pododermatitis and hock burn. Data were analysed using SPSS (version 20.0) and treatment and age effects on behaviouralparameters were determined in normally distributed data using ANOVA (‘Straw bale density*week’, or‘string*perches*week’ as appropriate), and in non-normally distributed data using Kuskall-Wallace tests (P<0.05 forsignificance) . Treatment (but not age) effects on performance and health data were determined using the same testsdepending on normality of data.
Results Average slaughter weight, and levels of mortality, culling, hock burn and pododermatitis were not affected bytreatment in either study (P<0.05). In Study 1 straw bale (SB) density had no significant effect on the frequency orduration of behaviours including standing, walking, ground pecking, dust bathing, pecking at bales or aggression, or onaverage gait score (P>0.05). However, the average latency to lie was greater when fewer SB were provided (30SB 23.38s,45SB 18.62s, P<0.01). In Study 2 there was an interaction between perches (Pe) and age in lying behaviour, with higherpercentages of birds observed lying in the Pe treatment during weeks 4 and 5 (week 3 +Pe 77.0 -Pe 80.9, week 4 +Pe 79.5 -Pe 75.2, week 5 +Pe 78.4 -Pe 76.2, P<0.02). There was also a significant interaction between string (S) and age inlocomotory behaviour, with higher percentages of birds observed in locomotion in the string treatment during week 3 butnot weeks 4 and 5 (week 3 +S 4.9 -S 3.9, week 4 +S 3.3 -S 3.7, week 5 +S 2.6 -S 2.8, P<0.04). There was also aninteraction between S and age in average gait scores, with lower gait scores in the string treatment in weeks 3 and 5 (week3: +S 0.7, -S 0.9, week 4: +S 1.5, -S 1.4, week 5: +S 1.9, -S 2.0, P<0.05). On average per 25 min observation there were15.1 (±13.6) bouts of perching and 19.2 (±14.08) bouts of string pecking, lasting 117.4 (±92.7) and 4.2 (±2.0) s for perchesand string, respectively.
Conclusion Increasing straw bale levels from 1 bale/44m2 to 1 bale/29m2 floor space does not appear to lead to significantimprovements in the welfare of broilers in windowed houses. The frequent use of perches and string suggests that thesestimuli have the potential to improve welfare. Provision of string also appeared to positively influence walking ability.However, this effect was numerically small, was only shown in certain weeks and was not reflected in the latency to lie.Further research on optimum design and level of provision of enrichment items for broiler chickens is warranted. Thisshould include measures of overall levels of activity (both in the vicinity of, and away from, enrichment items).
Resumo:
The notion of educating the public through generic healthy eating messages has pervaded dietary health promotion efforts over the years and continues to do so through various media, despite little evidence for any enduring impact upon eating behaviour. There is growing evidence, however, that tailored interventions such as those that could be delivered online can be effective in bringing about healthy dietary behaviour change. The present paper brings together evidence from qualitative and quantitative studies that have considered the public perspective of genomics, nutrigenomics and personalised nutrition, including those conducted as part of the EU-funded Food4Me project. Such studies have consistently indicated that although the public hold positive views about nutrigenomics and personalised nutrition, they have reservations about the service providers' ability to ensure the secure handling of health data. Technological innovation has driven the concept of personalised nutrition forward and now a further technological leap is required to ensure the privacy of online service delivery systems and to protect data gathered in the process of designing personalised nutrition therapies.
Resumo:
Coxian phase-type distributions are becoming a popular means of representing survival times within a health care environment. They are favoured as they show a distribution as a system of phases and can allow for an easy visual representation of the rate of flow of patients through a system. Difficulties arise, however, in determining the parameter estimates of the Coxian phase-type distribution. This paper examines ways of making the fitting of the Coxian phase-type distribution less cumbersome by outlining different software packages and algorithms available to perform the fit and assessing their capabilities through a number of performance measures. The performance measures rate each of the methods and help in identifying the more efficient. Conclusions drawn from these performance measures suggest SAS to be the most robust package. It has a high rate of convergence in each of the four example model fits considered, short computational times, detailed output, convergence criteria options, along with a succinct ability to switch between different algorithms.
Resumo:
Objective: Several surveillance definitions of influenza-like illness (ILI) have been proposed, based on the presence of symptoms. Symptom data can be obtained from patients, medical records, or both. Past research has found that agreements between health record data and self-report are variable depending on the specific symptom. Therefore, we aimed to explore the implications of using data on influenza symptoms extracted from medical records, similar data collected prospectively from outpatients, and the combined data from both sources as predictors of laboratory-confirmed influenza. Methods: Using data from the Hutterite Influenza Prevention Study, we calculated: 1) the sensitivity, specificity and predictive values of individual symptoms within surveillance definitions; 2) how frequently surveillance definitions correlated to laboratory-confirmed influenza; and 3) the predictive value of surveillance definitions. Results: Of the 176 participants with reports from participants and medical records, 142 (81%) were tested for influenza and 37 (26%) were PCR positive for influenza. Fever (alone) and fever combined with cough and/or sore throat were highly correlated with being PCR positive for influenza for all data sources. ILI surveillance definitions, based on symptom data from medical records only or from both medical records and self-report, were better predictors of laboratory-confirmed influenza with higher odds ratios and positive predictive values. Discussion: The choice of data source to determine ILI will depend on the patient population, outcome of interest, availability of data source, and use for clinical decision making, research, or surveillance. © Canadian Public Health Association, 2012.
Resumo:
Background There is growing evidence linking early social and emotional wellbeing to later academic performance and various health outcomes including mental health. An economic evaluation was designed alongside the Roots of Empathy cluster-randomised trial evaluation, which is a school-based intervention for improving pupils’ social and emotional wellbeing. Exploration of the relevance of the Strengths and Diffi culties Questionnaire (SDQ) and Child Health Utility 9D (CHU9D) in school-based health economic evaluations is warranted. The SDQ is a behavioural screening questionnaire for 4–17-year-old children, consisting of a total diffi culties score, and also prosocial behaviour,
which aims to identify positive aspects of behaviour. The CHU9D is a generic preference-based health-related quality of life instrument for 7–17-year-old children.
Resumo:
Objectives
To investigate individual, household and country variation in consent to health record linkage.
Study Design and Setting
Data from 50,994 individuals aged 16-74 years recruited to wave 1 of a large UK general purpose household survey (January 2009 – December 2010) were analysed using multi-level logistic regression models.
Results
Overall, 70.7% of respondents consented to record linkage. Younger age, marriage, tenure, car ownership and education were all significantly associated with consent, though there was little deviation from 70% in subgroups defined by these variables. There were small increases in consent rates in individuals with poor health when defined by self-reported long term limiting illness (adjusted OR 1.11; 95%CIs 1.06, 1.16), less so when defined by General Health Questionnaire score (adjusted OR=1.05; 95%CIs 1.00, 1.10), but the range in absolute consent rates between categories was generally less than 10%. Larger differences were observed for those of non-white ethnicity who were 38% less likely to consent (adjusted OR 0.62; 95%CIs 0.59, 0.66). Consent was higher in Scotland than England (adjusted OR 1.17; 95%CIs 1.06, 1.29) but lower in Northern Ireland (adjusted OR 0.56; 95%CIs 0.50, 0.63).
Conclusion
The modest overall level of systematic bias in consent to record linkage provides reassurance for record linkage potential in general purpose household surveys. However, the low consent rates amongst non-white ethnic minority survey respondents will further compound their low survey participation rates. The reason for the country-level variation requires further study.
Resumo:
Using new biomarker data from the 2010 pilot round of the Longitudinal Aging Study in India (LASI), we investigate education, gender, and state-level disparities in health. We find that hemoglobin level, a marker for anemia, is lower for respondents with no schooling (0.7 g/dL less in the adjusted model) compared to those with some formal education and is also lower for females than for males (2.0 g/dL less in the adjusted model). In addition, we find that about one third of respondents in our sample aged 45 or older have high C-reaction protein (CRP) levels (>3 mg/L), an indicator of inflammation and a risk factor for cardiovascular disease. We find no evidence of educational or gender differences in CRP, but there are significant state-level disparities, with Kerala residents exhibiting the lowest CRP levels (a mean of 1.96 mg/L compared to 3.28 mg/L in Rajasthan, the state with the highest CRP). We use the Blinder–Oaxaca decomposition approach to explain group-level differences, and find that state-level disparities in CRP are mainly due to heterogeneity in the association of the observed characteristics of respondents with CRP, rather than differences in the distribution of endowments across the sampled state populations.