148 resultados para Measure of adaptability
Resumo:
It is widely recognized that 'asymptomatic' patients with coeliac disease often feel better after commencing a gluten-free diet. The aim of this study was to determine a measure of the quality of life in patients diagnosed as having coeliac disease detected both by screening and those with typical clinical symptoms.
Resumo:
Bacteria exist, in most environments, as complex, organised communities of sessile cells embedded within a matrix of self-produced, hydrated extracellular polymeric substances known as biofilms. Bacterial biofilms represent a ubiquitous and predominant cause of both chronic infections and infections associated with the use of indwelling medical devices such as catheters and prostheses. Such infections typically exhibit significantly enhanced tolerance to antimicrobial, biocidal and immunological challenge. This renders them difficult, sometimes impossible, to treat using conventional chemotherapeutic agents. Effective alternative approaches for prevention and eradication of biofilm associated chronic and device-associated infections are therefore urgently required. Atmospheric pressure non-thermal plasmas are gaining increasing attention as a potential approach for the eradication and control of bacterial infection and contamination. To date, however, the majority of studies have been conducted with reference to planktonic bacteria and rather less attention has been directed towards bacteria in the biofilm mode of growth. In this study, the activity of a kilohertz-driven atmospheric pressure non-thermal plasma jet, operated in a helium oxygen mixture, against Pseudomonas aeruginosa in vitro biofilms was evaluated. Pseudomonas aeruginosa biofilms exhibit marked susceptibility to exposure of the plasma jet effluent, following even relatively short (~10's s) exposure times. Manipulation of plasma operating conditions, for example, plasma operating frequency, had a significant effect on the bacterial inactivation rate. Survival curves exhibit a rapid decline in the number of surviving cells in the first 60 seconds followed by slower rate of cell number reduction. Excellent anti-biofilm activity of the plasma jet was also demonstrated by both confocal scanning laser microscopy and metabolism of the tetrazolium salt, XTT, a measure of bactericidal activity.
Resumo:
Biodiversity may be seen as a scientific measure of the complexity of a biological system, implying an information basis. Complexity cannot be directly valued, so economists have tried to define the services it provides, though often just valuing the services of 'key' species. Here we provide a new definition of biodiversity as a measure of functional information, arguing that complexity embodies meaningful information as Gregory Bateson defined it. We argue that functional information content (FIC) is the potentially valuable component of total (algorithmic) information content (AIC), as it alone determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. Establishing substitutability is an essential foundation for valuation. From it, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and the production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science. © 2012 Elsevier B.V.
Resumo:
Neuropsychological outcome at 14 to 15 years of age of a cohort of 75 participants (39 male, 36 female) born at <33 weeks' gestation was investigated. Research was conducted parallel to a recent MRI study by Stewart and colleagues which reported that 55% of this cohort had evidence of brain abnormality. One aim of the studs was to compare neuropsychological function in those very preterm children with and without MRI abnormality. Compared to a control sample of term adolescents, very preterm participants had impairment only on a measure of word production. On measures of attention, memory, perceptual skill, and visuomotor and executive function, the adolescents born very preterm performed in the normal range, whether or not they had evidence of MRI abnormality. Our findings are encouraging as the neuropsychological consequences of damage to the very preterm brain, still evident on MRI at 14 to 15 years of age, appear to be minor.
Resumo:
Objective: The Schizophrenia Psychiatric Genome-wide Association (GWAS) Consortium recently reported on five novel schizophrenia susceptibility loci. The most significant finding mapped to a micro-RNA, MIR-137, which may be involved in regulating the function of other schizophrenia and bipolar disorder susceptibility genes. Method: We genotyped 821 patients with confirmed DSM-IV diagnoses of schizophrenia, bipolar affective disorder I and schizoaffective disorder for the risk SNP (rs1625579) and investigated the clinical profiles of risk allele carriers using a within-case design. We also assessed neurocognitive performance in a subset of cases (n=399) and controls (n=171). Results: Carriers of the risk allele had lower scores for an OPCRIT-derived positive symptom factor (p=0.04) and lower scores on a lifetime measure of psychosis incongruity (p=0.017). Risk allele carriers also had more cognitive deficits involving episodic memory and attentional control. Conclusion: This is the first evidence that the MIR-137 risk variant may be associated with a specific subgroup of psychosis patients. Although the effect of this single SNP was not clinically relevant, investigation of the impact of carrying multiple risk SNPs in the MIR-137 regulatory network on diagnosis and illness profile may be warranted. © 2012 Elsevier Ireland Ltd.
Resumo:
Poverty research has increasingly focused on persistent income poverty, both as a crucial social indicator and as a target for policy intervention. Such an approach can lead to an identification of a sub-set of poor individuals facing particularly adverse circumstances and/or distinctive problems in escaping from poverty. Here we seek to establish whether, in comparison with cross-sectional measures, persistent poverty measures also provide a better measure of exclusion from a minimally acceptable way of life and relate with other important variables in a logical fashion. Our analysis draws upon the first three waves of the ECHP and shows that a persistent poverty measure does constitute a significant improvement over its cross-sectional counterpart in the explanation of levels of deprivation. Persistent poverty is related to life-style deprivation in a manner that comes close to being uniform across countries. The measure of persistence also conforms to our expectations of how a poverty measure should behave in that, unlike relative income poverty lines, defining the threshold level more stringently enables us to identify progressively groups of increasingly deprived respondents. Overall the persistent poverty measure constitutes a significant advance on cross-sectional income measures. However, there is clearly a great deal relating to the process of accumulation and of erosion of resources, which is not fully captured in the persistent poverty measure. In the absence of such information, there is a great deal to be said for making use of both types of indictors in formulating and evaluating policies while we continue to improve our understanding of longer-term processes.
Resumo:
This paper focuses on the mismatch between income and deprivation measures of poverty. Using the first two waves of the European Community Household Panel Survey, a measure of relative deprivation is constructed and the overlap between the relative income poor and relatively deprived is examined, There is very limited overlap with the lowest relative income threshold. The overlap increases as the income threshold is raised, but it remains true that less than half those below the 60 percent relative income line are among the most deprived. Relative deprivation is shown to be related to the persistence of income poverty, but also to a range of other resource and need factors. Income and deprivation measures each contain information that can profitably be employed to enhance our understanding of poverty and a range of other social phenomena. This is illustrated by the manner in which both income poverty and relative deprivation are associated with self-reported difficulty making ends meet.
Resumo:
In this paper we seek to explain variations in levels of deprivation between EU countries. The starting-point of our analysis is the finding that the relationship between income and life-style deprivation varies across countries. Given our understanding of the manner in which the income-deprivation mismatch may arise from the limitations of current income as a measure of command over resources, the pattern of variation seems to be consistent with our expectations of the variable degree to which welfare-state regimes achieve 'decommodification' and smooth income flows. This line of reasoning suggests that cross-national differences in deprivation might, in significant part, be due not only to variation in household and individual characteristics that are associated with disadvantage but also to the differential impact of such variables across countries and indeed welfare regimes. To test this hypothesis, we have taken advantage of the ECHP (European Community Household Panel) comparative data set in order to pursue a strategy of substituting variable names for country/welfare regime names. We operated with two broad categories of variables, tapping, respectively, needs and resources. Although both sets of factors contribute independently to our ability to predict deprivation, it is the resource factors that are crucial in reducing country effects. The extent of cross-national heterogeneity depends on specifying the social class and situation in relation to long-term unemployment of the household reference person. The impact of the structural socio-economic variables that we label 'resource factors' varies across countries in a manner that is broadly consistent with welfare regime theory and is the key factor in explaining cross-country differences in deprivation. As a consequence, European homogeneity is a great deal more evident among the advantaged than the disadvantaged.
Resumo:
Background: The consumption of maize highly contaminated with carcinogenic fumonisins has been linked to high oesophageal cancer rates. The aim of this study was to validate a urinary fumonisin B-1 (UFB1) biomarker as a measure of fumonisin exposure and to investigate the reduction in exposure following a simple and culturally acceptable intervention.
Methods: At baseline home-grown maize, maize-based porridge, and first-void urine samples were collected from female participants (n = 22), following their traditional food practices in Centane, South Africa. During intervention the participants were trained to recognize and remove visibly infected kernels, and to wash the remaining kernels. Participants consumed the porridge prepared from the sorted and washed maize on each day of the two-day intervention. Porridge, maize, and urine samples were collected for FB1 analyses.
Results: The geometric mean (95% confidence interval) for FB1 exposure based on porridge (dry weight) consumption at baseline and following intervention was 4.84 (2.87-8.14) and 1.87 (1.40-2.51) mg FB1/kg body weight/day, respectively, (62% reduction, P < 0.05). UFB1C, UFB1 normalized for creatinine, was reduced from 470 (295-750) at baseline to 279 (202-386) pg/mg creatinine following intervention (41% reduction, P = 0.06). The UFB1C biomarker was positively correlated with FB1 intake at the individual level (r - 0.4972, P < 0.01). Urinary excretion of FB1 was estimated to be 0.075% (0.054%-0.104%) of the FB1 intake.
Conclusion: UFB1 reflects individual FB1 exposure and thus represents a valuable biomarker for future fumonisin risk assessment.
Impact: The simple intervention method, hand sorting and washing, could positively impact on food safety and health in communities exposed to fumonisins. Cancer Epidemiol Biomarkers Prev; 20(3); 483-9. (C)2011 AACR.
Resumo:
Background Dietary exposure to high levels of the fungal toxin, aflatoxin, occurs in West Africa, where long-term crop storage facilitates fungal growth.
Methods We conducted a cross-sectional study in Benin and Togo to investigate aflatoxin exposure in children around the time of weaning and correlated these data with food consumption, socioeconomic status, agro-ecological zone of residence, and anthropometric measures. Blood samples from 479 children (age 9 months to 5 years) from 16 villages in four agro-ecological zones were assayed for aflatoxin-albumin adducts (AF-alb) as a measure of recent past (2-3 months) exposure.
Results Aflatoxin-albumin adducts were detected in 475/479 (99%) children (geometric mean 32.8 pg/mg, 95% CI: 25.3-42.5). Adduct levels varied markedly across agro-ecological zones with mean levels being approximately four times higher in the central than in the northern region. The AF-alb level increased with age up to 3 years, and within the 1-3 year age group was significantly (P=0.0001) related to weaning status; weaned children had approximately twofold higher mean AF-alb adduct levels (38 pg AF-lysine equivalents per mg of albumin [pg/mg]) than those receiving a mixture of breast milk and solid foods after adjustment for age, sex, agro-ecological zone, and socioeconomic status. A higher frequency of maize consumption, but not groundnut consumption, by the child in the preceding week was correlated with higher AF-alb adduct level. We previously reported that the prevalence of stunted growth (height for age Z-score HAZ) and being underweight (weight for age Z-score WAZ) were 33% and 29% respectively by World Health Organziation criteria. Children in these two categories had 30-40% higher mean AF-alb levels than the remainder of the children and strong dose- response relationships were observed between AF-alb levels and the extent of stunting and being underweight.
Conclusions Exposure to this common toxic contaminant of West African food increases markedly following weaning and exposure early in life is associated with reduced growth. These observations reinforce the need for aflatoxin exposure intervention strategies within high-risk countries, possibly targeted specifically at foods used in the post-weaning period.
Resumo:
Exposure assessment is a critical part of epidemiological studies into the effect of mycotoxins on human health. Whilst exposure assessment can be made by estimating the quantity of ingested toxins from food analysis and questionnaire data, the use of biological markers (biomarkers) of exposure can provide a more accurate measure of individual level of exposure in reflecting the internal dose. Biomarkers of exposure can include the excreted toxin or its metabolites, as well as the products of interaction between the toxin and macromolecules such as protein and DNA. Samples in which biomarkers may be analysed include urine, blood, other body fluids and tissues, with urine and blood being the most accessible for human studies. Here we describe the development of biomarkers of exposure for the assessment of three important mycotoxins; aflatoxin, fumonisin and deoxynivalenol. A number of different biomarkers and methods have been developed that can be applied to human population studies, and these approaches are reviewed in the context of their application to molecular epidemiology research.
Resumo:
Heart rate (HR) has been widely studied as a measure of an individual's response to painful stimuli. It remains unclear whether changes in mean HR or the variability of HR are specifically related to the noxious stimulus (i.e. pain). Neither is it well understood how such changes reflect underlying neurologic control mechanisms that produce these responses, or how these mechanisms change during the first year of life. To study the changes in cardiac autonomic modulation that occur with acute pain and with age during early infancy, the relationship between respiratory activity and short-term variations of HR (i.e. respiratory sinus arrhythmia) was quantified in a longitudinal study of term born healthy infants who underwent a finger lance blood collection at 4 months of age (n = 24) and again at 8 months of age (n = 20). Quantitative respiratory activity and HR were obtained during baseline, lance, and recovery periods. Time and frequency domain analyses from 2.2-min epochs of data yielded mean values, spectral measures of low (0.04-0.15 Hz) and high (0.15-0.80 Hz) frequency power (LF and HF), and the LF/HF ratio. To determine sympathetic and parasympathetic cardiac activity, the transfer relation between respiration and HR was used. At both 4 and 8 months, mean HR increased significantly with the noxious event (p > 0.01). There were age-related differences in the pattern of LF, HF, and LF/HF ratio changes. Although these parameters all decreased (p > 0.01) at 4 months, LF and LF/HF increased at 8 months and at 8 months HF remained stable in response to the noxious stimulus. Transfer gain changes with the lance demonstrated a change from predominant vagal baseline to a sympathetic condition at both ages. The primary finding of this study is that a response to an acute noxious stimulus appears to produce an increase in respiratory-related sympathetic HR control and a significant decrease in respiratory-related parasympathetic control at both 4 and 8 months. Furthermore, with increasing age, the sympathetic and parasympathetic changes appear to be less intense, but more sustained.
Resumo:
The Temporal Focus Scale (TFS) is a 12-item self-report measure of cognitive engagement with the temporal domains of past, present and future. Developed in college student samples, a three-factor structure with adequate reliability and validity was documented in a series of independent studies. We tested the factor structure of the scale in a sample of Northern Irish adolescents and found that our data supported a three factor structure, although there were problems with item 10. Because time perspective measures have been found to relate differentially to various health behaviours, we tested the relations between scores on the TFS and self-reported alcohol use. Results showed that scores on the TFS were not consistent statistical predictors of drinking category in a logistic regression. Results are discussed in terms of scale development, future scale use and the assessment of health-compromising behaviours such as adolescent alcohol consumption. © 2012 The Foundation for Professionals in Services for Adolescents.
Resumo:
PURPOSE: To identify vision Patient-Reported Outcomes instruments relevant to glaucoma and assess their content validity.
METHODS: MEDLINE, MEDLINE in Process, EMBASE and SCOPUS (to January 2009) were systematically searched. Observational studies or randomised controlled trials, published in English, reporting use of vision instruments in glaucoma studies involving adults were included. In addition, reference lists were scanned to identify additional studies describing development and/or validation to ascertain the final version of the instruments. Instruments' content was then mapped onto a theoretical framework, the World Health Organization International Classification of Functioning, Disability and Health. Two reviewers independently evaluated studies for inclusion and quality assessed instrument content.
RESULTS: Thirty-three instruments were identified. Instruments were categorised into thirteen vision status, two vision disability, one vision satisfaction, five glaucoma status, one glaucoma medication related to health status, five glaucoma medication side effects and six glaucoma medication satisfaction measures according to each instruments' content. The National Eye Institute Visual Function Questionnaire-25, Impact of Vision Impairment and Treatment Satisfaction Survey-Intraocular Pressure had the highest number of positive ratings in the content validity assessment.
CONCLUSION: This study provides a descriptive catalogue of vision-specific PRO instruments, to inform the choice of an appropriate measure of patient-reported outcomes in a glaucoma context.
Resumo:
Objective: To evaluate the quality of reporting of diagnostic accuracy studies using optical coherence tomography (OCT) in glaucoma. Design: Descriptive series of published studies. Participants: Published studies reporting a measure of the diagnostic accuracy of OCT for glaucoma. Methods: Review of English language papers reporting measures of diagnostic accuracy of OCT for glaucoma. Papers were identified from a Medline literature search performed in June 2006. Articles were appraised using the 25 items provided by the Standards for Reporting of Diagnostic Accuracy (STARD) initiative. Each item was recorded as full, partially, or not reported. Main Outcome Measures: Degree of compliance with the STARD guidelines. Results: Thirty papers were appraised. Eight papers (26.7%) fully reported more than half of the STARD items. The lowest number of fully reported items in a study was 5 and the highest was 17. Descriptions of key aspects of methodology frequently were missing. For example, details of participant sampling (e.g., consecutive or random selection) were described in only 8 (26.7%) of 30 publications. Measures of statistical uncertainty were reported in 18 (60%) of 30 publications. No single STARD item was fully reported by all the papers. Conclusions: The standard of reporting of diagnostic accuracy studies in glaucoma using OCT was suboptimal. It is hoped that adoption of the STARD guidelines will lead to an improvement in reporting of diagnostic accuracy studies, enabling clearer evidence to be produced for the usefulness of OCT for the diagnosis of glaucoma. © 2007 American Academy of Ophthalmology.