918 resultados para Measure of riskiness


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we seek to explain variations in levels of deprivation between EU countries. The starting-point of our analysis is the finding that the relationship between income and life-style deprivation varies across countries. Given our understanding of the manner in which the income-deprivation mismatch may arise from the limitations of current income as a measure of command over resources, the pattern of variation seems to be consistent with our expectations of the variable degree to which welfare-state regimes achieve 'decommodification' and smooth income flows. This line of reasoning suggests that cross-national differences in deprivation might, in significant part, be due not only to variation in household and individual characteristics that are associated with disadvantage but also to the differential impact of such variables across countries and indeed welfare regimes. To test this hypothesis, we have taken advantage of the ECHP (European Community Household Panel) comparative data set in order to pursue a strategy of substituting variable names for country/welfare regime names. We operated with two broad categories of variables, tapping, respectively, needs and resources. Although both sets of factors contribute independently to our ability to predict deprivation, it is the resource factors that are crucial in reducing country effects. The extent of cross-national heterogeneity depends on specifying the social class and situation in relation to long-term unemployment of the household reference person. The impact of the structural socio-economic variables that we label 'resource factors' varies across countries in a manner that is broadly consistent with welfare regime theory and is the key factor in explaining cross-country differences in deprivation. As a consequence, European homogeneity is a great deal more evident among the advantaged than the disadvantaged.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The consumption of maize highly contaminated with carcinogenic fumonisins has been linked to high oesophageal cancer rates. The aim of this study was to validate a urinary fumonisin B-1 (UFB1) biomarker as a measure of fumonisin exposure and to investigate the reduction in exposure following a simple and culturally acceptable intervention.

Methods: At baseline home-grown maize, maize-based porridge, and first-void urine samples were collected from female participants (n = 22), following their traditional food practices in Centane, South Africa. During intervention the participants were trained to recognize and remove visibly infected kernels, and to wash the remaining kernels. Participants consumed the porridge prepared from the sorted and washed maize on each day of the two-day intervention. Porridge, maize, and urine samples were collected for FB1 analyses.

Results: The geometric mean (95% confidence interval) for FB1 exposure based on porridge (dry weight) consumption at baseline and following intervention was 4.84 (2.87-8.14) and 1.87 (1.40-2.51) mg FB1/kg body weight/day, respectively, (62% reduction, P < 0.05). UFB1C, UFB1 normalized for creatinine, was reduced from 470 (295-750) at baseline to 279 (202-386) pg/mg creatinine following intervention (41% reduction, P = 0.06). The UFB1C biomarker was positively correlated with FB1 intake at the individual level (r - 0.4972, P < 0.01). Urinary excretion of FB1 was estimated to be 0.075% (0.054%-0.104%) of the FB1 intake.

Conclusion: UFB1 reflects individual FB1 exposure and thus represents a valuable biomarker for future fumonisin risk assessment.

Impact: The simple intervention method, hand sorting and washing, could positively impact on food safety and health in communities exposed to fumonisins. Cancer Epidemiol Biomarkers Prev; 20(3); 483-9. (C)2011 AACR.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Dietary exposure to high levels of the fungal toxin, aflatoxin, occurs in West Africa, where long-term crop storage facilitates fungal growth.

Methods We conducted a cross-sectional study in Benin and Togo to investigate aflatoxin exposure in children around the time of weaning and correlated these data with food consumption, socioeconomic status, agro-ecological zone of residence, and anthropometric measures. Blood samples from 479 children (age 9 months to 5 years) from 16 villages in four agro-ecological zones were assayed for aflatoxin-albumin adducts (AF-alb) as a measure of recent past (2-3 months) exposure.

Results Aflatoxin-albumin adducts were detected in 475/479 (99%) children (geometric mean 32.8 pg/mg, 95% CI: 25.3-42.5). Adduct levels varied markedly across agro-ecological zones with mean levels being approximately four times higher in the central than in the northern region. The AF-alb level increased with age up to 3 years, and within the 1-3 year age group was significantly (P=0.0001) related to weaning status; weaned children had approximately twofold higher mean AF-alb adduct levels (38 pg AF-lysine equivalents per mg of albumin [pg/mg]) than those receiving a mixture of breast milk and solid foods after adjustment for age, sex, agro-ecological zone, and socioeconomic status. A higher frequency of maize consumption, but not groundnut consumption, by the child in the preceding week was correlated with higher AF-alb adduct level. We previously reported that the prevalence of stunted growth (height for age Z-score HAZ) and being underweight (weight for age Z-score WAZ) were 33% and 29% respectively by World Health Organziation criteria. Children in these two categories had 30-40% higher mean AF-alb levels than the remainder of the children and strong dose- response relationships were observed between AF-alb levels and the extent of stunting and being underweight.

Conclusions Exposure to this common toxic contaminant of West African food increases markedly following weaning and exposure early in life is associated with reduced growth. These observations reinforce the need for aflatoxin exposure intervention strategies within high-risk countries, possibly targeted specifically at foods used in the post-weaning period.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exposure assessment is a critical part of epidemiological studies into the effect of mycotoxins on human health. Whilst exposure assessment can be made by estimating the quantity of ingested toxins from food analysis and questionnaire data, the use of biological markers (biomarkers) of exposure can provide a more accurate measure of individual level of exposure in reflecting the internal dose. Biomarkers of exposure can include the excreted toxin or its metabolites, as well as the products of interaction between the toxin and macromolecules such as protein and DNA. Samples in which biomarkers may be analysed include urine, blood, other body fluids and tissues, with urine and blood being the most accessible for human studies. Here we describe the development of biomarkers of exposure for the assessment of three important mycotoxins; aflatoxin, fumonisin and deoxynivalenol. A number of different biomarkers and methods have been developed that can be applied to human population studies, and these approaches are reviewed in the context of their application to molecular epidemiology research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Heart rate (HR) has been widely studied as a measure of an individual's response to painful stimuli. It remains unclear whether changes in mean HR or the variability of HR are specifically related to the noxious stimulus (i.e. pain). Neither is it well understood how such changes reflect underlying neurologic control mechanisms that produce these responses, or how these mechanisms change during the first year of life. To study the changes in cardiac autonomic modulation that occur with acute pain and with age during early infancy, the relationship between respiratory activity and short-term variations of HR (i.e. respiratory sinus arrhythmia) was quantified in a longitudinal study of term born healthy infants who underwent a finger lance blood collection at 4 months of age (n = 24) and again at 8 months of age (n = 20). Quantitative respiratory activity and HR were obtained during baseline, lance, and recovery periods. Time and frequency domain analyses from 2.2-min epochs of data yielded mean values, spectral measures of low (0.04-0.15 Hz) and high (0.15-0.80 Hz) frequency power (LF and HF), and the LF/HF ratio. To determine sympathetic and parasympathetic cardiac activity, the transfer relation between respiration and HR was used. At both 4 and 8 months, mean HR increased significantly with the noxious event (p > 0.01). There were age-related differences in the pattern of LF, HF, and LF/HF ratio changes. Although these parameters all decreased (p > 0.01) at 4 months, LF and LF/HF increased at 8 months and at 8 months HF remained stable in response to the noxious stimulus. Transfer gain changes with the lance demonstrated a change from predominant vagal baseline to a sympathetic condition at both ages. The primary finding of this study is that a response to an acute noxious stimulus appears to produce an increase in respiratory-related sympathetic HR control and a significant decrease in respiratory-related parasympathetic control at both 4 and 8 months. Furthermore, with increasing age, the sympathetic and parasympathetic changes appear to be less intense, but more sustained.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Temporal Focus Scale (TFS) is a 12-item self-report measure of cognitive engagement with the temporal domains of past, present and future. Developed in college student samples, a three-factor structure with adequate reliability and validity was documented in a series of independent studies. We tested the factor structure of the scale in a sample of Northern Irish adolescents and found that our data supported a three factor structure, although there were problems with item 10. Because time perspective measures have been found to relate differentially to various health behaviours, we tested the relations between scores on the TFS and self-reported alcohol use. Results showed that scores on the TFS were not consistent statistical predictors of drinking category in a logistic regression. Results are discussed in terms of scale development, future scale use and the assessment of health-compromising behaviours such as adolescent alcohol consumption. © 2012 The Foundation for Professionals in Services for Adolescents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: To identify vision Patient-Reported Outcomes instruments relevant to glaucoma and assess their content validity.

METHODS: MEDLINE, MEDLINE in Process, EMBASE and SCOPUS (to January 2009) were systematically searched. Observational studies or randomised controlled trials, published in English, reporting use of vision instruments in glaucoma studies involving adults were included. In addition, reference lists were scanned to identify additional studies describing development and/or validation to ascertain the final version of the instruments. Instruments' content was then mapped onto a theoretical framework, the World Health Organization International Classification of Functioning, Disability and Health. Two reviewers independently evaluated studies for inclusion and quality assessed instrument content.

RESULTS: Thirty-three instruments were identified. Instruments were categorised into thirteen vision status, two vision disability, one vision satisfaction, five glaucoma status, one glaucoma medication related to health status, five glaucoma medication side effects and six glaucoma medication satisfaction measures according to each instruments' content. The National Eye Institute Visual Function Questionnaire-25, Impact of Vision Impairment and Treatment Satisfaction Survey-Intraocular Pressure had the highest number of positive ratings in the content validity assessment.

CONCLUSION: This study provides a descriptive catalogue of vision-specific PRO instruments, to inform the choice of an appropriate measure of patient-reported outcomes in a glaucoma context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To evaluate the quality of reporting of diagnostic accuracy studies using optical coherence tomography (OCT) in glaucoma. Design: Descriptive series of published studies. Participants: Published studies reporting a measure of the diagnostic accuracy of OCT for glaucoma. Methods: Review of English language papers reporting measures of diagnostic accuracy of OCT for glaucoma. Papers were identified from a Medline literature search performed in June 2006. Articles were appraised using the 25 items provided by the Standards for Reporting of Diagnostic Accuracy (STARD) initiative. Each item was recorded as full, partially, or not reported. Main Outcome Measures: Degree of compliance with the STARD guidelines. Results: Thirty papers were appraised. Eight papers (26.7%) fully reported more than half of the STARD items. The lowest number of fully reported items in a study was 5 and the highest was 17. Descriptions of key aspects of methodology frequently were missing. For example, details of participant sampling (e.g., consecutive or random selection) were described in only 8 (26.7%) of 30 publications. Measures of statistical uncertainty were reported in 18 (60%) of 30 publications. No single STARD item was fully reported by all the papers. Conclusions: The standard of reporting of diagnostic accuracy studies in glaucoma using OCT was suboptimal. It is hoped that adoption of the STARD guidelines will lead to an improvement in reporting of diagnostic accuracy studies, enabling clearer evidence to be produced for the usefulness of OCT for the diagnosis of glaucoma. © 2007 American Academy of Ophthalmology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background
Inappropriate polypharmacy is a particular concern in older people and is associated with negative health outcomes. Choosing the best interventions to improve appropriate polypharmacy is a priority, hence interest in appropriate polypharmacy, where many medicines may be used to achieve better clinical outcomes for patients, is growing.

Objectives
This review sought to determine which interventions, alone or in combination, are effective in improving the appropriate use of polypharmacy and reducing medication-related problems in older people.

Search methods
In November 2013, for this first update, a range of literature databases including MEDLINE and EMBASE were searched, and handsearching of reference lists was performed. Search terms included 'polypharmacy', 'medication appropriateness' and 'inappropriate prescribing'.

Selection criteria
A range of study designs were eligible. Eligible studies described interventions affecting prescribing aimed at improving appropriate polypharmacy in people 65 years of age and older in which a validated measure of appropriateness was used (e.g. Beers criteria, Medication Appropriateness Index (MAI)).

Data collection and analysis
Two review authors independently reviewed abstracts of eligible studies, extracted data and assessed risk of bias of included studies. Study-specific estimates were pooled, and a random-effects model was used to yield summary estimates of effect and 95% confidence intervals (CIs). The GRADE (Grades of Recommendation, Assessment, Development and Evaluation) approach was used to assess the overall quality of evidence for each pooled outcome.

Main results
Two studies were added to this review to bring the total number of included studies to 12. One intervention consisted of computerised decision support; 11 complex, multi-faceted pharmaceutical approaches to interventions were provided in a variety of settings. Interventions were delivered by healthcare professionals, such as prescribers and pharmacists. Appropriateness of prescribing was measured using validated tools, including the MAI score post intervention (eight studies), Beers criteria (four studies), STOPP criteria (two studies) and START criteria (one study). Interventions included in this review resulted in a reduction in inappropriate medication usage. Based on the GRADE approach, the overall quality of evidence for all pooled outcomes ranged from very low to low. A greater reduction in MAI scores between baseline and follow-up was seen in the intervention group when compared with the control group (four studies; mean difference -6.78, 95% CI -12.34 to -1.22). Postintervention pooled data showed a lower summated MAI score (five studies; mean difference -3.88, 95% CI -5.40 to -2.35) and fewer Beers drugs per participant (two studies; mean difference -0.1, 95% CI -0.28 to 0.09) in the intervention group compared with the control group. Evidence of the effects of interventions on hospital admissions (five studies) and of medication-related problems (six studies) was conflicting.

Authors' conclusions
It is unclear whether interventions to improve appropriate polypharmacy, such as pharmaceutical care, resulted in clinically significant improvement; however, they appear beneficial in terms of reducing inappropriate prescribing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In conditional probabilistic logic programming, given a query, the two most common forms for answering the query are either a probability interval or a precise probability obtained by using the maximum entropy principle. The former can be noninformative (e.g.,interval [0; 1]) and the reliability of the latter is questionable when the priori knowledge isimprecise. To address this problem, in this paper, we propose some methods to quantitativelymeasure if a probability interval or a single probability is sufficient for answering a query. We first propose an approach to measuring the ignorance of a probabilistic logic program with respect to a query. The measure of ignorance (w.r.t. a query) reflects howreliable a precise probability for the query can be and a high value of ignorance suggests that a single probability is not suitable for the query. We then propose a method to measure the probability that the exact probability of a query falls in a given interval, e.g., a second order probability. We call it the degree of satisfaction. If the degree of satisfaction is highenough w.r.t. the query, then the given interval can be accepted as the answer to the query. We also prove our measures satisfy many properties and we use a case study to demonstrate the significance of the measures. © Springer Science+Business Media B.V. 2012

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Consumption of arsenic (As) wine is a traditional activity during the classic Chinese festival of Duanwu, colloquially known worldwide as the Dragon Boat Day. Arsenic wine is drunk on the morning of the fifth day of the fifth lunar calendar month to commemorate the death of Qu Yuan, a famed Chinese poet who drowned himself in protest of a corrupt government, and to protect against ill fortune. Although realgar minerals are characteristically composed of sparingly soluble tetra-arsenic tetra-sulfides (As(4)S(4)), purity does vary with up to 10% of As being present as non-sulfur bound species, such as arsenate (As(v)) and arsenite (As(III)). Despite, the renewed interest in As speciation and the bioaccessibility of the active As components in realgar based Chinese medicines, little is known about the safety surrounding the cultural practice of drinking As wine. In a series of experiments the speciation and solubility of As in a range of wines were investigated. Furthermore, a simulated gastrointestinal system was employed to predict the impact of digestive processes on As bioavailability. The predominant soluble As species found in all the wines were As(III) and As(v). Based on typical As wine recipes employing 0.1 g realgar mL(-1) wine, the concentration of dissolved As ranged from ca. 100 to 400 mg L(-1) depending on the ethanol content of the preparation: with the As solubility found to be higher in wines with a lower proportion of ethanol. Based on a common 100 mL measure of wine with a concentration of 400 mg As L(-1), the amount of soluble As would equate to around half of the acute minimal lethal dose for adults. This is likely an underestimate of the bioaccessible concentration, as a three-fold increase in bioaccessibility could be observed in the intestinal phase based on the results from the stimulated gastrointestinal system. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: Depressive symptoms in schizophrenia have previously been associated with a perceived lack of social support. The aim of this study was to explore the relationship between perceived social support and depressive symptoms in schizophrenia; to assess the psychological wellbeing of their carers; and to examine the quality of the relationship between the patients and their carers. Method: Individuals with schizophrenia (n = 17) were assessed on the Beck Depression Inventory (BDI), the Beck Hopelessness Scale (BHS), a measure of perceived social support, the Significant Others Scale (SOS) and the Quality of Relationship Inventory (QRI). Results: The mean score on the BDI for patients fell within the moderate-severe range and the mean range on the BHS fell within the moderate range. Family and friends were perceived as supportive resources by patients. There was no significant relationship between patient epressive symptoms or hopelessness and perceived social support. Carers of patients did not report high rates of depressive symptoms or hopelessness. Conclusions: These findings do not support the previous finding of an association between depressive symptoms and a perceived lack of social support in schizophrenia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Anger may be more responsive than disgust to mitigating circumstances in judgments of wrongdoing. We tested this hypothesis in two studies where we had participants envision circumstances that could serve to mitigate an otherwise wrongful act. In Study 1, participants provided moral judgments, and ratings of anger and disgust, to a number of transgressions involving either harm or bodily purity. They were then asked to imagine and report whether there might be any circumstances that would make it all right to perform the act. Across transgression type, and controlling for covariance between anger and disgust, levels of anger were found to negatively predict the envisioning of mitigating circumstances for wrongdoing, while disgust was unrelated. Study 2 replicated and extended these findings to less serious transgressions, using a continuous measure of mitigating circumstances, and demonstrated the impact of
anger independent of deontological commitments. These findings highlight the differential relationship that anger and disgust have with the ability to envision mitigating factors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Collagen molecules in articular cartilage have an exceptionally long lifetime, which makes them susceptible to the accumulation of advanced glycation end products (AGEs). In fact, in comparison to other collagen-rich tissues, articular cartilage contains relatively high amounts of the AGE pentosidine. To test the hypothesis that this higher AGE accumulation is primarily the result of the slow turnover of cartilage collagen, AGE levels in cartilage and skin collagen were compared with the degree of racemization of aspartic acid (% d-Asp, a measure of the residence time of a protein). AGE (N(epsilon)-(carboxymethyl)lysine, N(epsilon)-(carboxyethyl)lysine, and pentosidine) and % d-Asp concentrations increased linearly with age in both cartilage and skin collagen (p <0.0001). The rate of increase in AGEs was greater in cartilage collagen than in skin collagen (p <0.0001). % d-Asp was also higher in cartilage collagen than in skin collagen (p <0.0001), indicating that cartilage collagen has a longer residence time in the tissue, and thus a slower turnover, than skin collagen. In both types of collagen, AGE concentrations increased linearly with % d-Asp (p <0.0005). Interestingly, the slopes of the curves of AGEs versus % d-Asp, i.e. the rates of accumulation of AGEs corrected for turnover, were identical for cartilage and skin collagen. The present study thus provides the first experimental evidence that protein turnover is a major determinant in AGE accumulation in different collagen types. From the age-related increases in % d-Asp the half-life of cartilage collagen was calculated to be 117 years and that of skin collagen 15 years, thereby providing the first reasonable estimates of the half-lives of these collagens.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ambisonics is a scalable spatial audio technique that attempts to present a sound scene to listeners over as large an area as possi- ble. A localisation experiment was carried out to investigate the performance of a first and third order system at three listening positions - one in the centre and two off-centre. The test used a reverse target-pointer adjustment method to determine the error, both signed and absolute, for each combination of listening posi- tion and system. The signed error was used to indicate the direc- tion and magnitude of the shifts in panning angle introduced for the off-centre listening positions. The absolute error was used as a measure of the performance of the listening position and systems combinations for a comparison of their overall performance. A comparison was made between the degree of image shifting be- tween the two systems and the robustness of their off-centre per- formance.