980 resultados para approach bias
Resumo:
Previous research has shown that action tendencies to approach alcohol may be modified using computerized ApproacheAvoidance Task (AAT), and that this impacted on subsequent consumption. A recent paper in this journal (Becker, Jostman, Wiers, & Holland, 2015) failed to show significant training effects for food in three studies: Nor did it find effects on subsequent consumption. However, avoidance training to high calorie foods was tested against a control rather than Approach training. The present study used a more comparable paradigm to the alcohol studies. It randomly assigned 90 participants to ‘approach’ or ‘avoid’ chocolate images on the AAT, and then asked them to taste and rate chocolates. A significant interaction of condition and time showed that training to avoid chocolate resulted in faster avoidance responses to chocolate images, compared with training to approach it. Consistent with Becker et al.'s Study 3, no effect was found on amounts of chocolate consumed, although a newly published study in this journal (Schumacher, Kemps, & Tiggemann, 2016) did do so. The collective evidence does not as yet provide solid basis for the application of AAT training to reduction of problematic food consumption, although clinical trials have yet to be conducted.
Resumo:
As borne out by everyday social experience, social cognition is highly dependent on context, modulated by a host of factors that arise from the social environment in which we live. While streamlined laboratory research provides excellent experimental control, it can be limited to telling us about the capabilities of the brain under artificial conditions, rather than elucidating the processes that come into play in the real world. Consideration of the impact of ecologically valid contextual cues on social cognition will improve the generalizability of social neuroscience findings also to pathology, e.g., to psychiatric illnesses. To help bridge between laboratory research and social cognition as we experience it in the real world, this thesis investigates three themes: (1) increasing the naturalness of stimuli with richer contextual cues, (2) the potentially special contextual case of social cognition when two people interact directly, and (3) a third theme of experimental believability, which runs in parallel to the first two themes. Focusing on the first two themes, in work with two patient populations, we explore neural contributions to two topics in social cognition. First, we document a basic approach bias in rare patients with bilateral lesions of the amygdala. This finding is then related to the contextual factor of ambiguity, and further investigated together with other contextual cues in a sample of healthy individuals tested over the internet, finally yielding a hierarchical decision tree for social threat evaluation. Second, we demonstrate that neural processing of eye gaze in brain structures related to face, gaze, and social processing is differently modulated by the direct presence of another live person. This question is investigated using fMRI in people with autism and controls. Across a range of topics, we demonstrate that two themes of ecological validity — integration of naturalistic contextual cues, and social interaction — influence social cognition, that particular brain structures mediate this processing, and that it will be crucial to study interaction in order to understand disorders of social interaction such as autism.
Resumo:
General circulation models (GCMs) are routinely used to simulate future climatic conditions. However, rainfall outputs from GCMs are highly uncertain in preserving temporal correlations, frequencies, and intensity distributions, which limits their direct application for downscaling and hydrological modeling studies. To address these limitations, raw outputs of GCMs or regional climate models are often bias corrected using past observations. In this paper, a methodology is presented for using a nested bias-correction approach to predict the frequencies and occurrences of severe droughts and wet conditions across India for a 48-year period (2050-2099) centered at 2075. Specifically, monthly time series of rainfall from 17 GCMs are used to draw conclusions for extreme events. An increasing trend in the frequencies of droughts and wet events is observed. The northern part of India and coastal regions show maximum increase in the frequency of wet events. Drought events are expected to increase in the west central, peninsular, and central northeast regions of India. (C) 2013 American Society of Civil Engineers.
Resumo:
This study addresses the effectivity of the Anti-Bias approach and training methodology as a pedagogical political strategy to challenge oppression among student groups in the cities of Bombay and Berlin. The Anti-Bias trainings conducted within the framework of this study also become the medium through which the perpetuation of oppressive structures by students within and outside the school is investigated. Empirical data from predominantly qualitative investigations in four secondary schools, two each in Bombay and Berlin, is studied and analysed on the basis of theoretical understandings of prejudice, discrimination and identity. This study builds on insights offered by previous research on prejudices and evaluations of anti-bias and diversity interventions, where the lack of sufficient research and thorough evaluations testing impact has been identified (Levy Paluck, 2006). The theoretical framework suggests that prejudices and discriminatory practices are learnt and performed by individuals over the years by way of pre-existing discourses, and that behaviour and practices can be unlearnt through a multi-step process. It proposes that the discursive practices of students contribute to the constitution of their viable selves and in the constitution of ‘others’. Drawing on this framework, the study demonstrates how student-subjects in Bombay and Berlin perpetuate oppressive discourses by performing their identities and performing identities onto ‘others’. Such performative constitution opens up the agency of the individual, disclosing the shifting and dynamic nature of identities. The Anti-Bias approach is posited as an alternative to oppressive discourses and a vehicle that encourages and assists the agency of individuals. The theoretical framework, which brings together a psychological approach to prejudice, a structural approach to discrimination and a poststructural approach to identity, facilitates the analysis of the perpetuation of dominant discourses by the students, as well as how they negotiate their way through familiar norms and discourses. Group discussions and interviews a year after the respective trainings serve to evaluate the agency of the students and the extent to which the training impacted on their perceptions, attitudes and behavioural practices. The study reveals the recurrence of the themes race, religion, gender and sexuality in the representational practices of the students groups in Berlin and Bombay. It demonstrates how students in this study not only perform, but also negotiate and resist oppressive structures. Of particular importance is the role of the school: When schools offer no spaces for discussion, debate and action on contemporary social issues, learning can neither be put into practice nor take on a positive, transformative form. In such cases, agency and resistance is limited and interventionist actions yield little. This study reports the potential of the Anti-Bias approach and training as a tool of political education and action in education. It demonstrates that a single training can initiate change but sustaining change requires long-term strategies and on-going actions. Taking a poststructural perspective, it makes concrete suggestions to adapt and alter the Anti-Bias approach and the implementation of Anti-Bias trainings.
Resumo:
OBJECTIVES: This contribution provides a unifying concept for meta-analysis integrating the handling of unobserved heterogeneity, study covariates, publication bias and study quality. It is important to consider these issues simultaneously to avoid the occurrence of artifacts, and a method for doing so is suggested here. METHODS: The approach is based upon the meta-likelihood in combination with a general linear nonparametric mixed model, which lays the ground for all inferential conclusions suggested here. RESULTS: The concept is illustrated at hand of a meta-analysis investigating the relationship of hormone replacement therapy and breast cancer. The phenomenon of interest has been investigated in many studies for a considerable time and different results were reported. In 1992 a meta-analysis by Sillero-Arenas et al. concluded a small, but significant overall effect of 1.06 on the relative risk scale. Using the meta-likelihood approach it is demonstrated here that this meta-analysis is due to considerable unobserved heterogeneity. Furthermore, it is shown that new methods are available to model this heterogeneity successfully. It is argued further to include available study covariates to explain this heterogeneity in the meta-analysis at hand. CONCLUSIONS: The topic of HRT and breast cancer has again very recently become an issue of public debate, when results of a large trial investigating the health effects of hormone replacement therapy were published indicating an increased risk for breast cancer (risk ratio of 1.26). Using an adequate regression model in the previously published meta-analysis an adjusted estimate of effect of 1.14 can be given which is considerably higher than the one published in the meta-analysis of Sillero-Arenas et al. In summary, it is hoped that the method suggested here contributes further to a good meta-analytic practice in public health and clinical disciplines.
Resumo:
In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zeromean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise based upon data from a well known survey is also presented. Overall, theoretical and empirical results show promise for the feasible bias-corrected average forecast.
Resumo:
In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular it delivers a zero-limiting mean-squared error if the number of forecasts and the number of post-sample time periods is sufficiently large. We also develop a zero-mean test for the average bias. Monte-Carlo simulations are conducted to evaluate the performance of this new technique in finite samples. An empirical exercise, based upon data from well known surveys is also presented. Overall, these results show promise for the bias-corrected average forecast.
Resumo:
In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zeromean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise, based upon data from a well known survey is also presented. Overall, these results show promise for the feasible bias-corrected average forecast.
Resumo:
The measurement of fast changing temperature fluctuations is a challenging problem due to the inherent limited bandwidth of temperature sensors. This results in a measured signal that is a lagged and attenuated version of the input. Compensation can be performed provided an accurate, parameterised sensor model is available. However, to account for the in influence of the measurement environment and changing conditions such as gas velocity, the model must be estimated in-situ. The cross-relation method of blind deconvolution is one approach for in-situ characterisation of sensors. However, a drawback with the method is that it becomes positively biased and unstable at high noise levels. In this paper, the cross-relation method is cast in the discrete-time domain and a bias compensation approach is developed. It is shown that the proposed compensation scheme is robust and yields unbiased estimates with lower estimation variance than the uncompensated version. All results are verified using Monte-Carlo simulations.
Resumo:
The measurement of fast changing temperature fluctuations is a challenging problem due to the inherent limited bandwidth of temperature sensors. This results in a measured signal that is a lagged and attenuated version of the input. Compensation can be performed provided an accurate, parameterised sensor model is available. However, to account for the influence of the measurement environment and changing conditions such as gas velocity, the model must be estimated in-situ. The cross-relation method of blind deconvolution is one approach for in-situ characterisation of sensors. However, a drawback with the method is that it becomes positively biased and unstable at high noise levels. In this paper, the cross-relation method is cast in the discrete-time domain and a bias compensation approach is developed. It is shown that the proposed compensation scheme is robust and yields unbiased estimates with lower estimation variance than the uncompensated version. All results are verified using Monte-Carlo simulations.
Resumo:
Background and Purpose: At least part of the failure in the transition from experimental to clinical studies in stroke has been attributed to the imprecision introduced by problems in the design of experimental stroke studies. Using a metaepidemiologic approach, we addressed the effect of randomization, blinding, and use of comorbid animals on the estimate of how effectively therapeutic interventions reduce infarct size. Methods: Electronic and manual searches were performed to identify meta-analyses that described interventions in experimental stroke. For each meta-analysis thus identified, a reanalysis was conducted to estimate the impact of various quality items on the estimate of efficacy, and these estimates were combined in a meta meta-analysis to obtain a summary measure of the impact of the various design characteristics. Results: Thirteen meta-analyses that described outcomes in 15 635 animals were included. Studies that included unblinded induction of ischemia reported effect sizes 13.1% (95% CI, 26.4% to 0.2%) greater than studies that included blinding, and studies that included healthy animals instead of animals with comorbidities overstated the effect size by 11.5% (95% CI, 21.2% to 1.8%). No significant effect was found for randomization, blinded outcome assessment, or high aggregate CAMARADES quality score. Conclusions: We provide empirical evidence of bias in the design of studies, with studies that included unblinded induction of ischemia or healthy animals overestimating the effectiveness of the intervention. This bias could account for the failure in the transition from bench to bedside of stroke therapies.
Resumo:
This paper seeks to explain the lagging productivity in Singapore’s manufacturing noted in the statements of the Economic Strategies Committee Report 2010. Two methods are employed: the Malmquist productivity to measure total factor productivity change and Simar and Wilson’s (J Econ, 136:31–64, 2007) bootstrapped truncated regression approach. In the first stage, the nonparametric data envelopment analysis is used to measure technical efficiency. To quantify the economic drivers underlying inefficiencies, the second stage employs a bootstrapped truncated regression whereby bias-corrected efficiency estimates are regressed against explanatory variables. The findings reveal that growth in total factor productivity was attributed to efficiency change with no technical progress. Most industries were technically inefficient throughout the period except for ‘Pharmaceutical Products’. Sources of efficiency were attributed to quality of worker and flexible work arrangements while incessant use of foreign workers lowered efficiency.
Resumo:
Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.