37 resultados para peer acceptance-rejection
Resumo:
This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax.
Resumo:
Aggression in young people has been associated with a bias towards attributing hostile intent to others; however, little is known about the origin of biased social information processing. The current study explored the potential role of peer contagion in the emergence of hostile attribution in adolescents. 134 adolescents were assigned to one of two manipulated ‘chat-room’ conditions, where they believed they were communicating with online peers (e-confederates) who endorsed either hostile or benign intent attributions. Adolescents showed increased hostile attributions following exposure to hostile e-confederates and reduced hostility in the benign condition. Further analyses demonstrated that social anxiety was associated with a reduced tendency to take on hostile peer attitudes. Neither gender nor levels of aggression influenced individual susceptibility to peer influence, but aggressive adolescents reported greater affinity with hostile e-confederates.
Resumo:
In Peer-to-Peer (P2P) networks, it is often desirable to assign node IDs which preserve locality relationships in the underlying topology. Node locality can be embedded into node IDs by utilizing a one dimensional mapping by a Hilbert space filling curve on a vector of network distances from each node to a subset of reference landmark nodes within the network. However this approach is fundamentally limited because while robustness and accuracy might be expected to improve with the number of landmarks, the effectiveness of 1 dimensional Hilbert Curve mapping falls for the curse of dimensionality. This work proposes an approach to solve this issue using Landmark Multidimensional Scaling (LMDS) to reduce a large set of landmarks to a smaller set of virtual landmarks. This smaller set of landmarks has been postulated to represent the intrinsic dimensionality of the network space and therefore a space filling curve applied to these virtual landmarks is expected to produce a better mapping of the node ID space. The proposed approach, the Virtual Landmarks Hilbert Curve (VLHC), is particularly suitable for decentralised systems like P2P networks. In the experimental simulations the effectiveness of the methods is measured by means of the locality preservation derived from node IDs in terms of latency to nearest neighbours. A variety of realistic network topologies are simulated and this work provides strong evidence to suggest that VLHC performs better than either Hilbert Curves or LMDS use independently of each other.
Resumo:
It is predicted that non-communicable diseases will account for over 73 % of global mortality in 2020. Given that the majority of these deaths occur in developed countries such as the UK, and that up to 80 % of chronic disease could be prevented through improvements in diet and lifestyle, it is imperative that dietary guidelines and disease prevention strategies are reviewed in order to improve their efficacy. Since the completion of the human genome project our understanding of complex interactions between environmental factors such as diet and genes has progressed considerably, as has the potential to individualise diets using dietary, phenotypic and genotypic data. Thus, there is an ambition for dietary interventions to move away from population-based guidance towards 'personalised nutrition'. The present paper reviews current evidence for the public acceptance of genetic testing and personalised nutrition in disease prevention. Health and clear consumer benefits have been identified as key motivators in the uptake of genetic testing, with individuals reporting personal experience of disease, such as those with specific symptoms, being more willing to undergo genetic testing for the purpose of personalised nutrition. This greater perceived susceptibility to disease may also improve motivation to change behaviour which is a key barrier in the success of any nutrition intervention. Several consumer concerns have been identified in the literature which should be addressed before the introduction of a nutrigenomic-based personalised nutrition service. Future research should focus on the efficacy and implementation of nutrigenomic-based personalised nutrition.
Resumo:
The hybrid Monte Carlo (HMC) method is a popular and rigorous method for sampling from a canonical ensemble. The HMC method is based on classical molecular dynamics simulations combined with a Metropolis acceptance criterion and a momentum resampling step. While the HMC method completely resamples the momentum after each Monte Carlo step, the generalized hybrid Monte Carlo (GHMC) method can be implemented with a partial momentum refreshment step. This property seems desirable for keeping some of the dynamic information throughout the sampling process similar to stochastic Langevin and Brownian dynamics simulations. It is, however, ultimate to the success of the GHMC method that the rejection rate in the molecular dynamics part is kept at a minimum. Otherwise an undesirable Zitterbewegung in the Monte Carlo samples is observed. In this paper, we describe a method to achieve very low rejection rates by using a modified energy, which is preserved to high-order along molecular dynamics trajectories. The modified energy is based on backward error results for symplectic time-stepping methods. The proposed generalized shadow hybrid Monte Carlo (GSHMC) method is applicable to NVT as well as NPT ensemble simulations.
Resumo:
Currently UK fruit and vegetable intakes are below recommendations. Bread is a staple food consumed by ~95% of adults in western countries. In addition, bread provides an ideal matrix by which functionality can be delivered to the consumer in an accepted food. Therefore, enriching bread with vegetables may be an effective strategy to increase vegetable consumption. This study evaluated consumer acceptance, purchase intent and intention of product replacement of bread enriched with red beetroot, carrot with coriander, red pepper with tomato or white beetroot (80g vegetable per serving of 200g) compared to white control bread (0g vegetable). Consumers (n=120) rated their liking of the breads overall, as well as their liking of appearance, flavour and texture using nine-point hedonic scales. Product replacement and purchase intent of the breads was rated using five-point scales. The effect of providing consumers with health information about the breads was also evaluated. There were significant differences in overall liking (P<0.0001), as well as liking of appearance (P<0.0001), flavour (P=0.0002) and texture (P=0.04), between the breads. However, the significant differences resulted from the red beetroot bread which was significantly (P<0.05) less liked compared to control bread. There were no significant differences in overall liking between any of the other vegetable-enriched breads compared with the control bread (no vegetable inclusion), apart from the red beetroot bread which was significantly less liked. The provision of health information about the breads did not increase consumer liking of the vegetable-enriched breads. In conclusion, this study demonstrated that vegetable-enriched bread appeared to be an acceptable strategy to increase vegetable intake, however, liking depended on vegetable type.
Resumo:
This paper reviews theories and models of users’ acceptance and use in relation to “persuasive technology”, to justify the need to add consideration of ‘perceived persuasiveness’. We conclude by identifying variables associated with perceived persuasiveness, and highlight important future research directions in this domain.
Resumo:
With an aging global population, the number of people living with a chronic illness is expected to increase significantly by 2050. If left unmanaged, chronic care leads to serious health complications, resulting in poor patient quality of life and a costly time bomb for care providers. If effectively managed, patients with chronic care tend to live a richer and more healthy life, resulting in a less costly total care solution. This chapter considers literature from the areas of technology acceptance and care self-management, which aims to alleviate symptoms and/or reason for non-acceptance of care, and thus minimise the risk of long-term complications, which in turn reduces the chance of spiralling health expenditure. By bringing together these areas, the chapter highlights areas where self-management is failing so that changes can be made in care in advance of health deterioration.
Resumo:
Persuasive technologies have been extensively applied in the context of e-commerce for the purpose of marketing, enhancing system credibility, and motivating users to adopt the systems. Recognising that persuasion impacts on consumer behaviour to purchase online have not been investigated previously. This study reviews theories of technology acceptance, and identifies their limitation in not considering the effect of persuasive technologies when determining user online technology acceptance. The study proposes a theoretical model that considers the effect of persuasive technologies on consumer acceptance of e-commerce websites; with consideration of other related variables, i.e. trust and technological attributes. Moreover the paper proposes a model based on the UTAUT2, which contains relevant contributing factors; including the concept of perceived persuasiveness.
Resumo:
Persuasive technologies, used within in the domain of interactive technology, are used broadly in social contexts to encourage customers towards positive behavior change. In the context of e-commerce, persuasive technologies have already been extensively applied in the area of marketing to enhancing system credibility, however the issue of ‘persuasiveness’, and its role on positive user acceptance of technology, has not been investigated in the technology acceptance literature. This paper reviews theories and models of users’ acceptance and use in relation with persuasive technology, and identifies their limitation when considering the impact of persuasive technology on users’ acceptance of technology; thus justifying a need to add consideration of ‘perceived persuasiveness’. We conclude by identifying variables associated with perceived persuasiveness, and suggest key research directions for future research.
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
Resumo:
This article critically reflects on the widely held view of a causal chain with trust in public authorities impacting technology acceptance via perceived risk. It first puts forward conceptual reason against this view, as the presence of risk is a precondition for trust playing a role in decision making. Second, results from consumer surveys in Italy and Germany are presented that support the associationist model as counter hypothesis. In that view, trust and risk judgments are driven by and thus simply indicators of higher order attitudes toward a certain technology which determine acceptance instead. The implications of these findings are discussed.