865 resultados para Peer-review
Resumo:
This paper provides an extended guide to reviewing for ESPL in particular and geomorphology in general. After a brief consideration of both how we choose reviewers and why we hope that reviewers will accept, I consider what makes a fair and constructive review. I note that we aim to publish papers with the rigour (r) necessary to sustain an original and significant contribution (q). I note that judging q is increasingly difficult because of the ever-growing size of the discipline (the Q). This is the sense in which we rarely have a full appreciation of Q, and our reviews are inevitably going to contain some bias. It is this bias that cannot be avoided (cf. Nicholas and Gordon, 2011) and makes the job of ESPL's Editors of critical importance. With this in mind, I identify six elements of a good review: (1) an introductory statement that explains your assessment of your competences in relation to the manuscript (r and Q); (2) a summative view of the originality and significance of the manuscript (q) in relation to Q: (3) a summative view of the methodological rigour of the manuscript (r); (4) identification and justification of any major concerns; (5) identification of any minor issues to be corrected if you think the manuscript merits eventual publication; and (6) note of any typographical or presentation issues to be addressed although this latter activity is also an editorial responsibility. In addition, I note the importance of a constructive review, grounded in what is written in the manuscript, justified where appropriate and avoiding reference to personal views as far as is possible. I conclude with a discussion of whether or not you should sign your review openly and the importance of reviewer confidentiality. Copyright (C) 2012 John Wiley & Sons, Ltd.
Resumo:
Éditoral / Editorial
Resumo:
guidance notes on review and evaluation processes. Part of the total handin required See also http://www.edshare.soton.ac.uk/9937/ for use in context and http://www.edshare.soton.ac.uk/9911/ for the guidance on the critical friend review process
Resumo:
Description of how to conduct a peer review, and guidance on how to submit it as a task. Download and edit this document if you decide to hand in information relating to your peer review exercise.
Resumo:
READ the guidance notes, then attempt the tasks CONTENTS: Peer review guidance
Resumo:
Recording of the Elsevier Author Seminar by Dr Anthony Newman and Michaela Kurschildgen.
Resumo:
Description of how to conduct a peer review
Resumo:
This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax.
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.