869 resultados para Seix Barral Editors
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Domain-specific languages (DSLs) are increasingly used as embedded languages within general-purpose host languages. DSLs provide a compact, dedicated syntax for specifying parts of an application related to specialized domains. Unfortunately, such language extensions typically do not integrate well with the development tools of the host language. Editors, compilers and debuggers are either unaware of the extensions, or must be adapted at a non-trivial cost. We present a novel approach to embed DSLs into an existing host language by leveraging the underlying representation of the host language used by these tools. Helvetia is an extensible system that intercepts the compilation pipeline of the Smalltalk host language to seamlessly integrate language extensions. We validate our approach by case studies that demonstrate three fundamentally different ways to extend or adapt the host language syntax and semantics.
Resumo:
When it comes to helping to shape sustainable development, research is most useful when it bridges the science–implementation/management gap and when it brings development specialists and researchers into a dialogue (Hurni et al. 2004); can a peer-reviewed journal contribute to this aim? In the classical system for validation and dissemination of scientific knowledge, journals focus on knowledge exchange within the academic community and do not specifically address a ‘life-world audience’. Within a North-South context, another knowledge divide is added: the peer review process excludes a large proportion of scientists from the South from participating in the production of scientific knowledge (Karlsson et al. 2007). Mountain Research and Development (MRD) is a journal whose mission is based on an editorial strategy to build the bridge between research and development and ensure that authors from the global South have access to knowledge production, ultimately with a view to supporting sustainable development in mountains. In doing so, MRD faces a number of challenges that we would like to discuss with the td-net community, after having presented our experience and strategy as editors of this journal. MRD was launched in 1981 by mountain researchers who wanted mountains to be included in the 1992 Rio process. In the late 1990s, MRD realized that the journal needed to go beyond addressing only the scientific community. It therefore launched a new section addressing a broader audience in 2000, with the aim of disseminating insights into, and recommendations for, the implementation of sustainable development in mountains. In 2006, we conducted a survey among MRD’s authors, reviewers, and readers (Wymann et al. 2007): respondents confirmed that MRD had succeeded in bridging the gap between research and development. But we realized that MRD could become an even more efficient tool for sustainability if development knowledge were validated: in 2009, we began submitting ‘development’ papers (‘transformation knowledge’) to external peer review of a kind different from the scientific-only peer review (for ‘systems knowledge’). At the same time, the journal became open access in order to increase the permeability between science and society, and ensure greater access for readers and authors in the South. We are currently rethinking our review process for development papers, with a view to creating more space for communication between science and society, and enhancing the co-production of knowledge (Roux 2008). Hopefully, these efforts will also contribute to the urgent debate on the ‘publication culture’ needed in transdisciplinary research (Kueffer et al. 2007).
Resumo:
Editor's note: The text of this article originally appeared as the final chapter of a brochure entitled Mountains and Climate Change—From Understanding to Action, prepared at the Centre for Development and Environment, University of Bern, Switzerland, for presentation by the Swiss Agency for Development and Cooperation (SDC) at a side event at the United Nations Climate Change Conference in Copenhagen on 12 December 2009. Chapters of the brochure deal with various aspects of climate change and its impact in mountain regions. In light of the significance of the Copenhagen COP 15 conference, the editors of this publication believe MRD's readers will be interested in reading this summary written from the perspective of Swiss researchers and development experts. The full brochure may be viewed and downloaded at www.cde.unibe.ch/Research/MA_Re.asp
Resumo:
Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.
Resumo:
Research councils, universities and funding agencies are increasingly asking for tools to measure the quality of research in the humanities. One of their preferred methods is a ranking of journals according to their supposed level of internationality. Our quantitative survey of seventeen major journals of medical history reveals the futility of such an approach. Most journals have a strong national character with a dominance of native language, authors and topics. The most common case is a paper written by a local author in his own language on a national subject regarding the nineteenth or twentieth century. American and British journals are taken notice of internationally but they only rarely mention articles from other history of medicine journals. Continental European journals show a more international review of literature, but are in their turn not noticed globally. Increasing specialisation and fragmentation has changed the role of general medical history journals. They run the risk of losing their function as international platforms of discourse on general and theoretical issues and major trends in historiography, to international collections of papers. Journal editors should therefore force their authors to write a more international report, and authors should be encouraged to submit papers of international interest and from a more general, transnational and methodological point of view.
Resumo:
Publishing is an essential means of validation and communication of research. This is no different in transdisciplinary research, where publishing also aims at contributing to the development of society through sharing of knowledge. In the scientific world, authors need to disseminate and validate results, reflect on issues, and participate in debates. On the other hand, institutions and individuals are assessed according to their publication record – as probably the most influential of all current evaluation criteria. Occupying the space between article production and counting impact factors, journal editors and reviewers play an important role in defining and using rules to assess and improve the work submitted to them. Publishing transdisciplinary research poses specific challenges, in particular with regard to peer-review processes, as it addresses different knowledge communities with different value systems and purposes.
Resumo:
BACKGROUND: Randomized controlled trials (RCTs) are the best tool to evaluate the effectiveness of clinical interventions. The Consolidated Standards for Reporting Trials (CONSORT) statement was introduced in 1996 to improve reporting of RCTs. We aimed to determine the extent of ambiguity and reporting quality as assessed by adherence to the CONSORT statement in published reports of RCTs involving patients with Hodgkin lymphoma from 1966 through 2002. METHODS: We analyzed 242 published full-text reports of RCTs in patients with Hodgkin lymphoma. Quality of reporting was assessed using a 14-item questionnaire based on the CONSORT checklist. Reporting was studied in two pre-CONSORT periods (1966-1988 and 1989-1995) and one post-CONSORT period (1996-2002). RESULTS: Only six of the 14 items were addressed in 75% or more of the studies in all three time periods. Most items that are necessary to assess the methodologic quality of a study were reported by fewer than 20% of the studies. Improvements over time were seen for some items, including the description of statistics methods used, reporting of primary research outcomes, performance of power calculations, method of randomization and concealment allocation, and having performed intention-to-treat analysis. CONCLUSIONS: Despite recent improvements, reporting levels of CONSORT items in RCTs involving patients with Hodgkin lymphoma remain unsatisfactory. Further concerted action by journal editors, learned societies, and medical schools is necessary to make authors even more aware of the need to improve the reporting RCTs in medical journals to allow assessment of validity of published clinical research.
Resumo:
Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalisability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control, and cross-sectional studies. We convened a 2-day workshop in September, 2004, with methodologists, researchers, and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE statement) that relate to the title, abstract, introduction, methods, results, and discussion sections of articles.18 items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies.A detailed explanation and elaboration document is published separately and is freely available on the websites of PLoS Medicine, Annals of Internal Medicine, and Epidemiology. We hope that the STROBE statement will contribute to improving the quality of reporting of observational studies
Resumo:
Much medical research is observational. The reporting of observational studies is often of insufficient quality. Poor reporting hampers the assessment of the strengths and weaknesses of a study and the generalizability of its results. Taking into account empirical evidence and theoretical considerations, a group of methodologists, researchers, and editors developed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) recommendations to improve the quality of reporting of observational studies.The STROBE Statement consists of a checklist of 22 items, which relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to cohort studies, case-control studies and cross-sectional studies and four are specific to each of the three study designs. The STROBE Statement provides guidance to authors about how to improve the reporting of observational studies and facilitates critical appraisal and interpretation of studies by reviewers, journal editors and readers.This explanatory and elaboration document is intended to enhance the use, understanding, and dissemination of the STROBE Statement. The meaning and rationale for each checklist item are presented. For each item, one or several published examples and, where possible, references to relevant empirical studies and methodological literature are provided. Examples of useful flow diagrams are also included. The STROBE Statement, this document, and the associated web site (http://www.strobe-statement.org) should be helpful resources to improve reporting of observational research.
Resumo:
Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalizability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a 2-day workshop in September 2004, with methodologists, researchers, and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results, and discussion sections of articles. 18 items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed "Explanation and Elaboration" document is published separately and is freely available on the web sites of PLoS Medicine, Annals of Internal Medicine, and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.
Resumo:
Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalizability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a two-day workshop, in September 2004, with methodologists, researchers and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the web sites of PLoS Medicine, Annals of Internal Medicine and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.
Resumo:
Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalisability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a 2-day workshop in September 2004, with methodologists, researchers, and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results, and discussion sections of articles. 18 items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the websites of PLoS Medicine, Annals of Internal Medicine and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.