998 resultados para Scientific publication
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
Underfeeding causes a significant increase of postoperative complications, particularly respiratory and infectious complications. Thoracic surgery is frequently required in patients suffering wasting diseases (cancer, COPD, cystic fibrosis), which increase the risk of malnutrition. The most important risk factors are preoperative hypoalbuminemia and BMI < 20. The deleterious effects of underfeeding may be corrected by a preoperative nutritional support for 7 to 15 days using oral supplements or enteral feeding: respiratory muscle strength is improved, immunity is restored, and overall complications are reduced. Therefore preoperative diagnosis of underfeeding is of utmost importance. In case of emergency surgery, the nutritional assessment on admission enables the introduction of early postoperative artificial feeding.
Resumo:
Considerando-se a importância da indexação no processo de disseminação da informação na sociedade contemporânea, as bases de dados configuram-se como essenciais nesse processo. Dentre elas, a do Institute for Scientific Information, a mais abrangente base de dados de informações científicas do mundo. Objetiva-se, então, situar a ciência brasileira nessa base, com o perfil dos periódicos ora indexados, a partir de parâmetros como região, estado e município; área de conhecimento; natureza do periódico; avaliação, entre outros. O universo compreende os 17 títulos de periódicos indexados. Para a coleta de dados, recorre-se à análise dos dois últimos fascículos produzidos em 1999. Dentre os resultados, destaca-se a supremacia da região Sudeste.
Resumo:
The constant scientific production in the universities and in the research centers makes these organizations produce and acquire a great amount of data in a short period of time. Due to the big quantity of data, the research organizations become potentially vulnerable to the impacts on information booms that may cause a chaos as far as information management is concerned. In this context, the development of data catalogues comes up as one possible solution to the problems such as (I) the organization and (II) the data management. In the scientific scope, the data catalogues are implemented with the standard for digital and geospatial metadata and are broadly utilized in the process of producing a catalogue of scientific information. The aim of this work is to present the characteristics of access and storage of metadata in databank systems in order to improve the description and dissemination of scientific data. Relevant aspects will be considered and they should be analyzed during the stage of planning, once they can determine the success of implementation. The use of data catalogues by research organizations may be a way to promote and facilitate the dissemination of scientific data, avoid the repetition of efforts while being executed, as well as incentivate the use of collected, processed an also stored.
Resumo:
Collection : Collection des médecins grecs et latins
Resumo:
Collection : Collection des médecins grecs et latins
Resumo:
Resumen: El artículo analiza los problemas de accesibilidad que actualmente presentan los artículos científicos en soporte digital. El estudio se centra en los aspectos de facilidad de uso del contenido de los documentos digitales según la forma en que se publiquen, sin entrar en el estudio de los distintos sistemas de recuperación. Se analizan los dos formatos más utilizados para la publicación de artículos científicos en soporte digital: HTML y PDF, estudiando el desempeño lector en relación a la presencia de sumarios o de tablas internas o vinculadas. El estudio se ha realizado con dos colectivos: 30 sujetos ciegos, usuarios de Jaws, contactados gracias a la mediación de la Fundación ONCE, y 30 sujetos no ciegos, profesores del Departamento de Biblioteconomía y Documentación de la Universidad de Barcelona. El estudio muestra que la localización de los datos contenidos en tablas se ve facilitada en documentos HTML por la inclusión de un sumario que vincule con la tabla, así como la inclusión de tablas completas en el cuerpo del documento HTML facilita la actividad lectora por parte de los usuarios ciegos. A nivel metodológico la presente investigación aporta dos novedades relevantes respecto a la literatura existente en los estudios de usabilidad con ciegos: estudia la usabilidad del formato PDF y es un test de usabilidad cuantitativo; este último hecho dificulta su comparación con la mayoría de artículos publicados. Abstract: This paper analyses the problems of accessibility posed by scientific articles published in digital format, focusing on the ease of use of their content with respect to the form in which they are published (irrespective of the recovery system). The two most widely used formats for the publication of scientific articles in digital format, HTML and PDF, are analysed, examining reader performance in relation to the presence of contents lists or internal or linked tables. The study involved two groups: 30 blind subjects, all JAWS users, contacted through the ONCE Foundation, and 30 sighted subjects, lecturers in the Department of Librarianship and Documentation of the University of Barcelona. The results shows the location of data in tables is easier in HTML documents through the inclusion of a contents list linked to these tables. Further, the inclusion of complete tables in the body of HTML document facilitates the reading activity of blind users. At the methodological level, this work reports two novelties with respect to the existing literature on usability by blind people: it examines the usability of the PDF format, and discusses a quantitative usability test. The latter hinders comparison with the majority of published articles.
Resumo:
Interest in cognitive pretest methods for evaluating survey questionnaires has been increasing for the last three decades. However, analysing the features of the scientific output in the field can be difficult due to its prevalence in public and private institutes whose main mission is not scientific research. The aim of this research is to characterize the current state of scientific output in the field by means of two bibliometric studies for the period from 1980 to 2007. Study 1 analysed documents obtained from the more commonly used bibliographic databases. Study 2 supplemented the body of documents from Study 1 with documents from non-indexed journals, conference papers, etc. Results show a constant growth in the number of publications. The wide dispersion of publication sources, together with the highlighted role of the public and private institutions as centres of production, can also be identified as relevant characteristics of the scientific output in this field.
Resumo:
BACKGROUND: Many clinical studies are ultimately not fully published in peer-reviewed journals. Underreporting of clinical research is wasteful and can result in biased estimates of treatment effect or harm, leading to recommendations that are inappropriate or even dangerous. METHODS: We assembled a cohort of clinical studies approved 2000-2002 by the Research Ethics Committee of the University of Freiburg, Germany. Published full articles were searched in electronic databases and investigators contacted. Data on study characteristics were extracted from protocols and corresponding publications. We characterized the cohort, quantified its publication outcome and compared protocols and publications for selected aspects. RESULTS: Of 917 approved studies, 807 were started and 110 were not, either locally or as a whole. Of the started studies, 576 (71%) were completed according to protocol, 128 (16%) discontinued and 42 (5%) are still ongoing; for 61 (8%) there was no information about their course. We identified 782 full publications corresponding to 419 of the 807 initiated studies; the publication proportion was 52% (95% CI: 0.48-0.55). Study design was not significantly associated with subsequent publication. Multicentre status, international collaboration, large sample size and commercial or non-commercial funding were positively associated with subsequent publication. Commercial funding was mentioned in 203 (48%) protocols and in 205 (49%) of the publications. In most published studies (339; 81%) this information corresponded between protocol and publication. Most studies were published in English (367; 88%); some in German (25; 6%) or both languages (27; 6%). The local investigators were listed as (co-)authors in the publications corresponding to 259 (62%) studies. CONCLUSION: Half of the clinical research conducted at a large German university medical centre remains unpublished; future research is built on an incomplete database. Research resources are likely wasted as neither health care professionals nor patients nor policy makers can use the results when making decisions.
Resumo:
This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.