963 resultados para Scientific data


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This text presents the general problem of nuisance and pollution created by macro-waste at sea. We clarify three main types of pollution: that of coasts, that of the sea bed and that of the open sea. The article leans on scientific data: report of the state of contamination of the coasts of France by macro waste (1982), results of works and oceanographic sea bed observation campaigns, evaluation of the contamination by micro-plastics floating on surface and concentrating in the big oceanic gyres. A paragraph is dedicated to the impacts on the ecosystems, an other one on the taken measures to fight against those macro-waste. The specific case of New Caledonia is not forgotten and in conclusion we underline the necessity of an education of the populations in terms of Eco responsibility, which appears to be a significant factor of regression of this type of nuisance that affects all the seas of the world

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introducción: El monitoreo hemodinámico es una herramienta para diagnosticar el choque cardiogénico y monitorear la respuesta al tratamiento; puede ser invasivo, mínimamente invasivo o no invasivo. Se realiza rutinariamente con catéter de arteria pulmonar (CAP) o catéter de Swan Ganz; nuevas técnicas de monitoreo hemodinámico mínimamente invasivo tienen menor tasa de complicaciones. Actualmente se desconoce cuál técnica de monitoreo cuenta con mayor seguridad en el paciente con choque cardiogénico. Objetivo: Evaluar la seguridad del monitoreo hemodinámico invasivo comparado con el mínimamente invasivo en pacientes con choque cardiogénico en cuidado intensivo adultos. Diseño: Revisión sistemática de la literatura. Búsqueda en Pubmed, EMBASE, OVID - Cochrane Library, Lilacs, Scielo, registros de ensayos clínicos, actas de conferencias, repositorios, búsqueda de literatura gris en Google Scholar, Teseo y Open Grey hasta agosto de 2016, publicados en inglés y español. Resultados: Se identificó un único estudio con 331 pacientes críticamente enfermos que comparó el monitoreo hemodinámico con CAP versus PiCCO que concluyó que después de la corrección de los factores de confusión, la elección del tipo de monitoreo no influyó en los resultados clínicos más importantes en términos de complicaciones y mortalidad. Dado que se incluyeron otros diagnósticos, no es posible extrapolar los resultados sólo a choque cardiogénico. Conclusión: En la literatura disponible no hay evidencia de que el monitoreo hemodinámico invasivo comparado con el mínimamente invasivo, en pacientes adultos críticamente enfermos con choque cardiogénico, tenga diferencias en cuanto a complicaciones y mortalidad.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The biggest challenge in conservation biology is breaking down the gap between research and practical management. A major obstacle is the fact that many researchers are unwilling to tackle projects likely to produce sparse or messy data because the results would be difficult to publish in refereed journals. The obvious solution to sparse data is to build up results from multiple studies. Consequently, we suggest that there needs to be greater emphasis in conservation biology on publishing papers that can be built on by subsequent research rather than on papers that produce clear results individually. This building approach requires: (1) a stronger theoretical framework, in which researchers attempt to anticipate models that will be relevant in future studies and incorporate expected differences among studies into those models; (2) use of modern methods for model selection and multi-model inference, and publication of parameter estimates under a range of plausible models; (3) explicit incorporation of prior information into each case study; and (4) planning management treatments in an adaptive framework that considers treatments applied in other studies. We encourage journals to publish papers that promote this building approach rather than expecting papers to conform to traditional standards of rigor as stand-alone papers, and believe that this shift in publishing philosophy would better encourage researchers to tackle the most urgent conservation problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJETIVO: Traçar o perfil das publicações científicas de fase I e procurar saber se a publicação oferece dados da fase pré-clínica com ênfase nos aspectos bioéticos. MÉTODOS: Foram analisados 61 artigos científicos publicados no ano de 2007, que relatam pesquisas envolvendo seres humanos com novos fármacos, medicamentos ou vacinas em fase I. Foi elaborado um roteiro para coleta de dados, com o qual fosse possível analisar e avaliar os artigos científicos. O roteiro contempla itens referentes à fase pré-clínica (associados à fase clínica) e itens referentes às características da amostra. RESULTADOS: Nos artigos analisados, a maioria das pesquisas foi realizada nos EUA. Devido ao grande número de publicações destinadas às doenças oncológicas a maioria delas foi realizada com voluntários doentes. Quanto às informações sobre a fase pré-clínica presente nas publicações de fase I observamos que são pobres ou inexistentes. Mesmo que os autores julguem a pesquisa fase I como promissora e sugiram estudos futuros de fase II, ao leitor não é possível este mesmo julgamento pela escassez de informações da fase pré-clínica. CONCLUSÃO: O perfil das publicações levanta dados que merecem reflexão e análise para melhor avaliação do que está ocorrendo com as publicações de fase I.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Until today, most of the documentation of forensic relevant medical findings is limited to traditional 2D photography, 2D conventional radiographs, sketches and verbal description. There are still some limitations of the classic documentation in forensic science especially if a 3D documentation is necessary. The goal of this paper is to demonstrate new 3D real data based geo-metric technology approaches. This paper present approaches to a 3D geo-metric documentation of injuries on the body surface and internal injuries in the living and deceased cases. Using modern imaging methods such as photogrammetry, optical surface and radiological CT/MRI scanning in combination it could be demonstrated that a real, full 3D data based individual documentation of the body surface and internal structures is possible in a non-invasive and non-destructive manner. Using the data merging/fusing and animation possibilities, it is possible to answer reconstructive questions of the dynamic development of patterned injuries (morphologic imprints) and to evaluate the possibility, that they are matchable or linkable to suspected injury-causing instruments. For the first time, to our knowledge, the method of optical and radiological 3D scanning was used to document the forensic relevant injuries of human body in combination with vehicle damages. By this complementary documentation approach, individual forensic real data based analysis and animation were possible linking body injuries to vehicle deformations or damages. These data allow conclusions to be drawn for automobile accident research, optimization of vehicle safety (pedestrian and passenger) and for further development of crash dummies. Real 3D data based documentation opens a new horizon for scientific reconstruction and animation by bringing added value and a real quality improvement in forensic science.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a data-intensive architecture that demonstrates the ability to support applications from a wide range of application domains, and support the different types of users involved in defining, designing and executing data-intensive processing tasks. The prototype architecture is introduced, and the pivotal role of DISPEL as a canonical language is explained. The architecture promotes the exploration and exploitation of distributed and heterogeneous data and spans the complete knowledge discovery process, from data preparation, to analysis, to evaluation and reiteration. The architecture evaluation included large-scale applications from astronomy, cosmology, hydrology, functional genetics, imaging processing and seismology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Provenance models are crucial for describing experimental results in science. The W3C Provenance Working Group has recently released the PROV family of specifications for provenance on the Web. While provenance focuses on what is executed, it is important in science to publish the general methods that describe scientific processes at a more abstract and general level. In this paper, we propose P-PLAN, an extension of PROV to represent plans that guid-ed the execution and their correspondence to provenance records that describe the execution itself. We motivate and discuss the use of P-PLAN and PROV to publish scientific workflows as Linked Data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper presents in brief the “2nd Generation Open Access Infrastructure for Research in Europe” project (http://www.openaire.eu/) and what is done in Bulgaria during the last year in the area of open access to scientific information and data.