952 resultados para Scientific Data Visualisation
Resumo:
Topographic variation, the spatial variation in elevation and terrain features, underpins a myriad of patterns and processes in geography and ecology and is key to understanding the variation of life on the planet. The characterization of this variation is scale-dependent, i.e. it varies with the distance over which features are assessed and with the spatial grain (grid cell resolution) of analysis. A fully standardized and global multivariate product of different terrain features has the potential to support many large-scale basic research and analytical applications, however to date, such technique is unavailable. Here we used the digital elevation model products of global 250 m GMTED and near-global 90 m SRTM to derive a suite of topographic variables: elevation, slope, aspect, eastness, northness, roughness, terrain roughness index, topographic position index, vector ruggedness measure, profile and tangential curvature, and 10 geomorphological landform classes. We aggregated each variable to 1, 5, 10, 50 and 100 km spatial grains using several aggregation approaches (median, average, minimum, maximum, standard deviation, percent cover, count, majority, Shannon Index, entropy, uniformity). While a global cross-correlation underlines the high similarity of many variables, a more detailed view in four mountain regions reveals local differences, as well as scale variations in the aggregated variables at different spatial grains. All newly-developed variables are available for download at http://www.earthenv.org and can serve as a basis for standardized hydrological, environmental and biodiversity modeling at a global extent.
Resumo:
Social media is changing the way we interact, present ideas and information and judge the quality of content and contributions. In recent years there have been hundreds of platforms to freely share all kinds of information and connect across networks. These new tools generate activity statistics and interactions among users such as mentions, retweets, conversations, comments on blogs or Facebook; managers references showing popularity ratings of more references shared by other researchers or repositories that generate statistics of visits or downloads of articles. This paper analyzes that have meaning and implications altmetrics, what are its advantages and critical platforms (Almetric.com, ImpactStory, Plos altmetrics, PlumX), reports progress and benefits for authors, publishers and librarians. It concluded that the value of alternative metrics as a complementary tool citation analysis is evident, although it is suggested that you should dig deeper into this issue to unravel the meaning and the potential value of these indicators to assess their potential.
Resumo:
To get a better insight into the radiological features of industrial by-products that can be reused in building materials a review of the reported scientific data can be very useful. The current study is based on the continuously growing database of the By-BM (H2020-MSCA-IF-2015) project (By-products for Building Materials). Currently, the By-BM database contains individual data of about 431 by-products and 1095 building and raw materials. It was found that in case of the building materials the natural radionuclide content varied widely (Ra-226: <DL-27851 Bq/kg; Th-232: <DL-906 Bq/kg, K-40: <DL-17922 Bq/kg), more so than for the by-products (Ra-226: 7-3152 Bq/kg; Th-232: <DL-1350 Bq/kg, K-40: <DL-3001 Bq/kg). The average Ra-226, Th-232 and K-40 contents of the reported by-products were respectively 2.52, 2.35 and 0.39 times higher than the building materials. The gamma exposure of bulk building products was calculated according to IAEA Specific Safety Guide No. SSG-32 and the European Commission Radiation Protection 112 based I-index (EU BSS). It was found that in most cases the I-index without density consideration provides a significant overestimation in excess effective dose.
Resumo:
Introduction to the module, fundamentals, history of data visualisation
Resumo:
Sound is potentially an effective way of analysing data and it is possible to simultaneously interpret layers of sounds and identify changes. Multiple attempts to use sound with scientific data have been made, with varying levels of success. On many occasions this was done without including the end user during the development. In this study a sonified model of the 8 planets of our solar system was built and tested using an end user approach. The sonification was created for the Esplora Planetarium, which is currently being constructed in Malta. The data requirements were gathered from a member of the planetarium staff, and 12 end users, as well as the planetarium representative tested the sonification. The results suggest that listeners were able to discern various planetary characteristics without requiring any additional information. Three out of eight sound design parameters did not represent characteristics successfully. These issues have been identified and further development will be conducted in order to improve the model.
Resumo:
This text presents the general problem of nuisance and pollution created by macro-waste at sea. We clarify three main types of pollution: that of coasts, that of the sea bed and that of the open sea. The article leans on scientific data: report of the state of contamination of the coasts of France by macro waste (1982), results of works and oceanographic sea bed observation campaigns, evaluation of the contamination by micro-plastics floating on surface and concentrating in the big oceanic gyres. A paragraph is dedicated to the impacts on the ecosystems, an other one on the taken measures to fight against those macro-waste. The specific case of New Caledonia is not forgotten and in conclusion we underline the necessity of an education of the populations in terms of Eco responsibility, which appears to be a significant factor of regression of this type of nuisance that affects all the seas of the world
Resumo:
This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.
Resumo:
Visual perception, information design for the brain, discussion of good and bad visualisations
Resumo:
Lectures on COMP6234
Resumo:
This thesis presents a cloud-based software platform for sharing publicly available scientific datasets. The proposed platform leverages the potential of NoSQL databases and asynchronous IO technologies, such as Node.JS, in order to achieve high performances and flexible solutions. This solution will serve two main groups of users. The dataset providers, which are the researchers responsible for sharing and maintaining datasets, and the dataset users, that are those who desire to access the public data. To the former are given tools to easily publish and maintain large volumes of data, whereas the later are given tools to enable the preview and creation of subsets of the original data through the introduction of filter and aggregation operations. The choice of NoSQL over more traditional RDDMS emerged from and extended benchmark between relational databases (MySQL) and NoSQL (MongoDB) that is also presented in this thesis. The obtained results come to confirm the theoretical guarantees that NoSQL databases are more suitable for the kind of data that our system users will be handling, i. e., non-homogeneous data structures that can grow really fast. It is envisioned that a platform like this can lead the way to a new era of scientific data sharing where researchers are able to easily share and access all kinds of datasets, and even in more advanced scenarios be presented with recommended datasets and already existing research results on top of those recommendations.
Resumo:
Introducción: El monitoreo hemodinámico es una herramienta para diagnosticar el choque cardiogénico y monitorear la respuesta al tratamiento; puede ser invasivo, mínimamente invasivo o no invasivo. Se realiza rutinariamente con catéter de arteria pulmonar (CAP) o catéter de Swan Ganz; nuevas técnicas de monitoreo hemodinámico mínimamente invasivo tienen menor tasa de complicaciones. Actualmente se desconoce cuál técnica de monitoreo cuenta con mayor seguridad en el paciente con choque cardiogénico. Objetivo: Evaluar la seguridad del monitoreo hemodinámico invasivo comparado con el mínimamente invasivo en pacientes con choque cardiogénico en cuidado intensivo adultos. Diseño: Revisión sistemática de la literatura. Búsqueda en Pubmed, EMBASE, OVID - Cochrane Library, Lilacs, Scielo, registros de ensayos clínicos, actas de conferencias, repositorios, búsqueda de literatura gris en Google Scholar, Teseo y Open Grey hasta agosto de 2016, publicados en inglés y español. Resultados: Se identificó un único estudio con 331 pacientes críticamente enfermos que comparó el monitoreo hemodinámico con CAP versus PiCCO que concluyó que después de la corrección de los factores de confusión, la elección del tipo de monitoreo no influyó en los resultados clínicos más importantes en términos de complicaciones y mortalidad. Dado que se incluyeron otros diagnósticos, no es posible extrapolar los resultados sólo a choque cardiogénico. Conclusión: En la literatura disponible no hay evidencia de que el monitoreo hemodinámico invasivo comparado con el mínimamente invasivo, en pacientes adultos críticamente enfermos con choque cardiogénico, tenga diferencias en cuanto a complicaciones y mortalidad.
Resumo:
Worldwide companies currently make a significant effort in performing the materiality analysis, whose aim is to explain corporate sustainability in an annual report. Materiality reflects what are the most important social, economic and environmental issues for a company and its stakeholders. Many studies and standards have been proposed to establish what are the main steps to follow to identify the specific topics to be included in a sustainability report. However, few existing quantitative and structured approaches help understanding how to deal with the identified topics and how to prioritise them to effectively show the most valuable ones. Moreover, the use of traditional approaches involves a long-lasting and complex procedure where a lot of people have to be reached and interviewed and several companies' reports have to be read to extrapolate the material topics to be discussed in the sustainability report. This dissertation aims to propose an automated mechanism to gather stakeholders and the company's opinions identifying relevant issues. To accomplish this purpose, text mining techniques are exploited to analyse textual documents written by either a stakeholder or the reporting company. It is then extracted a measure of how much a document deals with some defined topics. This kind of information is finally manipulated to prioritise topics based on how the author's opinion matters. The entire work is based upon a real case study in the domain of telecommunications.
Resumo:
In 2007 Associate Professor Jay Hall retires from the University of Queensland after more than 30 years of service to the Australian archaeological community. Celebrated as a gifted teacher and a pioneer of Queensland archaeology, Jay leaves a rich legacy of scholarship and achievement across a wide range of archaeological endeavours. An Archæological Life brings together past and present students, colleagues and friends to celebrate Jay’s contributions, influences and interests.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.
Resumo:
Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.