928 resultados para Research data management (RDM)
Resumo:
A report of a joint ARMA, RLUK, RUGIT, SCONUL, UCISA and Jisc workshop that underpins the "Directions in Research Data Management" report. Presentations from the event can be found at: http://www.jisc.ac.uk/events/directions-for-research-data-management-in-uk-universities-06-nov-2014 A blog post about the event can be found at: http://researchdata.jiscinvolve.org/wp/2014/12/04/directions-in-research-data-management/
Resumo:
This study has investigated the medium to long term costs to Higher Education Institutions (HEIs) of the preservation of research data and developed guidance to HEFCE and institutions on these issues. It has provided an essential methodological foundation on research data costs for the forthcoming HEFCE-sponsored feasibility study for a UK Research Data Service.It will also assist HEIs and funding bodies wishing to establish strategies and TRAC costings for long-term data management and archiving. The rising tide of digital research data raises issues relating to access, curation and preservation for HEIs and within the UK a growing number of research funders are now implementing policies requiring researchers to submit data management, preservation or data sharing plans with their funding applications.
Resumo:
Scientific research revolves around the production, analysis, storage, management, and re-use of data. Data sharing offers important benefits for scientific progress and advancement of knowledge. However, several limitations and barriers in the general adoption of data sharing are still in place. Probably the most important challenge is that data sharing is not yet very common among scholars and is not yet seen as a regular activity among scientists, although important efforts are being invested in promoting data sharing. In addition, there is a relatively low commitment of scholars to cite data. The most important problems and challenges regarding data metrics are closely tied to the more general problems related to data sharing. The development of data metrics is dependent on the growth of data sharing practices, after all it is nothing more than the registration of researchers’ behaviour. At the same time, the availability of proper metrics can help researchers to make their data work more visible. This may subsequently act as an incentive for more data sharing and in this way a virtuous circle may be set in motion. This report seeks to further explore the possibilities of metrics for datasets (i.e. the creation of reliable data metrics) and an effective reward system that aligns the main interests of the main stakeholders involved in the process. The report reviews the current literature on data sharing and data metrics. It presents interviews with the main stakeholders on data sharing and data metrics. It also analyses the existing repositories and tools in the field of data sharing that have special relevance for the promotion and development of data metrics. On the basis of these three pillars, the report presents a number of solutions and necessary developments, as well as a set of recommendations regarding data metrics. The most important recommendations include the general adoption of data sharing and data publication among scholars; the development of a reward system for scientists that includes data metrics; reducing the costs of data publication; reducing existing negative cultural perceptions of researchers regarding data publication; developing standards for preservation, publication, identification and citation of datasets; more coordination of data repository initiatives; and further development of interoperability protocols across different actors.
Resumo:
Planning the management of data at proposal time and throughout its lifecycle is becoming increasingly important to funding agencies and is essential to ensure its current usability and long term preservation and access. This presentation will describe the work being done at the Woods Hole Oceanographic Institution (WHOI) to assist PIs with the preparation of data management plans and the role the Library has in this process. Data management does not mean simply storing information. The emphasis is now on sharing data and making research accessible. Topics to be covered include educating staff about the NSF data policy implementation, a data management survey, resources for proposal preparation, collaborating with other librarians, and next steps.
Resumo:
Some basic concepts of fishery economics and management, and fish population dynamics are recalled, as presented during a course held at the Instituto de Investigaçāo Pesqueira from 23 February to 15 March 1988 in Maputo, Mozambique. Also, some basic elements of length-based stock assessment are reviewed, with emphasis on their implementation through the “Compleat Elefan" package, used extensively during this course, when the participants analyzed their data and wrote first draft of manuscripts incorporating the results of these analyses. Some problems relative to sampling and to seasonal growth oscillations are discussed with special reference to conditions in Mozambique.
Resumo:
To manage and process a large amount of oceanographic data, users must have powerful tools that simplify these tasks. The VODC for PC is software designed to assist in managing oceanographic data. It based on 32 bits Windows operation system and used Microsoft Access database management system. With VODC for PC users can update data simply, convert to some international data formats, combine some VODC databases to one, calculate average, min, max fields for some types of data, check for valid data…
Resumo:
As the UK's national marine data centre, a key responsibility of the British Oceanographic Data Centre (BODC) is to provide data management support for the scientific activities of complex multi-disciplinary long-term research programmes. Since the initial cruise in 1995, the NERC funded Atlantic Meridional Transect (AMT) project has undertaken 18 north–south transects of the Atlantic Ocean. As the project has evolved there has been a steady growth in the number of participants, the volume of data, its complexity and the demand for data. BODC became involved in AMT in 2002 at the beginning of phase II of this programme and since then has provided continuous support to the AMT and the wider scientific community through the rescue, quality control, processing and access to the data. The data management is carried out by a team of specialists using a sophisticated infrastructure and hardware to manage, integrate and serve physical, biological and chemical data. Here, we discuss the approach adopted, techniques applied and some guiding principles for management of large multi-disciplinary programmes.
Resumo:
To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioral data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals’ environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.
Resumo:
There is remarkable agreement in expectations today for vastly improved ocean data management a decade from now -- capabilities that will help to bring significant benefits to ocean research and to society. Advancing data management to such a degree, however, will require cultural and policy changes that are slow to effect. The technological foundations upon which data management systems are built are certain to continue advancing rapidly in parallel. These considerations argue for adopting attitudes of pragmatism and realism when planning data management strategies. In this paper we adopt those attitudes as we outline opportunities for progress in ocean data management. We begin with a synopsis of expectations for integrated ocean data management a decade from now. We discuss factors that should be considered by those evaluating candidate “standards”. We highlight challenges and opportunities in a number of technical areas, including “Web 2.0” applications, data modeling, data discovery and metadata, real-time operational data, archival of data, biological data management and satellite data management. We discuss the importance of investments in the development of software toolkits to accelerate progress. We conclude the paper by recommending a few specific, short term targets for implementation, that we believe to be both significant and achievable, and calling for action by community leadership to effect these advancements.
Resumo:
Climate-G is a large scale distributed testbed devoted to climate change research. It is an unfunded effort started in 2008 and involving a wide community both in Europe and US. The testbed is an interdisciplinary effort involving partners from several institutions and joining expertise in the field of climate change and computational science. Its main goal is to allow scientists carrying out geographical and cross-institutional data discovery, access, analysis, visualization and sharing of climate data. It represents an attempt to address, in a real environment, challenging data and metadata management issues. This paper presents a complete overview about the Climate-G testbed highlighting the most important results that have been achieved since the beginning of this project.
Resumo:
Instrumentation and automation plays a vital role to managing the water industry. These systems generate vast amounts of data that must be effectively managed in order to enable intelligent decision making. Time series data management software, commonly known as data historians are used for collecting and managing real-time (time series) information. More advanced software solutions provide a data infrastructure or utility wide Operations Data Management System (ODMS) that stores, manages, calculates, displays, shares, and integrates data from multiple disparate automation and business systems that are used daily in water utilities. These ODMS solutions are proven and have the ability to manage data from smart water meters to the collaboration of data across third party corporations. This paper focuses on practical, utility successes in the water industry where utility managers are leveraging instantaneous access to data from proven, commercial off-the-shelf ODMS solutions to enable better real-time decision making. Successes include saving $650,000 / year in water loss control, safeguarding water quality, saving millions of dollars in energy management and asset management. Immediate opportunities exist to integrate the research being done in academia with these ODMS solutions in the field and to leverage these successes to utilities around the world.