932 resultados para Data and Information
Resumo:
Overview of the key aspects and approaches to open access, open data and open science, emphasizing on sharing scientific knowledge for sustainable progress and development.
Resumo:
Overview of the growth of policies and a critical appraisal of the issues affecting open access, open data and open science policies. Example policies and a roadmap for open access, open research data and open science are included.
Resumo:
The breadth and depth of available clinico-genomic information, present an enormous opportunity for improving our ability to study disease mechanisms and meet the individualised medicine needs. A difficulty occurs when the results are to be transferred 'from bench to bedside'. Diversity of methods is one of the causes, but the most critical one relates to our inability to share and jointly exploit data and tools. This paper presents a perspective on current state-of-the-art in the analysis of clinico-genomic data and its relevance to medical decision support. It is an attempt to investigate the issues related to data and knowledge integration. Copyright © 2010 Inderscience Enterprises Ltd.
Resumo:
This research pursued the conceptualization and real-time verification of a system that allows a computer user to control the cursor of a computer interface without using his/her hands. The target user groups for this system are individuals who are unable to use their hands due to spinal dysfunction or other afflictions, and individuals who must use their hands for higher priority tasks while still requiring interaction with a computer. ^ The system receives two forms of input from the user: Electromyogram (EMG) signals from muscles in the face and point-of-gaze coordinates produced by an Eye Gaze Tracking (EGT) system. In order to produce reliable cursor control from the two forms of user input, the development of this EMG/EGT system addressed three key requirements: an algorithm was created to accurately translate EMG signals due to facial movements into cursor actions, a separate algorithm was created that recognized an eye gaze fixation and provided an estimate of the associated eye gaze position, and an information fusion protocol was devised to efficiently integrate the outputs of these algorithms. ^ Experiments were conducted to compare the performance of EMG/EGT cursor control to EGT-only control and mouse control. These experiments took the form of two different types of point-and-click trials. The data produced by these experiments were evaluated using statistical analysis, Fitts' Law analysis and target re-entry (TRE) analysis. ^ The experimental results revealed that though EMG/EGT control was slower than EGT-only and mouse control, it provided effective hands-free control of the cursor without a spatial accuracy limitation, and it also facilitated a reliable click operation. This combination of qualities is not possessed by either EGT-only or mouse control, making EMG/EGT cursor control a unique and practical alternative for a user's cursor control needs. ^
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. ^ A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. ^ This study finds that literature in the field of Library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians. ^
Resumo:
The age of organic material discharged by rivers provides information about its sources and carbon cycling processes within watersheds. While elevated ages in fluvially-transported organic matter are usually explained by erosion of soils and sediments, it is commonly assumed that mainly young organic material is discharged from flat tropical watersheds due to their extensive plant cover and high carbon turnover. Here we present compound-specific radiocarbon data of terrigenous organic fractions from a sedimentary archive offshore the Congo River in conjunction with molecular markers for methane-producing land cover reflecting wetland extent in the watershed. We find that the Congo River has been discharging aged organic matter for several thousand years with increasing ages from the mid- to the Late Holocene. This suggests that aged organic matter in modern samples is concealed by radiocarbon from nuclear weapons testing. By comparison to indicators for past rainfall changes we detect a systematic control of organic matter sequestration and release by continental hydrology mediating temporary carbon storage in wetlands. As aridification also leads to exposure and rapid remineralization of large amounts of previously stored labile organic matter we infer that this process may cause a profound direct climate feedback currently underestimated in carbon cycle assessments.
Resumo:
The Lena River Delta, situated in Northern Siberia (72.0 - 73.8° N, 122.0 - 129.5° E), is the largest Arctic delta and covers 29,000 km**2. Since natural deltas are characterised by complex geomorphological patterns and various types of ecosystems, high spatial resolution information on the distribution and extent of the delta environments is necessary for a spatial assessment and accurate quantification of biogeochemical processes as drivers for the emission of greenhouse gases from tundra soils. In this study, the first land cover classification for the entire Lena Delta based on Landsat 7 Enhanced Thematic Mapper (ETM+) images was conducted and used for the quantification of methane emissions from the delta ecosystems on the regional scale. The applied supervised minimum distance classification was very effective with the few ancillary data that were available for training site selection. Nine land cover classes of aquatic and terrestrial ecosystems in the wetland dominated (72%) Lena Delta could be defined by this classification approach. The mean daily methane emission of the entire Lena Delta was calculated with 10.35 mg CH4/m**2/d. Taking our multi-scale approach into account we find that the methane source strength of certain tundra wetland types is lower than calculated previously on coarser scales.
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.
Resumo:
Advances in communication, navigation and imaging technologies are expected to fundamentally change methods currently used to collect data. Electronic data interchange strategies will also minimize data handling and automatically update files at the point of capture. This report summarizes the outcome of using a multi-camera platform as a method to collect roadway inventory data. It defines basic system requirements as expressed by users, who applied these techniques and examines how the application of the technology met those needs. A sign inventory case study was used to determine the advantages of creating and maintaining the database and provides the capability to monitor performance criteria for a Safety Management System. The project identified at least 75 percent of the data elements needed for a sign inventory can be gathered by viewing a high resolution image.
Resumo:
BACKGROUND: Invasive meningococcal disease is a significant cause of mortality and morbidity in the UK. Administration of chemoprophylaxis to close contacts reduces the risk of a secondary case. However, unnecessary chemoprophylaxis may be associated with adverse reactions, increased antibiotic resistance and removal of organisms, such as Neisseria lactamica, which help to protect against meningococcal disease. Limited evidence exists to suggest that overuse of chemoprophylaxis may occur. This study aimed to evaluate prescribing of chemoprophylaxis for contacts of meningococcal disease by general practitioners and hospital staff. METHODS: Retrospective case note review of cases of meningococcal disease was conducted in one health district from 1st September 1997 to 31st August 1999. Routine hospital and general practitioner prescribing data was searched for chemoprophylactic prescriptions of rifampicin and ciprofloxacin. A questionnaire of general practitioners was undertaken to obtain more detailed information. RESULTS: Prescribing by hospital doctors was in line with recommendations by the Consultant for Communicable Disease Control. General practitioners prescribed 118% more chemoprophylaxis than was recommended. Size of practice and training status did not affect the level of additional prescribing, but there were significant differences by geographical area. The highest levels of prescribing occurred in areas with high disease rates and associated publicity. However, some true close contacts did not appear to receive prophylaxis. CONCLUSIONS: Receipt of chemoprophylaxis is affected by a series of patient, doctor and community interactions. High publicity appears to increase demand for prophylaxis. Some true contacts do not receive appropriate chemoprophylaxis and are left at an unnecessarily increased risk
Resumo:
The goal was to understand, document and module how information is currently flown internally in the largest dairy organization in Finland. The organization has undergone radical changes in the past years due to economic sanctions between European Union and Russia. Therefore, organization’s ultimate goal would be to continue its growth through managing its sales process more efficiently. The thesis consists of a literature review and an empirical part. The literature review consists of knowledge management and process modeling theories. First, the knowledge management discusses how data, information and knowledge are exchanged in the process. Knowledge management models and processes are describing how knowledge is created, exchanged and can be managed in an organization. Secondly, the process modeling is responsible for visualizing information flow through discussion of modeling approaches and presenting different methods and techniques. Finally, process’ documentation procedure was presented. In the end, a constructive research approach was used in order to identify process’ related problems and bottlenecks. Therefore, possible solutions were presented based on this approach. The empirical part of the study is based on 37 interviews, organization’s internal data sources and theoretical framework. The acquired data and information were used to document and to module the sales process in question with a flowchart diagram. Results are conducted through construction of the flowchart diagram and analysis of the documentation. In fact, answers to research questions are derived from empirical and theoretical parts. In the end, 14 problems and two bottlenecks were identified in the process. The most important problems are related to approach and/or standardization for information sharing, insufficient information technology tool utilization and lack of systematization of documentation. The bottlenecks are caused by the alarming amount of changes to files after their deadlines.
Resumo:
The changes in time and location of surface temperature from a water body has an important effect on climate activities, marine biology, sea currents, salinity and other characteristics of the seas and lakes water. Traditional measurement of temperature is costly and time consumer due to its dispersion and instability. In recent years the use of satellite technology and remote sensing sciences for data acquiring and parameter and lysis of climatology and oceanography is well developed. In this research we used the NOAA’s Satellite images from its AVHRR system to compare the field surface temperature data with the satellite images information. Ten satellite images were used in this project. These images were calibrated with the field data at the exact time of satellite pass above the area. The result was a significant relation between surface temperatures from satellite data with the field work. As the relative error less than %40 between these two data is acceptable, therefore in our observation the maximum error is %21.2 that can be considered it as acceptable. In all stations the result of satellite measurements is usually less than field data that cores ponds with the global result too. As this sea has a vast latitude, therefore the different in the temperature is natural. But we know this factor is not the only cause for surface currents. The information of all satellites were images extracted by ERDAS software, and the “Surfer” software is used to plot the isotherm lines.
Resumo:
The key to graduate professionals with sufficient capacity to meet the research demands of users, should be the vision and the commissioning of schools and their academic library. All these efforts must be linked systematically to ensure the use of data recorded on their knowledge and information units.