798 resultados para Data-Intensive Science
Resumo:
Selostus: Viljelymaiden savespitoisuuden alueellistaminen geostatistiikan ja pistemäisen tiedon avulla
Resumo:
Bionformatics is a rapidly evolving research field dedicated toanalyzing and managing biological data with computational resources. This paperaims to overview some of the processes and applications currently implementedat CCiT-UB¿s Bioinformatics Unit, focusing mainly on the areas of Genomics,Transcriptomics and Proteomics
Resumo:
Here we present a 30 000 years low-resolution climate record reconstructed from groundwater data. The investigated site is located in the Bohemian Cretaceous Basin, in the corridor between the Scandinavian ice sheet and the Alpine ice field. Noble gas temperatures (NGT), obtained from groundwater data, preserved multicentennial temperature variability and indicated a cooling of at least 5-7 °C during the last glacial maximum (LGM). This is further confirmed by the depleted δ18O and δ2H values at the LGM. High excess air (ΔNe) at the end of the Pleistocene is possibly related to abrupt changes in recharge dynamics due to progression and retreat of ice covers and permafrost. These results agree with the fact that during the LGM permafrost and small glaciers developed in the inner valleys of the Giant Mountains (located in the watershed of the aquifers). A temporal decrease of deuterium excess from the pre-industrial Holocene to present days is linked to an increase of the air temperatures, and probably also to an increase of water pressure at the source region of precipitation over the past few hundred years
Resumo:
Proyecto final de grado consistente en la explotación de un data warehouse para el análisis de información sobre el tránsito rodado de vehículos.
Resumo:
Purpose This study seeks to analyse the policies of library and information science (LIS) journals regarding the publication of supplementary materials, the number of journals and articles that include this feature, the kind of supplementary materials published with regard to their function in the article, the formats employed and the access provided to readers. Design/methodology/approach The study analysed the instructions for authors of LIS journals indexed in the ISI Journal Citation Reports, as well as the supplementary materials attached to the articles published in their 2011 online volumes. Findings Large publishers are more likely to have a policy regarding the publication of supplementary materials, and policies are usually homogeneous across all the journals of a given publisher. Most policies state the acceptance of supplementary materials, and even journals without a policy also publish supplementary materials. The majority of supplementary materials provided in LIS articles are extended methodological explanations and additional results in the form of textual information in PDF or Word files. Some toll-access journals provide open access to any reader to these files. Originality/value This study provides new insights into the characteristics of supplementary materials in LIS journals. The results may be used by journal publishers to establish a policy on the publication of supplementary materials and, more broadly, to develop data sharing initiatives in academic settings.
Resumo:
The present paper advocates for the creation of a federated, hybrid database in the cloud, integrating law data from all available public sources in one single open access system - adding, in the process, relevant meta-data to the indexed documents, including the identification of social and semantic entities and the relationships between them, using linked open data techniques and standards such as RDF. Examples of potential benefits and applications of this approach are also provided, including, among others, experiences from of our previous research, in which data integration, graph databases and social and semantic networks analysis were used to identify power relations, litigation dynamics and cross-references patterns both intra and inter-institutionally, covering most of the World international economic courts.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
Lithium is an efficacious agent for the treatment of bipolar disorder, but it is unclear to what extent its long-term use may result in neuroprotective or toxic consequences. Medline was searched with the combination of the word 'Lithium' plus key words that referred to every possible effect on the central nervous system. The papers were further classified into those supporting a neuroprotective effect, those in favour of a neurotoxic effect and those that were neutral. The papers were classified into research in humans, animal and in-vitro research, case reports, and review/opinion articles. Finally, the Natural Standard evidence-based validated grading rationale was used to validate the data. The Medline search returned 970 papers up to February 2006. Inspection of the abstracts supplied 214 papers for further reviewing. Eighty-nine papers supported the neuroprotective effect (6 human research, 58 animal/in vitro, 0 case reports, 25 review/opinion articles). A total of 116 papers supported the neurotoxic effect (17 human research, 23 animal/in vitro, 60 case reports, 16 review/opinion articles). Nine papers supported no hypothesis (5 human research, 3 animal/in vitro, 0 case reports, 1 review/opinion articles). Overall, the grading suggests that the data concerning the effect of lithium therapy is that of level C, that is 'unclear or conflicting scientific evidence' since there is conflicting evidence from uncontrolled non-randomized studies accompanied by conflicting evidence from animal and basic science studies. Although more papers are in favour of the toxic effect, the great difference in the type of papers that support either hypothesis, along with publication bias and methodological issues make conclusions difficult. Lithium remains the 'gold standard' for the prophylaxis of bipolar illness, however, our review suggests that there is a rare possibility of a neurotoxic effect in real-life clinical practice even in closely monitored patients with 'therapeutic' lithium plasma levels. It is desirable to keep lithium blood levels as low as feasible with prophylaxis.
Resumo:
Pseudomonas aeruginosa is one of the leading nosocomial pathogens in intensive care units (ICUs). The source of this microorganism can be either endogenous or exogenous. The proportion of cases as a result of transmission is still debated, and its elucidation is important for implementing appropriate control measures. To understand the relative importance of exogenous vs. endogenous sources of P. aeruginosa, molecular typing was performed on all available P. aeruginosa isolated from ICU clinical and environmental specimens in 1998, 2000, 2003, 2004 and 2007. Patient samples were classified according to their P. aeruginosa genotypes into three categories: (A) identical to isolate from faucet; (B) identical to at least one other patient sample and not found in faucet; and (C) unique genotype. Cases in categories A and B were considered as possibly exogenous, and cases in category C as possibly endogenous. A mean of 34 cases per 1000 admissions per year were found to be colonized or infected by P. aeruginosa. Higher levels of faucet contamination were correlated with a higher number of cases in category A. The number of cases in category B varied from 1.9 to 20 cases per 1000 admissions. This number exceeded 10/1000 admissions on three occasions and was correlated with an outbreak on one occasion. The number of cases considered as endogenous (category C) was stable and independent of the number of cases in categories A and B. The present study shows that repeated molecular typing can help identify variations in the epidemiology of P. aeruginosa in ICU patients and guide infection control measures.
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
Abstract:The objective of this work was to develop and validate a prognosis system for volume yield and basal area of intensively managed loblolly pine (Pinus taeda) stands, using stand and diameter class models compatible in basal area estimates. The data used in the study were obtained from plantations located in northern Uruguay. For model validation without data loss, a three-phase validation scheme was applied: first, the equations were fitted without the validation database; then, model validation was carried out; and, finally, the database was regrouped to recalibrate the parameter values. After the validation and final parameterization of the models, a simulation of the first commercial thinning was carried out. The developed prognosis system was precise and accurate in estimating basal area production per hectare or per diameter classes. There was compatibility in basal area estimates between diameter class and whole stand models, with a mean difference of -0.01 m2ha-1. The validation scheme applied is logic and consistent, since information on the accuracy and precision of the models is obtained without the loss of any information in the estimation of the models' parameters.
Resumo:
Gold in the quartz-pebble conglomerates of the late Archean Witwatersrand Basin, South Africa, is often intimately associated with carbonaceous matter of organic/biogenic origin which occurs in the form of stratiform carbon seams and paragenetically late bitumen nodules. Both carbon forms are believed to be formed by solidification of migrating hydrocarbons. This paper presents bulk and molecular chemical and stable carbon isotope data for the carbonaceous matter, all of which are used to provide a clue to the source of the hydrocarbons. These data are compared with those from intra-basinal shales and overlying dolostone of the Transvaal Supergroup. The delta C-13 values of the extracts from the Witwatersrand carbonaceous material show small differences (up to 2.4 parts per thousand) compared to the associated insoluble organic matter. This suggests that the auriferous rocks were stained by mobile hydrocarbons produced by thermal and oxidative alteration of indigenous bitumens, a contribution from hydrocarbons derived from intra-basinal Witwatersrand shales cannot be excluded. Individual aliphatic hydrocarbons of the various carbonaceous materials were subjected to compound specific isotope analysis using on-line gas chromatography/combustion/stable isotope ratio mass spectrometry (GC/C/IRMS). The limited variability of the molecular parameters and uniform delta C-13 values of individual n-alkanes (-31.1 +/- 1.7 parts per thousand) and isoprenoids (-30.7 +/- 1.1 parts per thousand) in the Witwatersrand samples exclude the mixing of oils from different sources. Carbonaceous matter in the dolostones shows distinctly different bulk and molecular isotope characteristics and thus cannot have been the source of the hydrocarbons in the Witwatersrand deposits. All the various forms of Witwatersrand carbon appear indigenous to the Witwatersrand Basin, and the differences between them are explained by variable, in general probably short (centimeter- to meter-scale) hydrocarbon migration during diagenesis and subsequent hydrothermal infiltration. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
Even though the research on innovation in services has expanded remarkably especially during the past two decades, there is still a need to increase understanding on the special characteristics of service innovation. In addition to studying innovation in service companies and industries, research has also recently focused more on services in innovation, as especially the significance of so-called knowledge intensive business services (KIBS) for the competitive edge of their clients, othercompanies, regions and even nations has been proved in several previous studies. This study focuses on studying technology-based KIBS firms, and technology andengineering consulting (TEC) sector in particular. These firms have multiple roles in innovation systems, and thus, there is also a need for in-depth studies that increase knowledge about the types and dimensions of service innovations as well as underlying mechanisms and procedures which make the innovations successful. The main aim of this study is to generate new knowledge in the fragmented research field of service innovation management by recognizing the different typesof innovations in TEC services and some of the enablers of and barriers to innovation capacity in the field, especially from the knowledge management perspective. The study also aims to shed light on some of the existing routines and new constructions needed for enhancing service innovation and knowledge processing activities in KIBS companies of the TEC sector. The main samples of data in this research include literature reviews and public data sources, and a qualitative research approach with exploratory case studies conducted with the help of the interviews at technology consulting companies in Singapore in 2006. These complement the qualitative interview data gathered previously in Finland during a larger research project in the years 2004-2005. The data is also supplemented by a survey conducted in Singapore. The respondents for the survey by Tan (2007) were technology consulting companies who operate in the Singapore region. The purpose ofthe quantitative part of the study was to validate and further examine specificaspects such as the influence of knowledge management activities on innovativeness and different types of service innovations, in which the technology consultancies are involved. Singapore is known as a South-east Asian knowledge hub and is thus a significant research area where several multinational knowledge-intensive service firms operate. Typically, the service innovations identified in the studied TEC firms were formed by several dimensions of innovations. In addition to technological aspects, innovations were, for instance, related to new client interfaces and service delivery processes. The main enablers of and barriers to innovation seem to be partly similar in Singaporean firms as compared to the earlier study of Finnish TEC firms. Empirical studies also brought forth the significance of various sources of knowledge and knowledge processing activities as themain driving forces of service innovation in technology-related KIBS firms. A framework was also developed to study the effect of knowledge processing capabilities as well as some moderators on the innovativeness of TEC firms. Especially efficient knowledge acquisition and environmental dynamism seem to influence the innovativeness of TEC firms positively. The results of the study also contributeto the present service innovation literature by focusing more on 'innovation within KIBs' rather than 'innovation through KIBS', which has been the typical viewpoint stressed in the previous literature. Additionally, the study provides several possibilities for further research.
Resumo:
A newspaper content management system has to deal with a very heterogeneous information space as the experience in the Diari Segre newspaper has shown us. The greatest problem is to harmonise the different ways the involved users (journalist, archivists...) structure the newspaper information space, i.e. news, topics, headlines, etc. Our approach is based on ontology and differentiated universes of discourse (UoD). Users interact with the system and, from this interaction, integration rules are derived. These rules are based on Description Logic ontological relations for subsumption and equivalence. They relate the different UoD and produce a shared conceptualisation of the newspaper information domain.