19 resultados para Linked Data

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, the formal scientific output in most fields of natural science has been limited to peer- reviewed academic journal publications, with less attention paid to the chain of intermediate data results and their associated metadata, including provenance. In effect, this has constrained the representation and verification of the data provenance to the confines of the related publications. Detailed knowledge of a dataset’s provenance is essential to establish the pedigree of the data for its effective re-use, and to avoid redundant re-enactment of the experiment or computation involved. It is increasingly important for open-access data to determine their authenticity and quality, especially considering the growing volumes of datasets appearing in the public domain. To address these issues, we present an approach that combines the Digital Object Identifier (DOI) – a widely adopted citation technique – with existing, widely adopted climate science data standards to formally publish detailed provenance of a climate research dataset as an associated scientific workflow. This is integrated with linked-data compliant data re-use standards (e.g. OAI-ORE) to enable a seamless link between a publication and the complete trail of lineage of the corresponding dataset, including the dataset itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an overview of the MELODIES project, which is developing new data-intensive environmental services based on data from Earth Observation satellites, government databases, national and European agencies and more. We focus here on the capabilities and benefits of the project’s “technical platform”, which applies cloud computing and Linked Data technologies to enable the development of these services, providing flexibility and scalability.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We describe the CHARMe project, which aims to link climate datasets with publications, user feedback and other items of "commentary metadata". The system will help users learn from previous community experience and select datasets that best suit their needs, as well as providing direct traceability between conclusions and the data that supported them. The project applies the principles of Linked Data and adopts the Open Annotation standard to record and publish commentary information. CHARMe contributes to the emerging landscape of "climate services", which will provide climate data and information to influence policy and decision-making. Although the project focuses on climate science, the technologies and concepts are very general and could be applied to other fields.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The CHARMe project enables the annotation of climate data with key pieces of supporting information that we term “commentary”. Commentary reflects the experience that has built up in the user community, and can help new or less-expert users (such as consultants, SMEs, experts in other fields) to understand and interpret complex data. In the context of global climate services, the CHARMe system will record, retain and disseminate this commentary on climate datasets, and provide a means for feeding back this experience to the data providers. Based on novel linked data techniques and standards, the project has developed a core system, data model and suite of open-source tools to enable this information to be shared, discovered and exploited by the community.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

For users of climate services, the ability to quickly determine the datasets that best fit one's needs would be invaluable. The volume, variety and complexity of climate data makes this judgment difficult. The ambition of CHARMe ("Characterization of metadata to enable high-quality climate services") is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports or feedback on previous applications of the data. The capture and discovery of this "commentary" information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator ("CHARMe Maps") and a tool for correlating climate time series with external "significant events" (e.g. instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source, released under a liberal licence, permitting future projects to re-use the source code as they wish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an immunogen of the coronavirus, the nucleoprotein (N) is a potential antigen for the serological monitoring of infectious bronchitis virus (IBV). In this report, recombinant N protein from the Beaudette strain of IBV was produced and purified from Escherichia coli as well as Sf9 ( insect) cells, and used for the coating of enzyme-linked immunosorbent assay ( ELISA) plates. The N protein produced in Sf9 cells was phosphorylated whereas N protein from E. coli was not. Our data indicated that N protein purified from E. coli was more sensitive to anti-IBV serum than the protein from Sf9 cells. The recombinant N protein did not react with the antisera to other avian pathogens, implying that it was specific in the recognition of IBV antibodies. In addition, the data from the detection of field samples and IBV strains indicated that using the recombinant protein as coating antigen could achieve an equivalent performance to an ELISA kit based on infected material extracts as a source of antigen(s). ELISAs based on recombinant proteins are safe ( no live virus), clean ( only virus antigens are present), specific ( single proteins can be used) and rapid ( to respond to new viral strains and strains that cannot necessarily be easily cultured).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wireless sensor network (WSN) is a group of sensors linked by wireless medium to perform distributed sensing tasks. WSNs have attracted a wide interest from academia and industry alike due to their diversity of applications, including home automation, smart environment, and emergency services, in various buildings. The primary goal of a WSN is to collect data sensed by sensors. These data are characteristic of being heavily noisy, exhibiting temporal and spatial correlation. In order to extract useful information from such data, as this paper will demonstrate, people need to utilise various techniques to analyse the data. Data mining is a process in which a wide spectrum of data analysis methods is used. It is applied in the paper to analyse data collected from WSNs monitoring an indoor environment in a building. A case study is given to demonstrate how data mining can be used to optimise the use of the office space in a building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variational data assimilation systems for numerical weather prediction rely on a transformation of model variables to a set of control variables that are assumed to be uncorrelated. Most implementations of this transformation are based on the assumption that the balanced part of the flow can be represented by the vorticity. However, this assumption is likely to break down in dynamical regimes characterized by low Burger number. It has recently been proposed that a variable transformation based on potential vorticity should lead to control variables that are uncorrelated over a wider range of regimes. In this paper we test the assumption that a transform based on vorticity and one based on potential vorticity produce an uncorrelated set of control variables. Using a shallow-water model we calculate the correlations between the transformed variables in the different methods. We show that the control variables resulting from a vorticity-based transformation may retain large correlations in some dynamical regimes, whereas a potential vorticity based transformation successfully produces a set of uncorrelated control variables. Calculations of spatial correlations show that the benefit of the potential vorticity transformation is linked to its ability to capture more accurately the balanced component of the flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Widespread reports of low pollination rates suggest a recent anthropogenic decline in pollination that could threaten natural and agricultural ecosystems. Nevertheless, unequivocal evidence for a decline in pollination over time has remained elusive because it was not possible to determine historical pollination rates. Here we demonstrate a widely applicable method for reconstructing historical pollination rates, thus allowing comparison with contemporary rates from the same sites. We focused on the relationship between the oil-collecting bee Rediviva peringueyi (Melittidae) and the guild of oil-secreting orchid species (Coryciinae) that depends on it for pollination. The guild is distributed across the highly transformed and fragmented lowlands of the Cape Region of South Africa. We show that rehydrated herbarium specimens of Pterygodium catholicum, the most abundant member of the guild, contain a record of past pollinator activity in the form of pollinarium removal rates. Analysis of a pollination time series showed a recent decline in pollination on Signal Hill, a small urban conservation area. The same herbaria contain historical species occurrence data. We analyzed this data and found that there has been a contemporaneous shift in orchid guild composition in urban areas due to the local extirpation of the non-clonal species, consistent with their greater dependence on seeds and pollination for population persistence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For thousands of years, humans have inhabited locations that are highly vulnerable to the impacts of climate change, earthquakes, and floods. In order to investigate the extent to which Holocene environmental changes may have impacted on cultural evolution, we present new geologic, geomorphic, and chronologic data from the Qazvin Plain in northwest Iran that provides a backdrop of natural environmental changes for the simultaneous cultural dynamics observed on the Central Iranian Plateau. Well-resolved archaeological data from the neighbouring settlements of Zagheh (7170—6300 yr BP), Ghabristan (6215—4950 yr BP) and Sagzabad (4050—2350 yr BP) indicate that Holocene occupation of the Hajiarab alluvial fan was interrupted by a 900 year settlement hiatus. Multiproxy climate data from nearby lakes in northwest Iran suggest a transition from arid early-Holocene conditions to more humid middle-Holocene conditions from c. 7550 to 6750 yr BP, coinciding with the settlement of Zagheh, and a peak in aridity at c. 4550 yr BP during the settlement hiatus. Palaeoseismic investigations indicate that large active fault systems in close proximity to the tell sites incurred a series of large (MW ~7.1) earthquakes with return periods of ~500—1000 years during human occupation of the tells. Mapping and optically stimulated luminescence (OSL) chronology of the alluvial sequences reveals changes in depositional style from coarse-grained unconfined sheet flow deposits to proximal channel flow and distally prograding alluvial deposits sometime after c. 8830 yr BP, possibly reflecting an increase in moisture following the early-Holocene arid phase. The coincidence of major climate changes, earthquake activity, and varying sedimentation styles with changing patterns of human occupation on the Hajiarab fan indicate links between environmental and anthropogenic systems. However, temporal coincidence does not necessitate a fundamental causative dependency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation (DA) systems are evolving to meet the demands of convection-permitting models in the field of weather forecasting. On 19 April 2013 a special interest group meeting of the Royal Meteorological Society brought together UK researchers looking at different aspects of the data assimilation problem at high resolution, from theory to applications, and researchers creating our future high resolution observational networks. The meeting was chaired by Dr Sarah Dance of the University of Reading and Dr Cristina Charlton-Perez from the MetOffice@Reading. The purpose of the meeting was to help define the current state of high resolution data assimilation in the UK. The workshop assembled three main types of scientists: observational network specialists, operational numerical weather prediction researchers and those developing the fundamental mathematical theory behind data assimilation and the underlying models. These three working areas are intrinsically linked; therefore, a holistic view must be taken when discussing the potential to make advances in high resolution data assimilation.