940 resultados para Data Standards


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accumulating data suggest that diets rich in flavanols and procyanidins are beneficial for human health. In this context, there has been a great interest in elucidating the systemic levels and metabolic profiles at which these compounds occur in humans. While recent progress has been made, there still exist considerable differences and various disagreements with regard to the mammalian metabolites of these compounds, which in turn is largely a consequence of the lack of availability of authentic standards that would allow for the directed development and validation of expedient analytical methodologies. In the present study, we developed a method for the analysis of structurally-related flavanol metabolites using a wide range of authentic standards. Applying this method in the context of a human dietary intervention study using comprehensively characterized and standardized flavanol- and procyanidin-containing cocoa, we were able to identify the structurally-related (−)-epicatechin metabolites (SREM) postprandially extant in the systemic circulation of humans. Our results demonstrate that (−)-epicatechin-3′-β-D-glucuronide, (−)-epicatechin-3′-sulfate, and a 3′-O-methyl(−)-epicatechin-5/7-sulfate are the predominant SREM in humans, and further confirm the relevance of the stereochemical configuration in the context of flavanol metabolism. In addition, we also identified plausible causes for the previously reported discrepancies regarding flavanol metabolism, consisting to a significant extent of inter-laboratory differences in sample preparation (enzymatic treatment and sample conditioning for HPLC analysis) and detection systems. Thus, these findings may also aid in the establishment of consensus on this topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excessive salt intake is linked to cardiovascular disease and several other health problems around the world. The UK Food Standards Agency initiated a campaign at the end of 2004 to reduce salt intake in the population. There is disagreement over whether the campaign was effective in curbing salt intake or not. We provide fresh evidence on the impact of the campaign, by using data on spot urinary sodium readings and socio-demographic variables from the Health Survey for England over 2003–2007 and combining it with food price information from the Expenditure and Food Survey. Aggregating the data into a pseudo-panel, we estimate fixed effects models to examine the trend in salt intake over the period and to deduce the heterogeneous effects of the policy on the intake of socio-demographic groups. Our results are consistent with a previous hypothesis that the campaign reduced salt intakes by approximately 10%. The impact is shown to be stronger among women than among men. Older cohorts of men show a larger response to the salt campaign compared to younger cohorts, while among women, younger cohorts respond more strongly than older cohorts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the findings of a small-scale research project which investigated the levels of awareness and knowledge of written standard English of 10 and 11 year old children in two English primary schools. The project involved repeating in 2010 a written questionnaire previously used with children in the same schools in three separate surveys in 1999, 2002 and 2005. Data from the latest survey are compared to those from the previous three. The analysis seeks to identify any changes over time in children’s ability to recognise non-standard forms and supply standard English alternatives, as well as their ability to use technical terms related to language variation. Differences between the performance of boys and girls and that of the two schools are also analysed. The paper concludes that the socio-economic context of the schools may be a more important factor than gender in variations over time identified in the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Written for communications and electronic engineers, technicians and students, this book begins with an introduction to data communications, and goes on to explain the concept of layered communications. Other chapters deal with physical communications channels, baseband digital transmission, analog data transmission, error control and data compression codes, physical layer standards, the data link layer, the higher layers of the protocol hierarchy, and local are networks (LANS). Finally, the book explores some likely future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data from civil engineering projects can inform the operation of built infrastructure. This paper captures lessons for such data handover, from projects into operations, through interviews with leading clients and their supply chain. Clients are found to value receiving accurate and complete data. They recognise opportunities to use high quality information in decision-making about capital and operational expenditure; as well as in ensuring compliance with regulatory requirements. Providing this value to clients is a motivation for information management in projects. However, data handover is difficult as key people leave before project completion; and different data formats and structures are used in project delivery and operations. Lessons learnt from leading practice include defining data requirements at the outset, getting operations teams involved early, shaping the evolution of interoperable systems and standards, developing handover processes to check data rather than documentation, and fostering skills to use and update project data in operations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CHARMe project enables the annotation of climate data with key pieces of supporting information that we term “commentary”. Commentary reflects the experience that has built up in the user community, and can help new or less-expert users (such as consultants, SMEs, experts in other fields) to understand and interpret complex data. In the context of global climate services, the CHARMe system will record, retain and disseminate this commentary on climate datasets, and provide a means for feeding back this experience to the data providers. Based on novel linked data techniques and standards, the project has developed a core system, data model and suite of open-source tools to enable this information to be shared, discovered and exploited by the community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper details a strategy for modifying the source code of a complex model so that the model may be used in a data assimilation context, {and gives the standards for implementing a data assimilation code to use such a model}. The strategy relies on keeping the model separate from any data assimilation code, and coupling the two through the use of Message Passing Interface (MPI) {functionality}. This strategy limits the changes necessary to the model and as such is rapid to program, at the expense of ultimate performance. The implementation technique is applied in different models with state dimension up to $2.7 \times 10^8$. The overheads added by using this implementation strategy in a coupled ocean-atmosphere climate model are shown to be an order of magnitude smaller than the addition of correlated stochastic random errors necessary for some nonlinear data assimilation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For users of climate services, the ability to quickly determine the datasets that best fit one's needs would be invaluable. The volume, variety and complexity of climate data makes this judgment difficult. The ambition of CHARMe ("Characterization of metadata to enable high-quality climate services") is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports or feedback on previous applications of the data. The capture and discovery of this "commentary" information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator ("CHARMe Maps") and a tool for correlating climate time series with external "significant events" (e.g. instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source, released under a liberal licence, permitting future projects to re-use the source code as they wish.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From a construction innovation systems perspective, firms acquire knowledge from suppliers, clients, universities and institutional environment. Building information modelling (BIM) involves these firms using new process standards. To understand the implications on interactive learning using BIM process standards, a case study is conducted with the UK operations of a multinational construction firm. Data is drawn from: a) two workshops involving the firm and a wider industry group, b) observations of practice in the BIM core team and in three ongoing projects, c) 12 semi-structured interviews; and d) secondary publications. The firm uses a set of BIM process standards (IFC, PAS 1192, Uniclass, COBie) in its construction activities. It is also involved in a pilot to implement the COBie standard, supported by technical and management standards for BIM, such as Uniclass and PAS1192. Analyses suggest that such BIM process standards unconsciously shapes the firm's internal and external interactive learning processes. Internally standards allow engineers to learn from each through visualising 3D information and talking around designs with operatives to address problems during construction. Externally, the firm participates in trial and pilot projects involving other construction firms, government agencies, universities and suppliers to learn about the standard and access knowledge to solve its specific design problems. Through its BIM manager, the firm provides feedback to standards developers and information technology suppliers. The research contributes by articulating how BIM process standards unconsciously change interactive learning processes in construction practice. Further research could investigate these findings in the wider UK construction innovation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the impact of socioeconomic factors on eighth grade achievement test scores in the face of federal and state initiatives for educational reform in Maine. We use student-level data over a five year period to provide a framework for understanding the policy implications of these initiatives. We model performance on standardized tests using a seemingly unrelated regressions approach and then determine the likelihood of meeting the standards defined by the adequate yearly progress requirements of the No Child Left Behind Act and Maine Learning Results initiatives. Our results indicate that the key factors influencing a student’s test scores include the education of a student’s parents, special services received for learning disabilities, and alternative measures of academic achievement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste estudo é fazer uma análise da relação entre o erro de previsão dos analistas de mercado quanto à rentabilidade das empresas listadas na BM&FBOVESPA S.A. (Bovespa) e os requerimentos de divulgação do International Financial Reporting Standards (IFRS). Isto foi feito através da regressão do erro de previsão dos analistas, utilizando a metodologia de dados em painel no ano de implantação do IFRS no Brasil, 2010, e, complementarmente em 2012, para referenciamento desses dados. Partindo desse pressuposto, foi determinado o erro de previsão das empresas listadas na Bovespa através de dados de rentabilidade (índice de lucro por ação/earnings per share) previstos e realizados, disponíveis nas bases de dados I/B/E/S Earnings Consensus Information, providos pela plataforma Thomson ONE Investment Banking e Economática Pro®, respectivamente. Os resultados obtidos indicam uma relação negativa entre o erro de previsão e o cumprimento dos requisitos de divulgação do IFRS, ou seja, quanto maior a qualidade nas informações divulgadas, menor o erro de previsão dos analistas. Portanto, esses resultados sustentam a perspectiva de que o grau de cumprimento das normas contábeis é tão ou mais importante do que as próprias normas. Adicionalmente, foi verificado que quando a empresa listada na BM&FBOVESPA é vinculada a Agência Reguladora, seu erro de previsão não é alterado. Por fim, esses resultados sugerem que é importante que haja o aprimoramento dos mecanismos de auditoria das firmas quanto ao cumprimento dos requerimentos normativos de divulgação, tais como: penalidades pela não observância da norma (enforcement), estruturas de governança corporativa e auditorias interna e externa.