966 resultados para Data quality-aware mechanisms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a concerted global effort to digitize biodiversity occurrence data from herbarium and museum collections that together offer an unparalleled archive of life on Earth over the past few centuries. The Global Biodiversity Information Facility provides the largest single gateway to these data. Since 2004 it has provided a single point of access to specimen data from databases of biological surveys and collections. Biologists now have rapid access to more than 120 million observations, for use in many biological analyses. We investigate the quality and coverage of data digitally available, from the perspective of a biologist seeking distribution data for spatial analysis on a global scale. We present an example of automatic verification of geographic data using distributions from the International Legume Database and Information Service to test empirically, issues of geographic coverage and accuracy. There are over 1/2 million records covering 31% of all Legume species, and 84% of these records pass geographic validation. These data are not yet a global biodiversity resource for all species, or all countries. A user will encounter many biases and gaps in these data which should be understood before data are used or analyzed. The data are notably deficient in many of the world's biodiversity hotspots. The deficiencies in data coverage can be resolved by an increased application of resources to digitize and publish data throughout these most diverse regions. But in the push to provide ever more data online, we should not forget that consistent data quality is of paramount importance if the data are to be useful in capturing a meaningful picture of life on Earth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OPAL is an English national programme that takes scientists into the community to investigate environmental issues. Biological monitoring plays a pivotal role covering topics of: i) soil and earthworms; ii) air, lichens and tar spot on sycamore; iii) water and aquatic invertebrates; iv) biodiversity and hedgerows; v) climate, clouds and thermal comfort. Each survey has been developed by an interdisciplinary team and tested by voluntary, statutory and community sectors. Data are submitted via the web and instantly mapped. Preliminary results are presented, together with a discussion on data quality and uncertainty. Communities also investigate local pollution issues, ranging from nitrogen deposition on heathlands to traffic emissions on roadside vegetation. Over 200,000 people have participated so far, including over 1000 schools and 1000 voluntary groups. Benefits include a substantial, growing database on biodiversity and habitat condition, much from previously unsampled sites particularly in urban areas, and a more engaged public.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review the scientific literature since the 1960s to examine the evolution of modeling tools and observations that have advanced understanding of global stratospheric temperature changes. Observations show overall cooling of the stratosphere during the period for which they are available (since the late 1950s and late 1970s from radiosondes and satellites, respectively), interrupted by episodes of warming associated with volcanic eruptions, and superimposed on variations associated with the solar cycle. There has been little global mean temperature change since about 1995. The temporal and vertical structure of these variations are reasonably well explained bymodels that include changes in greenhouse gases, ozone, volcanic aerosols, and solar output, although there are significant uncertainties in the temperature observations and regarding the nature and influence of past changes in stratospheric water vapor. As a companion to a recent WIREs review of tropospheric temperature trends, this article identifies areas of commonality and contrast between the tropospheric and stratospheric trend literature. For example, the increased attention over time to radiosonde and satellite data quality has contributed to better characterization of uncertainty in observed trends both in the troposphere and in the lower stratosphere, and has highlighted the relative deficiency of attention to observations in the middle and upper stratosphere. In contrast to the relatively unchanging expectations of surface and tropospheric warming primarily induced by greenhouse gas increases, stratospheric temperature change expectations have arisen from experiments with a wider variety of model types, showingmore complex trend patterns associated with a greater diversity of forcing agents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The long observational record is critical to our understanding of the Earth’s climate, but most observing systems were not developed with a climate objective in mind. As a result, tremendous efforts have gone into assessing and reprocessing the data records to improve their usefulness in climate studies. The purpose of this paper is to both review recent progress in reprocessing and reanalyzing observations, and summarize the challenges that must be overcome in order to improve our understanding of climate and variability. Reprocessing improves data quality through more scrutiny and improved retrieval techniques for individual observing systems, while reanalysis merges many disparate observations with models through data assimilation, yet both aim to provide a climatology of Earth processes. Many challenges remain, such as tracking the improvement of processing algorithms and limited spatial coverage. Reanalyses have fostered significant research, yet reliable global trends in many physical fields are not yet attainable, despite significant advances in data assimilation and numerical modeling. Oceanic reanalyses have made significant advances in recent years, but will only be discussed here in terms of progress toward integrated Earth system analyses. Climate data sets are generally adequate for process studies and large-scale climate variability. Communication of the strengths, limitations and uncertainties of reprocessed observations and reanalysis data, not only among the community of developers, but also with the extended research community, including the new generations of researchers and the decision makers is crucial for further advancement of the observational data records. It must be emphasized that careful investigation of the data and processing methods are required to use the observations appropriately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variations in the spatial configuration of the interstellar magnetic field (ISMF) near the Sun can be constrained by comparing the ISMF direction at the heliosphere found from the Interstellar Boundary Explorer (IBEX) spacecraft observations of a ""Ribbon"" of energetic neutral atoms (ENAs), with the ISMF direction derived from optical polarization data for stars within similar to 40 pc. Using interstellar polarization observations toward similar to 30 nearby stars within similar to 90 degrees of the heliosphere nose, we find that the best fits to the polarization position angles are obtained for a magnetic pole directed toward ecliptic coordinates of lambda, beta similar to 263 degrees, 37 degrees (or galactic coordinates of l, b similar to 38 degrees, 23 degrees), with uncertainties of +/- 35 degrees based on the broad minimum of the best fits and the range of data quality. This magnetic pole is 33 degrees from the magnetic pole that is defined by the center of the arc of the ENA Ribbon. The IBEX ENA ribbon is seen in sight lines that are perpendicular to the ISMF as it drapes over the heliosphere. The similarity of the polarization and Ribbon directions for the local ISMF suggests that the local field is coherent over scale sizes of tens of parsecs. The ISMF vector direction is nearly perpendicular to the flow of local interstellar material (ISM) through the local standard of rest, supporting a possible local ISM origin related to an evolved expanding magnetized shell. The local ISMF direction is found to have a curious geometry with respect to the cosmic microwave background dipole moment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho buscou investigar a interação entre os mecanismos de governança contratual e relacional na relação comprador-fornecedor e seus impactos sobre os resultados de projetos complexos. A governança dos relacionamentos interorganizacionais e sua importância estratégica para o desempenho das firmas e para a obtenção de vantagens competitivas têm sido tema de muitas pesquisas recentes na área de estratégia, bem como em áreas correlatas. Mais especificamente, é crescente a importância de tais relacionamentos na literatura de gestão, especialmente em contextos envolvendo economias emergentes. A literatura apresenta uma convergência acerca de dois tipos principais de governança nos relacionamentos interorganizacionais: a governança contratual, que se refere aos contratos e regras formalmente estabelecidas entre as firmas para geralmente coibir comportamentos oportunistas, e a governança relacional, que se baseia principalmente na confiança e em normas relacionais para coordenar tais relacionamentos. Embora diversos estudos tenham investigado a interação entre essas governanças, não há um consenso na literatura sobre a natureza dessa interação. Este estudo teve por objetivo investigar a interação dos mecanismos de governança contratual e relacional por meio de um estudo de caso sobre a implantação de um megaprojeto na indústria brasileira do petróleo offshore, envolvendo tecnologia inovadora. Os resultados indicam que os mecanismos de governança contratual e relacional desempenham importantes funções no relacionamento comprador-fornecedor e que a interação entre eles impacta os resultados do projeto em termos de prazo, custo e qualidade. Tais mecanismos atuam de forma simultânea e influenciam uns aos outros em grande medida. Percebe-se ainda que o nível de influência de cada um desses mecanismos varia ao longo do tempo, a depender do contexto. Por fim, conclui-se que os resultados do projeto, no contexto estudado, não podem ser plenamente explicados apenas pela interação entre esses mecanismos. Tais resultados precisam ser contextualizados, uma vez que diversos fatores do ambiente institucional atuam como moderadores da interação entre governanças.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Last century Six Sigma Strategy has been the focus of study for many scientists, between the discoveries we have the importance of data process for the free of error product manufactory. So, this work focuses on data quality importance in an enterprise. For this, a descriptive-exploratory study of seventeen pharmacies of manipulations from Rio Grande do Norte was undertaken with the objective to be able to create a base structure model to classify enterprises according to their data bases. Therefore, statistical methods such as cluster and discriminant analyses were used applied to a questionnaire built for this specific study. Data collection identified four group showing strong and weak characteristics for each group and that are differentiated from each other

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous computing systems operate in environments where the available resources significantly change during the system operation, thus requiring adaptive and context aware mechanisms to sense changes in the environment and adapt to new execution contexts. Motivated by this requirement, a framework for developing and executing adaptive context aware applications is proposed. The PACCA framework employs aspect-oriented techniques to modularize the adaptive behavior and to keep apart the application logic from this behavior. PACCA uses abstract aspect concept to provide flexibility by addition of new adaptive concerns that extend the abstract aspect. Furthermore, PACCA has a default aspect model that considers habitual adaptive concerns in ubiquitous applications. It exploits the synergy between aspect-orientation and dynamic composition to achieve context-aware adaptation, guided by predefined policies and aim to allow software modules on demand load making possible better use of mobile devices and yours limited resources. A Development Process for the ubiquitous applications conception is also proposed and presents a set of activities that guide adaptive context-aware developer. Finally, a quantitative study evaluates the approach based on aspects and dynamic composition for the construction of ubiquitous applications based in metrics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Brazilian Geodetic Network started to be established in the early 40's, employing classical surveying methods, such as triangulation and trilateration. With the introduction of satellite positioning systems, such as TRANSIT and GPS, that network was densified. That data was adjusted by employing a variety of methods, yielding distortions in the network that need to be understood. In this work, we analyze and interpret study cases in an attempt to understand the distortions in the Brazilian network. For each case, we performed the network adjustment employing the GHOST software suite. The results show that the distortion is least sensitive to the removal of invar baselines in the classical network. The network would be more affected by the inexistence of Laplace stations and Doppler control points, with differences up to 4.5 m.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CMS High-Level Trigger (HLT) is responsible for ensuring that data samples with potentially interesting events are recorded with high efficiency and good quality. This paper gives an overview of the HLT and focuses on its commissioning using cosmic rays. The selection of triggers that were deployed is presented and the online grouping of triggered events into streams and primary datasets is discussed. Tools for online and offline data quality monitoring for the HLT are described, and the operational performance of the muon HLT algorithms is reviewed. The average time taken for the HLT selection and its dependence on detector and operating conditions are presented. The HLT performed reliably and helped provide a large dataset. This dataset has proven to be invaluable for understanding the performance of the trigger and the CMS experiment as a whole. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acoustic Doppler current profilers are currently the main option for flow measurement and hydrodynamic monitoring of streams, replacing traditional methods. The spread of such equipment is mainly due to their operational advantages ranging from speed measurement to the greatest detail and amount of information generated about the hydrodynamics of hydrometric sections. As in the use of traditional methods and equipments, the use of acoustic Doppler profilers should be guided by the pursuit of data quality, since these are the basis for project and management of water resources constructions and systems. In this sense, the paper presents an analysis of measurement uncertainties of a hydrometric campaign held in Sapucaí River (Piranguinho-MG), using two different Doppler profilers - a Rio Grande ADCP 1200 kHz and a Qmetrix Qliner. 10 measurements were performed with each equipment consecutively, following the literature quality protocols, and later, a Type A uncertainty analysis (statistical analysis of several independent observations of the input under the same conditions). The measurements of the ADCP and Qliner presented, respectively, standard uncertainties of 0.679% and 0.508% compared with the averages. These results are satisfactory and acceptable when compared to references in the literature, indicating that the use of Doppler profilers is valid for expansion and upgrade of streamflow measurement networks and generation of hydrological data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)