936 resultados para Data quality problems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose â The paper addresses the practical problems which emerge when attempting to apply longitudinal approaches to the assessment of property depreciation using valuation-based data. These problems relate to inconsistent valuation regimes and the difficulties in finding appropriate benchmarks. Design/methodology/approach â The paper adopts a case study of seven major office locations around Europe and attempts to determine ten-year rental value depreciation rates based on a longitudinal approach using IPD, CBRE and BNP Paribas datasets. Findings â The depreciation rates range from a 5 per cent PA depreciation rate in Frankfurt to a 2 per cent appreciation rate in Stockholm. The results are discussed in the context of the difficulties in applying this method with inconsistent data. Research limitations/implications â The paper has methodological implications for measuring property investment depreciation and provides an example of the problems in adopting theoretically sound approaches with inconsistent information. Practical implications â Valuations play an important role in performance measurement and cross border investment decision making and, therefore, knowledge of inconsistency of valuation practice aids decision making and informs any application of valuation-based data in the attainment of depreciation rates. Originality/value â The paper provides new insights into the use of property market valuation data in a cross-border context, insights that previously had been anecdotal and unproven in nature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Emissions of exhaust gases and particles from oceangoing ships are a significant and growing contributor to the total emissions from the transportation sector. We present an assessment of the contribution of gaseous and particulate emissions from oceangoing shipping to anthropogenic emissions and air quality. We also assess the degradation in human health and climate change created by these emissions. Regulating ship emissions requires comprehensive knowledge of current fuel consumption and emissions, understanding of their impact on atmospheric composition and climate, and projections of potential future evolutions and mitigation options. Nearly 70% of ship emissions occur within 400 km of coastlines, causing air quality problems through the formation of ground-level ozone, sulphur emissions and particulate matter in coastal areas and harbours with heavy traffic. Furthermore, ozone and aerosol precursor emissions as well as their derivative species from ships may be transported in the atmosphere over several hundreds of kilometres, and thus contribute to air quality problems further inland, even though they are emitted at sea. In addition, ship emissions impact climate. Recent studies indicate that the cooling due to altered clouds far outweighs the warming effects from greenhouse gases such as carbon dioxide (CO2) or ozone from shipping, overall causing a negative present-day radiative forcing (RF). Current efforts to reduce sulphur and other pollutants from shipping may modify this. However, given the short residence time of sulphate compared to CO2, the climate response from sulphate is of the order decades while that of CO2 is centuries. The climatic trade-off between positive and negative radiative forcing is still a topic of scientific research, but from what is currently known, a simple cancellation of global mean forcing components is potentially inappropriate and a more comprehensive assessment metric is required. The CO2 equivalent emissions using the global temperature change potential (GTP) metric indicate that after 50 years the net global mean effect of current emissions is close to zero through cancellation of warming by CO2 and cooling by sulphate and nitrogen oxides.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The long observational record is critical to our understanding of the Earthâs climate, but most observing systems were not developed with a climate objective in mind. As a result, tremendous efforts have gone into assessing and reprocessing the data records to improve their usefulness in climate studies. The purpose of this paper is to both review recent progress in reprocessing and reanalyzing observations, and summarize the challenges that must be overcome in order to improve our understanding of climate and variability. Reprocessing improves data quality through more scrutiny and improved retrieval techniques for individual observing systems, while reanalysis merges many disparate observations with models through data assimilation, yet both aim to provide a climatology of Earth processes. Many challenges remain, such as tracking the improvement of processing algorithms and limited spatial coverage. Reanalyses have fostered significant research, yet reliable global trends in many physical fields are not yet attainable, despite significant advances in data assimilation and numerical modeling. Oceanic reanalyses have made significant advances in recent years, but will only be discussed here in terms of progress toward integrated Earth system analyses. Climate data sets are generally adequate for process studies and large-scale climate variability. Communication of the strengths, limitations and uncertainties of reprocessed observations and reanalysis data, not only among the community of developers, but also with the extended research community, including the new generations of researchers and the decision makers is crucial for further advancement of the observational data records. It must be emphasized that careful investigation of the data and processing methods are required to use the observations appropriately.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This special issue is focused on the assessment of algorithms for the observation of Earthâs climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earthâs climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Variations in the spatial configuration of the interstellar magnetic field (ISMF) near the Sun can be constrained by comparing the ISMF direction at the heliosphere found from the Interstellar Boundary Explorer (IBEX) spacecraft observations of a ""Ribbon"" of energetic neutral atoms (ENAs), with the ISMF direction derived from optical polarization data for stars within similar to 40 pc. Using interstellar polarization observations toward similar to 30 nearby stars within similar to 90 degrees of the heliosphere nose, we find that the best fits to the polarization position angles are obtained for a magnetic pole directed toward ecliptic coordinates of lambda, beta similar to 263 degrees, 37 degrees (or galactic coordinates of l, b similar to 38 degrees, 23 degrees), with uncertainties of +/- 35 degrees based on the broad minimum of the best fits and the range of data quality. This magnetic pole is 33 degrees from the magnetic pole that is defined by the center of the arc of the ENA Ribbon. The IBEX ENA ribbon is seen in sight lines that are perpendicular to the ISMF as it drapes over the heliosphere. The similarity of the polarization and Ribbon directions for the local ISMF suggests that the local field is coherent over scale sizes of tens of parsecs. The ISMF vector direction is nearly perpendicular to the flow of local interstellar material (ISM) through the local standard of rest, supporting a possible local ISM origin related to an evolved expanding magnetized shell. The local ISMF direction is found to have a curious geometry with respect to the cosmic microwave background dipole moment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the poultry industry, the use of water with adequate physical, chemical and microbiological quality it is of fundamental importance. Since many birds have access to the same water source, quality problems will affect a great number of animals. The drinking water plays an important role in the transmission of some bacterial, viral and protozoan diseases that are among the most common poultry diseases. Important factors to prevent waterborne diseases in broiler production are the protection of supply sources, water disinfection and the quality control of microbiological, chemical and physical characteristics. Water is an essential nutrient for birds and therefore quality preservation is fundamental for good herd performance. The farmer may prevent many diseases in bird flocks by controlling the quality of the ingested water, will certainly result in decreased costs and increased profit, two essential aims of animal production nowadays.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Last century Six Sigma Strategy has been the focus of study for many scientists, between the discoveries we have the importance of data process for the free of error product manufactory. So, this work focuses on data quality importance in an enterprise. For this, a descriptive-exploratory study of seventeen pharmacies of manipulations from Rio Grande do Norte was undertaken with the objective to be able to create a base structure model to classify enterprises according to their data bases. Therefore, statistical methods such as cluster and discriminant analyses were used applied to a questionnaire built for this specific study. Data collection identified four group showing strong and weak characteristics for each group and that are differentiated from each other

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Particularly in Braziland in Rio Grande do Norte, companies manufacturing red ceramic, play an important role as agents of development to study the region Seridó- RN, specific place for carrying out the research. It is observed in this region a concentration of red ceramic industries of small size, which, despite its importance in the ceramic, they are unable to enjoy or use the new forms of administrative management and technological advances designed and offered by universities, centers of research and projects of governments, remained almost entirely outside the progress and modernization, technological and administrative. These companies still have outdated technology, and management processes, providing quality problems and standardization of end products. Upon these conditions are the companies going through crisis and struggling to survive alone and without assistance. The region of Seridó-RN, lets make a detailed case study of red ceramic companies in the region proposed from the existing theoretical and actual lifting of the condition of the product manufacturing red ceramic, allowing through this overview of the implementation of collect samples of raw materials, allowing the study of each ceramic industry that contributed to the participation of the research, which was determined parameters such as: analysis of the physical, chemical and technological properties of raw materials, characterization of the processes used, raising the technological resources considering equipment, machinery, supplies, raw materials and facilities available and its organization by type of products from companies involved in this study. The methodology consists of the following steps: collection of raw material, crushing and screening, characterization of raw materials (liquid limit, chemical analysis, mineralogical analysis, differential thermal analysis, sieve analysis), mixing, forming, cutting, drying and burning of ceramic bodies and bodies of evidence. The results showed that it was clay with distinct characteristics with respect to plasticity. With respect to the different compositions of mixtures of ceramic masses, we conclude that the ceramic properties showed a direct proportionality with increasing fraction of the clay not plastic. However, the compositions of the masses studied proved to be the most appropriate for the types of simulated clay for use in ceramics. Adopted in the ceramic processing made it possible to obtain products the resulted in consistent properties, and in some cases even exceeding the requirements of technical studies and standard-Brazilian clays to obtain ceramic products such as tiles, bricks and tiles to floor. Based on the discussions from the results obtained in the various processing steps of this work, one can draw conclusions according to the physico-chemical and mineralogical properties of raw materials, the properties of ceramic products burned and analysis. This work may be used by other researchers, private companies and governmental organizations, undergraduate students and graduate, can develop studies and future research to: develop projects to modify the furnaces; mapping projects develop and rationalize the exploitation of raw materials ;promoting reforestation and forest management; develop reduction projects and recovery of waste; develop training projects in manpower sector, and develop security projects, improving the conditions of work in the area pottery

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anthropic disturbances in watersheds, such as inappropriate building development, disorderly land occupation and unplanned land use, may strengthen the sediment yield and the inflow into the estuary, leading to siltation, changes in the reach channel conformation, and ecosystem/water quality problems. Faced with such context, this study aims to assess the applicability of SWAT model to estimate, even in a preliminary way, the sediment yield distribution along the Potengi River watershed, as well as its contribution to the estuary. Furthermore, an assessment of its erosion susceptibility was used for comparison. The susceptibility map was developed by overlaying rainfall erosivity, soil erodibility, the slope of the terrain and land cover. In order to overlap these maps, a multi-criteria analysis through AHP method was applied. The SWAT was run using a five year period (1997-2001), considering three different scenarios based on different sorts of human interference: a) agriculture; b) pasture; and c) no interference (background). Results were analyzed in terms of surface runoff, sediment yield and their propagation along each river section, so that it was possible to find that the regions in the extreme west of the watershed and in the downstream portions returned higher values of sediment yield, reaching respectively 2.8 e 5.1 ton/ha.year, whereas central areas, which were less susceptible, returned the lowest values, never more than 0.7 ton/ha.ano. It was also noticed that in the west sub-watersheds, where one can observe the headwaters, sediment yield was naturally forced by high declivity and weak soils. In another hand, results suggest that the eastern part would not contribute to the sediment inflow into the estuary in a significant way, and the larger part of the sediment yield in that place is due to anthropic activities. For the central region, the analysis of sediment propagation indicates deposition predominance in opposition to transport. Thus, it s not expected that isolated rain storms occurring in the upstream river portions would significantly provide the estuary with sediment. Because the model calibration process hasn t been done yet, it becomes essential to emphasize that values presented here as results should not be applied for pratical aims. Even so, this work warns about the risks of a growth in the alteration of natural land cover, mainly in areas closer to the headwaters and in the downstream Potengi River

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Brazilian Geodetic Network started to be established in the early 40's, employing classical surveying methods, such as triangulation and trilateration. With the introduction of satellite positioning systems, such as TRANSIT and GPS, that network was densified. That data was adjusted by employing a variety of methods, yielding distortions in the network that need to be understood. In this work, we analyze and interpret study cases in an attempt to understand the distortions in the Brazilian network. For each case, we performed the network adjustment employing the GHOST software suite. The results show that the distortion is least sensitive to the removal of invar baselines in the classical network. The network would be more affected by the inexistence of Laplace stations and Doppler control points, with differences up to 4.5 m.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The CMS High-Level Trigger (HLT) is responsible for ensuring that data samples with potentially interesting events are recorded with high efficiency and good quality. This paper gives an overview of the HLT and focuses on its commissioning using cosmic rays. The selection of triggers that were deployed is presented and the online grouping of triggered events into streams and primary datasets is discussed. Tools for online and offline data quality monitoring for the HLT are described, and the operational performance of the muon HLT algorithms is reviewed. The average time taken for the HLT selection and its dependence on detector and operating conditions are presented. The HLT performed reliably and helped provide a large dataset. This dataset has proven to be invaluable for understanding the performance of the trigger and the CMS experiment as a whole. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The weather and climate has a direct influence in agriculture, it affects all stages of farming, since soil preparation to harvest. Meteorological data derived from automatic or conventional weather stations are used to monitor these effects. These meteorological data has problems like difficulty of data access and low density of meteorological stations in Brazil. Meteorological data from atmospheric models, such as ECMWF (European Center for Medium-Range Weather Forecast) can be an alternative. Thus, the aim of this study was to compare 10-day period precipitation, maximum and minimum air temperature data from the ECMWF model with interpolated maps from 33 weather stations in Sao Paulo state between 2005 and 2010 and generate statistical maps pixel by pixel. Statistical index showed spatially satisfactory (most of the results with R 2 > 0.60, d > 0.7, RMSE < 5°C and < 50 mm; Es < 5°C and < 24 mm) in period and ECMWF model can be recommended for use in the Sao Paulo state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acoustic Doppler current profilers are currently the main option for flow measurement and hydrodynamic monitoring of streams, replacing traditional methods. The spread of such equipment is mainly due to their operational advantages ranging from speed measurement to the greatest detail and amount of information generated about the hydrodynamics of hydrometric sections. As in the use of traditional methods and equipments, the use of acoustic Doppler profilers should be guided by the pursuit of data quality, since these are the basis for project and management of water resources constructions and systems. In this sense, the paper presents an analysis of measurement uncertainties of a hydrometric campaign held in Sapucaí River (Piranguinho-MG), using two different Doppler profilers - a Rio Grande ADCP 1200 kHz and a Qmetrix Qliner. 10 measurements were performed with each equipment consecutively, following the literature quality protocols, and later, a Type A uncertainty analysis (statistical analysis of several independent observations of the input under the same conditions). The measurements of the ADCP and Qliner presented, respectively, standard uncertainties of 0.679% and 0.508% compared with the averages. These results are satisfactory and acceptable when compared to references in the literature, indicating that the use of Doppler profilers is valid for expansion and upgrade of streamflow measurement networks and generation of hydrological data.