958 resultados para Iron foundries Production control Data processing
Resumo:
A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
This article presents and technically describes a new field spectro-goniometer system for the ground-based characterization of the surface reflectance anisotropy under natural illumination conditions developed at the Alfred Wegener Institute (AWI). The spectro-goniometer consists of a Manual Transportable Instrument platform for ground-based Spectro-directional observations (ManTIS), and a hyperspectral sensor system. The presented measurement strategy shows that the AWI ManTIS field spectro-goniometer can deliver high quality hemispherical conical reflectance factor (HCRF) measurements with a pointing accuracy of ±6 cm within the constant observation center. The sampling of a ManTIS hemisphere (up to 30° viewing zenith, 360° viewing azimuth) needs approx. 18 min. The developed data processing chain in combination with the software used for the semi-automatic control provides a reliable method to reduce temporal effects during the measurements. The presented visualization and analysis approaches of the HCRF data of an Arctic low growing vegetation showcase prove the high quality of spectro-goniometer measurements. The patented low-cost and lightweight ManTIS instrument platform can be customized for various research needs and is available for purchase.
Resumo:
Distribution, accumulation and diagenesis of surficial sediments in coastal and continental shelf systems follow complex chains of localized processes and form deposits of great spatial variability. Given the environmental and economic relevance of ocean margins, there is growing need for innovative geophysical exploration methods to characterize seafloor sediments by more than acoustic properties. A newly conceptualized benthic profiling and data processing approach based on controlled source electromagnetic (CSEM) imaging permits to coevally quantify the magnetic susceptibility and the electric conductivity of shallow marine deposits. The two physical properties differ fundamentally insofar as magnetic susceptibility mostly assesses solid particle characteristics such as terrigenous or iron mineral content, redox state and contamination level, while electric conductivity primarily relates to the fluid-filled pore space and detects salinity, porosity and grain-size variations. We develop and validate a layered half-space inversion algorithm for submarine multifrequency CSEM with concentric sensor configuration. Guided by results of modeling, we modified a commercial land CSEM sensor for submarine application, which was mounted into a nonconductive and nonmagnetic bottom-towed sled. This benthic EM profiler Neridis II achieves 25 soundings/second at 3-4 knots over continuous profiles of up to hundred kilometers. Magnetic susceptibility is determined from the 75 Hz in-phase response (90% signal originates from the top 50 cm), while electric conductivity is derived from the 5 kHz out-of-phase (quadrature) component (90% signal from the top 92 cm). Exemplary survey data from the north-west Iberian margin underline the excellent sensitivity, functionality and robustness of the system in littoral (~0-50 m) and neritic (~50-300 m) environments. Susceptibility vs. porosity cross-plots successfully identify known lithofacies units and their transitions. All presently available data indicate an eminent potential of CSEM profiling for assessing the complex distribution of shallow marine surficial sediments and for revealing climatic, hydrodynamic, diagenetic and anthropogenic factors governing their formation.
Resumo:
In recent years, profiling floats, which form the basis of the successful international Argo observatory, are also being considered as platforms for marine biogeochemical research. This study showcases the utility of floats as a novel tool for combined gas measurements of CO2 partial pressure (pCO2) and O2. These float prototypes were equipped with a small-sized and submersible pCO2 sensor and an optode O2 sensor for highresolution measurements in the surface ocean layer. Four consecutive deployments were carried out during November 2010 and June 2011 near the Cape Verde Ocean Observatory (CVOO) in the eastern tropical North Atlantic. The profiling float performed upcasts every 31 h while measuring pCO2, O2, salinity, temperature, and hydrostatic pressure in the upper 200 m of the water column. To maintain accuracy, regular pCO2 sensor zeroings at depth and surface, as well as optode measurements in air, were performed for each profile. Through the application of data processing procedures (e.g., time-lag correction), accuracies of floatborne pCO2 measurements were greatly improved (10-15 µatm for the water column and 5 µatm for surface measurements). O2 measurements yielded an accuracy of 2 µmol/kg. First results of this pilot study show the possibility of using profiling floats as a platform for detailed and unattended observations of the marine carbon and oxygen cycle dynamics.
Resumo:
The presented database contains time-referenced sea ice draft values from upward looking sonar (ULS) measurements in the Weddell Sea, Antarctica. The sea ice draft data can be used to infer the thickness of the ice. They were collected during the period 1990-2008. In total, the database includes measurements from 13 locations in the Weddell Sea and was generated from more than 3.7 million measurements of sea ice draft. The files contain uncorrected raw drafts, corrected drafts and the basic parameters measured by the ULS. The measurement principle, the data processing procedure and the quality control are described in detail. To account for the unknown speed of sound in the water column above the ULS, two correction methods were applied to the draft data. The first method is based on defining a reference level from the identification of open water leads. The second method uses a model of sound speed in the oceanic mixed layer and is applied to ice draft in austral winter. Both methods are discussed and their accuracy is estimated. Finally, selected results of the processing are presented.