967 resultados para satellite data processing
Resumo:
The Gravity field and steady-state Ocean Circulation Explorer (GOCE) was the first Earth explorer core mission of the European Space Agency. It was launched on March 17, 2009 into a Sun-synchronous dusk-dawn orbit and re-entered into the Earth’s atmosphere on November 11, 2013. The satellite altitude was between 255 and 225 km for the measurement phases. The European GOCE Gravity consortium is responsible for the Level 1b to Level 2 data processing in the frame of the GOCE High-level processing facility (HPF). The Precise Science Orbit (PSO) is one Level 2 product, which was produced under the responsibility of the Astronomical Institute of the University of Bern within the HPF. This PSO product has been continuously delivered during the entire mission. Regular checks guaranteed a high consistency and quality of the orbits. A correlation between solar activity, GPS data availability and quality of the orbits was found. The accuracy of the kinematic orbit primarily suffers from this. Improvements in modeling the range corrections at the retro-reflector array for the SLR measurements were made and implemented in the independent SLR validation for the GOCE PSO products. The satellite laser ranging (SLR) validation finally states an orbit accuracy of 2.42 cm for the kinematic and 1.84 cm for the reduced-dynamic orbits over the entire mission. The common-mode accelerations from the GOCE gradiometer were not used for the official PSO product, but in addition to the operational HPF work a study was performed to investigate to which extent common-mode accelerations improve the reduced-dynamic orbit determination results. The accelerometer data may be used to derive realistic constraints for the empirical accelerations estimated for the reduced-dynamic orbit determination, which already improves the orbit quality. On top of that the accelerometer data may further improve the orbit quality if realistic constraints and state-of-the-art background models such as gravity field and ocean tide models are used for the reduced-dynamic orbit determination.
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
As the number of space debris is increasing in the geostationary ring, it becomes mandatory for any satellite operator to avoid any collisions. Space debris in geosynchronous orbits may be observed with optical telescopes. Other than radar, that requires very large dishes and transmission powers for sensing high-altitude objects, optical observations do not depend on active illumination from ground and may be performed with notably smaller apertures. The detection size of an object depends on the aperture of the telescope, sky background and exposure time. With a telescope of 50 cm aperture, objects down to approximately 50 cm may be observed. This size is regarded as a threshold for the identification of hazardous objects and the prevention of potentially catastrophic collisions in geostationary orbits. In collaboration with the Astronomical Institute of the University of Bern (AIUB), the German Space Operations Center (GSOC) is building a small aperture telescope to demonstrate the feasibility of optical surveillance of the geostationary ring. The telescope will be located in the southern hemisphere and complement an existing telescope in the northern hemisphere already operated by AIUB. These two telescopes provide an optimum coverage of European GEO satellites and enable a continuous monitoring independent of seasonal limitations. The telescope will be operated completely automatically. The automated operations should be demonstrated covering the full range of activities including scheduling of observations, telescope and camera control as well as data processing.
Resumo:
Several lake ice phenology studies from satellite data have been undertaken. However, the availability of long-term lake freeze-thaw-cycles, required to understand this proxy for climate variability and change, is scarce for European lakes. Long time series from space observations are limited to few satellite sensors. Data of the Advanced Very High Resolution Radiometer (AVHRR) are used in account of their unique potential as they offer each day global coverage from the early 1980s expectedly until 2022. An automatic two-step extraction was developed, which makes use of near-infrared reflectance values and thermal infrared derived lake surface water temperatures to extract lake ice phenology dates. In contrast to other studies utilizing thermal infrared, the thresholds are derived from the data itself, making it unnecessary to define arbitrary or lake specific thresholds. Two lakes in the Baltic region and a steppe lake on the Austrian–Hungarian border were selected. The later one was used to test the applicability of the approach to another climatic region for the time period 1990 to 2012. A comparison of the extracted event dates with in situ data provided good agreements of about 10 d mean absolute error. The two-step extraction was found to be applicable for European lakes in different climate regions and could fill existing data gaps in future applications. The extension of the time series to the full AVHRR record length (early 1980 until today) with adequate length for trend estimations would be of interest to assess climate variability and change. Furthermore, the two-step extraction itself is not sensor-specific and could be applied to other sensors with equivalent near- and thermal infrared spectral bands.
Resumo:
Time series of satellite measurements are used to describe patterns of surface temperature and chlorophyll associated with the 1996 cold La Nina phase and the 1997-1998 warm El Nino phase of the El Nino - Southern Oscillation cycle in the upwelling region off northern Chile. Surface temperature data are available through the entire study period. Sea-viewing Wide Field-of-view Sensor (SeaWiFS) data first became available in September 1997 during a relaxation in El Nino conditions identified by in situ hydrographic data. Over the time period of coincident satellite data, chlorophyll patterns closely track surface temperature patterns. Increases both in nearshore chlorophyll concentration and in cross-shelf extension of elevated concentrations are associated with decreased coastal temperatures during both the relaxation in El Nino conditions in September-November 1997 and the recovery from EI Nino conditions after March 1998. Between these two periods during austral summer (December 1997 to March 1998) and maximum El Nino temperature anomalies, temperature patterns normally associated with upwelling were absent and chlorophyll concentrations were minimal. Cross-shelf chlorophyll distributions appear to be modulated by surface temperature frontal zones and are positively correlated with a satellite-derived upwelling index. Frontal zone patterns and the upwelling index in 1996 imply an austral summer nearshore chlorophyll maximum, consistent with SeaWiFS data from I 1998-1999, after the El Nino. SeaWiFS retrievals in the data set used here are higher than in situ measurements by a factor of 2-4; however, consistency in the offset suggests relative patterns are valid.
Resumo:
A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
We re-evaluate the Greenland mass balance for the recent period using low-pass Independent Component Analysis (ICA) post-processing of the Level-2 GRACE data (2002-2010) from different official providers (UTCSR, JPL, GFZ) and confirm the present important ice mass loss in the range of -70 and -90 Gt/y of this ice sheet, due to negative contributions of the glaciers on the east coast. We highlight the high interannual variability of mass variations of the Greenland Ice Sheet (GrIS), especially the recent deceleration of ice loss in 2009-2010, once seasonal cycles are robustly removed by Seasonal Trend Loess (STL) decomposition. Interannual variability leads to varying trend estimates depending on the considered time span. Correction of post-glacial rebound effects on ice mass trend estimates represents no more than 8 Gt/y over the whole ice sheet. We also investigate possible climatic causes that can explain these ice mass interannual variations, as strong correlations between GRACE-based mass balance and atmosphere/ocean parallels are established: (1) changes in snow accumulation, and (2) the influence of inputs of warm ocean water that periodically accelerate the calving of glaciers in coastal regions and, feed-back effects of coastal water cooling by fresh currents from glaciers melting. These results suggest that the Greenland mass balance is driven by coastal sea surface temperature at time scales shorter than accumulation.
Resumo:
The Wadden Sea is located in the southeastern part of the North Sea forming an extended intertidal area along the Dutch, German and Danish coast. It is a highly dynamic and largely natural ecosystem influenced by climatic changes and anthropogenic use of the North Sea. Changes in the environment of the Wadden Sea, natural or anthropogenic origin, cannot be monitored by the standard measurement methods alone, because large-area surveys of the intertidal flats are often difficult due to tides, tidal channels and unstable underground. For this reason, remote sensing offers effective monitoring tools. In this study a multi-sensor concept for classification of intertidal areas in the Wadden Sea has been developed. Basis for this method is a combined analysis of RapidEye (RE) and TerraSAR-X (TSX) satellite data coupled with ancillary vector data about the distribution of vegetation, mussel beds and sediments. The classification of the vegetation and mussel beds is based on a decision tree and a set of hierarchically structured algorithms which use object and texture features. The sediments are classified by an algorithm which uses thresholds and a majority filter. Further improvements focus on radiometric enhancement and atmospheric correction. First results show that we are able to identify vegetation and mussel beds with the use of multi-sensor remote sensing. The classification of the sediments in the tidal flats is a challenge compared to vegetation and mussel beds. The results demonstrate that the sediments cannot be classified with high accuracy by their spectral properties alone due to their similarity which is predominately caused by their water content.
Resumo:
In recent years, profiling floats, which form the basis of the successful international Argo observatory, are also being considered as platforms for marine biogeochemical research. This study showcases the utility of floats as a novel tool for combined gas measurements of CO2 partial pressure (pCO2) and O2. These float prototypes were equipped with a small-sized and submersible pCO2 sensor and an optode O2 sensor for highresolution measurements in the surface ocean layer. Four consecutive deployments were carried out during November 2010 and June 2011 near the Cape Verde Ocean Observatory (CVOO) in the eastern tropical North Atlantic. The profiling float performed upcasts every 31 h while measuring pCO2, O2, salinity, temperature, and hydrostatic pressure in the upper 200 m of the water column. To maintain accuracy, regular pCO2 sensor zeroings at depth and surface, as well as optode measurements in air, were performed for each profile. Through the application of data processing procedures (e.g., time-lag correction), accuracies of floatborne pCO2 measurements were greatly improved (10-15 µatm for the water column and 5 µatm for surface measurements). O2 measurements yielded an accuracy of 2 µmol/kg. First results of this pilot study show the possibility of using profiling floats as a platform for detailed and unattended observations of the marine carbon and oxygen cycle dynamics.
Resumo:
The ground surface temperature is one of the key parameters that determine the thermal regime of permafrost soils in arctic regions. Due to remoteness of most permafrost areas, monitoring of the land surface temperature (LST) through remote sensing is desirable. However, suitable satellite platforms such as MODIS provide spatial resolutions, that cannot resolve the considerable small-scale heterogeneity of the surface conditions characteristic for many permafrost areas. This study investigates the spatial variability of summer surface temperatures of high-arctic tundra on Svalbard, Norway. A thermal imaging system mounted on a mast facilitates continuous monitoring of approximately 100 x 100 m of tundra with a wide variability of different surface covers and soil moisture conditions over the entire summer season from the snow melt until fall. The net radiation is found to be a control parameter for the differences in surface temperature between wet and dry areas. Under clear-sky conditions in July, the differences in surface temperature between wet and dry areas reach up to 10K. The spatial differences reduce strongly in weekly averages of the surface temperature, which are relevant for the soil temperature evolution of deeper layers. Nevertheless, a considerable variability remains, with maximum differences between wet and dry areas of 3 to 4K. Furthermore, the pattern of snow patches and snow-free areas during snow melt in July causes even greater differences of more than 10K in the weekly averages. Towards the end of the summer season, the differences in surface temperature gradually diminish. Due to the pronounced spatial variability in July, the accumulated degree-day totals of the snow-free period can differ by more than 60% throughout the study area. The terrestrial observations from the thermal imaging system are compared to measurements of the land surface temperature from the MODIS sensor. During periods with frequent clear-sky conditions and thus a high density of satellite data, weekly averages calculated from the thermal imaging system and from MODIS LST agree within less than 2K. Larger deviations occur when prolonged cloudy periods prevent satellite measurements. Futhermore, the employed MODIS L2 LST data set contains a number of strongly biased measurements, which suggest an admixing of cloud top temperatures. We conclude that a reliable gap filling procedure to moderate the impact of prolonged cloudy periods would be of high value for a future LST-based permafrost monitoring scheme. The occurrence of sustained subpixel variability of the summer surface temperature is a complicating factor, whose impact needs to be assessed further in conjunction with other spatially variable parameters such as the snow cover and soil properties.