1000 resultados para File size
Resumo:
In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.
Resumo:
The Imbrie and Kipp transfer function method (IKM) and the modern analog technique (MAT) are accepted tools for quantitative paleoenvironmental reconstructions. However, no uncomplicated, flexible software has been available to apply these methods on modern computer devices. For this reason the software packages PaleoToolBox, MacTransfer, WinTransfer, MacMAT, and PanPlot have been developed. The PaleoToolBox package provides a flexible tool for the preprocessing of microfossil reference and downcore data as well as hydrographic reference parameters. It includes procedures to randomize the raw databases; to switch specific species in or out of the total species list; to establish individual ranking systems and their application on the reference and downcore databasessemi; and to convert the prepared databases into the file formats of IKM and MAT software for estimation of paleohydrographic parameters.
Resumo:
This study presents a new Miocene biostratigraphic synthesis for the high-latitude northeastern North Atlantic region. Via correlations to the bio-magnetostratigraphy and oxygen isotope records of Ocean Drilling Program and Deep Sea Drilling Project Sites, the ages of shallower North Sea deposits have been better constrained. The result has been an improved precision and documentation of the age designations of the existing North Sea foraminiferal zonal boundaries of King (1989) and Gradstein and Bäckström (1996). All calibrations have been updated to the Astronomically Tuned Neogene Time Scale (ATNTS) of Lourens et al. (2004). This improved Miocene biozonation has been achieved through: the updating of age calibrations for key microfossil bioevents, identification of new events, and integration of new biostratigraphic data from a foraminiferal analysis of commercial wells in the North Sea and Norwegian Sea. The new zonation has been successfully applied to two commercial wells and an onshore research borehole. At these high latitudes, where standard zonal markers are often absent, integration of microfossil groups significantly improves temporal resolution. The new zonation comprises 11 Nordic Miocene (NM) Zones with an average duration of 1 to 2 million years. This multi-group combination of a total of 92 bioevents (70 foraminifers and bolboformids; 16 dinoflagellate cysts and acritarchs; 6 marine diatoms) facilitates zonal identification throughout the Nordic Atlantic region. With the highest proportion of events being of calcareous walled microfossils, this zonation is primarily suited to micropaleontologists. A correlation of this Miocene biostratigraphy with a re-calibrated oxygen isotope record for DSDP Site 608 suggests a strong correlation between Miocene planktonic microfossil turnover rates and the inferred paleoclimatic trends. Benthic foraminifera zonal boundaries appear to often coincide with Miocene global sequence boundaries. The biostratigraphic record is punctuated by four main stratigraphic hiati which show variation in their geographic and temporal extent. These are related to the following regional unconformities: basal Neogene, Lower/Middle Miocene ("mid-Miocene unconformity"), basal Upper Miocene and basal Messinian unconformities. Further coring of Neogene sections in the North Sea and Norwegian Sea may better constrain their extent and their effect on the biostratigraphic record.
Resumo:
Coastal communities around the world face increasing risk from flooding as a result of rising sea level, increasing storminess, and land subsidence. Salt marshes can act as natural buffer zones, providing protection from waves during storms. However, the effectiveness of marshes in protecting the coastline during extreme events when water levels and waves are highest is poorly understood. Here, we experimentally assess wave dissipation under storm surge conditions in a 300-m-long wave flume that contains a transplanted section of natural salt marsh. We find that the presence of marsh vegetation causes considerable wave attenuation, even when water levels and waves are high. From a comparison with experiments without vegetation, we estimate that up to 60% of observed wave reduction is attributed to vegetation. We also find that although waves progressively flatten and break vegetation stems and thereby reduce dissipation, the marsh substrate remained remarkably stable and resistant to surface erosion under all conditions.The effectiveness of storm wave dissipation and the resilience of tidal marshes even at extreme conditions suggest that salt marsh ecosystems can be a valuable component of coastal protection schemes.
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.