4 resultados para statistical techniques
em Archimer: Archive de l'Institut francais de recherche pour l'exploitation de la mer
Resumo:
Three sediment records of sea surface temperature (SST) are analyzed that originate from distant locations in the North Atlantic, have centennial-to-multicentennial resolution, are based on the same reconstruction method and chronological assumptions, and span the past 15 000 yr. Using recursive least squares techniques, an estimate of the time-dependent North Atlantic SST field over the last 15 kyr is sought that is consistent with both the SST records and a surface ocean circulation model, given estimates of their respective error (co)variances. Under the authors' assumptions about data and model errors, it is found that the 10 degrees C mixed layer isotherm, which approximately traces the modern Subpolar Front, would have moved by ~15 degrees of latitude southward (northward) in the eastern North Atlantic at the onset (termination) of the Younger Dryas cold interval (YD), a result significant at the level of two standard deviations in the isotherm position. In contrast, meridional movements of the isotherm in the Newfoundland basin are estimated to be small and not significant. Thus, the isotherm would have pivoted twice around a region southeast of the Grand Banks, with a southwest-northeast orientation during the warm intervals of the Bolling-Allerod and the Holocene and a more zonal orientation and southerly position during the cold interval of the YD. This study provides an assessment of the significance of similar previous inferences and illustrates the potential of recursive least squares in paleoceanography.
Resumo:
Over the past several decades, thousands of otoliths, bivalve shells, and scales have been collected for the purposes of age determination and remain archived in European and North American fisheries laboratories. Advances in digital imaging and computer software combined with techniques developed by tree-ring scientists provide a means by which to extract additional levels of information in these calcified structures and generate annually resolved (one value per year), multidecadal time-series of population-level growth anomalies. Chemical and isotopic properties may also be extracted to provide additional information regarding the environmental conditions these organisms experienced.Given that they are exactly placed in time, chronologies can be directly compared to instrumental climate records, chronologies from other regions or species, or time-seriesof other biological phenomena. In this way, chronologies may be used to reconstruct historical ranges of environmental variability, identify climatic drivers of growth, establish linkages within and among species, and generate ecosystem-level indicators. Following the first workshop in Hamburg, Germany, in December 2014, the second workshop on Growth increment Chronologies in Marine Fish: climate-ecosystem interactions in the North Atlantic (WKGIC2) met at the Mediterranean Institute for Advanced Studies headquarters in Esporles, Spain, on 18–22 April 2016, chaired by Bryan Black (USA) and Christoph Stransky (Germany).Thirty-six participants from fifteen different countries attended. Objectives were to i) review the applications of chronologies developed from growth-increment widths in the hard parts (otoliths, shells, scales) of marine fish and bivalve species ii) review the fundamentals of crossdating and chronology development, iii) discuss assumptions and limitations of these approaches, iv) measure otolith growth-increment widths in image analysis software, v) learn software to statistically check increment dating accuracy, vi) generate a growth increment chronology and relate it to climate indices, and vii) initiate cooperative projects or training exercises to commence after the workshop.The workshop began with an overview of tree-ring techniques of chronology development, including a hands-on exercise in cross dating. Next, we discussed the applications of fish and bivalve biochronologies and the range of issues that could be addressed. We then reviewed key assumptions and limitations, especially those associated with short-lived species for which there are numerous and extensive otolith archives in European fisheries labs. Next, participants were provided with images of European plaice otoliths from the North Sea and taught to measure increment widths in image analysis software. Upon completion of measurements, techniques of chronology development were discussed and contrasted to those that have been applied for long-lived species. Plaice growth time-series were then related to environmental variability using the KNMI Climate Explorer. Finally, potential future collaborations and funding opportunities were discussed, and there was a clear desire to meet again to compare various statistical techniques for chronology development using a range existing fish, bivalve, and tree growth-increment datasets. Overall, we hope to increase the use of these techniques, and over the long term, develop networks of biochronologies for integrative analyses of ecosystem functioning and relationships to long-term climate variability and fishing pressure.
Resumo:
Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).
Resumo:
In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.