67 resultados para hydrometeorology, Penman-Monteith-FAO, kriging


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Weeds tend to aggregate in patches within fields and there is evidence that this is partly owing to variation in soil properties. Because the processes driving soil heterogeneity operate at different scales, the strength of the relationships between soil properties and weed density would also be expected to be scale-dependent. Quantifying these effects of scale on weed patch dynamics is essential to guide the design of discrete sampling protocols for mapping weed distribution. We have developed a general method that uses novel within-field nested sampling and residual maximum likelihood (REML) estimation to explore scale-dependent relationships between weeds and soil properties. We have validated the method using a case study of Alopecurus myosuroides in winter wheat. Using REML, we partitioned the variance and covariance into scale-specific components and estimated the correlations between the weed counts and soil properties at each scale. We used variograms to quantify the spatial structure in the data and to map variables by kriging. Our methodology successfully captured the effect of scale on a number of edaphic drivers of weed patchiness. The overall Pearson correlations between A. myosuroides and soil organic matter and clay content were weak and masked the stronger correlations at >50 m. Knowing how the variance was partitioned across the spatial scales we optimized the sampling design to focus sampling effort at those scales that contributed most to the total variance. The methods have the potential to guide patch spraying of weeds by identifying areas of the field that are vulnerable to weed establishment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-term monitoring of surface water quality has shown increasing concentrations of Dissolved Organic Carbon (DOC) across a large part of the Northern Hemisphere. Several drivers have been implicated including climate change, land management change, nitrogen and sulphur deposition and CO2 enrichment. Analysis of stream water data, supported by evidence from laboratory studies, indicates that an effect of declining sulphur deposition on catchment soil chemistry is likely to be the primary mechanism, but there are relatively few long term soil water chemistry records in the UK with which to investigate this, and other, hypotheses directly. In this paper, we assess temporal relationships between soil solution chemistry and parameters that have been argued to regulate DOC production and, using a unique set of co-located measurements of weather and bulk deposition and soil solution chemistry provided by the UK Environmental Change Network and the Intensive Forest Monitoring Level II Network . We used statistical non-linear trend analysis to investigate these relationships at 5 forested and 4 non-forested sites from 1993 to 2011. Most trends in soil solution DOC concentration were found to be non-linear. Significant increases in DOC occurred mostly prior to 2005. The magnitude and sign of the trends was associated qualitatively with changes in acid deposition, the presence/absence of a forest canopy, soil depth and soil properties. The strongest increases in DOC were seen in acidic forest soils and were most clearly linked to declining anthropogenic acid deposition, while DOC trends at some sites with westerly locations appeared to have been influenced by shorter-term hydrological variation. The results indicate that widespread DOC increases in surface waters observed elsewhere, are most likely dominated by enhanced mobilization of DOC in surficial organic horizons, rather than changes in the soil water chemistry of deeper horizons. While trends in DOC concentrations in surface horizons have flattened out in recent years, further increases may be expected as soil chemistry continues to adjust to declining inputs of acidity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological and biogeochemical processes in lakes are strongly dependent upon water temperature. Long-term surface warming of many lakes is unequivocal, but little is known about the comparative magnitude of temperature variation at diel timescales, due to a lack of appropriately resolved data. Here we quantify the pattern and magnitude of diel temperature variability of surface waters using high-frequency data from 100 lakes. We show that the near-surface diel temperature range can be substantial in summer relative to long-term change and, for lakes smaller than 3 km2, increases sharply and predictably with decreasing lake area. Most small lakes included in this study experience average summer diel ranges in their near-surface temperatures of between 4 and 7°C. Large diel temperature fluctuations in the majority of lakes undoubtedly influence their structure, function and role in biogeochemical cycles, but the full implications remain largely unexplored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.