114 resultados para Deterministic imputation
Examining the relationships between Holocene climate change, hydrology, and human society in Ireland
Resumo:
This thesis explores human-environment interactions during the Mid-Late Holocene in raised bogs in central Ireland. The raised bogs of central Ireland are widely-recognised for their considerable palaeoenvironmental and archaeological resources: research over the past few decades has established the potential for such sites to preserve sensitive records of Holocene climatic variability expressed as changes in bog surface wetness (BSW); meanwhile archaeological investigations over the past century have uncovered hundreds of peatland archaeological features dating from the Neolithic through to the Post-Medieval period including wooden trackways, platforms, and deposits of high-status metalwork. Previous studies have attempted to explore the relationship between records of past environmental change and the occurrence of peatland archaeological sites reaching varying conclusions. More recently, environmentally-deterministic models of human-environment interaction in Irish raised bogs at the regional scale have been explicitly tested leading to the conclusion that there is no relationship between BSW and past human activity. These relationships are examined in more detail on a site-by-site basis in this thesis. To that end, testate amoebae-derived BSW records from nine milled former raised bogs in central Ireland were produced from sites with known and dated archaeological records. Relationships between BSW records and environmental conditions within the study area were explored through both the development of a new central Ireland testate amoebae transfer function and through comparisons between recent BSW records and instrumental weather data. Compilation of BSW records from the nine fossil study sites show evidence both for climate forcing, particularly during 3200-2400 cal BP, as well as considerable inter-site variability. Considerable inter-site variability was also evident in the archaeological records of the same sites. Whilst comparisons between BSW and archaeological records do not show a consistent linear relationship, examination of records on a site-by-site basis were shown to reveal interpretatively important contingent relationships. It is concluded therefore, that future research on human-environment interactions should focus on individual sites and should utilise theoretical approaches from the humanities in order to avoid the twin pitfalls of masking important local patterns of change, and of environmental determinism.
Resumo:
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Resumo:
This paper shows that radiometer channel radiances for cloudy atmospheric conditions can be simulated with an optimised frequency grid derived under clear-sky conditions. A new clear-sky optimised grid is derived for AVHRR channel 5 ð12 m m, 833 cm �1 Þ. For HIRS channel 11 ð7:33 m m, 1364 cm �1 Þ and AVHRR channel 5, radiative transfer simulations using an optimised frequency grid are compared with simulations using a reference grid, where the optimised grid has roughly 100–1000 times less frequencies than the full grid. The root mean square error between the optimised and the reference simulation is found to be less than 0.3 K for both comparisons, with the magnitude of the bias less than 0.03 K. The simulations have been carried out with the radiative transfer model Atmospheric Radiative Transfer Simulator (ARTS), version 2, using a backward Monte Carlo module for the treatment of clouds. With this module, the optimised simulations are more than 10 times faster than the reference simulations. Although the number of photons is the same, the smaller number of frequencies reduces the overhead for preparing the optical properties for each frequency. With deterministic scattering solvers, the relative decrease in runtime would be even more. The results allow for new radiative transfer applications, such as the development of new retrievals, because it becomes much quicker to carry out a large number of simulations. The conclusions are applicable to any downlooking infrared radiometer.
Resumo:
This study examines convection-permitting numerical simulations of four cases of terrain-locked quasi-stationary convective bands over the UK. For each case, a 2.2-km grid-length 12-member ensemble and 1.5-km grid-length deterministic forecast are analyzed, each with two different initialization times. Object-based verification is applied to determine whether the simulations capture the structure, location, timing, intensity and duration of the observed precipitation. These verification diagnostics reveal that the forecast skill varies greatly between the four cases. Although the deterministic and ensemble simulations captured some aspects of the precipitation correctly in each case, they never simultaneously captured all of them satisfactorily. In general, the models predicted banded precipitation accumulations at approximately the correct time and location, but the precipitating structures were more cellular and less persistent than the coherent quasi-stationary bands that were observed. Ensemble simulations from the two different initialization times were not significantly different, which suggests a potential benefit of time-lagging subsequent ensembles to increase ensemble size. The predictive skill of the upstream larger-scale flow conditions and the simulated precipitation on the convection-permitting grids were strongly correlated, which suggests that more accurate forecasts from the parent ensemble should improve the performance of the convection-permitting ensemble nested within it.
Resumo:
The derivation of time evolution equations for slow collective variables starting from a micro- scopic model system is demonstrated for the tutorial example of the classical, two-dimensional XY model. Projection operator techniques are used within a nonequilibrium thermodynamics framework together with molecular simulations in order to establish the building blocks of the hydrodynamics equations: Poisson brackets that determine the deterministic drift, the driving forces from the macroscopic free energy and the friction matrix. The approach is rather general and can be applied for deriving the equations of slow variables for a broad variety of systems.
Resumo:
Based on a large dataset from eight Asian economies, we test the impact of post-crisis regulatory reforms on the performance of depository institutions in countries at different levels of financial development. We allow for technological heterogeneity and estimate a set of country-level stochastic cost frontiers followed by a deterministic bootstrapped meta-frontier to evaluate cost efficiency and cost technology. Our results support the view that liberalization policies have a positive impact on bank performance, while the reverse is true for prudential regulation policies. The removal of activities restrictions, bank privatization and foreign bank entry have a positive and significant impact on technological progress and cost efficiency. In contrast, prudential policies, which aim to protect the banking sector from excessive risk-taking, tend to adversely affect banks cost efficiency but not cost technology.
Resumo:
Following the 1997 crisis, banking sector reforms in Asia have been characterised by the emphasis on prudential regulation, associated with increased financial liberalisation. Using a panel data set of commercial banks from eight major Asian economies over the period 2001-2010, this study explores how the coexistence of liberalisation and prudential regulation affects banks’ cost characteristics. Given the presence of heterogeneity of technologies across countries, we use a stochastic frontier approach followed by the estimation of a deterministic meta-frontier to provide ‘true’ estimates of bank cost efficiency measures. Our results show that the liberalization of bank interest rates and the increase in foreign banks' presence have had a positive and significant impact on technological progress and cost efficiency. On the other hand, we find that prudential regulation might adversely affect bank cost performance. When designing an optimal regulatory framework, policy makers should combine policies which aim to foster financial stability without hindering financial intermediation.
Resumo:
The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.
Resumo:
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.