63 resultados para Model space
Resumo:
Measuring poverty has occupied a lot of space in the development discourse. Over the years a number of approaches have been offered to capture the experience of what it means to be poor. However, latterly such approaches often ignore core assets. Indeed, the comparative impact of livestock vs. other core assets such as land and education on poverty has not been well explored. Therefore, the authors created an 'asset impact model' to examine changes to both tangible and intangible assets at the household level, with a particular focus on gender and ethnicity among communities residing in the Bolivian Altiplano. The simple model illustrates that for indigenous women, a 20 per cent increase in the livestock herd has the same impact on household income as increasing the education levels by 20 per cent and household land ownership by 5 per cent. The study illustrates the potential role of a productive, tangible asset, i.e. livestock, on poverty reduction in the short term. The policy implications of supporting asset-focused measures of poverty are discussed.
Resumo:
Models of the City of London office market are extended by considering a longer time series of data, covering two cycles, and by explicit modeling of asymmetric rental response to supply and demand model. A long run structural model linking demand for office space, real rental levels and office-based employment is estimated and then rental adjustment processes are modeled using an error correction model framework. Adjustment processes are seen to be asymmetric, dependent both on the direction of the supply and demand shock and on the state of the rental market at the time of the shock. A complete system of equations is estimated: unit shocks produce oscillations but there is a return to a steady equilibrium state in the long run.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.
Resumo:
The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.
Resumo:
Correlations between various chemical species simulated by the Canadian Middle Atmosphere Model, a general circulation model with fully interactive chemistry, are considered in order to investigate the general conditions under which compact correlations can be expected to form. At the same time, the analysis serves to validate the model. The results are compared to previous work on this subject, both from theoretical studies and from atmospheric measurements made from space and from aircraft. The results highlight the importance of having a data set with good spatial coverage when working with correlations and provide a background against which the compactness of correlations obtained from atmospheric measurements can be confirmed. It is shown that for long-lived species, distinct correlations are found in the model in the tropics, the extratropics, and the Antarctic winter vortex. Under these conditions, sparse sampling such as arises from occultation instruments is nevertheless suitable to define a chemical correlation within each region even from a single day of measurements, provided a sufficient range of mixing ratio values is sampled. In practice, this means a large vertical extent, though the requirements are less stringent at more poleward latitudes.
Resumo:
Mean field models (MFMs) of cortical tissue incorporate salient, average features of neural masses in order to model activity at the population level, thereby linking microscopic physiology to macroscopic observations, e.g., with the electroencephalogram (EEG). One of the common aspects of MFM descriptions is the presence of a high-dimensional parameter space capturing neurobiological attributes deemed relevant to the brain dynamics of interest. We study the physiological parameter space of a MFM of electrocortical activity and discover robust correlations between physiological attributes of the model cortex and its dynamical features. These correlations are revealed by the study of bifurcation plots, which show that the model responses to changes in inhibition belong to two archetypal categories or “families”. After investigating and characterizing them in depth, we discuss their essential differences in terms of four important aspects: power responses with respect to the modeled action of anesthetics, reaction to exogenous stimuli such as thalamic input, and distributions of model parameters and oscillatory repertoires when inhibition is enhanced. Furthermore, while the complexity of sustained periodic orbits differs significantly between families, we are able to show how metamorphoses between the families can be brought about by exogenous stimuli. We here unveil links between measurable physiological attributes of the brain and dynamical patterns that are not accessible by linear methods. They instead emerge when the nonlinear structure of parameter space is partitioned according to bifurcation responses. We call this general method “metabifurcation analysis”. The partitioning cannot be achieved by the investigation of only a small number of parameter sets and is instead the result of an automated bifurcation analysis of a representative sample of 73,454 physiologically admissible parameter sets. Our approach generalizes straightforwardly and is well suited to probing the dynamics of other models with large and complex parameter spaces.
Resumo:
A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.
Resumo:
FAMOUS fills an important role in the hierarchy of climate models, both explicitly resolving atmospheric and oceanic dynamics yet being sufficiently computationally efficient that either very long simulations or large ensembles are possible. An improved set of carbon cycle parameters for this model has been found using a perturbed physics ensemble technique. This is an important step towards building the "Earth System" modelling capability of FAMOUS, which is a reduced resolution, and hence faster running, version of the Hadley Centre Climate model, HadCM3. Two separate 100 member perturbed parameter ensembles were performed; one for the land surface and one for the ocean. The land surface scheme was tested against present-day and past representations of vegetation and the ocean ensemble was tested against observations of nitrate. An advantage of using a relatively fast climate model is that a large number of simulations can be run and hence the model parameter space (a large source of climate model uncertainty) can be more thoroughly sampled. This has the associated benefit of being able to assess the sensitivity of model results to changes in each parameter. The climatologies of surface and tropospheric air temperature and precipitation are improved relative to previous versions of FAMOUS. The improved representation of upper atmosphere temperatures is driven by improved ozone concentrations near the tropopause and better upper level winds.
Resumo:
The difference between the rate of change of cerebral blood volume (CBV) and cerebral blood flow (CBF) following stimulation is thought to be due to circumferential stress relaxation in veins (Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689). In this paper we explore the visco-elastic properties of blood vessels, and present a dynamic model relating changes in CBF to changes in CBV. We refer to this model as the visco-elastic windkessel (VW) model. A novel feature of this model is that the parameter characterising the pressure–volume relationship of blood vessels is treated as a state variable dependent on the rate of change of CBV, producing hysteresis in the pressure–volume space during vessel dilation and contraction. The VW model is nonlinear time-invariant, and is able to predict the observed differences between the time series of CBV and that of CBF measurements following changes in neural activity. Like the windkessel model derived by Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689, the VW model is primarily a model of haemodynamic changes in the venous compartment. The VW model is demonstrated to have the following characteristics typical of visco-elastic materials: (1) hysteresis, (2) creep, and (3) stress relaxation, hence it provides a unified model of the visco-elastic properties of the vasculature. The model will not only contribute to the interpretation of the Blood Oxygen Level Dependent (BOLD) signals from functional Magnetic Resonance Imaging (fMRI) experiments, but also find applications in the study and modelling of the brain vasculature and the haemodynamics of circulatory and cardiovascular systems.
Resumo:
Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
Nocturnal cooling of air within a forest canopy and the resulting temperature profile may drive local thermally driven motions, such as drainage flows, which are believed to impact measurements of ecosystem–atmosphere exchange. To model such flows, it is necessary to accurately predict the rate of cooling. Cooling occurs primarily due to radiative heat loss. However, much of the radiative loss occurs at the surface of canopy elements (leaves, branches, and boles of trees), while radiative divergence in the canopy air space is small due to high transmissivity of air. Furthermore, sensible heat exchange between the canopy elements and the air space is slow relative to radiative fluxes. Therefore, canopy elements initially cool much more quickly than the canopy air space after the switch from radiative gain during the day to radiative loss during the night. Thus in modeling air cooling within a canopy, it is not appropriate to neglect the storage change of heat in the canopy elements or even to assume equal rates of cooling of the canopy air and canopy elements. Here a simple parameterization of radiatively driven cooling of air within the canopy is presented, which accounts implicitly for radiative cooling of the canopy volume, heat storage in the canopy elements, and heat transfer between the canopy elements and the air. Simulations using this parameterization are compared to temperature data from the Morgan–Monroe State Forest (IN, USA) FLUXNET site. While the model does not perfectly reproduce the measured rates of cooling, particularly near the top of the canopy, the simulated cooling rates are of the correct order of magnitude.
Resumo:
A stand-alone sea ice model is tuned and validated using satellite-derived, basinwide observations of sea ice thickness, extent, and velocity from the years 1993 to 2001. This is the first time that basin-scale measurements of sea ice thickness have been used for this purpose. The model is based on the CICE sea ice model code developed at the Los Alamos National Laboratory, with some minor modifications, and forcing consists of 40-yr ECMWF Re-Analysis (ERA-40) and Polar Exchange at the Sea Surface (POLES) data. Three parameters are varied in the tuning process: Ca, the air–ice drag coefficient; P*, the ice strength parameter; and α, the broadband albedo of cold bare ice, with the aim being to determine the subset of this three-dimensional parameter space that gives the best simultaneous agreement with observations with this forcing set. It is found that observations of sea ice extent and velocity alone are not sufficient to unambiguously tune the model, and that sea ice thickness measurements are necessary to locate a unique subset of parameter space in which simultaneous agreement is achieved with all three observational datasets.
Resumo:
Initial results are presented from a middle atmosphere extension to a version of the European Centre For Medium Range Weather Forecasting tropospheric model. The extended version of the model has been developed as part of the UK Universities Global Atmospheric Modelling Project and extends from the ground to approximately 90 km. A comprehensive solar radiation scheme is included which uses monthly averaged climatological ozone values. A linearised infrared cooling scheme is employed. The basic climatology of the model is described; the parametrization of drag due to orographically forced gravity waves is shown to have a dramatic effect on the simulations of the winter hemisphere.