27 resultados para parameter driven model

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As climate changes, temperatures will play an increasing role in determining crop yield. Both climate model error and lack of constrained physiological thresholds limit the predictability of yield. We used a perturbed-parameter climate model ensemble with two methods of bias-correction as input to a regional-scale wheat simulation model over India to examine future yields. This model configuration accounted for uncertainty in climate, planting date, optimization, temperature-induced changes in development rate and reproduction. It also accounts for lethal temperatures, which have been somewhat neglected to date. Using uncertainty decomposition, we found that fractional uncertainty due to temperature-driven processes in the crop model was on average larger than climate model uncertainty (0.56 versus 0.44), and that the crop model uncertainty is dominated by crop development. Simulations with the raw compared to the bias-corrected climate data did not agree on the impact on future wheat yield, nor its geographical distribution. However the method of bias-correction was not an important source of uncertainty. We conclude that bias-correction of climate model data and improved constraints on especially crop development are critical for robust impact predictions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine the determinants of home-region strategy of the multinational subsidiary and the impact of such a strategy on its performance. The author draws upon new internalization theory to develop a theory-driven model and empirically tests the simultaneous relationships between home-region strategy and performance of the subsidiary. Design/methodology/approach – The author tests the model using a simultaneous equation statistical technique on an original, new data set of publicly listed multinational subsidiaries operating in the ASEAN region, with parent firms’ headquarters across the broad triad. Findings – There are three significant findings. The first finding is that subsidiary-level downstream knowledge (marketing advantages), and the geographic location of the subsidiary in the same home region as of the parent firm are key antecedents of a subsidiary’s home-region strategy. The second finding is that a subsidiary’s profitability reduces home-region orientation; however, home-region strategy has an insignificant effect on performance. The third finding is that these subsidiaries generate on average 92 per cent of their total sales in the home region (the Asia Pacific). Originality/value – The author advances the existing literature on the regional nature of parent-level multinational enterprises by demonstrating that their quasi-autonomous subsidiaries also operate mainly on a home-region basis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Small and medium sized enterprises (SMEs) play an important role in the European economy. A critical challenge faced by SME leaders, as a consequence of the continuing digital technology revolution, is how to optimally align business strategy with digital technology to fully leverage the potential offered by these technologies in pursuit of longevity and growth. There is a paucity of empirical research examining how e-leadership in SMEs drives successful alignment between business strategy and digital technology fostering longevity and growth. To address this gap, in this paper we develop an empirically derived e-leadership model. Initially we develop a theoretical model of e-leadership drawing on strategic alignment theory. This provides a theoretical foundation on how SMEs can harness digital technology in support of their business strategy enabling sustainable growth. An in-depth empirical study was undertaken interviewing 42 successful European SME leaders to validate, advance and substantiate our theoretically driven model. The outcome of the two stage process – inductive development of a theoretically driven e-leadership model and deductive advancement to develop a complete model through in-depth interviews with successful European SME leaders – is an e-leadership model with specific constructs fostering effective strategic alignment. The resulting diagnostic model enables SME decision makers to exercise effective e-leadership by creating productive alignment between business strategy and digital technology improving longevity and growth prospects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A simple theoretical model for the intensification of tropical cyclones and polar lows is developed using a minimal set of physical assumptions. These disturbances are assumed to be balanced systems intensifying through the WISHE (Wind-Induced Surface Heat Exchange) intensification mechanism, driven by surface fluxes of heat and moisture into an atmosphere which is neutral to moist convection. The equation set is linearized about a resting basic state and solved as an initial-value problem. A system is predicted to intensify with an exponential perturbation growth rate scaled by the radial gradient of an efficiency parameter which crudely represents the effects of unsaturated processes. The form of this efficiency parameter is assumed to be defined by initial conditions, dependent on the nature of a pre-existing vortex required to precondition the atmosphere to a state in which the vortex can intensify. Evaluation of the simple model using a primitive-equation, nonlinear numerical model provides support for the prediction of exponential perturbation growth. Good agreement is found between the simple and numerical models for the sensitivities of the measured growth rate to various parameters, including surface roughness, the rate of transfer of heat and moisture from the ocean surface, and the scale for the growing vortex.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The theta-logistic is a widely used generalisation of the logistic model of regulated biological processes which is used in particular to model population regulation. Then the parameter theta gives the shape of the relationship between per-capita population growth rate and population size. Estimation of theta from population counts is however subject to bias, particularly when there are measurement errors. Here we identify factors disposing towards accurate estimation of theta by simulation of populations regulated according to the theta-logistic model. Factors investigated were measurement error, environmental perturbation and length of time series. Large measurement errors bias estimates of theta towards zero. Where estimated theta is close to zero, the estimated annual return rate may help resolve whether this is due to bias. Environmental perturbations help yield unbiased estimates of theta. Where environmental perturbations are large, estimates of theta are likely to be reliable even when measurement errors are also large. By contrast where the environment is relatively constant, unbiased estimates of theta can only be obtained if populations are counted precisely Our results have practical conclusions for the design of long-term population surveys. Estimation of the precision of population counts would be valuable, and could be achieved in practice by repeating counts in at least some years. Increasing the length of time series beyond ten or 20 years yields only small benefits. if populations are measured with appropriate accuracy, given the level of environmental perturbation, unbiased estimates can be obtained from relatively short censuses. These conclusions are optimistic for estimation of theta. (C) 2008 Elsevier B.V All rights reserved.