809 resultados para Performance model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of ceiling geometries on the performance of lightshelves was investigated using physical model experiments and radiance simulations. Illuminance level and distribution uniformity were assessed for a working plane in a large space located in sub-tropical climate regions where innovative systems for daylighting and shading are required. It was found that the performance of the lightshelf can be improved by changing the ceiling geometry; the illuminance level increased in the rear of the room and decreased in the front near the window compared to rooms having conventional horizontal ceilings. Moreover, greater uniformity was achieved throughout the room as a result of reducing the difference in the illuminance level between the front and rear of the room. Radiance simulation results were found to be in good agreement with physical model data obtained under a clear sky and high solar radiation. The best ceiling shape was found to be one that is curved in the front and rear of the room.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the calibration and evaluation of flood inundation models are a prerequisite for their successful application, there is a clear need to ensure that the performance measures that quantify how well models match the available observations are fit for purpose. This paper evaluates the binary pattern performance measures that are frequently used to compare flood inundation models with observations of flood extent. This evaluation considers whether these measures are able to calibrate and evaluate model predictions in a credible and consistent way, i.e. identifying the underlying model behaviour for a number of different purposes such as comparing models of floods of different magnitudes or on different catchments. Through theoretical examples, it is shown that the binary pattern measures are not consistent for floods of different sizes, such that for the same vertical error in water level, a model of a flood of large magnitude appears to perform better than a model of a smaller magnitude flood. Further, the commonly used Critical Success Index (usually referred to as F<2 >) is biased in favour of overprediction of the flood extent, and is also biased towards correctly predicting areas of the domain with smaller topographic gradients. Consequently, it is recommended that future studies consider carefully the implications of reporting conclusions using these performance measures. Additionally, future research should consider whether a more robust and consistent analysis could be achieved by using elevation comparison methods instead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Tropical Rainfall Measuring Mission 3B42 precipitation estimates are widely used in tropical regions for hydrometeorological research. Recently, version 7 of the product was released. Major revisions to the algorithm involve the radar refl ectivity - rainfall rates relationship, surface clutter detection over high terrain, a new reference database for the passive microwave algorithm, and a higher quality gauge analysis product for monthly bias correction. To assess the impacts of the improved algorithm, we compare the version 7 and the older version 6 product with data from 263 rain gauges in and around the northern Peruvian Andes. The region covers humid tropical rainforest, tropical mountains, and arid to humid coastal plains. We and that the version 7 product has a significantly lower bias and an improved representation of the rainfall distribution. We further evaluated the performance of versions 6 and 7 products as forcing data for hydrological modelling, by comparing the simulated and observed daily streamfl ow in 9 nested Amazon river basins. We find that the improvement in the precipitation estimation algorithm translates to an increase in the model Nash-Sutcliffe effciency, and a reduction in the percent bias between the observed and simulated flows by 30 to 95%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Past climates provide a test of models’ ability to predict climate change. We present a comprehensive evaluation of state-of-the-art models against Last Glacial Maximum and mid-Holocene climates, using reconstructions of land and ocean climates and simulations from the Palaeoclimate Modelling and Coupled Modelling Intercomparison Projects. Newer models do not perform better than earlier versions despite higher resolution and complexity. Differences in climate sensitivity only weakly account for differences in model performance. In the glacial, models consistently underestimate land cooling (especially in winter) and overestimate ocean surface cooling (especially in the tropics). In the mid-Holocene, models generally underestimate the precipitation increase in the northern monsoon regions, and overestimate summer warming in central Eurasia. Models generally capture large-scale gradients of climate change but have more limited ability to reproduce spatial patterns. Despite these common biases, some models perform better than others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the initialization of Northern-hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates significantly reduces assimilation error both in identical-twin experiments and when assimilating sea-ice observations, reducing the concentration error by a factor of four to six, and the thickness error by a factor of two. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that the strong dependence of thermodynamic ice growth on ice concentration necessitates an adjustment of mean ice thickness in the analysis update. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that proportional mean-thickness updates are superior to the other two methods considered and enable us to assimilate sea ice in a global climate model using simple Newtonian relaxation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the initialisation of Northern Hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates leads to good assimilation performance for sea-ice concentration and thickness, both in identical-twin experiments and when assimilating sea-ice observations. The simulation of other Arctic surface fields in the coupled model is, however, not significantly improved by the assimilation. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that an adjustment of mean ice thickness in the analysis update is essential to arrive at plausible state estimates. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that assimilation with proportional mean-thickness updates outperforms the other two methods considered. The method described here is very simple to implement, and gives results that are sufficiently good to be used for initialising sea ice in a global climate model for seasonal to decadal predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the ability of several models to generate optimal hedge ratios. Statistical models employed include univariate and multivariate generalized autoregressive conditionally heteroscedastic (GARCH) models, and exponentially weighted and simple moving averages. The variances of the hedged portfolios derived using these hedge ratios are compared with those based on market expectations implied by the prices of traded options. One-month and three-month hedging horizons are considered for four currency pairs. Overall, it has been found that an exponentially weighted moving-average model leads to lower portfolio variances than any of the GARCH-based, implied or time-invariant approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper employs a vector autoregressive model to investigate the impact of macroeconomic and financial variables on a UK real estate return series. The results indicate that unexpected inflation, and the interest rate term spread have explanatory powers for the property market. However, the most significant influence on the real estate series are the lagged values of the real estate series themselves. We conclude that identifying the factors that have determined UK property returns over the past twelve years remains a difficult task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban land surface models (LSM) are commonly evaluated for short periods (a few weeks to months) because of limited observational data. This makes it difficult to distinguish the impact of initial conditions on model performance or to consider the response of a model to a range of possible atmospheric conditions. Drawing on results from the first urban LSM comparison, these two issues are considered. Assessment shows that the initial soil moisture has a substantial impact on the performance. Models initialised with soils that are too dry are not able to adjust their surface sensible and latent heat fluxes to realistic values until there is sufficient rainfall. Models initialised with too wet soils are not able to restrict their evaporation appropriately for periods in excess of a year. This has implications for short term evaluation studies and implies the need for soil moisture measurements to improve data assimilation and model initialisation. In contrast, initial conditions influencing the thermal storage have a much shorter adjustment timescale compared to soil moisture. Most models partition too much of the radiative energy at the surface into the sensible heat flux at the probable expense of the net storage heat flux.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe Global Atmosphere 4.0 (GA4.0) and Global Land 4.0 (GL4.0): configurations of the Met Office Unified Model and JULES (Joint UK Land Environment Simulator) community land surface model developed for use in global and regional climate research and weather prediction activities. GA4.0 and GL4.0 are based on the previous GA3.0 and GL3.0 configurations, with the inclusion of developments made by the Met Office and its collaborators during its annual development cycle. This paper provides a comprehensive technical and scientific description of GA4.0 and GL4.0 as well as details of how these differ from their predecessors. We also present the results of some initial evaluations of their performance. Overall, performance is comparable with that of GA3.0/GL3.0; the updated configurations include improvements to the science of several parametrisation schemes, however, and will form a baseline for further ongoing development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the performance of banks is of the utmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of their performance. Using a dynamic panel model, we analyse the impact of residential mortgage loans on bank profitability and risk, based on a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that an increase in residential mortgage loans seems to improve bank’s performance in terms of both profitability and credit risk in good market, pre-financial crisis, conditions. These findings may aid in explaining why banks rush to lend to property during booms because of the positive effect it has on performance. The results also show that credit risk and profitability are lower during the upturn in the residential property cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A great number of studies on wind conditions in passages between slab-type buildings have been conducted in the past. However, wind conditions under different structure and configuration of buildings is still unclear and studies existed still can’t provide guidance on urban planning and design, due to the complexity of buildings and aerodynamics. The aim of this paper is to provide more insight in the mechanism of wind conditions in passages. In this paper, a simplified passage model with non-parallel buildings is developed on the basis of the wind tunnel experiments conducted by Blocken et al. (2008). Numerical simulation based on CFD is employed for a detailed investigation of the wind environment in passages between two long narrow buildings with different directions and model validation is performed by comparing numerical results with corresponding wind tunnel measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the economics of Enhanced Landfill Mining (ELFM) both from a private point of view as well as from a society perspective. The private potential is assessed using a case study for which an investment model is developed to identify the impact of a broad range of parameters on the profitability of ELFM. We found that especially variations in Waste-to-Energy (WtE efficiency, electricity price, CO2-price, WtE investment and operational costs) and ELFM support explain the variation in economic profitability measured by the Internal Rate of Return. To overcome site-specific parameters we also evaluated the regional ELFM potential for the densely populated and industrial region of Flanders (north of Belgium). The total number of potential ELFM sites was estimated using a 5-step procedure and a simulation tool was developed to trade-off private costs and benefits. The analysis shows that there is a substantial economic potential for ELFM projects on the wider regional level. Furthermore, this paper also reviews the costs and benefits from a broader perspective. The carbon footprint of the case study was mapped in order to assess the project’s net impact in terms of greenhouse gas emissions. Also the impacts of nature restoration, soil remediation, resource scarcity and reduced import dependence were valued so that they can be used in future social cost-benefit analysis. Given the complex trade-off between economic, social and environmental issues of ELFM projects, we conclude that further refinement of the methodological framework and the development of the integrated decision tools supporting private and public actors, are necessary.