928 resultados para HEVC Performance Modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Johne's disease in cattle is a contagious wasting disease caused by Mycobacterium avium subspecies paratuberculosis (MAP). Johne's infection is characterised by a long subclinical phase and can therefore go undetected for long periods of time during which substantial production losses can occur. The protracted nature of Johne's infection therefore presents a challenge for both veterinarians and farmers when discussing control options due to a paucity of information and limited test performance when screening for the disease. The objectives were to model Johne's control decisions in suckler beef cattle using a decision support approach, thus implying equal focus on ‘end user’ (veterinarian) participation whilst still focusing on the technical disease modelling aspects during the decision support model development. The model shows how Johne's disease is likely to affect a herd over time both in terms of physical and financial impacts. In addition, the model simulates the effect on production from two different Johne's control strategies; herd management measures and test and cull measures. The article also provides and discusses results from a sensitivity analysis to assess the effects on production from improving the currently available test performance. Output from running the model shows that a combination of management improvements to reduce routes of infection and testing and culling to remove infected and infectious animals is likely to be the least-cost control strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An unlisted property fund is a private investment vehicle which aims to provide direct property total returns and may also employ financial leverage which will accentuate performance. They have become a far more prevalent institutional property investment conduit since the early 2000’s. Investors have been primarily attracted to them due to the ease of executing a property exposure, both domestically and internationally, and for their diversification benefits given the capital intensive nature of constructing a well diversified commercial property investment portfolio. However, despite their greater prominence there has been little academic research conducted on the performance and risks of unlisted property fund investments. This can be attributed to a paucity of available data and limited time series where it exists. In this study we have made use of a unique dataset of institutional UK unlisted non-listed property funds over the period 2003Q4 to 2011Q4, using a panel modelling framework in order to determine the key factors which impact on fund performance. The sample provided a rich set of unlisted property fund factors including market exposures, direct property characteristics and the level of financial leverage employed. The findings from the panel regression analysis show that a small number of variables are able to account for the performance of unlisted property funds. These variables should be considered by investors when assessing the risk and return of these vehicles. The impact of financial leverage upon the performance of these vehicles through the recent global financial crisis and subsequent UK commercial property market downturn was also studied. The findings indicate a significant asymmetric effect of employing debt finance within unlisted property funds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study attempts to fill the existing gap in the simulation of variable flow distribution systems through developing new pressure governing components. These components are able to capture the actual ever-changing system performance curve in variable flow distribution systems together with the prediction of controversial issues such as starving, over-flow and the lack of controllability on the flow rate of different branches in a hydronic system. The performance of the proposed components is verified using a case study under design and off-design circumstances. Full integration of the new components within the TRNSYS simulation package is another advantage of this study, which makes it more applicable for designers in both the design and commissioning of hydronic systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The heat pump market in the UK has grown rapidly over the last few years. Performance analyses of vertical ground-loop heat exchanger configurations have been widely carried out using both numerical modelling and experiments. However, research findings and design recommendations on horizontal slinky-loop and vertical slinky-loop heat exchangers are far fewer compared with those for vertical ground-loop heat exchanger configurations, especially where the long-term operation of the systems is concerned. The paper presents the results obtained from a numerical simulation for the horizontal slinky-loop and vertical slinky-loop heat exchangers of a ground-source heat pump system. A three-dimensional numerical heat transfer model was developed to study the thermal performance of various heat exchanger configurations. The influence of the loop pitch (loop spacing) and the depth of a vertical slinky-loop installation were investigated and the thermal performance and excavation work required for the horizontal and vertical slinky-loop heat exchangers were compared. The influence of the installation depth for vertical slinky-loop configurations was also investigated. The results of this study show that the influence of the installation depth of the vertical slinky-loop heat exchanger on the thermal performance of the system is small. The maximum difference in the thermal performance between the vertical and horizontal slinky-loop heat exchangers with the same loop diameter and loop pitch is less than 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-based Synthetic Aperture Radar (SAR) has proved useful for obtaining information on flood extent, which, when intersected with a Digital Elevation Model (DEM) of the floodplain, provides water level observations that can be assimilated into a hydrodynamic model to decrease forecast uncertainty. With an increasing number of operational satellites with SAR capability, information on the relationship between satellite first visit and revisit times and forecast performance is required to optimise the operational scheduling of satellite imagery. By using an Ensemble Transform Kalman Filter (ETKF) and a synthetic analysis with the 2D hydrodynamic model LISFLOOD-FP based on a real flooding case affecting an urban area (summer 2007,Tewkesbury, Southwest UK), we evaluate the sensitivity of the forecast performance to visit parameters. We emulate a generic hydrologic-hydrodynamic modelling cascade by imposing a bias and spatiotemporal correlations to the inflow error ensemble into the hydrodynamic domain. First, in agreement with previous research, estimation and correction for this bias leads to a clear improvement in keeping the forecast on track. Second, imagery obtained early in the flood is shown to have a large influence on forecast statistics. Revisit interval is most influential for early observations. The results are promising for the future of remote sensing-based water level observations for real-time flood forecasting in complex scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A physically based gust parameterisation is added to the atmospheric mesoscale model FOOT3DK to estimate wind gusts associated with storms over West Germany. The gust parameterisation follows the Wind Gust Estimate (WGE) method and its functionality is verified in this study. The method assumes that gusts occurring at the surface are induced by turbulent eddies in the planetary boundary layer, deflecting air parcels from higher levels down to the surface under suitable conditions. Model simulations are performed with horizontal resolutions of 20 km and 5 km. Ten historical storm events of different characteristics and intensities are chosen in order to include a wide range of typical storms affecting Central Europe. All simulated storms occurred between 1990 and 1998. The accuracy of the method is assessed objectively by validating the simulated wind gusts against data from 16 synoptic stations by means of “quality parameters”. Concerning these parameters, the temporal and spatial evolution of the simulated gusts is well reproduced. Simulated values for low altitude stations agree particularly well with the measured gusts. For orographically exposed locations, the gust speeds are partly underestimated. The absolute maximum gusts lie in most cases within the bounding interval given by the WGE method. Focussing on individual storms, the performance of the method is better for intense and large storms than for weaker ones. Particularly for weaker storms, the gusts are typically overestimated. The results for the sample of ten storms document that the method is generally applicable with the mesoscale model FOOT3DK for mid-latitude winter storms, even in areas with complex orography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood prediction systems rely on good quality precipitation input data and forecasts to drive hydrological models. Most precipitation data comes from daily stations with a good spatial coverage. However, some flood events occur on sub-daily time scales and flood prediction systems could benefit from using models calibrated on the same time scale. This study compares precipitation data aggregated from hourly stations (HP) and data disaggregated from daily stations (DP) with 6-hourly forecasts from ECMWF over the time period 1 October 2006–31 December 2009. The HP and DP data sets were then used to calibrate two hydrological models, LISFLOOD-RR and HBV, and the latter was used in a flood case study. The HP scored better than the DP when evaluated against the forecast for lead times up to 4 days. However, this was not translated in the same way to the hydrological modelling, where the models gave similar scores for simulated runoff with the two datasets. The flood forecasting study showed that both datasets gave similar hit rates whereas the HP data set gave much smaller false alarm rates (FAR). This indicates that using sub-daily precipitation in the calibration and initiation of hydrological models can improve flood forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse by simulation the impact of model-selection strategies (sometimes called pre-testing) on forecast performance in both constant-and non-constant-parameter processes. Restricted, unrestricted and selected models are compared when either of the first two might generate the data. We find little evidence that strategies such as general-to-specific induce significant over-fitting, or thereby cause forecast-failure rejection rates to greatly exceed nominal sizes. Parameter non-constancies put a premium on correct specification, but in general, model-selection effects appear to be relatively small, and progressive research is able to detect the mis-specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Tropical Rainfall Measuring Mission 3B42 precipitation estimates are widely used in tropical regions for hydrometeorological research. Recently, version 7 of the product was released. Major revisions to the algorithm involve the radar refl ectivity - rainfall rates relationship, surface clutter detection over high terrain, a new reference database for the passive microwave algorithm, and a higher quality gauge analysis product for monthly bias correction. To assess the impacts of the improved algorithm, we compare the version 7 and the older version 6 product with data from 263 rain gauges in and around the northern Peruvian Andes. The region covers humid tropical rainforest, tropical mountains, and arid to humid coastal plains. We and that the version 7 product has a significantly lower bias and an improved representation of the rainfall distribution. We further evaluated the performance of versions 6 and 7 products as forcing data for hydrological modelling, by comparing the simulated and observed daily streamfl ow in 9 nested Amazon river basins. We find that the improvement in the precipitation estimation algorithm translates to an increase in the model Nash-Sutcliffe effciency, and a reduction in the percent bias between the observed and simulated flows by 30 to 95%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Price indices for commercial real estate markets are difficult to construct because assets are heterogeneous, they are spatially dispersed and they are infrequently traded. Appraisal-based indices are one response to these problems, but may understate volatility or fail to capture turning points in a timely manner. This paper estimates “transaction linked indices” for major European markets to see whether these offer a different perspective on market performance. The paper aims to discuss these issues. Design/methodology/approach – The assessed value method is used to construct the indices. This has been recently applied to commercial real estate datasets in the USA and UK. The underlying data comprise appraisals and sale prices for assets monitored by Investment Property Databank (IPD). The indices are compared to appraisal-based series for the countries concerned for Q4 2001 to Q4 2012. Findings – Transaction linked indices show stronger growth and sharper declines over the course of the cycle, but they do not notably lead their appraisal-based counterparts. They are typically two to four times more volatile. Research limitations/implications – Only country-level indicators can be constructed in many cases owing to low trading volumes in the period studied, and this same issue prevented sample selection bias from being analysed in depth. Originality/value – Discussion of the utility of transaction-based price indicators is extended to European commercial real estate markets. The indicators offer alternative estimates of real estate market volatility that may be useful in asset allocation and risk modelling, including in a regulatory context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM) with an existing pollination service model (PSM) to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees) as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, computational fluid dynamics (CFD) has been widely used as a method of simulating airflow and addressing indoor environment problems. The complexity of airflows within the indoor environment would make experimental investigation difficult to undertake and also imposes significant challenges on turbulence modelling for flow prediction. This research examines through CFD visualization how air is distributed within a room. Measurements of air temperature and air velocity have been performed at a number of points in an environmental test chamber with a human occupant. To complement the experimental results, CFD simulations were carried out and the results enabled detailed analysis and visualization of spatial distribution of airflow patterns and the effect of different parameters to be predicted. The results demonstrate the complexity of modelling human exhalation within a ventilated enclosure and shed some light into how to achieve more realistic predictions of the airflow within an occupied enclosure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When studying hydrological processes with a numerical model, global sensitivity analysis (GSA) is essential if one is to understand the impact of model parameters and model formulation on results. However, different definitions of sensitivity can lead to a difference in the ranking of importance of the different model factors. Here we combine a fuzzy performance function with different methods of calculating global sensitivity to perform a multi-method global sensitivity analysis (MMGSA). We use an application of a finite element subsurface flow model (ESTEL-2D) on a flood inundation event on a floodplain of the River Severn to illustrate this new methodology. We demonstrate the utility of the method for model understanding and show how the prediction of state variables, such as Darcian velocity vectors, can be affected by such a MMGSA. This paper is a first attempt to use GSA with a numerically intensive hydrological model.