62 resultados para Structure-based model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Urban surveillance footage can be of poor quality, partly due to the low quality of the camera and partly due to harsh lighting and heavily reflective scenes. For some computer surveillance tasks very simple change detection is adequate, but sometimes a more detailed change detection mask is desirable, eg, for accurately tracking identity when faced with multiple interacting individuals and in pose-based behaviour recognition. We present a novel technique for enhancing a low-quality change detection into a better segmentation using an image combing estimator in an MRF based model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken apart from a single Guidance Note (GN5, RICS 2003) stressing the importance of recognising uncertainty in valuation but not proffering any particular solution. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The surfactant-like peptide (Ala)6(Arg) is found to self-assemble into 3 nm-thick sheets in aqueous solution. Scanning transmission electron microscopy measurements of mass per unit area indicate a layer structure based on antiparallel dimers. At higher concentration the sheets wrap into unprecedented ultrathin helical ribbon and nanotube architectures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vintage-based vector autoregressive models of a single macroeconomic variable are shown to be a useful vehicle for obtaining forecasts of different maturities of future and past observations, including estimates of post-revision values. The forecasting performance of models which include information on annual revisions is superior to that of models which only include the first two data releases. However, the empirical results indicate that a model which reflects the seasonal nature of data releases more closely does not offer much improvement over an unrestricted vintage-based model which includes three rounds of annual revisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasing cereal yield is needed to meet the projected increased demand for world food supply of about 70% by 2050. Sirius, a process-based model for wheat, was used to estimate yield potential for wheat ideotypes optimized for future climatic projections (HadCM3 global climate model) for ten wheat growing areas of Europe. It was predicted that the detrimental effect of drought stress on yield would be decreased due to enhanced tailoring of phenology to future weather patterns, and due to genetic improvements in the response of photosynthesis and green leaf duration to water shortage. Yield advances could be made through extending maturation and thereby improve resource capture and partitioning. However the model predicted an increase in frequency of heat stress at meiosis and anthesis. Controlled environment experiments quantify the effects of heat and drought at booting and flowering on grain numbers and potential grain size. A current adaptation of wheat to areas of Europe with hotter and drier summers is a quicker maturation which helps to escape from excessive stress, but results in lower yields. To increase yield potential and to respond to climate change, increased tolerance to heat and drought stress should remain priorities for the genetic improvement of wheat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ships and wind turbines generate noise, which can have a negative impact on marine mammal populations by scaring animals away. Effective modelling of how this affects the populations has to take account of the location and timing of disturbances. Here we construct an individual-based model of harbour porpoises in the Inner Danish Waters. Individuals have their own energy budgets constructed using established principles of physiological ecology. Data are lacking on the spatial distribution of food which is instead inferred from knowledge of time-varying porpoise distributions. The model produces plausible patterns of population dynamics and matches well the age distribution of porpoises caught in by-catch. It estimates the effect of existing wind farms as a 10% reduction in population size when food recovers fast (after two days). Proposed new wind farms and ships do not result in further population declines. The population is however sensitive to variations in mortality resulting from by-catch and to the speed at which food recovers after being depleted. If food recovers slowly the effect of wind turbines becomes negligible, whereas ships are estimated to have a significant negative impact on the population. Annual by-catch rates ≥10% lead to monotonously decreasing populations and to extinction, and even the estimated by-catch rate from the adjacent area (approximately 4.1%) has a strong impact on the population. This suggests that conservation efforts should be more focused on reducing by-catch in commercial gillnet fisheries than on limiting the amount of anthropogenic noise. Individual-based models are unique in their ability to take account of the location and timing of disturbances and to show their likely effects on populations. The models also identify deficiencies in the existing database and can be used to set priorities for future field research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a process-based model for the dispersion of a passive scalar in the turbulent flow around the buildings of a city centre. The street network model is based on dividing the airspace of the streets and intersections into boxes, within which the turbulence renders the air well mixed. Mean flow advection through the network of street and intersection boxes then mediates further lateral dispersion. At the same time turbulent mixing in the vertical detrains scalar from the streets and intersections into the turbulent boundary layer above the buildings. When the geometry is regular, the street network model has an analytical solution that describes the variation in concentration in a near-field downwind of a single source, where the majority of scalar lies below roof level. The power of the analytical solution is that it demonstrates how the concentration is determined by only three parameters. The plume direction parameter describes the branching of scalar at the street intersections and hence determines the direction of the plume centreline, which may be very different from the above-roof wind direction. The transmission parameter determines the distance travelled before the majority of scalar is detrained into the atmospheric boundary layer above roof level and conventional atmospheric turbulence takes over as the dominant mixing process. Finally, a normalised source strength multiplies this pattern of concentration. This analytical solution converges to a Gaussian plume after a large number of intersections have been traversed, providing theoretical justification for previous studies that have developed empirical fits to Gaussian plume models. The analytical solution is shown to compare well with very high-resolution simulations and with wind tunnel experiments, although re-entrainment of scalar previously detrained into the boundary layer above roofs, which is not accounted for in the analytical solution, is shown to become an important process further downwind from the source.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate projections show Australia becoming significantly warmer during the 21st century, and precipitation decreasing over much of the continent. Such changes are conventionally considered to increase wildfire risk. Nevertheless, we show that burnt area increases in southern Australia, but decreases in northern Australia. Overall the projected increase in fire is small (0.72–1.31% of land area, depending on the climate scenario used), and does not cause a decrease in carbon storage. In fact, carbon storage increases by 3.7–5.6 Pg C (depending on the climate scenario used). Using a process-based model of vegetation dynamics, vegetation–fire interactions and carbon cycling, we show increased fire promotes a shift to more fire-adapted trees in wooded areas and their encroachment into grasslands, with an overall increase in forested area of 3.9–11.9%. Both changes increase carbon uptake and storage. The increase in woody vegetation increases the amount of coarse litter, which decays more slowly than fine litter hence leading to a relative reduction in overall heterotrophic respiration, further reducing carbon losses. Direct CO2 effects increase woody cover, water-use efficiency and productivity, such that carbon storage is increased by 8.5–14.8 Pg C compared to simulations in which CO2 is held constant at modern values. CO2 effects tend to increase burnt area, fire fluxes and therefore carbon losses in arid areas, but increase vegetation density and reduce burnt area in wooded areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

More and more households are purchasing electric vehicles (EVs), and this will continue as we move towards a low carbon future. There are various projections as to the rate of EV uptake, but all predict an increase over the next ten years. Charging these EVs will produce one of the biggest loads on the low voltage network. To manage the network, we must not only take into account the number of EVs taken up, but where on the network they are charging, and at what time. To simulate the impact on the network from high, medium and low EV uptake (as outlined by the UK government), we present an agent-based model. We initialise the model to assign an EV to a household based on either random distribution or social influences - that is, a neighbour of an EV owner is more likely to also purchase an EV. Additionally, we examine the effect of peak behaviour on the network when charging is at day-time, night-time, or a mix of both. The model is implemented on a neighbourhood in south-east England using smart meter data (half hourly electricity readings) and real life charging patterns from an EV trial. Our results indicate that social influence can increase the peak demand on a local level (street or feeder), meaning that medium EV uptake can create higher peak demand than currently expected.