71 resultados para historic shipwreck


Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are now considerable expectations that semi-distributed models are useful tools for supporting catchment water quality management. However, insufficient attention has been given to evaluating the uncertainties inherent to this type of model, especially those associated with the spatial disaggregation of the catchment. The Integrated Nitrogen in Catchments model (INCA) is subjected to an extensive regionalised sensitivity analysis in application to the River Kennet, part of the groundwater-dominated upper Thames catchment, UK The main results are: (1) model output was generally insensitive to land-phase parameters, very sensitive to groundwater parameters, including initial conditions, and significantly sensitive to in-river parameters; (2) INCA was able to produce good fits simultaneously to the available flow, nitrate and ammonium in-river data sets; (3) representing parameters as heterogeneous over the catchment (206 calibrated parameters) rather than homogeneous (24 calibrated parameters) produced a significant improvement in fit to nitrate but no significant improvement to flow and caused a deterioration in ammonium performance; (4) the analysis indicated that calibrating the flow-related parameters first, then calibrating the remaining parameters (as opposed to calibrating all parameters together) was not a sensible strategy in this case; (5) even the parameters to which the model output was most sensitive suffered from high uncertainty due to spatial inconsistencies in the estimated optimum values, parameter equifinality and the sampling error associated with the calibration method; (6) soil and groundwater nutrient and flow data are needed to reduce. uncertainty in initial conditions, residence times and nitrogen transformation parameters, and long-term historic data are needed so that key responses to changes in land-use management can be assimilated. The results indicate the general, difficulty of reconciling the questions which catchment nutrient models are expected to answer with typically limited data sets and limited knowledge about suitable model structures. The results demonstrate the importance of analysing semi-distributed model uncertainties prior to model application, and illustrate the value and limitations of using Monte Carlo-based methods for doing so. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. The production of food for human consumption has led to an historical and global conflict with terrestrial carnivores, which in turn has resulted in the extinction or extirpation of many species, although some have benefited. At present, carnivores affect food production by: (i) killing human producers; killing and/or eating (ii) fish/shellfish; (iii) game/wildfowl; (iv) livestock; (v) damaging crops; (vi) transmitting diseases; and (vii) through trophic interactions with other species in agricultural landscapes. Conversely, carnivores can themselves be a source of dietary protein (bushmeat). 2. Globally, the major areas of conflict are predation on livestock and the transmission of rabies. At a broad scale, livestock predation is a customary problem where predators are present and has been quantified for a broad range of carnivore species, although the veracity of these estimates is equivocal. Typically, but not always, losses are small relative to the numbers held, but can be a significant proportion of total livestock mortality. Losses experienced by producers are often highly variable, indicating that factors such as husbandry practices and predator behaviour may significantly affect the relative vulnerability of properties in the wider landscape. Within livestock herds, juvenile animals are particularly vulnerable. 3. Proactive and reactive culling are widely practised as a means to limit predation on livestock and game. Historic changes in species' distributions and abundance illustrate that culling programmes can be very effective at reducing predator density, although such substantive impacts are generally considered undesirable for native predators. However, despite their prevalence, the effectiveness, efficiency and the benefit:cost ratio of culling programmes have been poorly studied. 4. A wide range of non-lethal methods to limit predation has been studied. However, many of these have their practical limitations and are unlikely to be widely applicable. 5. Lethal approaches are likely to dominate the management of terrestrial carnivores for the foreseeable future, but animal welfare considerations are increasingly likely to influence management strategies. The adoption of non-lethal approaches will depend upon proof of their effectiveness and the willingness of stakeholders to implement them, and, in some cases, appropriate licensing and legislation. 6. Overall, it is apparent that we still understand relatively little about the importance of factors affecting predation on livestock and how to manage this conflict effectively. We consider the following avenues of research to be essential: (i) quantified assessments of the loss of viable livestock; (ii) landscape-level studies of contiguous properties to quantify losses associated with variables such as different husbandry practices; (iii) replicated experimental manipulations to identify the relative benefit of particular management practices, incorporating (iv) techniques to identify individual predators killing stock; and (v) economic analyses of different management approaches to quantify optimal production strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate flower induction in June-bearing strawberry plants, morphological changes in shoot apices and Historic H4 expression in the central zone during flower initiation were observed. Strawberry plants were placed under flower inducible, short-day conditions (23 degrees C/17 degrees C, 10 h day length) for differing number of days (8, 16, 20, 24 or 32 days) and then these plants were transferred to non-inducible, long-day conditions (25 degrees C/20 degrees C, 14 h day length). The shoot apices of plants placed under short-day conditions for 8 days were flat, similar to shoot apices of plants in the vegetative phase of development, and Histone H4 was not expressed in the central zone during the experimental period. On the other hand, the shoot apices of plants placed under short-day conditions for 16 days remained flat, similar to shoot apices of plants placed under short-day conditions for 8 days, but Histone H4 was expressed in the central zone at the end of the short-day treatment. Morphological changes in the shoot apices of these plants were observed 8 days after the change in day-length. These plants developed differentiated flower organs after they were grown for another 30 days under long-day conditions. These results indicate that changes in the expression pattern of the Histone H4 gene occur before morphological changes during flower induction and that the expression of the gene in the central zone can be used as one of the indicators of the flowering process in strawberries. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

National food control systems are a key element in the protection of consumers from unsafe foods and from other fraudulent practices. International guidance is available and provides a framework for enhancing national systems. However, it is recognized that before reaching decisions on the necessary improvements to a national system, an analysis is required of the current state of key elements in the present system. This paper provides such an analysis for the State of Kuwait. The fragmented nature of the food control system is described. Four key elements of the Kuwaiti system are analyzed: the legal framework, the administrative structures, the enforcement activity and the provision of education and training. It is noted that the country has a dependence on imported foods and that the present national food control system is largely based on an historic approach to food sampling at the point of import and is unsustainable. The paper recommends a more coordinated approach to food safety control in Kuwait with a significant increase in the use of risk analysis methods to target enforcement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility, or the variability of the underlying asset, is one of the key fundamental components of property derivative pricing and in the application of real option models in development analysis. There has been relatively little work on volatility in real terms of its application to property derivatives and the real options analysis. Most research on volatility stems from investment performance (Nathakumaran & Newell (1995), Brown & Matysiak 2000, Booth & Matysiak 2001). Historic standard deviation is often used as a proxy for volatility and there has been a reliance on indices, which are subject to valuation smoothing effects. Transaction prices are considered to be more volatile than the traditional standard deviations of appraisal based indices. This could lead, arguably, to inefficiencies and mis-pricing, particularly if it is also accepted that changes evolve randomly over time and where future volatility and not an ex-post measure is the key (Sing 1998). If history does not repeat, or provides an unreliable measure, then estimating model based (implied) volatility is an alternative approach (Patel & Sing 2000). This paper is the first of two that employ alternative approaches to calculating and capturing volatility in UK real estate for the purposes of applying the measure to derivative pricing and real option models. It draws on a uniquely constructed IPD/Gerald Eve transactions database, containing over 21,000 properties over the period 1983-2005. In this first paper the magnitude of historic amplification associated with asset returns by sector and geographic spread is looked at. In the subsequent paper the focus will be upon model based (implied) volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historic analysis of the inflation hedging properties of stocks produced anomalous results, with equities often appearing to offer a perverse hedge against inflation. This has been attributed to the impact of real and monetary shocks to the economy, which influence both inflation and asset returns. It has been argued that real estate should provide a better hedge: however, empirical results have been mixed. This paper explores the relationship between commercial real estate returns (from both private and public markets) and economic, fiscal and monetary factors and inflation for US and UK markets. Comparative analysis of general equity and small capitalisation stock returns in both markets is carried out. Inflation is subdivided into expected and unexpected components using different estimation techniques. The analyses are undertaken using long-run error correction techniques. In the long-run, once real and monetary variables are included, asset returns are positively linked to anticipated inflation but not to inflation shocks. Adjustment processes are, however, gradual and not within period. Real estate returns, particularly direct market returns, exhibit characteristics that differ from equities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper uses data provided by three major real estate advisory firms to investigate the level and pattern of variation in the measurement of historic real estate rental values for the main European office centres. The paper assesses the extent to which the data providing organizations agree on historic market performance in terms of returns, risk and timing and examines the relationship between market maturity and agreement. The analysis suggests that at the aggregate level and for many markets, there is substantial agreement on direction, quantity and timing of market change. However, there is substantial variability in the level of agreement among cities. The paper also assesses whether the different data sets produce different explanatory models and market forecast. It is concluded that, although disagreement on the direction of market change is high for many market, the different data sets often produce similar explanatory models and predict similar relative performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The British countryside has been shaped and sustained over the years by the establishment of landed estates. Some of our best known, and now most protected, landmarks derive from this tradition by which money, that was often sourced from outside the rural economy, was invested in land. Whilst there was some reversal in this trend during the last century, there is again a widespread desire among people of means to invest in new country property. Paragraph 3.21 of Planning Policy Guidance Note 7: The Countryside - Environmental Quality and Economic and Social Development was introduced in 1997 as a means of perpetuating the historic tradition of innovation in the countryside through the construction of fine individual houses in landscaped grounds. That it was considered necessary to use a special provision of this kind reflects the prevailing presumption of planning authorities against allowing private residential development in open countryside. The Government is currently reviewing rural planning policy and is focusing on higher density housing, affordable homes and the use of brownfield sites. There is an underlying conception that individual private house developments contribute nothing and are seen as the least attractive option for most development sites. The purpose of paragraph 3.21 lies outside the government’s priorities and its particular provisions may therefore be excluded in forthcoming ‘policy statements’. This paper seeks to examine the role of private investors wishing to build new houses in the countryside, and the impact that that might have on local economies. It explores the interpretation placed on PPG7 through an investigation of appeal sites, and concludes by making recommendations for the review process, including the retention of some form of exceptions policy for new build houses.