841 resultados para physically-based model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken apart from a single Guidance Note (GN5, RICS 2003) stressing the importance of recognising uncertainty in valuation but not proffering any particular solution. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High rates of nutrient loading from agricultural and urban development have resulted in surface water eutrophication and groundwater contamination in regions of Ontario. In Lake Simcoe (Ontario, Canada), anthropogenic nutrient contributions have contributed to increased algal growth, low hypolimnetic oxygen concentrations, and impaired fish reproduction. An ambitious programme has been initiated to reduce phosphorus loads to the lake, aiming to achieve at least a 40% reduction in phosphorus loads by 2045. Achievement of this target necessitates effective remediation strategies, which will rely upon an improved understanding of controls on nutrient export from tributaries of Lake Simcoe as well as improved understanding of the importance of phosphorus cycling within the lake. In this paper, we describe a new model structure for the integrated dynamic and process-based model INCA-P, which allows fully-distributed applications, suited to branched river networks. We demonstrate application of this model to the Black River, a tributary of Lake Simcoe, and use INCA-P to simulate the fluxes of P entering the lake system, apportion phosphorus among different sources in the catchment, and explore future scenarios of land-use change and nutrient management to identify high priority sites for implementation of watershed best management practises.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A weekly programme of water quality monitoring has been conducted by Slapton Ley Field Centre since 1970. Samples have been collected for the four main streams draining into Slapton Ley, from the Ley itself and from other sites within the catchment. On occasions, more frequent sampling has been undertaken during short-term research projects, usually in relation to nutrient export from the catchment. These water quality data, unparalleled in length for a series of small drainage basins in the British Isles, provide a unique resource for analysis of spatial and temporal variations in stream water quality within an agricultural area. Not surprisingly, given the eutrophic status of the Ley, most attention has focused on the nutrients nitrate and phosphate. A number of approaches to modelling nutrient loss have been attempted, including time series analysis and the application of nutrient export and physically-based models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quantitative simulations of the global-scale benefits of climate change mitigation are presented, using a harmonised, self-consistent approach based on a single set of climate change scenarios. The approach draws on a synthesis of output from both physically-based and economics-based models, and incorporates uncertainty analyses. Previous studies have projected global and regional climate change and its impacts over the 21st century but have generally focused on analysis of business-as-usual scenarios, with no explicit mitigation policy included. This study finds that both the economics-based and physically-based models indicate that early, stringent mitigation would avoid a large proportion of the impacts of climate change projected for the 2080s. However, it also shows that not all the impacts can now be avoided, so that adaptation would also therefore be needed to avoid some of the potential damage. Delay in mitigation substantially reduces the percentage of impacts that can be avoided, providing strong new quantitative evidence for the need for stringent and prompt global mitigation action on greenhouse gas emissions, combined with effective adaptation, if large, widespread climate change impacts are to be avoided. Energy technology models suggest that such stringent and prompt mitigation action is technologically feasible, although the estimated costs vary depending on the specific modelling approach and assumptions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of diurnal variations in sea surface temperature (SST) on the air-sea flux of CO2 over the central Atlantic ocean and Mediterranean Sea (60 S–60 N, 60 W–45 E) is evaluated for 2005–2006. We use high spatial resolution hourly satellite ocean skin temperature data to determine the diurnal warming (ΔSST). The CO2 flux is then computed using three different temperature fields – a foundation temperature (Tf, measured at a depth where there is no diurnal variation), Tf, plus the hourly ΔSST and Tf, plus the monthly average of the ΔSSTs. This is done in conjunction with a physically-based parameterisation for the gas transfer velocity (NOAA-COARE). The differences between the fluxes evaluated for these three different temperature fields quantify the effects of both diurnal warming and diurnal covariations. We find that including diurnal warming increases the CO2 flux out of this region of the Atlantic for 2005–2006 from 9.6 Tg C a−1 to 30.4 Tg C a−1 (hourly ΔSST) and 31.2 Tg C a−1 (monthly average of ΔSST measurements). Diurnal warming in this region, therefore, has a large impact on the annual net CO2 flux but diurnal covariations are negligible. However, in this region of the Atlantic the uptake and outgassing of CO2 is approximately balanced over the annual cycle, so although we find diurnal warming has a very large effect here, the Atlantic as a whole is a very strong carbon sink (e.g. −920 Tg C a−1 Takahashi et al., 2002) making this is a small contribution to the Atlantic carbon budget.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vintage-based vector autoregressive models of a single macroeconomic variable are shown to be a useful vehicle for obtaining forecasts of different maturities of future and past observations, including estimates of post-revision values. The forecasting performance of models which include information on annual revisions is superior to that of models which only include the first two data releases. However, the empirical results indicate that a model which reflects the seasonal nature of data releases more closely does not offer much improvement over an unrestricted vintage-based model which includes three rounds of annual revisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasing cereal yield is needed to meet the projected increased demand for world food supply of about 70% by 2050. Sirius, a process-based model for wheat, was used to estimate yield potential for wheat ideotypes optimized for future climatic projections (HadCM3 global climate model) for ten wheat growing areas of Europe. It was predicted that the detrimental effect of drought stress on yield would be decreased due to enhanced tailoring of phenology to future weather patterns, and due to genetic improvements in the response of photosynthesis and green leaf duration to water shortage. Yield advances could be made through extending maturation and thereby improve resource capture and partitioning. However the model predicted an increase in frequency of heat stress at meiosis and anthesis. Controlled environment experiments quantify the effects of heat and drought at booting and flowering on grain numbers and potential grain size. A current adaptation of wheat to areas of Europe with hotter and drier summers is a quicker maturation which helps to escape from excessive stress, but results in lower yields. To increase yield potential and to respond to climate change, increased tolerance to heat and drought stress should remain priorities for the genetic improvement of wheat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Model quality assessment programs (MQAPs) aim to assess the quality of modelled 3D protein structures. The provision of quality scores, describing both global and local (per-residue) accuracy are extremely important, as without quality scores we are unable to determine the usefulness of a 3D model for further computational and experimental wet lab studies.Here, we briefly discuss protein tertiary structure prediction, along with the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) competition and their key role in driving the field of protein model quality assessment methods (MQAPs). We also briefly discuss the top MQAPs from the previous CASP competitions. Additionally, we describe our downloadable and webserver-based model quality assessment methods: ModFOLD3, ModFOLDclust, ModFOLDclustQ, ModFOLDclust2, and IntFOLD-QA. We provide a practical step-by-step guide on using our downloadable and webserver-based tools and include examples of their application for improving tertiary structure prediction, ligand binding site residue prediction, and oligomer predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As weather and climate models move toward higher resolution, there is growing excitement about potential future improvements in the understanding and prediction of atmospheric convection and its interaction with larger-scale phenomena. A meeting in January 2013 in Dartington, Devon was convened to address the best way to maximise these improvements, specifically in a UK context but with international relevance. Specific recommendations included increased convective-scale observations, high-resolution virtual laboratories, and a system of parameterization test beds with a range of complexities. The main recommendation was to facilitate the development of physically based convective parameterizations that are scale-aware, non-local, non-equilibrium, and stochastic.