50 resultados para Validation and certification competences process
em CentAUR: Central Archive University of Reading - UK
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
The applications of rheology to the main processes encountered during breadmaking (mixing, sheeting, fermentation and baking) are reviewed. The most commonly used rheological test methods and their relationships to product functionality are reviewed. It is shown that the most commonly used method for rheological testing of doughs, shear oscillation dynamic rheology, is generally used under deformation conditions inappropriate for breadmaking and shows little relationship with end-use performance. The frequency range used in conventional shear oscillation tests is limited to the plateau region, which is insensitive to changes in the HMW glutenin polymers thought to be responsible for variations in baking quality. The appropriate deformation conditions can be accessed either by long-time creep or relaxation measurements, or by large deformation extensional measurements at low strain rates and elevated temperatures. Molecular size and structure of the gluten polymers that make up the major structural components of wheat are related to their rheological properties via modern polymer rheology concepts. Interactions between polymer chain entanglements and branching are seen to be the key mechanisms determining the rheology of HMW polymers. Recent work confirms the observation that the dynamic shear plateau modulus is essentially independent of variations in MW of glutens amongst wheat varieties of varying baking performance and also that it is not the size of the soluble glutenin polymers, but the secondary structural and rheological properties of the insoluble polymer fraction that are mainly responsible for variations in baking performance. Extensional strain hardening has been shown to be a sensitive indicator of entanglements and long-chain branching in HMW polymers, and is well related to baking performance of bread doughs. The Considere failure criterion for instability in extension of polymers defines a region below which bubble walls become unstable, and predicts that when strain hardening falls below a value of around 1, bubble walls are no longer stable and coalesce rapidly, resulting in loss of gas retention and lower volume and texture. Strain hardening in doughs has been shown to reach this value at increasingly higher temperatures for better breadmaking varieties and is directly related to bubble stability and baking performance. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Typically, algorithms for generating stereo disparity maps have been developed to minimise the energy equation of a single image. This paper proposes a method for implementing cross validation in a belief propagation optimisation. When tested using the Middlebury online stereo evaluation, the cross validation improves upon the results of standard belief propagation. Furthermore, it has been shown that regions of homogeneous colour within the images can be used for enforcing the so-called "Segment Constraint". Developing from this, Segment Support is introduced to boost belief between pixels of the same image region and improve propagation into textureless regions.
Resumo:
This paper reviews and critiques the current practice of classifying building clients according to their 'type'. An alternative approach to understanding organisations is developed in accordance with the principles of naturalistic inquiry. It is contended that the complex pluralistic clients of the 1990s can only really be understood 'from the inside'. The concept of organisational metaphors is introduced as the basis for a more sophisticated way of thinking about organisations. The various strands of organisational theory are also analyzed in terms of their underlying metaphors. Different theories are seen to bring different insights. The implicit metaphors adopted by practitioners are held to be important in that they tend to dictate the adopted approach to client briefing. This contention is illustrated by analyzing three different characterisations of the briefing process in terms of their underlying metaphors. Finally, the discussion is placed in a contemporary UK context by comparing the dominant paradigm of practice during the 1980s to that of the 1990s.
Resumo:
We introduce the notion that the energy of individuals can manifest as a higher-level, collective construct. To this end, we conducted four independent studies to investigate the viability and importance of the collective energy construct as assessed by a new survey instrument—the productive energy measure (PEM). Study 1 (n = 2208) included exploratory and confirmatory factor analyses to explore the underlying factor structure of PEM. Study 2 (n = 660) cross-validated the same factor structure in an independent sample. In study 3, we administered the PEM to more than 5000 employees from 145 departments located in five countries. Results from measurement invariance, statistical aggregation, convergent, and discriminant-validity assessments offered additional support for the construct validity of PEM. In terms of predictive and incremental validity, the PEM was positively associated with three collective attitudes—units' commitment to goals, the organization, and overall satisfaction. In study 4, we explored the relationship between the productive energy of firms and their overall performance. Using data from 92 firms (n = 5939employees), we found a positive relationship between the PEM (aggregated to the firm level) and the performance of those firms. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
Models for water transfer in the crop-soil system are key components of agro-hydrological models for irrigation, fertilizer and pesticide practices. Many of the hydrological models for water transfer in the crop-soil system are either too approximate due to oversimplified algorithms or employ complex numerical schemes. In this paper we developed a simple and sufficiently accurate algorithm which can be easily adopted in agro-hydrological models for the simulation of water dynamics. We used a dual crop coefficient approach proposed by the FAO for estimating potential evaporation and transpiration, and a dynamic model for calculating relative root length distribution on a daily basis. In a small time step of 0.001 d, we implemented algorithms separately for actual evaporation, root water uptake and soil water content redistribution by decoupling these processes. The Richards equation describing soil water movement was solved using an integration strategy over the soil layers instead of complex numerical schemes. This drastically simplified the procedures of modeling soil water and led to much shorter computer codes. The validity of the proposed model was tested against data from field experiments on two contrasting soils cropped with wheat. Good agreement was achieved between measurement and simulation of soil water content in various depths collected at intervals during crop growth. This indicates that the model is satisfactory in simulating water transfer in the crop-soil system, and therefore can reliably be adopted in agro-hydrological models. Finally we demonstrated how the developed model could be used to study the effect of changes in the environment such as lowering the groundwater table caused by the construction of a motorway on crop transpiration. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.
Resumo:
Refractivity changes (ΔN) derived from radar ground clutter returns serve as a proxy for near-surface humidity changes (1 N unit ≡ 1% relative humidity at 20 °C). Previous studies have indicated that better humidity observations should improve forecasts of convection initiation. A preliminary assessment of the potential of refractivity retrievals from an operational magnetron-based C-band radar is presented. The increased phase noise at shorter wavelengths, exacerbated by the unknown position of the target within the 300 m gate, make it difficult to obtain absolute refractivity values, so we consider the information in 1 h changes. These have been derived to a range of 30 km with a spatial resolution of ∼4 km; the consistency of the individual estimates (within each 4 km × 4 km area) indicates that ΔN errors are about 1 N unit, in agreement with in situ observations. Measurements from an instrumented tower on summer days show that the 1 h refractivity changes up to a height of 100 m remain well correlated with near-surface values. The analysis of refractivity as represented in the operational Met Office Unified Model at 1.5, 4 and 12 km grid lengths demonstrates that, as model resolution increases, the spatial scales of the refractivity structures improve. It is shown that the magnitude of refractivity changes is progressively underestimated at larger grid lengths during summer. However, the daily time series of 1 h refractivity changes reveal that, whereas the radar-derived values are very well correlated with the in situ observations, the high-resolution model runs have little skill in getting the right values of ΔN in the right place at the right time. This suggests that the assimilation of these radar refractivity observations could benefit forecasts of the initiation of convection.
Resumo:
This review is an output of the International Life Sciences Institute (ILSI) Europe Marker Initiative, which aims to identify evidence-based criteria for selecting adequate measures of nutrient effects on health through comprehensive literature review. Experts in cognitive and nutrition sciences examined the applicability of these proposed criteria to the field of cognition with respect to the various cognitive domains usually assessed to reflect brain or neurological function. This review covers cognitive domains important in the assessment of neuronal integrity and function, commonly used tests and their state of validation, and the application of the measures to studies of nutrition and nutritional intervention trials. The aim is to identify domain-specific cognitive tests that are sensitive to nutrient interventions and from which guidance can be provided to aid the application of selection criteria for choosing the most suitable tests for proposed nutritional intervention studies using cognitive outcomes. The material in this review serves as a background and guidance document for nutritionists, neuropsychologists, psychiatrists, and neurologists interested in assessing mental health in terms of cognitive test performance and for scientists intending to test the effects of food or food components on cognitive function.
Resumo:
We report on the first realtime ionospheric predictions network and its capabilities to ingest a global database and forecast F-layer characteristics and "in situ" electron densities along the track of an orbiting spacecraft. A global network of ionosonde stations reported around-the-clock observations of F-region heights and densities, and an on-line library of models provided forecasting capabilities. Each model was tested against the incoming data; relative accuracies were intercompared to determine the best overall fit to the prevailing conditions; and the best-fit model was used to predict ionospheric conditions on an orbit-to-orbit basis for the 12-hour period following a twice-daily model test and validation procedure. It was found that the best-fit model often provided averaged (i.e., climatologically-based) accuracies better than 5% in predicting the heights and critical frequencies of the F-region peaks in the latitudinal domain of the TSS-1R flight path. There was a sharp contrast however, in model-measurement comparisons involving predictions of actual, unaveraged, along-track densities at the 295 km orbital altitude of TSS-1R In this case, extrema in the first-principle models varied by as much as an order of magnitude in density predictions, and the best-fit models were found to disagree with the "in situ" observations of Ne by as much as 140%. The discrepancies are interpreted as a manifestation of difficulties in accurately and self-consistently modeling the external controls of solar and magnetospheric inputs and the spatial and temporal variabilities in electric fields, thermospheric winds, plasmaspheric fluxes, and chemistry.
Resumo:
There is little consensus on how agriculture will meet future food demands sustainably. Soils and their biota play a crucial role by mediating ecosystem services that support agricultural productivity. However, a multitude of site-specific environmental factors and management practices interact to affect the ability of soil biota to perform vital functions, confounding the interpretation of results from experimental approaches. Insights can be gained through models, which integrate the physiological, biological and ecological mechanisms underpinning soil functions. We present a powerful modelling approach for predicting how agricultural management practices (pesticide applications and tillage) affect soil functioning through earthworm populations. By combining energy budgets and individual-based simulation models, and integrating key behavioural and ecological drivers, we accurately predict population responses to pesticide applications in different climatic conditions. We use the model to analyse the ecological consequences of different weed management practices. Our results demonstrate that an important link between agricultural management (herbicide applications and zero, reduced and conventional tillage) and earthworms is the maintenance of soil organic matter (SOM). We show how zero and reduced tillage practices can increase crop yields while preserving natural ecosystem functions. This demonstrates how management practices which aim to sustain agricultural productivity should account for their effects on earthworm populations, as their proliferation stimulates agricultural productivity. Synthesis and applications. Our results indicate that conventional tillage practices have longer term effects on soil biota than pesticide control, if the pesticide has a short dissipation time. The risk of earthworm populations becoming exposed to toxic pesticides will be reduced under dry soil conditions. Similarly, an increase in soil organic matter could increase the recovery rate of earthworm populations. However, effects are not necessarily additive and the impact of different management practices on earthworms depends on their timing and the prevailing environmental conditions. Our model can be used to determine which combinations of crop management practices and climatic conditions pose least overall risk to earthworm populations. Linking our model mechanistically to crop yield models would aid the optimization of crop management systems by exploring the trade-off between different ecosystem services.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.