68 resultados para Value-based pricing
Resumo:
Substituting grass silage with maize silage in forage mixtures may result in one forage influencing the nutritive value of another in terms of whole tract nutrient digestibility and N utilisation. This experiment investigated effects of four forage combinations being, grass silage (G); 67 g/100 g grass silage + 33 g/100 g maize silage (GGM); 67 g/100 g maize silage + 33 g/100 g grass silage (MMG); maize silage (M). All diets were formulated to be isonitrogenous (22.4 g N/kg dry matter [DM]) using a concentrate mixture. Ration digestibility and N balance was determined using 7 Holstein Friesian steers (mean body weight 411.0 +/- 120.9 kg) in a cross-over design. Inclusion of maize silage in the diet had a positive linear effect on forage and total DM intake (P = 0.001), and on apparent DM and organic matter digestibility (both P = 0.048). Regardless of the silage ratio used, the metabolisable energy concentration of maize silage was calculated to be higher than that of grass silage (P = 0.058), and linearly related to the relative proportions of the two silages in the forage mixture. Inclusion of maize silage in the diet resulted in a linear decline in the apparent digestibility of starch (P = 0.022), neutral detergent fibre (P < 0.001) and acid detergent fibre (P = 0.003). Nitrogen retention, expressed as amount retained per day or in terms of body weight (g/100 kg) increased linearly with maize inclusion (P = 0.047 and 0.046, respectively). Replacing grass silage with maize silage caused linear responses according to the proportions of each forage in the diet, and that there were no associative effects of combining forages. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Grass-based diets are of increasing social-economic importance in dairy cattle farming, but their low supply of glucogenic nutrients may limit the production of milk. Current evaluation systems that assess the energy supply and requirements are based on metabolisable energy (ME) or net energy (NE). These systems do not consider the characteristics of the energy delivering nutrients. In contrast, mechanistic models take into account the site of digestion, the type of nutrient absorbed and the type of nutrient required for production of milk constituents, and may therefore give a better prediction of supply and requirement of nutrients. The objective of the present study is to compare the ability of three energy evaluation systems, viz. the Dutch NE system, the agricultural and food research council (AFRC) ME system, and the feed into milk (FIM) ME system, and of a mechanistic model based on Dijkstra et al. [Simulation of digestion in cattle fed sugar cane: prediction of nutrient supply for milk production with locally available supplements. J. Agric. Sci., Cambridge 127, 247-60] and Mills et al. [A mechanistic model of whole-tract digestion and methanogenesis in the lactating dairy cow: model development, evaluation and application. J. Anim. Sci. 79, 1584-97] to predict the feed value of grass-based diets for milk production. The dataset for evaluation consists of 41 treatments of grass-based diets (at least 0.75 g ryegrass/g diet on DM basis). For each model, the predicted energy or nutrient supply, based on observed intake, was compared with predicted requirement based on observed performance. Assessment of the error of energy or nutrient supply relative to requirement is made by calculation of mean square prediction error (MSPE) and by concordance correlation coefficient (CCC). All energy evaluation systems predicted energy requirement to be lower (6-11%) than energy supply. The root MSPE (expressed as a proportion of the supply) was lowest for the mechanistic model (0.061), followed by the Dutch NE system (0.082), FIM ME system (0.097) and AFRCME system(0.118). For the energy evaluation systems, the error due to overall bias of prediction dominated the MSPE, whereas for the mechanistic model, proportionally 0.76 of MSPE was due to random variation. CCC analysis confirmed the higher accuracy and precision of the mechanistic model compared with energy evaluation systems. The error of prediction was positively related to grass protein content for the Dutch NE system, and was also positively related to grass DMI level for all models. In conclusion, current energy evaluation systems overestimate energy supply relative to energy requirement on grass-based diets for dairy cattle. The mechanistic model predicted glucogenic nutrients to limit performance of dairy cattle on grass-based diets, and proved to be more accurate and precise than the energy systems. The mechanistic model could be improved by allowing glucose maintenance and utilization requirements parameters to be variable. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The term commercial management has been used for some time, similarly the job title commercial manager. However, as of yet, little emphasis has been placed on defining. This paper presents the findings from a two-year research initiative that has compared and contrasted the role of commercial managers from a range of organisations and across industry sectors, as a first step in developing a body of knowledge for commercial. It is argued that there are compelling arguments for considering commercial management, not solely as atask undertaken by commercial managers, but as a discipline in itself: a discipline that, arguably, bridges traditional project management and organisational theories. While the study has established differences in approach and application both between and within industry sectors, it has established sufficient similarity and synergy in practice to identify a specific role of commercial management in project-based organisations. These similarities encompass contract management and dispute resolution; the divergences include a greater involvement in financial and value management in construction and in bid management in defence/aerospace.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
This study analyzes the short-term consequences of visitors' use of different types of exhibits (i.e., "exemplars of phenomena" and "analogy based") together with the factors affecting visitors' understanding of and their evaluation of the use of such exhibits. One hundred and twenty five visitors (either alone or in groups) were observed during their interaction and interviewed immediately afterwards. Findings suggest that the type of exhibit constrains the nature of the understanding achieved. The use of analogical reasoning may lead to an intended causal explanation of an exhibit that is an exemplar of a phenomenon, but visitors often express misconceptions as a consequence of using this type of exhibit. Analogy-based exhibits are often not used as intended by the designer. This may be because visitors do not access the source domain intended; are unaware of the use of analogy per se (in particular, when the exhibit is of the subtype "only showing similarities between relationships"); only acquire fragmentary knowledge about the target; or fail to use analogical reasoning of which they were capable. Furthermore, exhibits related to everyday world situations are recognized to have an immediate educative value for visitors. Suggestions for enhancing the educative value of exhibits are proposed.
Resumo:
The temperature-time profiles of 22 Australian industrial ultra-high-temperature (UHT) plants and 3 pilot plants, using both indirect and direct heating, were surveyed. From these data, the operating parameters of each plant, the chemical index C*, the bacteriological index B* and the predicted changes in the levels of beta-lactoglobulin, alpha-lactalbumin, lactulose, furosine and browning were determined using a simulation program based on published formulae and reaction kinetics data. There was a wide spread of heating conditions used, some of which resulted in a large margin of bacteriological safety and high chemical indices. However, no conditions were severe enough to cause browning during processing. The data showed a clear distinction between the indirect and direct heating plants. They also indicated that degree of denaturation of alpha-lactalbumin varied over a wide range and may be a useful discriminatory index of heat treatment. Application of the program to pilot plants illustrated its value in determining processing conditions in these plants to simulate the conditions in industrial UHT plants. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We have studied 'food grade' sialyloligosaccharides (SOS) as anti-adhesive drugs or receptor analogues, since the terminal sialic acid residue has already been shown to contribute significantly to the adhesion and pathogenesis of the Vibrio cholerae toxin (Ctx). GM1-oligosaccharide (GM1-OS) was immobilized into a supporting POPC lipid bilayer onto a surface plasmon resonance (SPR) chip, and the interaction between uninhibited Ctx and GM1-OS-POPC was measured. SOS inhibited 94.7% of the Ctx binding to GM1-OS-POPC at 10 mg/mL. The SOS EC50 value of 5.521 mg/mL is high compared with 0.2811 mu g/mL (182.5 pM or 1.825 x 10(-10) M) for GM1-OS. The commercially available sialyloligosaccharide (SOS) mixture Sunsial E (R) is impure, containing one monosialylated and two disialylated oligosaccharides in the ratio 9.6%. 6.5% and 17.5%, respectively, and 66.4% protein. However, these inexpensive food-grade molecules are derived from egg yolk and could be used to fortify conventional food additives, by way of emulsifiers, sweeteners and/or preservatives. The work further supports our hypothesis that SOS could be a promising natural anti-adhesive glycomimetic against Ctx and prevent subsequent onset of disease. (C) 2009 Elsevier Ltd. All rights reserved
Resumo:
Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.
Resumo:
We present a novel topology of the radial basis function (RBF) neural network, referred to as the boundary value constraints (BVC)-RBF, which is able to automatically satisfy a set of BVC. Unlike most existing neural networks whereby the model is identified via learning from observational data only, the proposed BVC-RBF offers a generic framework by taking into account both the deterministic prior knowledge and the stochastic data in an intelligent manner. Like a conventional RBF, the proposed BVC-RBF has a linear-in-the-parameter structure, such that it is advantageous that many of the existing algorithms for linear-in-the-parameters models are directly applicable. The BVC satisfaction properties of the proposed BVC-RBF are discussed. Finally, numerical examples based on the combined D-optimality-based orthogonal least squares algorithm are utilized to illustrate the performance of the proposed BVC-RBF for completeness.
Resumo:
Many kernel classifier construction algorithms adopt classification accuracy as performance metrics in model evaluation. Moreover, equal weighting is often applied to each data sample in parameter estimation. These modeling practices often become problematic if the data sets are imbalanced. We present a kernel classifier construction algorithm using orthogonal forward selection (OFS) in order to optimize the model generalization for imbalanced two-class data sets. This kernel classifier identification algorithm is based on a new regularized orthogonal weighted least squares (ROWLS) estimator and the model selection criterion of maximal leave-one-out area under curve (LOO-AUC) of the receiver operating characteristics (ROCs). It is shown that, owing to the orthogonalization procedure, the LOO-AUC can be calculated via an analytic formula based on the new regularized orthogonal weighted least squares parameter estimator, without actually splitting the estimation data set. The proposed algorithm can achieve minimal computational expense via a set of forward recursive updating formula in searching model terms with maximal incremental LOO-AUC value. Numerical examples are used to demonstrate the efficacy of the algorithm.
Resumo:
Efficient markets should guarantee the existence of zero spreads for total return swaps. However, real estate markets have recorded values that are significantly different from zero in both directions. Possible explanations might suggest non-rational behaviour by inexperienced market players or unusual features of the underlying asset market. We find that institutional characteristics in the underlying market lead to market inefficiencies and, hence, to the creation of a rational trading window with upper and lower bounds within which transactions do not offer arbitrage opportunities. Given the existence of this rational trading window, we also argue that the observed spreads can substantially be explained by trading imbalances due to the limited liquidity of a newly formed market and/or to the effect of market sentiment, complementing explanations based on the lag between underlying market returns and index returns.
Resumo:
A case study on the tendering process and cost/time performance of a public building project in Ghana is conducted. Competitive bids submitted by five contractors for the project, in which contractors were required to prepare their own quantities, were analyzed to compare differences in their pricing levels and risk/requirement perceptions. Queries sent to the consultants at the tender stage were also analyzed to identify the significant areas of concern to contractors in relation to the tender documentation. The five bidding prices were significantly different. The queries submitted for clarifications were significantly different, although a few were similar. Using a before-and-after experiment, the expected cost/time estimate at the start of the project was compared to the actual cost/time values, i.e. what happened in the actual construction phase. The analysis showed that the project exceeded its expected cost by 18% and its planned time by 210%. Variations and inadequate design were the major reasons. Following an exploration of these issues, an alternative tendering mechanism is recommended to clients. A shift away from the conventional approach of awarding work based on price, and serious consideration of alternative procurement routes can help clients in Ghana obtain better value for money on their projects.
An assessment of aerosol‐cloud interactions in marine stratus clouds based on surface remote sensing
Resumo:
An assessment of aerosol-cloud interactions (ACI) from ground-based remote sensing under coastal stratiform clouds is presented. The assessment utilizes a long-term, high temporal resolution data set from the Atmospheric Radiation Measurement (ARM) Program deployment at Pt. Reyes, California, United States, in 2005 to provide statistically robust measures of ACI and to characterize the variability of the measures based on variability in environmental conditions and observational approaches. The average ACIN (= dlnNd/dlna, the change in cloud drop number concentration with aerosol concentration) is 0.48, within a physically plausible range of 0–1.0. Values vary between 0.18 and 0.69 with dependence on (1) the assumption of constant cloud liquid water path (LWP), (2) the relative value of cloud LWP, (3) methods for retrieving Nd, (4) aerosol size distribution, (5) updraft velocity, and (6) the scale and resolution of observations. The sensitivity of the local, diurnally averaged radiative forcing to this variability in ACIN values, assuming an aerosol perturbation of 500 c-3 relative to a background concentration of 100 cm-3, ranges betwee-4 and -9 W -2. Further characterization of ACI and its variability is required to reduce uncertainties in global radiative forcing estimates.
Resumo:
Major construction clients are increasingly looking to procure built facilities on the basis of added value, rather than capital cost. Recent advances in the procurement of construction projects have emphasised a whole-life value approach to meeting the client’s objectives, with strategies put in place to encourage long-term commitment and through-life service provision. Construction firms are therefore increasingly required to take on responsibility for the operation and maintenance of the construction project on the client’s behalf - with the emphasis on value and service. This inevitably throws up a host of challenges, not the least of which is the need for construction firms to manage and accommodate the new emphasis on service. Indeed, these ‘service-led’ projects represent a new realm of construction projects where the rationale for the project is driven by client’s objectives with some aspect of service provision. This vision of downstream service delivery increases the number of stakeholders, adds to project complexity and challenges deeply-ingrained working practices. Ultimately it presents a major challenge for the construction sector. This paper sets out to unravel some of the many implications that this change brings with it. It draws upon ongoing research investigating how construction firms can adapt to a more service-orientated built environment and add value in project-based environments. The conclusions lay bare the challenges that firms face when trying to compete on the basis of added-value and service delivery. In particular, how it affects deeply-ingrained working practices and established relationships in the sector.