880 resultados para Benchmark Problem
Resumo:
This paper examines the implications of policy fracture and arms length governance within the decision making processes currently shaping curriculum design within the English education system. In particular it argues that an unresolved ‘ideological fracture’ at government level has been passed down to school leaders whose response to the dilemma is distorted by the target-driven agenda of arms length agencies. Drawing upon the findings of a large scale on-line survey of history teaching in English secondary schools, this paper illustrates the problems that occur when policy making is divorced from curriculum theory, and in particular from any consideration of the nature of knowledge. Drawing on the social realist theory of knowledge elaborated by Young (2008), we argue that the rapid spread of alternative curricular arrangements, implemented in the absence of an understanding of curriculum theory, undermines the value of disciplined thinking to the detriment of many young people, particularly those in areas of social and economic deprivation.
Resumo:
This study was undertaken to explore gel permeation chromatography (GPC) for estimating molecular weights of proanthocyanidin fractions isolated from sainfoin (Onobrychis viciifolia). The results were compared with data obtained by thiolytic degradation of the same fractions. Polystyrene, polyethylene glycol and polymethyl methacrylate standards were not suitable for estimating the molecular weights of underivatized proanthocyanidins. Therefore, a novel HPLC-GPC method was developed based on two serially connected PolarGel-L columns using DMF that contained 5% water, 1% acetic acid and 0.15 M LiBr at 0.7 ml/min and 50 degrees C. This yielded a single calibration curve for galloyl glucoses (trigalloyl glucose, pentagalloyl glucose), ellagitannins (pedunculagin, vescalagin, punicalagin, oenothein B, gemin A), proanthocyanidins (procyanidin B2, cinnamtannin B1), and several other polyphenols (catechin, epicatechin gallate, epicallocatechin gallate, amentoflavone). These GPC predicted molecular weights represented a considerable advance over previously reported HPLC-GPC methods for underivatized proanthocyanidins. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This article explores the nature and impact of path dependence in British rail coal haulage before 1939. It examines the factors which locked Britain's railways into a system of small coal wagons with highly fragmented ownership, the cost penalties of this system, and the reasons that attempts at modernization were unsuccessful. The analysis highlights the importance of decentralized ownership of a highly durable installed base of complementary infrastructure. Technical and institutional interrelatedness blocked incremental modernization, while the political requirement to compensate private wagon owners for the loss of their wagon stock made wholesale rationalization financially unattractive.
Resumo:
The evaluation of investment fund performance has been one of the main developments of modern portfolio theory. Most studies employ the technique developed by Jensen (1968) that compares a particular fund's returns to a benchmark portfolio of equal risk. However, the standard measures of fund manager performance are known to suffer from a number of problems in practice. In particular previous studies implicitly assume that the risk level of the portfolio is stationary through the evaluation period. That is unconditional measures of performance do not account for the fact that risk and expected returns may vary with the state of the economy. Therefore many of the problems encountered in previous performance studies reflect the inability of traditional measures to handle the dynamic behaviour of returns. As a consequence Ferson and Schadt (1996) suggest an approach to performance evaluation called conditional performance evaluation which is designed to address this problem. This paper utilises such a conditional measure of performance on a sample of 27 UK property funds, over the period 1987-1998. The results of which suggest that once the time varying nature of the funds beta is corrected for, by the addition of the market indicators, the average fund performance show an improvement over that of the traditional methods of analysis.
Resumo:
The position of Real Estate within a multi-asset portfolio has received considerable attention recently. Previous research has concentrated on the percentage holding property would achieve given its risk/return characteristics. Such studies have invariably used Modern Portfolio Theory and these approaches have been criticised for both the quality of the real estate data and problems with the methodology itself. The first problem is now well understood, and the second can be addressed by the use of realistic constraints on asset holdings. This paper takes a different approach. We determine the level of return that Real Estate needs to achieve to justify an allocation within the multi asset portfolio. In order to test the importance of the quality of the data we use historic appraisal based and desmoothed returns to examine the sensitivity of the results. Consideration is also given to the Holding period and the imposition of realistic constraints on the asset holdings in order to model portfolios held by pension fund investors. We conclude, using several benchmark levels of portfolio risk and return, that using appraisal based data the required level of return for Real Estate was less than that achieved over the period 1972-1993. The use of desmoothed series can reverse this result at the highest levels of desmoothing although within a restricted holding period Real Estate offered returns in excess of those required to enter the portfolio and might have a role to play in the multi-asset portfolio.
Resumo:
An important test of the quality of a computational model is its ability to reproduce standard test cases or benchmarks. For steady open–channel flow based on the Saint Venant equations some benchmarks exist for simple geometries from the work of Bresse, Bakhmeteff and Chow but these are tabulated in the form of standard integrals. This paper provides benchmark solutions for a wider range of cases, which may have a nonprismatic cross section, nonuniform bed slope, and transitions between subcritical and supercritical flow. This makes it possible to assess the underlying quality of computational algorithms in more difficult cases, including those with hydraulic jumps. Several new test cases are given in detail and the performance of a commercial steady flow package is evaluated against two of them. The test cases may also be used as benchmarks for both steady flow models and unsteady flow models in the steady limit.
Resumo:
Acrylamide, a chemical that is probably carcinogenic in humans and has neurological and reproductive effects, forms from free asparagine and reducing sugars during high-temperature cooking and processing of common foods. Potato and cereal products are major contributors to dietary exposure to acrylamide and while the food industry reacted rapidly to the discovery of acrylamide in some of the most popular foods, the issue remains a difficult one for many sectors. Efforts to reduce acrylamide formation would be greatly facilitated by the development of crop varieties with lower concentrations of free asparagine and/or reducing sugars, and of best agronomic practice to ensure that concentrations are kept as low as possible. This review describes how acrylamide is formed, the factors affecting free asparagine and sugar concentrations in crop plants, and the sometimes complex relationship between precursor concentration and acrylamide-forming potential. It covers some of the strategies being used to reduce free asparagine and sugar concentrations through genetic modification and other genetic techniques, such as the identification of quantitative trait loci. The link between acrylamide formation, flavour, and colour is discussed, as well as the difficulty of balancing the unknown risk of exposure to acrylamide in the levels that are present in foods with the well-established health benefits of some of the foods concerned. Key words: Amino acids, asparagine, cereals, crop quality, food safety, Maillard reaction, potato, rye, sugars, wheat.
Resumo:
Developing brief training interventions that benefit different forms of problem solving is challenging. In earlier research, Chrysikou (2006) showed that engaging in a task requiring generation of alternative uses of common objects improved subsequent insight problem solving. These benefits were attributed to a form of implicit transfer of processing involving enhanced construction of impromptu, on-the-spot or ‘ad hoc’ goal-directed categorizations of the problem elements. Following this, it is predicted that the alternative uses exercise should benefit abilities that govern goal-directed behaviour, such as fluid intelligence and executive functions. Similarly, an indirect intervention – self-affirmation (SA) – that has been shown to enhance cognitive and executive performance after self-regulation challenge and when under stereotype threat, may also increase adaptive goal-directed thinking and likewise should bolster problem-solving performance. In Experiment 1, brief single-session interventions, involving either alternative uses generation or SA, significantly enhanced both subsequent insight and visual–spatial fluid reasoning problem solving. In Experiment 2, we replicated the finding of benefits of both alternative uses generation and SA on subsequent insight problem-solving performance, and demonstrated that the underlying mechanism likely involves improved executive functioning. Even brief cognitive– and social–psychological interventions may substantially bolster different types of problem solving and may exert largely similar facilitatory effects on goal-directed behaviours.
Resumo:
A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.