9 resultados para duplicate elimination
em CentAUR: Central Archive University of Reading - UK
Resumo:
Projections of future global sea level depend on reliable estimates of changes in the size of polar ice sheets. Calculating this directly from global general circulation models (GCMs) is unreliable because the coarse resolution of 100 km or more is unable to capture narrow ablation zones, and ice dynamics is not usually taken into account in GCMs. To overcome these problems a high-resolution (20 km) dynamic ice sheet model has been coupled to the third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3). A novel feature is the use of two-way coupling, so that climate changes in the GCM drive ice mass changes in the ice sheet model that, in turn, can alter the future climate through changes in orography, surface albedo, and freshwater input to the model ocean. At the start of the main experiment the atmospheric carbon dioxide concentration was increased to 4 times the preindustrial level and held constant for 3000 yr. By the end of this period the Greenland ice sheet is almost completely ablated and has made a direct contribution of approximately 7 m to global average sea level, causing a peak rate of sea level rise of 5 mm yr-1 early in the simulation. The effect of ice sheet depletion on global and regional climate has been examined and it was found that apart from the sea level rise, the long-term effect on global climate is small. However, there are some significant regional climate changes that appear to have reduced the rate at which the ice sheet ablates.
Resumo:
Introduction A high saturated fatty acid intake is a well recognized risk factor for coronary heart disease development. More recently a high intake of n-6 polyunsaturated fatty acids (PUFA) in combination with a low intake of the long chain n-3 PUFA, eicosapentaenoic acid and docosahexaenoic acid has also been implicated as an important risk factor. Aim To compare total dietary fat and fatty acid intake measured by chemical analysis of duplicate diets with nutritional database analysis of estimated dietary records, collected over the same 3-day study period. Methods Total fat was analysed using soxhlet extraction and subsequently the individual fatty acid content of the diet was determined by gas chromatography. Estimated dietary records were analysed using a nutrient database which was supplemented with a selection of dishes commonly consumed by study participants. Results Bland & Altman statistical analysis demonstrated a lack of agreement between the two dietary assessment techniques for determining dietary fat and fatty acid intake. Conclusion The lack of agreement observed between dietary evaluation techniques may be attributed to inadequacies in either or both assessment techniques. This study highlights the difficulties that may be encountered when attempting to accurately evaluate dietary fat intake among the population.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
A fast backward elimination algorithm is introduced based on a QR decomposition and Givens transformations to prune radial-basis-function networks. Nodes are sequentially removed using an increment of error variance criterion. The procedure is terminated by using a prediction risk criterion so as to obtain a model structure with good generalisation properties. The algorithm can be used to postprocess radial basis centres selected using a k-means routine and, in this mode, it provides a hybrid supervised centre selection approach.