833 resultados para Probabilistic methodology
Resumo:
One cubic centimetre potato cubes were blanched, sulfited, dried initially for between 40 and 80 min in air at 90 degreesC in a cabinet drier, puffed in a high temperature fluidised bed and then dried for up to 180 min in a cabinet drier. The final moisture content was 0.05 dwb. The resulting product was optimised using response surface methodology, in terms of volume and colour (L-*, a(*) and b(*) values) of the dry product, as well as rehydration ratio and texture of the rehydrated product. The operating conditions resulting in the optimised product were found to be blanching for 6 min in water at 100 degreesC, dipping in 400 ppm sodium metabisulfite solution for 10 min, initially drying for 40 min and puffing in air at 200 degreesC for 40 s, followed by final drying to a moisture content of 0.05 dwb. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To describe the calculations and approaches used to design experimental diets of differing saturated fatty acid (SFA) and monounsaturated fatty acid (MUFA) compositions for use in a long-term dietary intervention study, and to evaluate the degree to which the dietary targets were met. Design, setting and subjects: Fifty-one students living in a university hall of residence consumed a reference (SFA) diet for 8 weeks followed by either a moderate MUFA (MM) diet or a high MUFA (HM) diet for 16 weeks. The three diets were designed to differ only in their proportions of SFA and MUFA, while keeping total fat, polyunsaturated fatty acids (PUFA), trans-fatty acids, and the ratio of palmitic to stearic acid, and n-6 to n-3 PUFA, unchanged. Results: Using habitual diet records and a standardised database for food fatty acid compositions, a sequential process of theoretical fat substitutions enabled suitable fat sources for use in the three diets to be identified, and experimental margarines for baking, spreading and the manufacture of snack foods to be designed. The dietary intervention was largely successful in achieving the fatty acid targets of the three diets, although unintended differences between the original target and the analysed fatty acid composition of the experimental margarines resulted in a lower than anticipated MUFA intake on the HM diet, and a lower ratio of palmitic to stearic acid compared with the reference or MM diet. Conclusions: This study has revealed important theoretical considerations that should be taken into account when designing diets of specific fatty acid composition, as well as practical issues of implementation.
Resumo:
Background: Postprandial lipid metabolism in humans has deserved much attention during the last two decades. Although fasting lipid and lipoprotein parameters reflect body homeostasis to some extent, the transient lipid and lipoprotein accumulation that occurs in the circulation after a fat-containing meal highlights the individual capacity to handle an acute fat input. An exacerbated postprandial accumulation of triglyceride-rich lipoproteins in the circulation has been associated with an increased cardiovascular risk. Methods: The important number of studies published in this field raises the question of the methodology used for such postprandial studies, as reviewed. Results: Based on our experiences, the present review reports and discuss the numerous methodological issues involved to serve as a basis for further works. These aspects include aims of the postprandial tests, size and nutrient composition of the test meals and background diets, pre-test conditions, characteristics of subjects involved, timing of sampling, suitable markers of postprandial lipid metabolism and calculations. Conclusion: In conclusion, we stress the need for standardization of postprandial tests.
Resumo:
In this paper, we evaluate the Probabilistic Occupancy Map (POM) pedestrian detection algorithm on the PETS 2009 benchmark dataset. POM is a multi-camera generative detection method, which estimates ground plane occupancy from multiple background subtraction views. Occupancy probabilities are iteratively estimated by fitting a synthetic model of the background subtraction to the binary foreground motion. Furthermore, we test the integration of this algorithm into a larger framework designed for understanding human activities in real environments. We demonstrate accurate detection and localization on the PETS dataset, despite suboptimal calibration and foreground motion segmentation input.
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
A new probabilistic neural network (PNN) learning algorithm based on forward constrained selection (PNN-FCS) is proposed. An incremental learning scheme is adopted such that at each step, new neurons, one for each class, are selected from the training samples arid the weights of the neurons are estimated so as to minimize the overall misclassification error rate. In this manner, only the most significant training samples are used as the neurons. It is shown by simulation that the resultant networks of PNN-FCS have good classification performance compared to other types of classifiers, but much smaller model sizes than conventional PNN.
Resumo:
Based on the idea of an important cluster, a new multi-level probabilistic neural network (MLPNN) is introduced. The MLPNN uses an incremental constructive approach, i.e. it grows level by level. The construction algorithm of the MLPNN is proposed such that the classification accuracy monotonically increases to ensure that the classification accuracy of the MLPNN is higher than or equal to that of the traditional PNN. Numerical examples are included to demonstrate the effectiveness of proposed new approach.
Resumo:
This paper presents a preface to this Special Issue on the results of the QUEST-GSI (Global Scale Impacts) project on climate change impacts on catchment-scale water resources. A detailed description of the unified methodology, subsequently used in all studies in this issue, is provided. The project method involved running simulations of catchment-scale hydrology using a unified set of past and future climate scenarios, to enable a consistent analysis of the climate impacts around the globe. These scenarios include "policy-relevant" prescribed warming scenarios. This is followed by a synthesis of the key findings. Overall, the studies indicate that in most basins the models project substantial changes to river flow, beyond that observed in the historical record, but that in many cases there is considerable uncertainty in the magnitude and sign of the projected changes. The implications of this for adaptation activities are discussed.
Resumo:
Considers the application of value management during the briefing and outline design stages of building developments.
Resumo:
This paper provides a new set of theoretical perspectives on the topic of value management in building procurement. On the evidence of the current literature it is possible to identify two distinct methodologies which are based on different epistemological positions. An argument is developed which sees these two methodologies to be complementary. A tentative meta-methodology is then outlined for matching methodologies to different problem situations. It is contended however that such a meta-methodology could never provide a prescriptive guide. Its usefulness lies in the way in which it provides the basis for reflective practice. Of central importance is the need to understand the problem context within which value management is to be applied. The distinctions between unitary, pluralistic and coercive situations are seen to be especially significant.