897 resultados para product design optimality


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of the study were (a) to examine which information and design elements on dairy product packages operate as cues in consumer evaluations of product healthfulness, and (b) to measure the degree to which consumers voluntarily attend to these elements during product choice. Visual attention was measured by means of eye-tracking. Task (free viewing, product healthfulness evaluation, and purchase likelihood evaluation) and product (five different yoghurt products) were varied in a mixed within-between subjects design. The free viewing condition served as a baseline against which increases or decreases in attention during product healthfulness evaluation and purchase likelihood evaluation were assessed. The analysis revealed that the only element operating as a health cue during product healthfulness evaluation was the nutrition label. The information cues used during purchase likelihood evaluation were the name of the product category and the nutrition label. Taken together, the results suggest that the only information element that consumers consistently utilize as a health cue is the nutrition label and that only a limited amount of attention is devoted to read nutrition labels during purchase likelihood evaluations. The study also revealed that the probability that a consumer will read the nutrition label during the purchase decision process is associated with gender, body mass index and health motivation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most factorial experiments in industrial research form one stage in a sequence of experiments and so considerable prior knowledge is often available from earlier stages. A Bayesian A-optimality criterion is proposed for choosing designs, when each stage in experimentation consists of a small number of runs and the objective is to optimise a response. Simple formulae for the weights are developed, some examples of the use of the design criterion are given and general recommendations are made. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A nickel catalyst was modeled with ligand L-2, [ NH = CH-CH = CH-O](-), which should have potential use as a syndiotactic polyolefin catalyst, and the reaction mechanism was studied by theoretical calculations using the density functional method at the B3LYP/ LANL2MB level. The mechanism involves the formation of the intermediate [(NiLMe)-Me-2](+), in which the metal occuples a T-shaped geometry. - This intermediate has two possible structures with the methyl group trans either to the oxygen or to the nitrogen atom of L-2. The results show that both structures can lead to the desired product via similar reaction paths, A and B. Thus, the polymerization could be considered as taking place either with the alkyl group occupying the position trans to the Ni-O or trans to the Ni-N bond in the catalyst. The polymerization process thus favors the catalysis of syndiotactic polyolefins. The syndiotactic synthesis effects could also be enhanced by variations in the ligand substituents. From energy considerations, we can conclude that it is more favorable for the methyl group to occupy the trans-O position to form a complex than to occupy the trans-N position. From bond length considerations, it is also more favoured for ethene to occupy the trans-O position than to occupy the trans-N position.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The journey from the concept of a building to the actual built form is mediated with the use of various artefacts, such as drawings, product samples and models. These artefacts are produced for different purposes and for people with different levels of understanding of the design and construction processes. This paper studies design practice as it occurs naturally in a real-world situation by observing the conversations that surround the use of artefacts at the early stages of a building's design. Drawing on ethnographic data, insights are given into how the use of artefacts can reveal a participant's understanding of the scheme. The appropriateness of the method of conversation analysis to reveal the users' understanding of a scheme is explored by observing spoken micro-interactional behaviours. It is shown that the users' understanding of the design was developed in the conversations around the use of artefacts, as well as the knowledge that is embedded in the artefacts themselves. The users' confidence in the appearance of the building was considered to be gained in conversation, rather than the ability of the artefacts to represent a future reality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm based on the basis pursuit that minimises the l(1) norm of the parameter estimate vector. The model subset selection cost function includes a D-optimality design criterion that maximises the determinant of the design matrix of the subset to ensure model robustness and to enable the model selection procedure to automatically terminate at a sparse model. The proposed approach is based on the forward OLS algorithm using the modified Gram-Schmidt procedure. Both the parameter tuning procedure, based on basis pursuit, and the model selection criterion, based on the D-optimality that is effective in ensuring model robustness, are integrated with the forward regression. As a consequence the inherent computational efficiency associated with the conventional forward OLS approach is maintained in the proposed algorithm. Examples demonstrate the effectiveness of the new approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This correspondence introduces a new orthogonal forward regression (OFR) model identification algorithm using D-optimality for model structure selection and is based on an M-estimators of parameter estimates. M-estimator is a classical robust parameter estimation technique to tackle bad data conditions such as outliers. Computationally, The M-estimator can be derived using an iterative reweighted least squares (IRLS) algorithm. D-optimality is a model structure robustness criterion in experimental design to tackle ill-conditioning in model Structure. The orthogonal forward regression (OFR), often based on the modified Gram-Schmidt procedure, is an efficient method incorporating structure selection and parameter estimation simultaneously. The basic idea of the proposed approach is to incorporate an IRLS inner loop into the modified Gram-Schmidt procedure. In this manner, the OFR algorithm for parsimonious model structure determination is extended to bad data conditions with improved performance via the derivation of parameter M-estimators with inherent robustness to outliers. Numerical examples are included to demonstrate the effectiveness of the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this brief, we propose an orthogonal forward regression (OFR) algorithm based on the principles of the branch and bound (BB) and A-optimality experimental design. At each forward regression step, each candidate from a pool of candidate regressors, referred to as S, is evaluated in turn with three possible decisions: 1) one of these is selected and included into the model; 2) some of these remain in S for evaluation in the next forward regression step; and 3) the rest are permanently eliminated from S. Based on the BB principle in combination with an A-optimality composite cost function for model structure determination, a simple adaptive diagnostics test is proposed to determine the decision boundary between 2) and 3). As such the proposed algorithm can significantly reduce the computational cost in the A-optimality OFR algorithm. Numerical examples are used to demonstrate the effectiveness of the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New construction algorithms for radial basis function (RBF) network modelling are introduced based on the A-optimality and D-optimality experimental design criteria respectively. We utilize new cost functions, based on experimental design criteria, for model selection that simultaneously optimizes model approximation, parameter variance (A-optimality) or model robustness (D-optimality). The proposed approaches are based on the forward orthogonal least-squares (OLS) algorithm, such that the new A-optimality- and D-optimality-based cost functions are constructed on the basis of an orthogonalization process that gains computational advantages and hence maintains the inherent computational efficiency associated with the conventional forward OLS approach. The proposed approach enhances the very popular forward OLS-algorithm-based RBF model construction method since the resultant RBF models are constructed in a manner that the system dynamics approximation capability, model adequacy and robustness are optimized simultaneously. The numerical examples provided show significant improvement based on the D-optimality design criterion, demonstrating that there is significant room for improvement in modelling via the popular RBF neural network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A very efficient learning algorithm for model subset selection is introduced based on a new composite cost function that simultaneously optimizes the model approximation ability and model adequacy. The derived model parameters are estimated via forward orthogonal least squares, but the subset selection cost function includes an A-optimality design criterion to minimize the variance of the parameter estimates that ensures the adequacy and parsimony of the final model. An illustrative example is included to demonstrate the effectiveness of the new approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New high technology products usher in novel possibilities to transform the design, production and use of buildings. The high technology companies which design, develop and introduce these new products by generating and applying novel scientific and technical knowledge are faced with significant market uncertainty, technological uncertainty and competitive volatility. These characteristics present unique innovation challenges compared to low- and medium technology companies. This paper reports on an ongoing Construction Knowledge Exchange funded project which is tracking, real time, the new product development process of a new family of light emitting diode (LEDs) technologies. LEDs offer significant functional and environmental performance improvements over incumbent tungsten and halogen lamps. Hitherto, the use of energy efficient, low maintenance LEDs has been constrained by technical limitations. Rapid improvements in basic science and technology mean that for the first time LEDs can provide realistic general and accent lighting solutions. Interim results will be presented on the complex, emergent new high technology product development processes which are being revealed by the integrated supply chain of a LED module manufacture, a luminaire (light fitting) manufacture and end user involved in the project.