20 resultados para Extended Langmuir model

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – A binary integer programming model for the simple assembly line balancing problem (SALBP), which is well known as SALBP-1, was formulated more than 30 years ago. Since then, a number of researchers have extended the model for the variants of assembly line balancing problem.The model is still prevalent nowadays mainly because of the lower and upper bounds on task assignment. These properties avoid significant increase of decision variables. The purpose of this paper is to use an example to show that the model may lead to a confusing solution. Design/methodology/approach – The paper provides a remedial constraint set for the model to rectify the disordered sequence problem. Findings – The paper presents proof that the assembly line balancing model formulated by Patterson and Albracht may lead to a confusing solution. Originality/value – No one previously has found that the commonly used model is incorrect.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The standard GTM (generative topographic mapping) algorithm assumes that the data on which it is trained consists of independent, identically distributed (iid) vectors. For time series, however, the iid assumption is a poor approximation. In this paper we show how the GTM algorithm can be extended to model time series by incorporating it as the emission density in a hidden Markov model. Since GTM has discrete hidden states we are able to find a tractable EM algorithm, based on the forward-backward algorithm, to train the model. We illustrate the performance of GTM through time using flight recorder data from a helicopter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Brand extensions are increasingly used by multinational corporations in emerging markets such as China. However, understanding how consumers in the emerging markets evaluate brand extensions is hampered by a lack of research in the emerging markets contexts. To address the knowledge void, we built on an established brand extension evaluation framework in the West, namely Aaker and Keller (1990)1. Aaker , D. A. and Keller , K. L. 1990 . Consumer evaluations of brand extensions . Journal of Marketing , 54 ( 1 ) : 27 – 41 . [CrossRef], [Web of Science ®] View all references, and extended the model by incorporating two new factors: perceived fit based on brand image consistency and competition intensity in the brand extension category. The additions of two factors are made in recognition of the uniqueness of the considerations of consumers in the emerging markets in their brand extension evaluations. The extended model was tested by an empirical experiment using consumers in China. The results partly validated the Aaker and Keller model, and evidence that both newly added factors were significant in influencing consumers' evaluation of brand extensions was also found. More important, one new factor proposed, namely, consumer-perceived fit based on brand image consistency, was found to be more significant than all the factors in Aaker and Keller's original model, suggesting that the Aaker and Keller model may be limited in explaining how consumers in the emerging markets evaluate brand extensions. Further research implications and limitations are discussed in the paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The themes of this thesis are that international trade and foreign direct investment (FDI) are closely related and that they have varying impacts on economic growth in countries at different stages of development. The thesis consists of three empirical studies. The first one examines the causal relationship between FDI and trade in China. The empirical study is based on a panel of bilateral data for China and 19 home countries/regions over the period 1984-98. The specific feature of the study is that econometric techniques designed specially for panel data are applied to test for unit roots and causality. The results indicate a virtuous procedure of development for China. The growth of China’s imports causes growth in inward FDI from a home country/region, which in turn causes the growth of exports from China to the home country/region. The growth of exports causes the growth of imports. This virtuous procedure is the result of China’s policy of opening to the outside world. China has been encouraging export-oriented FDI and reducing trade barriers. Such policy instruments should be further encouraged in order to enhance economic growth. In the second study, an extended gravity model is constructed to identify the main causes of recent trade growth in OECD countries. The specific features include (a) the explicit introduction of R&D and FDI as two important explanatory variables into an augmented gravity equation; (b) the adoption of a panel data approach, and (c) the careful treatment of endogeneity. The main findings are that the levels and similarities of market size, domestic R&D stock and inward FDI stock are positively related to the volume of bilateral trade, while the geographical distance, exchange rate and relative factor endowments, has a negative impact. These findings lend support to new trade, FDI and economic growth theories. The third study evaluates the impact of openness on growth in different country groups. This research distinguishes itself from many existing studies in three aspects: first, both trade and FDI are included in the measurement of openness. Second, countries are divided' into three groups according to their development stages to compare the roles of FDI and trade in different groups. Third, the possible problems of endogeneity and multicollinearity of FDI and trade are carefully dealt with in a panel data setting. The main findings are that FDI and trade are both beneficial to a country's development. However, trade has positive effects on growth in all country groups but FDI has positive effects on growth only in the country groups which have had moderate development. The findings suggest FDI and trade may affect growth under different conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heterogeneous and incomplete datasets are common in many real-world visualisation applications. The probabilistic nature of the Generative Topographic Mapping (GTM), which was originally developed for complete continuous data, can be extended to model heterogeneous (i.e. containing both continuous and discrete values) and missing data. This paper describes and assesses the resulting model on both synthetic and real-world heterogeneous data with missing values.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this research paper is to demonstrate how existing performance measurement may be adopted to measure and manage performance in extended enterprises. Design/methodology/approach - The paper reviews the literature in performance measurement and extended enterprises. It explains the collaborative architecture of an extended enterprise and demonstrates this architecture through a case study. A model for measuring and managing performance in extended enterprises is developed using the case study. Findings - The research found that due to structural differences between traditional and extended enterprises, the systems required to measure and manage the performance of extended enterprises, whilst being based upon existing performance measurement frameworks, would be structurally and operationally different. Based on this, a model for measuring and managing performance in extended enterprises is proposed which includes intrinsic and extrinsic inter-enterprise coordinating measures. Research limitations/implications - There are two limitations this research. First, the evidence is based on a single case, thus further cases should be studied to establish the generalisibility of the presented results. Second, the practical limitations of the EE performance measurement model should be established through longitudinal action research. Practical implications - In practice the model proposed requires collaborating organisations to be more open and share critical performance information with one another. This will require change in practices and attitudes. Originality/value - The main contribution this paper makes is that it highlights the structural differences between traditional and collaborative enterprises and specifies performance measurement and management requirements of these collaborative organisations. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to extend the minimax disparity to determine the ordered weighted averaging (OWA) model based on linear programming. It introduces the minimax disparity approach between any distinct pairs of the weights and uses the duality of linear programming to prove the feasibility of the extended OWA operator weights model. The paper finishes with an open problem. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-scale model of edge coding based on normalized Gaussian derivative filters successfully predicts perceived scale (blur) for a wide variety of edge profiles [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision]. Our model spatially differentiates the luminance profile, half-wave rectifies the 1st derivative, and then differentiates twice more, to give the 3rd derivative of all regions with a positive gradient. This process is implemented by a set of Gaussian derivative filters with a range of scales. Peaks in the inverted normalized 3rd derivative across space and scale indicate the positions and scales of the edges. The edge contrast can be estimated from the height of the peak. The model provides a veridical estimate of the scale and contrast of edges that have a Gaussian integral profile. Therefore, since scale and contrast are independent stimulus parameters, the model predicts that the perceived value of either of these parameters should be unaffected by changes in the other. This prediction was found to be incorrect: reducing the contrast of an edge made it look sharper, and increasing its scale led to a decrease in the perceived contrast. Our model can account for these effects when the simple half-wave rectifier after the 1st derivative is replaced by a smoothed threshold function described by two parameters. For each subject, one pair of parameters provided a satisfactory fit to the data from all the experiments presented here and in the accompanying paper [May, K. A. & Georgeson, M. A. (2007). Added luminance ramp alters perceived edge blur and contrast: A critical test for derivative-based models of edge coding. Vision Research, 47, 1721-1731]. Thus, when we allow for the visual system's insensitivity to very shallow luminance gradients, our multi-scale model can be extended to edge coding over a wide range of contrasts and blurs. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research investigates the processes of adoption and implementation, by organisations, of computer aided production management systems (CAPM). It is organised around two different theoretical perspectives. The first part is informed by the Rogers model of the diffusion, adoption and implementation of innovations, and the second part by a social constructionist approach to technology. Rogers' work is critically evaluated and a model of adoption and implementation is distilled from it and applied to a set of empirical case studies. In the light of the case study data, strengths and weaknesses of the model are identified. It is argued that the model is too rational and linear to provide an adequate explanation of adoption processes. It is useful for understanding processes of implementation but requires further development. The model is not able to adequately encompass complex computer based technologies. However, the idea of 'reinvention' is identified as Roger's key concept but it needs to be conceptually extended. Both Roger's model and definition of CAPM found in the literature from production engineering tend to treat CAPM in objectivist terms. The problems with this view are addressed through a review of the literature on the sociology of technology, and it is argued that a social constructionist approach offers a more useful framework for understanding CAPM, its nature, adoption, implementation, and use. CAPM it is argued, must be understood on terms of the ways in which it is constituted in discourse, as part of a 'struggle for meaning' on the part of academics, professional engineers, suppliers, and users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experirnental data and theoretical calculation on the heat transfer performance of extended surface submerged: in shallow air fluidized beds ~ less than 150 mm, are presented. Energy t;ransferrence from the bed material was effected by water cooled tubes passing through the fins. The extended surface tested was either manufactured from square or radial copper fins silver soldered to a circular basic tube or commercially supplied, being of the crimped or extruded helical fin type. Performances are compared, for a wide range of geometric variables, bed configurations and fluidized materials, with plain and oval tubes operating under similar experimental conditions. A statistical analysis of all results, using a regression technique, has shown the relative importance of each significant variable. The bed to surface heat transfer coefficients are higher than those reported in earlier published work using finned tubes in much deeper beds and the heat transfer to the whole of the extended surface is at least as good as that previously reported for un-finned tubes. The improved performance is attributed partly to the absence of large bubbles in shallow beds and it is suggested that the improved circulation of the solids when constrained in the narrow passages between adjacent fins may be a contributory factor. Flow visualisation studies between a perspex extended surface and a fluidized bed using air at ambient temperatures, have demonstrated the effect of too small a fin spacing. Fin material and the bonding to the basic tube are more important in the optimisation of performance than in conventional convective applications because of the very much larger heat fluxes involved. A theoretical model of heat flow for a radial fin surface, provides data concerning the maximum heat transfer and minimum metal required to fulfil a given heat exchange duty. Results plotted in a series of charts aim at assisting the designer of shalJow fluidized beds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper concerns the problem of agent trust in an electronic market place. We maintain that agent trust involves making decisions under uncertainty and therefore the phenomenon should be modelled probabilistically. We therefore propose a probabilistic framework that models agent interactions as a Hidden Markov Model (HMM). The observations of the HMM are the interaction outcomes and the hidden state is the underlying probability of a good outcome. The task of deciding whether to interact with another agent reduces to probabilistic inference of the current state of that agent given all previous interaction outcomes. The model is extended to include a probabilistic reputation system which involves agents gathering opinions about other agents and fusing them with their own beliefs. Our system is fully probabilistic and hence delivers the following improvements with respect to previous work: (a) the model assumptions are faithfully translated into algorithms; our system is optimal under those assumptions, (b) It can account for agents whose behaviour is not static with time (c) it can estimate the rate with which an agent's behaviour changes. The system is shown to significantly outperform previous state-of-the-art methods in several numerical experiments. Copyright © 2010, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to investigate enterprise resource planning (ERP) systems development and emerging practices in the management of enterprises (i.e. parts of companies working with parts of other companies to deliver a complex product and/or service) and identify any apparent correlations. Suitable a priori contingency frameworks are then used and extended to explain apparent correlations. Discussion is given to provide guidance for researchers and practitioners to deliver better strategic, structural and operational competitive advantage through this approach; coined here as the "enterprization of operations". Design/methodology/approach: Theoretical induction uses a new empirical longitudinal case study from Zoomlion (a Chinese manufacturing company) built using an adapted form of template analysis to produce a new contingency framework. Findings: Three main types of enterprises and the three main types of ERP systems are defined and correlations between them are explained. Two relevant a priori frameworks are used to induct a new contingency model to support the enterprization of operations; known as the dynamic enterprise reference grid for ERP (DERG-ERP). Research limitations/implications: The findings are based on one longitudinal case study. Further case studies are currently being conducted in the UK and China. Practical implications: The new contingency model, the DERG-ERP, serves as a guide for ERP vendors, information systems management and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structure and ERP systems. Originality/value: This research explains how ERP systems and the effective management of enterprises should develop in order to sustain competitive advantage with respect to enterprise strategy, enterprise structure and ERP systems use. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalized Drucker–Prager (GD–P) viscoplastic yield surface model was developed and validated for asphalt concrete. The GD–P model was formulated based on fabric tensor modified stresses to consider the material inherent anisotropy. A smooth and convex octahedral yield surface function was developed in the GD–P model to characterize the full range of the internal friction angles from 0° to 90°. In contrast, the existing Extended Drucker–Prager (ED–P) was demonstrated to be applicable only for a material that has an internal friction angle less than 22°. Laboratory tests were performed to evaluate the anisotropic effect and to validate the GD–P model. Results indicated that (1) the yield stresses of an isotropic yield surface model are greater in compression and less in extension than that of an anisotropic model, which can result in an under-prediction of the viscoplastic deformation; and (2) the yield stresses predicted by the GD–P model matched well with the experimental results of the octahedral shear strength tests at different normal and confining stresses. By contrast, the ED–P model over-predicted the octahedral yield stresses, which can lead to an under-prediction of the permanent deformation. In summary, the rutting depth of an asphalt pavement would be underestimated without considering anisotropy and convexity of the yield surface for asphalt concrete. The proposed GD–P model was demonstrated to be capable of overcoming these limitations of the existing yield surface models for the asphalt concrete.