44 resultados para Multi-Objective Optimization
Resumo:
Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.
Resumo:
Servitization is the process by which manufacturers add services to their product offerings and even replace products with services. The capabilities necessary to develop and deliver advanced services as part of servitization are often discussed in the literature from the manufacturer’s perspective, e.g., having a service-focused culture or the ability to sell solutions. Recent research has acknowledged the important role of customers and, to a lesser extent, other actors (e.g., intermediaries) in bringing about successful servitization, particularly for use-oriented and results-oriented advanced services. The objective of this study is to identify the capabilities required to successful develop advanced services as part of servitization by considering the perspective of manufacturers, intermediaries and customers. This study involved interviews with 33 managers in 28 large UK-based companies from these three groups, about servitization capabilities. The findings suggest that there are eight broad capabilities that are important for advanced services; 1) personnel with expertise and deep technical product knowledge, 2) methodologies for improving operational processes, helping to manage risk and reduce costs, 3) the evolution from being a product- focused manufacturer to embracing a services culture, 4) developing trusting relationships with other actors in the network to support the delivery of advanced services, 5) new innovation activities focused on financing contracts (e.g., ‘gain share’) and technology implementation (e.g., Web-based applications), 6) customer intimacy through understanding their business challenges in order to develop suitable solutions, 7) extensive infrastructure (e.g., personnel, service centres) to deliver a local service, and 8) the ability to tailor service offerings to each customer’s requirements and deliver these responsively to changing needs. The capabilities required to develop and deliver advanced services align to a need to enhance the operational performance of supplied products throughout their lifecycles and as such require greater investment than the capabilities for base and intermediate services.
Resumo:
In the contemporary customer-driven supply chain, maximization of customer service plays an equally important role as minimization of costs for a company to retain and increase its competitiveness. This article develops a multiple-criteria optimization approach, combining the analytic hierarchy process (AHP) and an integer linear programming (ILP) model, to aid the design of an optimal logistics distribution network. The proposed approach outperforms traditional cost-based optimization techniques because it considers both quantitative and qualitative factors and also aims at maximizing the benefits of deliverer and customers. In the approach, the AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to some critical customer-oriented criteria. The results of AHP prioritization are utilized as the input of the ILP model, the objective of which is to select the best warehouses at the lowest possible cost. In this article, two commercial packages are used: including Expert Choice and LINDO.
Resumo:
Integrated supplier selection and order allocation is an important decision for both designing and operating supply chains. This decision is often influenced by the concerned stakeholders, suppliers, plant operators and customers in different tiers. As firms continue to seek competitive advantage through supply chain design and operations they aim to create optimized supply chains. This calls for on one hand consideration of multiple conflicting criteria and on the other hand consideration of uncertainties of demand and supply. Although there are studies on supplier selection using advanced mathematical models to cover a stochastic approach, multiple criteria decision making techniques and multiple stakeholder requirements separately, according to authors' knowledge there is no work that integrates these three aspects in a common framework. This paper proposes an integrated method for dealing with such problems using a combined Analytic Hierarchy Process-Quality Function Deployment (AHP-QFD) and chance constrained optimization algorithm approach that selects appropriate suppliers and allocates orders optimally between them. The effectiveness of the proposed decision support system has been demonstrated through application and validation in the bioenergy industry.
Resumo:
Link quality-based rate adaptation has been widely used for IEEE 802.11 networks. However, network performance is affected by both link quality and random channel access. Selection of transmit modes for optimal link throughput can cause medium access control (MAC) throughput loss. In this paper, we investigate this issue and propose a generalised cross-layer rate adaptation algorithm. It considers jointly link quality and channel access to optimise network throughput. The objective is to examine the potential benefits by cross-layer design. An efficient analytic model is proposed to evaluate rate adaptation algorithms under dynamic channel and multi-user access environments. The proposed algorithm is compared to link throughput optimisation-based algorithm. It is found rate adaptation by optimising link layer throughput can result in large performance loss, which cannot be compensated by the means of optimising MAC access mechanism alone. Results show cross-layer design can achieve consistent and considerable performance gains of up to 20%. It deserves to be exploited in practical design for IEEE 802.11 networks.
Resumo:
The re-entrant flow shop scheduling problem (RFSP) is regarded as a NP-hard problem and attracted the attention of both researchers and industry. Current approach attempts to minimize the makespan of RFSP without considering the interdependency between the resource constraints and the re-entrant probability. This paper proposed Multi-level genetic algorithm (GA) by including the co-related re-entrant possibility and production mode in multi-level chromosome encoding. Repair operator is incorporated in the Multi-level genetic algorithm so as to revise the infeasible solution by resolving the resource conflict. With the objective of minimizing the makespan, Multi-level genetic algorithm (GA) is proposed and ANOVA is used to fine tune the parameter setting of GA. The experiment shows that the proposed approach is more effective to find the near-optimal schedule than the simulated annealing algorithm for both small-size problem and large-size problem. © 2013 Published by Elsevier Ltd.
Resumo:
This article conceptualises and operationalizes ‘subjective entrepreneurial success’ in a manner which reflects the criteria employed by entrepreneurs, rather than those imposed by researchers. Using two studies, a first qualitative enquiry investigated success definitions using interviews with 185 German entrepreneurs; five factors emerged from their reports: firm performance, workplace relationships, personal fulfilment, community impact, and personal financial rewards. The second study developed a questionnaire, the Subjective Entrepreneurial Success–Importance Scale (SES-IS), to measure these five factors using a sample of 184 entrepreneurs. We provide evidence for the validity of the SES-IS, including establishing systematic relationships of SES-IS with objective indicators of firm success, annual income and entrepreneur satisfaction with life and financial situation. We also provide evidence for the cross-cultural invariance of SES-IS using a sample of Polish entrepreneurs. The quintessence of our studies being that subjective entrepreneurial success is a multi-factorial construct, i.e. entrepreneurs value various indicators of success with money as only one possible option.
Resumo:
This article presents a laser tracker position optimization code based on the tracker uncertainty model developed by the National Physical Laboratory (NPL). The code is able to find the optimal tracker positions for generic measurements involving one or a network of many trackers, and an arbitrary set of targets. The optimization is performed using pattern search or optionally, genetic algorithm (GA) or particle swarm optimization (PSO). Different objective function weightings for the uncertainties of individual points, distance uncertainties between point pairs, and the angular uncertainties between three points can be defined. Constraints for tracker position limits and minimum measurement distances have also been implemented. Furthermore, position optimization taking into account of lines-of-sight (LOS) within complex CAD geometry have also been demonstrated. The code is simple to use and can be a valuable measurement planning tool.
Resumo:
This article presents a laser tracker position optimization code based on the tracker uncertainty model developed by the National Physical Laboratory (NPL). The code is able to find the optimal tracker positions for generic measurements involving one or a network of many trackers, and an arbitrary set of targets. The optimization is performed using pattern search or optionally, genetic algorithm (GA) or particle swarm optimization (PSO). Different objective function weightings for the uncertainties of individual points, distance uncertainties between point pairs, and the angular uncertainties between three points can be defined. Constraints for tracker position limits and minimum measurement distances have also been implemented. Furthermore, position optimization taking into account of lines-of-sight (LOS) within complex CAD geometry have also been demonstrated. The code is simple to use and can be a valuable measurement planning tool.
Resumo:
OBJECTIVE: To explore patients' and physicians' experiences of atrial fibrillation consultations and oral anticoagulation decision-making. DESIGN: Multi-perspective interpretative phenomenological analyses. METHODS: Participants included small homogeneous subgroups: AF patients who accepted (n=4), refused (n=4), or discontinued (n=3) warfarin, and four physician subgroups (n=4 each group): consultant cardiologists, consultant general physicians, general practitioners and cardiology registrars. Semi-structured interviews were conducted. Transcripts were analysed using multi-perspective IPA analyses to attend to individuals within subgroups and making comparisons within and between groups. RESULTS: Three themes represented patients' experiences: Positioning within the physician-patient dyad, Health-life balance, and Drug myths and fear of stroke. Physicians' accounts generated three themes: Mechanised metaphors and probabilities, Navigating toward the 'right' decision, and Negotiating systemic factors. CONCLUSIONS: This multi-perspective IPA design facilitated an understanding of the diagnostic consultation and treatment decision-making which foregrounded patients' and physicians' experiences. We drew on Habermas' theory of communicative action to recommend broadening the content within consultations and shifting the focus to patients' life contexts. Interventions including specialist multidisciplinary teams, flexible management in primary care, and multifaceted interventions for information provision may enable the creation of an environment that supports genuine patient involvement and participatory decision-making.
Resumo:
Insulated-gate bipolar transistor (IGBT) power modules find widespread use in numerous power conversion applications where their reliability is of significant concern. Standard IGBT modules are fabricated for general-purpose applications while little has been designed for bespoke applications. However, conventional design of IGBTs can be improved by the multiobjective optimization technique. This paper proposes a novel design method to consider die-attachment solder failures induced by short power cycling and baseplate solder fatigue induced by the thermal cycling which are among major failure mechanisms of IGBTs. Thermal resistance is calculated analytically and the plastic work design is obtained with a high-fidelity finite-element model, which has been validated experimentally. The objective of minimizing the plastic work and constrain functions is formulated by the surrogate model. The nondominated sorting genetic algorithm-II is used to search for the Pareto-optimal solutions and the best design. The result of this combination generates an effective approach to optimize the physical structure of power electronic modules, taking account of historical environmental and operational conditions in the field.
Resumo:
Heat sinks are widely used for cooling electronic devices and systems. Their thermal performance is usually determined by the material, shape, and size of the heat sink. With the assistance of computational fluid dynamics (CFD) and surrogate-based optimization, heat sinks can be designed and optimized to achieve a high level of performance. In this paper, the design and optimization of a plate-fin-type heat sink cooled by impingement jet is presented. The flow and thermal fields are simulated using the CFD simulation; the thermal resistance of the heat sink is then estimated. A Kriging surrogate model is developed to approximate the objective function (thermal resistance) as a function of design variables. Surrogate-based optimization is implemented by adaptively adding infill points based on an integrated strategy of the minimum value, the maximum mean square error approach, and the expected improvement approaches. The results show the influence of design variables on the thermal resistance and give the optimal heat sink with lowest thermal resistance for given jet impingement conditions.
Resumo:
Porosity development of mesostructured colloidal silica nanoparticles is related to the removal of the organic templates and co-templates which is often carried out by calcination at high temperatures, 500-600 °C. In this study a mild detemplation method based on the oxidative Fenton chemistry has been investigated. The Fenton reaction involves the generation of OH radicals following a redox Fe3+/Fe2+ cycle that is used as catalyst and H2O2 as oxidant source. Improved material properties are anticipated since the Fenton chemistry comprises milder conditions than calcination. However, the general application of this methodology is not straightforward due to limitations in the hydrothermal stability of the particular system under study. The objective of this work is three-fold: 1) reducing the residual Fe in the resulting solid as this can be detrimental for the application of the material, 2) shortening the reaction time by optimizing the reaction temperature to minimize possible particle agglomeration, and finally 3) investigating the structural and textural properties of the resulting material in comparison to the calcined counterparts. It appears that the Fenton detemplation can be optimized by shortening the reaction time significantly at low Fe concentration. The milder conditions of detemplation give rise to enhanced properties in terms of surface area, pore volume, structural preservation, low Fe residue and high degree of surface hydroxylation; the colloidal particles are stable during storage. A relative particle size increase, expressed as 0.11%·h-1, has been determined.
Resumo:
Emergency managers are faced with critical evacuation decisions. These decisions must balance conflicting objectives as well as high levels of uncertainty. Multi-Attribute Utility Theory (MAUT) provides a framework through which objective trade-offs can be analyzed to make optimal evacuation decisions. This paper is the result of data gathered during the European Commission Project, Evacuation Responsiveness by Government Organizations (ERGO) and outlines a preliminary decision model for the evacuation decision. The illustrative model identifies levels of risk at which point evacuation actions should be taken by emergency managers in a storm surge scenario with forecasts at 12 and 9 hour intervals. The results illustrate how differences in forecast precision affect the optimal evacuation decision. Additional uses for this decision model are also discussed along with improvements to the model through future ERGO data-gathering.