905 resultados para Operations Research, Systems Engineering and Industrial Engineering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A heuristic for batching orders in a manual order-picking warehouse has been developed. It prioritizes orders based on due time to prevent mixing of orders of different priority levels. The order density of aisles criterion is used to form batches. It also determines the number of pickers required and assigns batches to pickers such that there is a uniform workload per unit of time. The effectiveness of the heuristic was studied by observing computational time and aisle congestion for various numbers of total orders and number of orders that form a batch. An initial heuristic performed well for small number of orders, but for larger number of orders, a partitioning technique is computationally more efficient, needing only minutes to solve for thousands of orders, while preserving 90% of the batch quality obtained with the original heuristic. Comparative studies between the heuristic and other published heuristics are needed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation delivers a framework to diagnose the Bull-Whip Effect (BWE) in supply chains and then identify methods to minimize it. Such a framework is needed because in spite of the significant amount of literature discussing the bull-whip effect, many companies continue to experience the wide variations in demand that are indicative of the bull-whip effect. While the theory and knowledge of the bull-whip effect is well established, there still is the lack of an engineering framework and method to systematically identify the problem, diagnose its causes, and identify remedies. ^ The present work seeks to fill this gap by providing a holistic, systems perspective to bull-whip identification and diagnosis. The framework employs the SCOR reference model to examine the supply chain processes with a baseline measure of demand amplification. Then, research of the supply chain structural and behavioral features is conducted by means of the system dynamics modeling method. ^ The contribution of the diagnostic framework, is called Demand Amplification Protocol (DAMP), relies not only on the improvement of existent methods but also contributes with original developments introduced to accomplish successful diagnosis. DAMP contributes a comprehensive methodology that captures the dynamic complexities of supply chain processes. The method also contributes a BWE measurement method that is suitable for actual supply chains because of its low data requirements, and introduces a BWE scorecard for relating established causes to a central BWE metric. In addition, the dissertation makes a methodological contribution to the analysis of system dynamic models with a technique for statistical screening called SS-Opt, which determines the inputs with the greatest impact on the bull-whip effect by means of perturbation analysis and subsequent multivariate optimization. The dissertation describes the implementation of the DAMP framework in an actual case study that exposes the approach, analysis, results and conclusions. The case study suggests a balanced solution between costs and demand amplification can better serve both firms and supply chain interests. Insights pinpoint to supplier network redesign, postponement in manufacturing operations and collaborative forecasting agreements with main distributors.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprise Resource Planning (ERP) systems are software programs designed to integrate the functional requirements, and operational information needs of a business. Pressures of competition and entry standards for participation in major manufacturing supply chains are creating greater demand for small business ERP systems. The proliferation of new offerings of ERP systems introduces complexity to the selection process to identify the right ERP business software for a small and medium-sized enterprise (SME). The selection of an ERP system is a process in which a faulty conclusion poses a significant risk of failure to SME’s. The literature reveals that there are still very high failure rates in ERP implementation, and that faulty selection processes contribute to this failure rate. However, the literature is devoid of a systematic methodology for the selection process for an ERP system by SME’s. This study provides a methodological approach to selecting the right ERP system for a small or medium-sized enterprise. The study employs Thomann’s meta-methodology for methodology development; a survey of SME’s is conducted to inform the development of the methodology, and a case study is employed to test, and revise the new methodology. The study shows that a rigorously developed, effective methodology that includes benchmarking experiences has been developed and successfully employed. It is verified that the methodology may be applied to the domain of users it was developed to serve, and that the test results are validated by expert users and stakeholders. Future research should investigate in greater detail the application of meta-methodologies to supplier selection and evaluation processes for services and software; additional research into the purchasing practices of small firms is clearly needed.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past few decades, we have been enjoying tremendous benefits thanks to the revolutionary advancement of computing systems, driven mainly by the remarkable semiconductor technology scaling and the increasingly complicated processor architecture. However, the exponentially increased transistor density has directly led to exponentially increased power consumption and dramatically elevated system temperature, which not only adversely impacts the system's cost, performance and reliability, but also increases the leakage and thus the overall power consumption. Today, the power and thermal issues have posed enormous challenges and threaten to slow down the continuous evolvement of computer technology. Effective power/thermal-aware design techniques are urgently demanded, at all design abstraction levels, from the circuit-level, the logic-level, to the architectural-level and the system-level. ^ In this dissertation, we present our research efforts to employ real-time scheduling techniques to solve the resource-constrained power/thermal-aware, design-optimization problems. In our research, we developed a set of simple yet accurate system-level models to capture the processor's thermal dynamic as well as the interdependency of leakage power consumption, temperature, and supply voltage. Based on these models, we investigated the fundamental principles in power/thermal-aware scheduling, and developed real-time scheduling techniques targeting at a variety of design objectives, including peak temperature minimization, overall energy reduction, and performance maximization. ^ The novelty of this work is that we integrate the cutting-edge research on power and thermal at the circuit and architectural-level into a set of accurate yet simplified system-level models, and are able to conduct system-level analysis and design based on these models. The theoretical study in this work serves as a solid foundation for the guidance of the power/thermal-aware scheduling algorithms development in practical computing systems.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research focuses on developing a capacity planning methodology for the emerging concurrent engineer-to-order (ETO) operations. The primary focus is placed on the capacity planning at sales stage. This study examines the characteristics of capacity planning in a concurrent ETO operation environment, models the problem analytically, and proposes a practical capacity planning methodology for concurrent ETO operations in the industry. A computer program that mimics a concurrent ETO operation environment was written to validate the proposed methodology and test a set of rules that affect the performance of a concurrent ETO operation. ^ This study takes a systems engineering approach to the problem and employs systems engineering concepts and tools for the modeling and analysis of the problem, as well as for developing a practical solution to this problem. This study depicts a concurrent ETO environment in which capacity is planned. The capacity planning problem is modeled into a mixed integer program and then solved for smaller-sized applications to evaluate its validity and solution complexity. The objective is to select the best set of available jobs to maximize the profit, while having sufficient capacity to meet each due date expectation. ^ The nature of capacity planning for concurrent ETO operations is different from other operation modes. The search for an effective solution to this problem has been an emerging research field. This study characterizes the problem of capacity planning and proposes a solution approach to the problem. This mathematical model relates work requirements to capacity over the planning horizon. The methodology is proposed for solving industry-scale problems. Along with the capacity planning methodology, a set of heuristic rules was evaluated for improving concurrent ETO planning. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biofouling, the accumulation of biomolecules, cells, organisms and their deposits on submerged and implanted surfaces, is a ubiquitous problem across various human endeavors including maritime operations, medicine, food industries and biotechnology. Since several decades, there have been substantial research efforts towards developing various types of antifouling and fouling release approaches to control bioaccumulation on man-made surfaces. In this work we hypothesized, investigated and developed dynamic change of the surface area and topology of elastomers as a general approach for biofouling management. Further, we combined dynamic surface deformation of elastomers with other existing antifouling and fouling-release approaches to develop multifunctional, pro-active biofouling control strategies.

This research work was focused on developing fundamental, new and environment-friendly approaches for biofouling management with emphasis on marine model systems and applications, but which also provided fundamental insights into the control of infectious biofilms on biomedical devices. We used different methods (mechanical stretching, electrical-actuation and pneumatic-actuation) to generate dynamic deformation of elastomer surfaces. Our initial studies showed that dynamic surface deformation methods are effective in detaching laboratory grown bacterial biofilms and barnacles. Further systematic studies revealed that a threshold critical surface strain is required to debond a biofilm from the surface, and this critical strain is dependent on the biofilm mechanical properties including adhesion energy, thickness and modulus. To test the dynamic surface deformation approach in natural environment, we conducted field studies (at Beaufort, NC) in natural seawater using pneumatic-actuation of silicone elastomer. The field studies also confirmed that a critical substrate strain is needed to detach natural biofilm accumulated in seawater. Additionally, the results from the field studies suggested that substrate modulus also affect the critical strain needed to debond biofilms. To sum up, both the laboratory and the field studies proved that dynamic surface deformation approach can effectively detach various biofilms and barnacles, and therefore offers a non-toxic and environmental friendly approach for biofouling management.

Deformable elastomer systems used in our studies are easy to fabricate and can be used as complementary approach for existing commercial strategies for biofouling control. To this end, we aimed towards developed proactive multifunctional surfaces and proposed two different approaches: (i) modification of elastomers with antifouling polymers to produce multifunctional, and (ii) incorporation of silicone-oil additives into the elastomer to enhance fouling-release performance.

In approach (i), we modified poly(vinylmethylsiloxane) elastomer surfaces with zwitterionic polymers using thiol-ene click chemistry and controlled free radical polymerization. These surfaces exhibited both fouling resistance and triggered fouling-release functionalities. The zwitterionic polymers exhibited fouling resistance over short-term (∼hours) exposure to bacteria and barnacle cyprids. The biofilms that eventually accumulated over prolonged-exposure (∼days) were easily detached by applying mechanical strain to the elastomer substrate. In approach (ii), we incorporated silicone-oil additives in deformable elastomer and studied synergistic effect of silicone-oils and surface strain on barnacle detachment. We hypothesized that incorporation of silicone-oil additive reduces the amount of surface strain needed to detach barnacles. Our experimental results supported the above hypothesis and suggested that surface-action of silicone-oils plays a major role in decreasing the strain needed to detach barnacles. Further, we also examined the effect of change in substrate modulus and showed that stiffer substrates require lower amount of strain to detach barnacles.

In summary, this study shows that (1) dynamic surface deformation can be used as an effective, environmental friendly approach for biofouling control (2) stretchable elastomer surfaces modified with anti-fouling polymers provides a pro-active, dual-mode approach for biofouling control, and (3) incorporation of silicone-oils additives into stretchable elastomers improves the fouling-release performance of dynamic surface deformation technology. Dynamic surface deformation by itself and as a supplementary approach can be utilized biofouling management in biomedical, industrial and marine applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the audit is to provide a report to management of the adequacy of controls employed to manage this work area. Where appropriate, recommendations and comments are provided for management consideration. Consideration is given to compliance with applicable policies and procedures, both federal and state. Economy and efficiency of operations are considered to the degree feasible but are not primary objectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social-ecological systems are often highly complex, making effective governance a considerable challenge. In large, heterogeneous systems, hierarchical institutional regimes may be efficient, but effective management outcomes are dependent on stakeholder support. This support is shaped by perceptions of legitimacy, which risks being undermined where resource users are not engaged in decision-making. Although legitimacy is demonstrably critical for effective governance, less is known about the factors contributing to stakeholders’ perceptions of legitimacy or how these perceptions are socially differentiated. We quantitatively assessed stakeholder perceptions of legitimacy (indicated by support for rules) and their contributory factors among 307 commercial fishers and tourism operators in Australia’s Great Barrier Reef Marine Park. Legitimacy was most strongly associated with trust in information from governing bodies, followed by confidence in institutional performance and the equity of management outcomes. Legitimacy differed both within and among resource user groups, which emphasizes the heterogeneous nature of commonly defined stakeholder groups. Overall, tourism operators perceived higher legitimacy than did commercial fishers, which was associated with higher trust in information from management agencies. For fishers, higher levels of trust were associated with: (1) engagement in fisheries that had high subsector cohesion and positive previous experiences of interactions with governing bodies; (2) location in areas with greater proximity to sources of knowledge, resources, and decision-making; and (3) engagement in a Reef Guardian program. These findings highlight the necessity of strategies and processes to build trust among all user groups in large social-ecological systems such as the Great Barrier Reef Marine Park. Furthermore, the social differentiation of perceptions that were observed within user groups underscores the importance of targeted strategies to engage groups that may not be heard through traditional governance channels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Design of geotechnical systems is often challenging as it requires the understanding of complex soil behaviour and its influence on field-scale performance of geo-structures. To advance the scientific knowledge and the technological development in geotechnical engineering, a Scottish academic community, named Scottish Universities Geotechnics Network (SUGN), was established in 2001, composing of eight higher education institutions. The network gathers geotechnics researchers, including experimentalists as well as centrifuge, constitutive, and numerical modellers, to generate multiple synergies for building larger collaboration and wider research dissemination in and beyond Scotland. The paper will highlight the research excellence and leading work undertaken in SUGN emphasising some of the contribution to the geotechnical research community and some of the significant research outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With global markets and global competition, pressures are placed on manufacturing organizations to compress order fulfillment times, meet delivery commitments consistently and also maintain efficiency in operations to address cost issues. This chapter argues for a process perspective on planning, scheduling and control that integrates organizational planning structures, information systems as well as human decision makers. The chapter begins with a reconsideration of the gap between theory and practice, in particular for classical scheduling theory and hierarchical production planning and control. A number of the key studies of industrial practice are then described and their implications noted. A recent model of scheduling practice derived from a detailed study of real businesses is described. Socio-technical concepts are then introduced and their implications for the design and management of planning, scheduling and control systems are discussed. The implications of adopting a process perspective are noted along with insights from knowledge management. An overview is presented of a methodology for the (re-)design of planning, scheduling and control systems that integrates organizational, system and human perspectives. The most important messages from the chapter are then summarized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 6: Engineering and Implementation of Collaborative Networks