16 resultados para cut-to-length operations

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research addresses the problem of cost estimation for product development in engineer-to-order (ETO) operations. An ETO operation starts the product development process with a product specification and ends with delivery of a rather complicated, highly customized product. ETO operations are practiced in various industries such as engineering tooling, factory plants, industrial boilers, pressure vessels, shipbuilding, bridges and buildings. ETO views each product as a delivery item in an industrial project and needs to make an accurate estimation of its development cost at the bidding and/or planning stage before any design or manufacturing activity starts. ^ Many ETO practitioners rely on an ad hoc approach to cost estimation, with use of past projects as reference, adapting them to the new requirements. This process is often carried out on a case-by-case basis and in a non-procedural fashion, thus limiting its applicability to other industry domains and transferability to other estimators. In addition to being time consuming, this approach usually does not lead to an accurate cost estimate, which varies from 30% to 50%. ^ This research proposes a generic cost modeling methodology for application in ETO operations across various industry domains. Using the proposed methodology, a cost estimator will be able to develop a cost estimation model for use in a chosen ETO industry in a more expeditious, systematic and accurate manner. ^ The development of the proposed methodology was carried out by following the meta-methodology as outlined by Thomann. Deploying the methodology, cost estimation models were created in two industry domains (building construction and the steel milling equipment manufacturing). The models are then applied to real cases; the cost estimates are significantly more accurate than the actual estimates, with mean absolute error rate of 17.3%. ^ This research fills an important need of quick and accurate cost estimation across various ETO industries. It differs from existing approaches to the problem in that a methodology is developed for use to quickly customize a cost estimation model for a chosen application domain. In addition to more accurate estimation, the major contributions are in its transferability to other users and applicability to different ETO operations. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research focuses on developing a capacity planning methodology for the emerging concurrent engineer-to-order (ETO) operations. The primary focus is placed on the capacity planning at sales stage. This study examines the characteristics of capacity planning in a concurrent ETO operation environment, models the problem analytically, and proposes a practical capacity planning methodology for concurrent ETO operations in the industry. A computer program that mimics a concurrent ETO operation environment was written to validate the proposed methodology and test a set of rules that affect the performance of a concurrent ETO operation. ^ This study takes a systems engineering approach to the problem and employs systems engineering concepts and tools for the modeling and analysis of the problem, as well as for developing a practical solution to this problem. This study depicts a concurrent ETO environment in which capacity is planned. The capacity planning problem is modeled into a mixed integer program and then solved for smaller-sized applications to evaluate its validity and solution complexity. The objective is to select the best set of available jobs to maximize the profit, while having sufficient capacity to meet each due date expectation. ^ The nature of capacity planning for concurrent ETO operations is different from other operation modes. The search for an effective solution to this problem has been an emerging research field. This study characterizes the problem of capacity planning and proposes a solution approach to the problem. This mathematical model relates work requirements to capacity over the planning horizon. The methodology is proposed for solving industry-scale problems. Along with the capacity planning methodology, a set of heuristic rules was evaluated for improving concurrent ETO planning. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. ^ For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver.^ The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. ^ The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. ^ To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. ^ Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. ^ The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much potential for growth in hospitality firms exists in foreign countries, but expansion abroad typicality bears additional risks that could be detrimental to the operations. The authors explore those risks, currency exchange risk, and country risk, and offer practical techniques to access, manage, control, and reduce them. Deriving benefits from global opportunities requires effective management of these areas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research examined the factors contributing to the performance of online grocers prior to, and following, the 2000 dot.com collapse. The primary goals were to assess the relationship between a company’s business model(s) and its performance in the online grocery channel and to determine if there were other company and/or market related factors that could account for company performance. To assess the primary goals, a case based theory building process was utilized. A three-way cross-case analysis comprising Peapod, GroceryWorks, and Tesco examined the common profit components, the structural category (e.g., pure-play, partnership, and hybrid) profit components, and the idiosyncratic profit components related to each specific company. Based on the analysis, it was determined that online grocery store business models could be represented at three distinct, but hierarchically, related levels. The first level was termed the core model and represented the basic profit structure that all online grocers needed in order to conduct operations. The next model level was termed the structural model and represented the profit structure associated with the specific business model configuration (i.e., pure-play, partnership, hybrid). The last model level was termed the augmented model and represented the company’s business model when idiosyncratic profit components were included. In relation to the five company related factors, scalability, rate of expansion, and the automation level were potential candidates for helping to explain online grocer performance. In addition, all the market structure related factors were deemed possible candidates for helping to explain online grocer performance. The study concluded by positing an alternative hypothesis concerning the performance of online grocers. Prior to this study, the prevailing wisdom was that the business models were the primary cause of online grocer performance. However, based on the core model analysis, it was hypothesized that the customer relationship activities (i.e., advertising, promotions, and loyalty program tie-ins) were the real drivers of online grocer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Pleistocene carbonate rock Biscayne Aquifer of south Florida contains laterally-extensive bioturbated ooltic zones characterized by interconnected touching-vug megapores that channelize most flow and make the aquifer extremely permeable. Standard petrophysical laboratory techniques may not be capable of accurately measuring such high permeabilities. Instead, innovative procedures that can measure high permeabilities were applied. These fragile rocks cannot easily be cored or cut to shapes convenient for conducting permeability measurements. For the laboratory measurement, a 3D epoxy-resin printed rock core was produced from computed tomography data obtained from an outcrop sample. Permeability measurements were conducted using a viscous fluid to permit easily observable head gradients (~2 cm over 1 m) simultaneously with low Reynolds number flow. For a second permeability measurement, Lattice Boltzmann Method flow simulations were computed on the 3D core renderings. Agreement between the two estimates indicates an accurate permeability was obtained that can be applied to future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hurricanes, earthquakes, floods, and other serious natural hazards have been attributed with causing changes in regional economic growth, income, employment, and wealth. Natural disasters are said to cause; (1) an acceleration of existing economic trends; (2) an expansion of employment and income, due to recovery operations (the so-called silver lining); and (3) an alteration in the structure of regional economic activity due to changes in "intra" and "inter" regional trading patterns, and technological change.^ Theoretical and stylized disaster simulations (Cochrane 1975; Haas, Cochrane, and Kates 1977; Petak et al. 1982; Ellson et al. 1983, 1984; Boisvert 1992; Brookshire and McKee 1992) point towards a wide scope of possible negative and long lasting impacts upon economic activity and structure. This work examines the consequences of Hurricane Andrew on Dade County's economy. Following the work of Ellson et al. (1984), Guimaraes et al. (1993), and West and Lenze (1993; 1994), a regional econometric forecasting model (DCEFM) using a framework of "with" and "without" the hurricane is constructed and utilized to assess Hurricane Andrew's impact on the structure and level of economic activity in Dade County, Florida.^ The results of the simulation exercises show that the direct economic impact associated with Hurricane Andrew on Dade County is of short duration, and of isolated sectoral impact, with impact generally limited to construction, TCP (transportation, communications, and public utilities), and agricultural sectors. Regional growth, and changes in income and employment reacted directly to, and within the range and direction set by national economic activity. The simulations also lead to the conclusion that areal extent, infrastructure, and sector specific damages or impacts, as opposed to monetary losses, are the primary determinants of a disaster's effects upon employment, income, growth, and economic structure. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In their discussion - Participative Budgeting and Participant Motivation: A Review of the Literature - by Frederick J. Demicco, Assistant Professor, School of Hotel, Restaurant and Institutional Management, The Pennsylvania State University and Steven J. Dempsey, Fulton F. Galer, Martin Baker, Graduate Assistants, College of Business at Virginia Polytechnic Institute and State University, the authors initially observe: “In recent years behavioral literature has stressed the importance of participation In goal-setting by those most directly affected by those goals. The common postulate is that greater participation by employees in the various management functions, especially the planning function, will lead to improved motivation, performance, coordination, and functional behavior. The authors analyze this postulate as it relates to the budgeting process and discuss whether or not participative budgeting has a significant positive impact on the motivations of budget participants.” In defining the concept of budgeting, the authors offer: “Budgeting is usually viewed as encompassing the preparation and adoption of a detailed financial operating plan…” In furthering that statement they also furnish that budgeting’s focus is to influence, in a positive way, how managers plan and coordinate the activities of a property in a way that will enhance their own performance. In essence, framing an organization within its described boundaries, and realizing its established goals. The authors will have you know, to control budget is to control operations. What kind of parallels can be drawn between the technical methods and procedures of budgeting, and managerial behavior? “In an effort to answer this question, Ronen and Livingstone have suggested that a fourth objective of budgeting exists, that of motivation,” say the authors with attribution. “The managerial function of motivation is manipulative in nature.” Demicco, Dempsey, Galer, and Baker attempt to quantify motivation as a psychological premise using the expectancy theory, which encompasses empirical support, intuitive appeal, and ease of application to the budgetary process. They also present you with House's Path-Goal model; essentially a mathematics type formula designed to gauge motivation. You really need to see this. The views of Argyris are also explored in particular detail. Although, the Argyris study was primarily aimed at manufacturing firms, and the effects on line-supervisors of the manufacturing budgets which were used to control and evaluate their performance, its application is relevant to the hospitality industry. As the title suggests, other notables in the field of behavioral motivation theory, and participation are also referenced. “Behavioral theory has been moving away from models of purported general applicability toward contingency models that are suited for particular situations,” say the authors in closing. “It is conceivable that some time in the future, contingency models will make possible the tailoring of budget strategies to individual budget holder personalities.”

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was to determine the emergency department (ED) length of stay (LOS) of patients admitted to inpatient telemetry and critical care units and to identify the factors that contribute to a prolonged ED LOS. It also examined whether there was a difference in ED LOS between clients evaluated by an ED physician, an Advanced Registered Nurse Practitioner (ARNP) or a Physician's Assistant (PA).^ A data collection tool was devised and used to record data obtained by retrospectively reviewing 110 charts of patients from this sample. The mean ED LOS was 286.75 minutes. Multiple factors were recorded as affecting the ED LOS of this sample, including: age, diagnosis, consultations, multiple radiographs, pending admission orders, nurse unable to call report/busy, relatives at bedside, observation or stabilization necessary, bed not ready and infusion in progress. No significant difference in ED LOS was noted between subjects initially evaluated by a physician, an ARNP or a PA. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The elderly are at the highest risk of developing pressure ulcers that result in prolonged hospitalization, high health care costs, increased mortality, and decreased quality of life. The burden of pressure ulcers will intensify because of a rapidly increasing elderly population in the United States (US). Poor nutrition is a major predictor of pressure ulcer formation. The purpose of this study was to examine the effects of a comprehensive, interdisciplinary nutritional protocol on: (1) pressure ulcer wound healing (2) length of hospital stays, and (3) charges for pressure ulcer management. Using a pre-intervention/post intervention quasi-experimental design the study sample was composed of 100 patients 60 years or older, admitted with or acquiring a pressure ulcer. A pre-intervention group (n= 50) received routine pressure ulcer care (standard diet, dressing changes, and equipment). A post-intervention group received routine care plus an interdisciplinary nutrition intervention (physical therapy, speech therapy, occupational therapy, added protein and calories to the diet). Research questions were analyzed using descriptive statistics, frequencies, Chi-Square Tests, and T-tests. Findings indicated that the comprehensive, interdisciplinary nutritional protocol had a significant effect on the rate of wound healing in Week3 and Week4, total hospital length of stay (pre-intervention M= 43.2 days, SD=31.70 versus M=31.77, SID-12.02 post-intervention), and pressure ulcer length of stay (pre-intervention 25.28 days, SD5.60 versus 18.40 days, SD 5.27 post-intervention). Although there was no significant difference in total charges for the pre-intervention group ($727,245.00) compared to the post-intervention group ($702,065.00), charges for speech (m=$5885.12, SD=$332.55), pre albumin (m=$808.52,SD= $332.55), and albumin($278 .88, SD=55.00) were higher in the pre-intervention group and charges for PT ($5721.26, SD$3655.24) and OT($2544 .64, SD=1712.863) were higher in the post-intervention group. Study findings indicate that this comprehensive nutritional intervention was effective in improving pressure ulcer wound healing, decreasing both hospital length of stay for treatment of pressure ulcer and total hospital length of stay while showing no significant additional charges for treatment of pressure ulcers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was threefold: first, to investigate variables associated with learning, and performance as measured by the National Council Licensure Examination for Registered Nurses (NCLEX-RN). The second purpose was to validate the predictive value of the Assessment Technologies Institute (ATI) achievement exit exam, and lastly, to provide a model that could be used to predict performance on the NCLEX-RN, with implications for admission and curriculum development. The study was based on school learning theory, which implies that acquisition in school learning is a function of aptitude (pre-admission measures), opportunity to learn, and quality of instruction (program measures). Data utilized were from 298 graduates of an associate degree nursing program in the Southeastern United States. Of the 298 graduates, 142 were Hispanic, 87 were Black, non-Hispanic, 54 White, non-Hispanic, and 15 reported as Others. The graduates took the NCLEX-RN for the first time during the years 2003–2005. This study was a predictive, correlational design that relied upon retrospective data. Point biserial correlations, and chi-square analyses were used to investigate relationships between 19 selected predictor variables and the dichotomous criterion variable, NCLEX-RN. The correlation and chi square findings indicated that men did better on the NCLEX-RN than women; Blacks had the highest failure rates, followed by Hispanics; older students were more likely to pass the exam than younger students; and students who passed the exam started and completed the nursing program with a higher grade point average, than those who failed the exam. Using logistic regression, five statistical models that used variables associated with learning and student performance on the NCLEX-RN were tested with a model adapted from Bloom's (1976) and Carroll's (1963) school learning theories. The derived model included: NCLEX-RNsuccess = f (Nurse Entrance Test and advanced medical-surgical nursing course grade achieved). The model demonstrates that student performance on the NCLEX-RN can be predicted by one pre-admission measure, and a program measure. The Assessment Technologies Institute achievement exit exam (an outcome measure) had no predictive value for student performance on the NCLEX-RN. The model developed accurately predicted 94% of the student's successful performance on the NCLEX-RN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Climate change in the Arctic is predicted to increase plant productivity through decomposition-related enhanced nutrient availability. However, the extent of the increase will depend on whether the increased nutrient availability can be sustained. To address this uncertainty, I assessed the response of plant tissue nutrients, litter decomposition rates, and soil nutrient availability to experimental climate warming manipulations, extended growing season and soil warming, over a 7 year period. Overall, the most consistent effect was the year-to-year variability in measured parameters, probably a result of large differences in weather and time of snowmelt. The results of this study emphasize that although plants of arctic environments are specifically adapted to low nutrient availability, they also posses a suite of traits that help to reduce nutrient losses such as slow growth, low tissue concentrations, and low tissue turnover that result in subtle responses to environmental changes.