980 resultados para Multiple products
Resumo:
This article describes the use of Artificial Intelligence (IA) techniques applied in cells of a manufacturing system. Machine Vision was used to identify pieces and their positions of two different products to be assembled in the same productive line. This information is given as input for an IA planner embedded in the manufacturing system. Therefore, initial and final states are sent automatically to the planner capable to generate assembly plans for a robotic cell, in real time.
Resumo:
Membrane filtration technology has been proven to be a technically sound process to improve the quality of clarified cane juice and subsequently to increase the productivity of crystallisation and the quality of sugar production. However, commercial applications have been hindered because the benefits to crystallisation and sugar quality have not outweighed the increased processing costs associated with membrane applications. An 'Integrated Sugar Production Process (ISPP) Concept Model' is proposed to recover more value from the non-sucrose streams generated by membrane processing. Pilot scale membrane fractionation trials confirmed the technical feasibility of separating high-molecular weight, antioxidant and reducing sugar fractions from cane juice in forms suitable for value recovery. It was also found that up to 40% of potassium salts from the juice can be removed by membrane application while removing the similar amount of water with potential energy saving in subsequent evaporation. Application of ISPP would allow sugar industry to co-produce multiple products and high quality mill sugar while eliminating energy intensive refining processes.
Resumo:
LXRα is an orphan member of the nuclear hormone receptor superfamily that displays constitutive transcriptional activity. We reasoned that this activity may result from the production of an endogenous activator that is a component of intermediary metabolism. The use of metabolic inhibitors revealed that mevalonic acid biosynthesis is required for LXRα activity. Mevalonic acid is a common metabolite used by virtually all eukaryotic cells. It serves as a precursor to a large number of important molecules including farnesyl pyrophosphate, geranylgeranyl pyrophosphate, cholesterol, and oxysterols. Inhibition of LXRα could be reversed by addition of mevalonic acid and certain oxysterols but not by other products of mevalonic acid metabolism. Surprisingly, the constitutive activity of LXRα was inhibited by geranylgeraniol, a metabolite of mevalonic acid. These findings suggest that LXRα may represent a central component of a signaling pathway that is both positively and negatively regulated by multiple products of mevalonate metabolism.
Resumo:
Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.
Resumo:
Some luxury goods manufacturers offer limited editions of their products, whereas some others market multiple product lines. Researchers have found that reference groups shape consumer evaluations of these product categories. Yet little empirical research has examined how reference groups affect the product line decisions of firms. Indeed, in a field setting it is quite a challenge to isolate reference group effects from contextual effects and correlated effects. In this paper, we propose a parsimonious model that allows us to study how reference groups influence firm behavior and that lends itself to experimental analysis. With the aid of the model we investigate the behavior of consumers in a laboratory setting where we can focus on the reference group effects after controlling for the contextual and correlated effects. The experimental results show that in the presence of strong reference group effects, limited editions and multiple products can help improve firms' profits. Furthermore, the trends in the purchase decisions of our participants point to the possibility that they are capable of introspecting close to two steps of thinking at the outset of the game and then learning through reinforcement mechanisms. © 2010 INFORMS.
Resumo:
O vinagre é obtido por dupla fermentação alcoólica e acética de substâncias de origem agrícola, possuindo cada tipo um flavour particular, função dos substratos e tecnologia usados, mantendo gosto sui generis ácido. A sua aptidão tecnológica viabiliza o fabrico de múltiplos produtos, macerando especiarias, plantas, etc, conduzindo ao enriquecimento da matriz, cujo perfil químico ganha complexidade e novas características sensoriais/funcionais. A picklagem fresh pack é um processo alternativo de conservação em vinagre, sem fermentação. Com vinagres de fermentação submergida, desenvolveram-se na ESAS (2009-2013), dois vinagres e um vinagrete com adições e um pickles de frutos doces, articulando ensaios tecnológicos, laboratoriais e sensoriais. Concebidos como produtos gourmet, pretendeu-se oferecer inovação e conveniência. Além do longo tempo de vida de prateleira, evidencia-se: 1) no vinagre de vinho branco com mirtilo –a mais-valia de preservar o fruto inteiro, por efeito de picklagem; 2) no vinagre agridoce, de vinho tinto Touriga Nacional com mel e especiarias –uma tónica agridoce equilibrada e actual; 3) no vinagrete de laranja aromatizado –a complexidade aromática aliada à sensação de frescura na boca; 4) no pickles fresh pack de pera-abacaxi agridoce –novidade e dupla utilização: consumida a fruta, a infusão utiliza-se como vinagre de mesa (aptidão incomum em pickles).---Vinegar is obtained by double fermentation alcoholic and acetic of substances from agricultural origin, each type having one particular flavor, due to the technology and the substrates used, while maintaining sui generis acid taste. Its technological aptitude enables the manufacture of multiple products, macerating spices, plants, leading to the enrichment of the matrix whose chemical profile becomes increasingly complex with new sensory/functional characteristics. The fresh pack process is an alternative process of pickling, without fermentation. With submerged fermentation vinegar, two vinegars and a vinaigrette with additions and pickled sweet fruits were developed in ESAS (2009-2013), articulating technological, laboratory and sensory tests. Designed as gourmet products, intended to provide innovation and convenience. In addition to the long shelf life, stands out: 1) in white wine vinegar with blueberries – the added value of preserving the whole fruit by pickling effect, 2) in bittersweet red wine vinegar, Touriga Nacional with honey and spices – the sweet and sour taste, balanced and fashionable; 3) in flavored orange vinaigrette – the aromatic complexity coupled with the fresh sensation in the mouth, 4) in the fresh pack sweet and sour pickles with pear-pineapple – the innovation and dual-use: consumed the fruit, infusion is used as table vinegar (unusual application for pickles).
Resumo:
The effective and efficient management of diversified business firms that supply multiple products and operate in multiple, dynamic markets, especially large multinational enterprises (MNEs), builds upon a number of specific governance principles. These governance principles allow the alignment of environmental characteristics, strategy and organization. Given the rising need to “learn from the world”, Doz et al., in their influential Harvard Business School Press book entitled From Global to Metanational, have proposed a new set of governance principles described under the “metanational” umbrella concept. This paper revisits the metanational, using a comparative institutional perspective; here we contrast multidivisional and metanational governance principles. A comparative institutional analysis suggests that the metanational's application potential in terms of actually improving the effectiveness and efficiency of MNE governance may be subject to more qualification than suggested by Doz et al. Senior MNE management must therefore reflect carefully before substituting metanational governance principles for the more conventional, multidivisional ones with established contributions to managerial effectiveness and efficiency.
Resumo:
The aim of the study was to develop a system of growth and yield models for thinned stands of Eucalyptus spp.; and to assess the behavior of the growth in scenarios with 10% decrease or increase in rainfall. The probability distribution functions Weibull 2 and 3 parameters and Johnson SB for different methods were fitted. Correlation between the fitted parameters with age was evaluated. Dominant height growth behavior was evaluated to check if thinned stand changes its growth when compared to a non-thinned stands. The stand variables dominant height and basal area were projected and simultaneously predicted and projected, respectively. Individual tree equations were fitted, which were fitted as functions of stand level variables in order to decrease the error propagation. R software was used to fit all the proposed models and consequently all the fitted models were evaluated by their parameters significance (F-test) and graphs of predicted values in relation to the observed values around the 1:1 line. Thus, the prognosis system was made by two ways, first one using the full data set, and for the second one the dataset was restricted at age 7.5. Increase and decrease in 20% of rainfall were assessed by updating the site index function. Method of moments was the most precise to describe the diameter distribution for every age in eucalyptus stands for Johnson SB and Weibull 2 parameters pdfs. When observed for each pdf the correlation for their fitted parameters with age, we noticed that shape parameters for a thinned stand were no longer correlated with age, differently of non-thinned stands. Thus, thinning effect was accounted in the basal area prediction and projection modeling. This result emphasized the necessity of applying the Parameter Recovery method in order to assess differences and capture the right pattern for thinned and non-thinned stands in the future. Dominant height was not influenced by thinning intensity. Therefore the fitted Chapman-Richards model did not account for a stand being thinned or not. All the fitted equations behaved with good precision, no matter using full or precocious dataset. The prognosis system using full and/or precocious date set was evaluated for when using Parameter Recovery method for Sb and Weibull pdfs, and by then, graphical analysis and precision statistics showed appropriated results. Finally, the increase or decrease in rainfall regime were observed for eucalyptus stand yields and we may notice how important is to observe this effect, since the growth pattern is strictly affected by water.
Resumo:
Since product take-back is mandated in Europe, and has effects for producers worldwide including the U.S., designing efficient forward and reverse supply chain networks is becoming essential for business viability. Centralizing production facilities may reduce costs but perhaps not environmental impacts. Decentralizing a supply chain may reduce transportation environmental impacts but increase capital costs. Facility location strategies of centralization or decentralization are tested for companies with supply chains that both take back and manufacture products. Decentralized and centralized production systems have different effects on the environment, industry and the economy. Decentralized production systems cluster suppliers within the geographical market region that the system serves. Centralized production systems have many suppliers spread out that meet all market demand. The point of this research is to help further the understanding of company decision-makers about impacts to the environment and costs when choosing a decentralized or centralized supply chain organizational strategy. This research explores; what degree of centralization for a supply chain makes the most financial and environmental sense for siting facilities; and which factories are in the best location to handle the financial and environmental impacts of particular processing steps needed for product manufacture. This research considered two examples of facility location for supply chains when products are taken back; the theoretical case involved shoe resoling and a real world case study considered the location of operations for a company that reclaims multiple products for use as material inputs. For the theoretical example a centralized strategy to facility location was optimal: whereas for the case study a decentralized strategy to facility location was best. In conclusion, it is not possible to say that a centralized or decentralized strategy to facility location is in general best for a company that takes back products. Each company’s specific concerns, needs, and supply chain details will determine which degree of centralization creates the optimal strategy for siting their facilities.
Resumo:
Component commonality - the use of the same version of a component across multiple products - is being increasingly considered as a promising way to offer high external variety while retaining low internal variety in operations. However, increasing commonality has both positive and negative cost effects, so that optimization approaches are required to identify an optimal commonality level. As components influence to a greater or lesser extent nearly every process step along the supply chain, it is not surprising that a multitude of diverging commonality problems is being investigated in literature, each of which are developing a specific algorithm designed for the respective commonality problem being considered. The paper on hand aims at a general framework which is flexible and efficient enough to be applied to a wide range of commonality problems. Such a procedure based on a two-stage graph approach is presented and tested. Finally, flexibility of the procedure is shown by customizing the framework to account for different types of commonality problems.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
This dissertation contributes to the rapidly growing empirical research area in the field of operations management. It contains two essays, tackling two different sets of operations management questions which are motivated by and built on field data sets from two very different industries --- air cargo logistics and retailing.
The first essay, based on the data set obtained from a world leading third-party logistics company, develops a novel and general Bayesian hierarchical learning framework for estimating customers' spillover learning, that is, customers' learning about the quality of a service (or product) from their previous experiences with similar yet not identical services. We then apply our model to the data set to study how customers' experiences from shipping on a particular route affect their future decisions about shipping not only on that route, but also on other routes serviced by the same logistics company. We find that customers indeed borrow experiences from similar but different services to update their quality beliefs that determine future purchase decisions. Also, service quality beliefs have a significant impact on their future purchasing decisions. Moreover, customers are risk averse; they are averse to not only experience variability but also belief uncertainty (i.e., customer's uncertainty about their beliefs). Finally, belief uncertainty affects customers' utilities more compared to experience variability.
The second essay is based on a data set obtained from a large Chinese supermarket chain, which contains sales as well as both wholesale and retail prices of un-packaged perishable vegetables. Recognizing the special characteristics of this particularly product category, we develop a structural estimation model in a discrete-continuous choice model framework. Building on this framework, we then study an optimization model for joint pricing and inventory management strategies of multiple products, which aims at improving the company's profit from direct sales and at the same time reducing food waste and thus improving social welfare.
Collectively, the studies in this dissertation provide useful modeling ideas, decision tools, insights, and guidance for firms to utilize vast sales and operations data to devise more effective business strategies.
Resumo:
The effective supplier evaluation and purchasing processes are of vital importance to business organizations, making the suppliers selection problem a fundamental key issue to their success. We consider a complex supplier selection problem with multiple products where minimum package quantities, minimum order values related to delivery costs, and discounted pricing schemes are taken into account. Our main contribution is to present a mixed integer linear programming (MILP) model for this supplier selection problem. The model is used to solve several examples including three real case studies from an electronic equipment assembly company.