991 resultados para MANUFACTURING STRATEGIES
Resumo:
Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.
Resumo:
Many types of production are being transferred from the rich economies of the North to the poorer economies of the South. Such changes began in manufacturing but are now spreading to services. This paper provides estimates of their past and future impact on employment in the North. About 5 million manufacturing jobs have been lost over the past decade because of trade with low-wage economies. A similar number of service jobs may be lost to low-wage economies over the next decade. Although small compared to total employment, such losses may seriously harm certain localities or types of worker.
Resumo:
In this paper we analyse the recent evolution and determinants of real wages in Mexicos manufacturing sector, using theories based on the assumption of imperfect competition both in the product and in the labour markets, especially wage-bargain theory, insider-outsider and mark-up models. We show evidence that the Mexican labour market does not behave as a traditional competitive market. The proposed explanation for this fact is that some workers benefit from advantages when compared with others, so that they can get a greater share of the proceedings of the productive process. Also, we find that changes in the degree of competition in the market for output influence the behaviour of real wages.
Resumo:
Business intelligence (BI) is an information process that includes the activities and applications used to transform business data into valuable business information. Today’s enterprises are collecting detailed data which has increased the available business data drastically. In order to meet changing customer needs and gain competitive advantage businesses try to leverage this information. However, IT departments are struggling to meet the increased amount of reporting needs. Therefore, recent shift in the BI market has been towards empowering business users with self-service BI capabilities. The purpose of this study was to understand how self-service BI could help businesses to meet increased reporting demands. The research problem was approached with an empirical single case study. Qualitative data was gathered with a semi-structured, theme-based interview. The study found out that case company’s BI system was mostly used for group performance reporting. Ad-hoc and business user-driven information needs were mostly fulfilled with self-made tools and manual work. It was felt that necessary business information was not easily available. The concept of self-service BI was perceived to be helpful to meet such reporting needs. However, it was found out that the available data is often too complex for an average user to fully understand. The respondents felt that in order to self-service BI to work, the data has to be simplified and described in a way that it can be understood by the average business user. The results of the study suggest that BI programs struggle in meeting all the information needs of today’s businesses. The concept of self-service BI tries to resolve this problem by allowing users easy self-service access to necessary business information. However, business data is often complex and hard to understand. Self-serviced BI has to overcome this challenge before it can reach its potential benefits.
Resumo:
This thesis investigates the performance of value and momentum strategies in the Swedish stock market during the 2000-2015 sample period. In addition the performance of some value and value-momentum combination is examined. The data consists of all the publicly traded companies in the Swedish stock market between 2000-2015. P/E, P/B, P/S, EV/EBITDA, EV/S ratios and 3, 6 and 12 months value criteria are used in the portfolio formation. In addition to single selection criteria, combination of P/E and P/B (aka. Graham number), the average ranking of the five value criteria and EV/EBIT – 3 month momentum combination is used as a portfolio-formation criterion. The stocks are divided into quintile portfolios based on each selection criterion. The portfolios are reformed once a year using the April’s price information and previous year’s financial information. The performance of the portfolios is examined based on average annual return, the Sharpe ratio and the Jensen alpha. The results show that the value-momentum combination is the best-performing portfolio both during the whole sample period and during the sub-period that started after the 2007-financial crisis.
Resumo:
Product Data Management (PDM) systems have been utilized within companies since the 1980s. Mainly the PDM systems have been used by large companies. This thesis presents the premise that small and medium-sized companies can also benefit from utilizing the Product Data Management systems. Furthermore, the starting point for the thesis is that the existing PDM systems are either too expensive or do not properly respond to the requirements SMEs have. The aim of this study is to investigate what kinds of requirements and special features SMEs, operating in Finnish manufacturing industry, have towards Product Data Management. Additionally, the target is to create a conceptual model that could fulfill the specified requirements. The research has been carried out as a qualitative case study, in which the research data was collected from ten Finnish companies operating in manufacturing industry. The research data is formed by interviewing key personnel from the case companies. After this, the data formed from the interviews has been processed to comprise a generic set of information system requirements and the information system concept supporting it. The commercialization of the concept is studied in the thesis from the perspective of system development. The aim was to create a conceptual model, which would be economically feasible for both, a company utilizing the system and for a company developing it. For this reason, the thesis has sought ways to scale the system development effort for multiple simultaneous cases. The main methods found were to utilize platform-based thinking and a way to generalize the system requirements, or in other words abstracting the requirements of an information system. The results of the research highlight the special features Finnish manufacturing SMEs have towards PDM. The most significant of the special features is the usage of project model to manage the order-to-delivery –process. This differs significantly from the traditional concepts of Product Data Management presented in the literature. Furthermore, as a research result, this thesis presents a conceptual model of a PDM system, which would be viable for the case companies interviewed during the research. As a by-product, this research presents a synthesized model, found from the literature, to abstract information system requirements. In addition to this, the strategic importance and categorization of information systems within companies has been discussed from the perspective of information system customizations.
Resumo:
This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.
Resumo:
Internationalization represents a complex topic that has been researched for quite some time. However, since it continues to be extremely current a topic, its significance has not diminished, but maybe even increased in importance. Companies today face extreme pressure to enter new markets in the hope of growing, becoming more profitable, increasing market share, attracting new customers and meeting the requirements of its share- and stakeholders. In the increasingly global business environment of today, companies are facing both challenges and possible advantages of internationalization. Few companies are not operating internationally and it is becoming the question of ‘Why not?’ rather than ‘Why?’ to internationalize business operations. Internationalization and the importance of strategy are discussed in this research from the viewpoint of three case companies that were interviewed about internationalization strategies. This research project is a qualitative study that answers the research question of How is a business strategy constructed for entering a new market? The sub-questions are • How are goals set and what indicators are used to monitor the achievement of these goals? • What are the key characteristics of a strategy implementation process? The research method chosen for this study is a multiple-case study. Three case companies were chosen for the interviews in order to gain in-depth data of internationalization strategies within the construction industry.
Resumo:
When compared to Latin America, Asian economies since 1980 have grown faster and have done so with relatively modest inequalities. Why? A comparison of Asia and Latin America underlines the superiority of the nationalist capitalist model of development, which has often been pursued more explicitly in Asia, over that of a dependent capitalist model, which has often been pursued in Latin America. In comparison to Latin America, the Asian model has facilitated higher and less volatile rates of economic growth and a greater political room to pursue social democratic policies. The "tap root" of these alternate pathways is relative autonomy from global constraints: states and economies in Asia have been more nationalist and autonomous than in Latin America.
Resumo:
The objective of the study is to extend the existing hedging literature of the commodity price risks by investigating what kind of hedging strategies can be used in companies using bitumen as raw material in their production. Five different alternative swap hedging strategies in bitumen markets are empirically tested. Strategies tested are full hedge strategy, simple, conservative, and aggressive term structure strategies, and implied volatility strategy. The effectiveness of the alternative strategies is measured by excess returns compared to no hedge strategy. In addition, the downside risk of each strategy is measured with target absolute semi-deviation. Results indicate that any of the tested strategies does not outperform the no hedge strategy in terms of excess returns in all maturities. The best-performing aggressive term structure strategy succeeds to create positive excess returns only in short maturities. However, risk seems to increase hand-in-hand with the excess returns so that the best-performing strategies get the highest risk metrics as well. This implicates that the company willing to gain from favorable price movements must be ready to bear a greater risk. Thus, no superior hedging strategy over the others is found.
Resumo:
Organizations that provide health and social services operate in a complex and constantly changing environment. Changes occur, for example, in ageing, technology and biotechnology, and customers’ expectations, as well as the global economic situation. Organizations typically aim to adapt the changes by introducing new organizational structures and managerial practices, such as process and lean management. Only recently has there been an interest in evaluating whether organizations providing health and social services could apply modularity in order to respond to some of the changes. The concept of modularity originates from manufacturing, but is applied in many other disciplines, such as information technology and logistics. However, thus far, the literature concerning modularity in health and social services is scarce. Therefore the purpose of this thesis is to increase understanding concerning modularity and the possibilities to apply modularity in the health and social services context. In addition, the purpose is to shed light on the viewpoints that are worth taking into account when considering the application of modularity in the health and social services context. The aim of the thesis is to analyze the way in which the modular structures are applied in the health and social services context and to analyze what advantages and possible barriers, as well as managerial concerns, might occur if modularity is applied in the health and social services context. The thesis is conducted by using multiple methods in order to provide a broad aspect to the topic. A systematic literature review provided solid ground for pre-understanding the topic and supported the formulation of the research questions. Theoretical reasoning provided a general overview of the special characteristics of the health and social services context and their effect on application of modularity. Empirical studies concentrated on managerial concerns of modularity particularly from the perspective of health and social services for the elderly. Results of the thesis reveal that structures in products, services, processes, and organizations are rather modular in health and social services. They can be decomposed in small independent units, while the challenges seem to occur especially in the compatibility of the services. It seems that health and social services managers have recognized this problem and they are increasingly paying attention to this challenge in order to enhance the flexible compatibility of services. Advantages and possible barriers of modularity are explored in this thesis, and from the theoretical perspective it could be argued that modularity seems to be beneficial in the context of health and social services. In fact, it has the potential to alleviate several of the challenges that the health and social services context is confronting. For example, modular structures could support organizations in their challenging task to respond to customers’ increasing demand for heterogeneous services. However, special characteristics of the health and social services context create barriers and provide significant challenges in application of modularity. For example, asymmetry of information, negative externalities, uncertainty of demand, and rigid regulation prevent managers from extensively drawing benefits from modularity. Results also reveal that modularity has managerial implications in health and social service. Modularity has the potential to promote and support new service development and outsourcing. Results also provide insights into network management and increases managerial understanding of different network management strategies. Standardization in health and social services is extensive due to legislation and recommendations. Modularity provides alternative paths to take an advantage of standardization while still ensuring the quality of the services. Based on this thesis, it can be concluded, both from a theoretical perspective and from empirical results concerning modularity in health and social services, that modularity might fit well and be beneficial. However, the special characteristics of the health and social services context prevent some of the benefits of modularity and complicate its application. This thesis contributes to the academic literature on the organization and management of health and social services by describing modularity as an alternative way for organizing and managing health and social services. In addition, it contributes to the literature of modularity by exploring the applicability of modularity in the context of health and social services. It also provides practical contribution to health and social services managers by evaluating the pros and cons of modularity when applied to health and social services.
Resumo:
The purpose of conducting this thesis is to gather around information about additive manufacturing and to design a product to be additively manufactured. The specific manufacturing method dealt with in this thesis, is powder bed fusion of metals. Therefore when mentioning additive manufacturing in this thesis, it is referred to powder bed fusion of metals. The literature review focuses on the principle of powder bed fusion, the general process chain in additive manufacturing, design rules for additive manufacturing. Examples of success stories in additive manufacturing and reasons for selecting parts to be manufactured with additive manufacturing are also explained in literature review. This knowledge is demanded to understand the experimental part of the thesis. The experimental part of the thesis is divided into two parts. Part A concentrates on finding proper geometry for building self-supporting pipes and proper parameters for support structures of them. Part B of the experimental part concentrates on a case study of designing a product for additive manufacturing. As a result of experimental part A, the design process of self-supporting pipes, results of visual analysis and results of 3D scanning are presented. As a result of experimental part B the design process of the product is presented and compared to the original model.
Resumo:
The objective of this research is to create a current state analysis of pulp supply chain processes from production planning to deliveries to customers. A cross-functional flowchart is being used to model these processes. These models help finding key performance indicators (KPIs) which enable examinations of the supply chain efficiency. Supply chain measures in different processes reveal the changes need processes that affect the whole supply chain and its efficiency and competitiveness. Structure of pulp supply chain differs from most of the other supply chains. The fact that there are big volumes of bulk products, small product variations and supply forecasts are made for the year ahead make the difference. This factor brings different benefits but also challenges when developing supply chain. This thesis divides pulp supply chain in three different main categories: production planning, warehousing and transportation. It provides tools for estimating the functionality of supply chain as well as developing the efficiency for different functions of supply chain. By having a better understanding of supply chain processes and measurement the whole supply chain structure can be developed significantly.
Resumo:
The purpose of this study is to find out how laser based Directed Energy Deposition processes can benefit from different types of monitoring. DED is a type of additive manufacturing process, where parts are manufactured in layers by using metallic powder or metallic wire. DED processes can be used to manufacture parts that are not possible to manufacture with conventional manufacturing processes, when adding new geometries to existing parts or when wanting to minimize the scrap material that would result from machining the part. The aim of this study is to find out why laser based DED-processes are monitored, how they are monitored and what devices are used for monitoring. This study has been done in the form of a literature review. During the manufacturing process, the DED-process is highly sensitive to different disturbances such as fluctuations in laser absorption, powder feed rate, temperature, humidity or the reflectivity of the melt pool. These fluctuations can cause fluctuations in the size of the melt pool or its temperature. The variations in the size of the melt pool have an effect on the thickness of individual layers, which have a direct impact on the final surface quality and dimensional accuracy of the parts. By collecting data from these fluctuations and adjusting the laser power in real-time, the size of the melt pool and its temperature can be kept within a specified range that leads to significant improvements in the manufacturing quality. The main areas of monitoring can be divided into the monitoring of the powder feed rate, the temperature of the melt pool, the height of the melt pool and the geometry of the melt pool. Monitoring the powder feed rate is important when depositing different material compositions. Monitoring the temperature of the melt pool can give information about the microstructure and mechanical properties of the part. Monitoring the height and the geometry of the melt pool is an important factor in achieving the desired dimensional accuracy of the part. By combining multiple different monitoring devices, the amount of fluctuations that can be controlled will be increased. In addition, by combining additive manufacturing with machining, the benefits of both processes could be utilized.
Resumo:
The production of biodiesel through transesterification has created a surplus of glycerol on the international market. In few years, glycerol has become an inexpensive and abundant raw material, subject to numerous plausible valorisation strategies. Glycerol hydrochlorination stands out as an economically attractive alternative to the production of biobased epichlorohydrin, an important raw material for the manufacturing of epoxy resins and plasticizers. Glycerol hydrochlorination using gaseous hydrogen chloride (HCl) was studied from a reaction engineering viewpoint. Firstly, a more general and rigorous kinetic model was derived based on a consistent reaction mechanism proposed in the literature. The model was validated with experimental data reported in the literature as well as with new data of our own. Semi-batch experiments were conducted in which the influence of the stirring speed, HCl partial pressure, catalyst concentration and temperature were thoroughly analysed and discussed. Acetic acid was used as a homogeneous catalyst for the experiments. For the first time, it was demonstrated that the liquid-phase volume undergoes a significant increase due to the accumulation of HCl in the liquid phase. Novel and relevant features concerning hydrochlorination kinetics, HCl solubility and mass transfer were investigated. An extended reaction mechanism was proposed and a new kinetic model was derived. The model was tested with the experimental data by means of regression analysis, in which kinetic and mass transfer parameters were successfully estimated. A dimensionless number, called Catalyst Modulus, was proposed as a tool for corroborating the kinetic model. Reactive flash distillation experiments were conducted to check the commonly accepted hypothesis that removal of water should enhance the glycerol hydrochlorination kinetics. The performance of the reactive flash distillation experiments were compared to the semi-batch data previously obtained. An unforeseen effect was observed once the water was let to be stripped out from the liquid phase, exposing a strong correlation between the HCl liquid uptake and the presence of water in the system. Water has revealed to play an important role also in the HCl dissociation: as water was removed, the dissociation of HCl was diminished, which had a retarding effect on the reaction kinetics. In order to obtain a further insight on the influence of water on the hydrochlorination reaction, extra semi-batch experiments were conducted in which initial amounts of water and the desired product were added. This study revealed the possibility to use the desired product as an ideal “solvent” for the glycerol hydrochlorination process. A co-current bubble column was used to investigate the glycerol hydrochlorination process under continuous operation. The influence of liquid flow rate, gas flow rate, temperature and catalyst concentration on the glycerol conversion and product distribution was studied. The fluid dynamics of the system showed a remarkable behaviour, which was carefully investigated and described. Highspeed camera images and residence time distribution experiments were conducted to collect relevant information about the flow conditions inside the tube. A model based on the axial dispersion concept was proposed and confronted with the experimental data. The kinetic and solubility parameters estimated from the semi-batch experiments were successfully used in the description of mass transfer and fluid dynamics of the bubble column reactor. In light of the results brought by the present work, the glycerol hydrochlorination reaction mechanism has been finally clarified. It has been demonstrated that the reactive distillation technology may cause drawbacks to the glycerol hydrochlorination reaction rate under certain conditions. Furthermore, continuous reactor technology showed a high selectivity towards monochlorohydrins, whilst semibatch technology was demonstrated to be more efficient towards the production of dichlorohydrins. Based on the novel and revealing discoveries brought by the present work, many insightful suggestions are made towards the improvement of the production of αγ-dichlorohydrin on an industrial scale.