916 resultados para Lean Manufacturing, Make to Order Manufacturing, Time Study, Kanban, Rapid Performance Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As an indicator of global change and shifting balances of power, every September in Dalian, China, the World Economic Forum meets. The subject in 2011 – Mastering Quality Growth. On the agenda is pursuing new frontiers of growth linked to embracing disruptive innovation. With growth coming from emerging markets, and European and North American economies treading water, many firms in the West are facing the reality of having to not just downsize but actually close manufacturing operations and re-open them elsewhere, where costs are lower, to remain competitive. There are thousands of books on “change management”. Yet very few of these devote much time to downsizing preferring to talk about re-engineering or restructuring. What lessons are available from the past to achieve a positive outcome from what will inevitably be something of a human, as well as an economic, tragedy. The authors reached three fundamental conclusions from their experience and research in facility closure management within Vauxhall, UK: put your people first, make sure you keep running the business and manage your legacy. They devlop the ideas into a new business model linked to the emotions of change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lean manufacturing (LM) is currently enjoying its second heyday. Companies in several industries are implementing leanpractices to keep pace with the competition and achieve better results. In this article, we will concentrate on how companies can improve their inventoryturnover performance through the use ofleanpractices. According to our main proposition, firms that widely apply leanpractices have higher inventoryturnover than those that do not rely on LM. However, there may be significant differences in inventoryturnover even among lean manufacturers depending on their contingencies. Therefore, we also investigate how various contingency factors (production systems, order types, product types) influence theinventoryturnoveroflean manufacturers. We use cluster and correlation analysis to separate manufacturers based onthe extent of their leanness and to examine the effect of contingencies. We acquired the data from the International Manufacturing Strategy Survey (IMSS) in ISIC sectors 28–35.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of patterning methods including conventional photo-lithography and E-beam lithography have been employed to pattern devices with critical dimensions of submicrometer levels. The methods of device fabrication by lithography and multilevel processing are usually specific to the chemical and physical properties of the etchants and materials used, and require a number of processing steps. As an alternative, focused ion beam (FIB) lithography is a unique and straightforward tool to rapidly develop nanomagnetic prototyping devices. This feature of FIB is critical to conduct the basic study necessary to advance the state-of-the-art in magnetic recording. ^ The dissertation develops a specific design of nanodevices and demonstrates FIB-fabricated stable and reproducible magnetic nanostructures with a critical dimension of about 10 nm. The project included the fabrication of a patterned single and multilayer magnetic media with areal densities beyond 10 Terabit/in 2. Each block had perpendicular or longitudinal magnetic anisotropy and a single domain structure. The purpose was to demonstrate how the ability of FIB to directly etch nanoscale patterns allowed exploring (even in the academic environment) the true physics of various types of nanostructures. ^ Another goal of this study was the investigation of FIB patterned magnetic media with a set of characterization tools: e.g. Spinstand Guzik V2002, magnetic force microscopy, scanning electron microscopy with energy dispersive system and wavelength dispersive system. ^ In the course of this work, a unique prototype of a record high density patterned magnetic media device capable of 10 terabit/in 2 was built. The read/write testing was performed by a Guzik spinstand. The readback signals were recorded and analyzed by a digital oscilloscope. A number of different configurations for writing and reading information from a magnetic medium were explored. The prototype transducers for this work were fabricated via FIB trimming of different magnetic recording heads. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In their article - Sales Promotion In Hotels: A British Perspective - by Francis Buttle, Lecturer, Department of Hotel, Restaurant, and Travel Administration, University of Massachusetts and Ini Akpabio, Property Manager, Trusthouse Forte, Britain, Buttle and Akpabio initially state: “Sales promotion in hotels is in its infancy. Other industries, particularly consumer goods manufacturing, have long recognized the contribution that sales promotion can make to the cost-effective achievement of marketing objectives. Sales promotion activities in hotels have remained largely uncharted. The authors define, identify and classify these hotel sales promotion activities to understand their function and form, and to highlight any scope for improvement.” The authors begin their discussion by attempting to define what the phrase sales promotion [SP] actually means. “The Institute of Sales Promotion regards sales promotions as “adding value, usually of a temporary nature, to a product or service in order to persuade the end user to purchase that particular brand as opposed to a competitive brand,” the authors offer. Williams, however, describes sales promotions more broadly as “short term tactical marketing tools which are used to achieve specific marketing objectives during a defined time period,” Buttle and Akpabio present with attribution. “The most significant difference between these two viewpoints is that Williams does not limit his definition to activities which are targeted at the consumer,” is their educated view. A lot of the discussion is centered on the differences in the collective marketing-promotional mix. “…it is not always easy to definitively categorize promotional activity,” Buttle and Akpabio say. “For example, in personal selling, a sales promotion such as a special bonus offer may be used to close the sale; an advertisement may be sales promotional in character in that it offers discounts.” Are promotion and marketing distinguishable as two separate entities? “…not only may there be conceptual confusion between components of the promotional mix, but there is sometimes a blurring of the boundaries between the elements of the marketing mix,” the authors suggest. “There are several reasons why SP is particularly suitable for use in hotels: seasonality, increasing competitiveness, asset characteristics, cost characteristics, increased use of channel intermediaries, new product launches, and deal proneness.” Buttle and Akpabio offer their insight on each of these segments. The authors also want you to know that SP customer applications are not the only game in town, SP trade applications are just as essential. Bonuses, enhanced commission rates, and vouchers are but a few examples of trade SP. The research for the article was compiled from several sources including, mail surveys, telephone surveys, personal interviews, trade magazines and newspapers; essentially in the U.K.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous evolution of integrated circuit technology has allowed integrating thousands of transistors on a single chip. This is due to the miniaturization process, which reduces the diameter of wires and transistors. One drawback of this process is that the circuit becomes more fragile and susceptible to break, making the circuit more susceptible to permanent faults during the manufacturing process as well as during their lifetime. Coarse Grained Reconfigurable Architectures (CGRAs) have been used as an alternative to traditional architectures in an attempt to tolerate such faults due to its intrinsic hardware redundancy and high performance. This work proposes a fault tolerance mechanism in a CGRA in order to increase the architecture fault tolerance even considering a high fault rate. The proposed mechanism was added to the scheduler, which is the mechanism responsible for mapping instructions onto the architecture. The instruction mapping occurs at runtime, translating binary code without the need for recompilation. Furthermore, to allow faster implementation, instruction mapping is performed using a greedy module scheduling algorithm, which consists of a software pipeline technique for loop acceleration. The results show that, even with the proposed mechanism, the time for mapping instructions is still in order of microseconds. This result allows that instruction mapping process remains at runtime. In addition, a study was also carried out mapping scheduler rate. The results demonstrate that even at fault rates over 50% in functional units and interconnection components, the scheduler was able to map instructions onto the architecture in most of the tested applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of the practices and strategies of manufacturing management, over the years, has made many companies realign their production systems in order to raise their competitiveness and operational performance. However, in most cases these changes are made in a heterogeneous manner, which ends up leaving the production system without a defined goal, which may end up damaging the managerial strategies of the organization as a whole. Thus, some organizations seek to use techniques and/or successful production practices used by other companies, believing can be able to reproduce the same results. An efficient production system must be fully planned and appropriate to the strategic objectives of the organization. Thus, this paper aims to identify the manufacturing management strategies adopted in paraibanas industries, as well as identify the lean practices used by them. Thus, a qualitative study was conducted, using as methodological basis the multicase study. Were made: direct observations, semi-structured interviews and questionnaires applied to those responsible by the production sector of the participating companies. As a result, it was possible to identify the type of manufacturing management system adopted by companies. Where it was detected that Company A uses a system of Modern Mass Production with focus on productivity and low cost and the Company B is using Lean Manufacturing system focused on quality and diversity. In the two organizations was possible to realize the application of lean practices where the Company what does not use the LM, possessed lean practices in standard extremely mature of utilization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deciphering the driving mechanisms of Earth system processes, including the climate dynamics expressed as paleoceanographic events, requires a complete, continuous, and high-resolution stratigraphy that is very accurately dated. In this study, we construct a robust astronomically calibrated age model for the middle Eocene to early Oligocene interval (31-43 Ma) in order to permit more detailed study of the exceptional climatic events that occurred during this time, including the Middle Eocene Climate Optimum and the Eocene/Oligocene transition. A goal of this effort is to accurately date the middle Eocene to early Oligocene composite section cored during the Pacific Equatorial Age Transect (PEAT, IODP Exp. 320/321). The stratigraphic framework for the new time scale is based on the identification of the stable long eccentricity cycle in published and new high-resolution records encompassing bulk and benthic stable isotope, calibrated XRF core scanning, and magnetostratigraphic data from ODP Sites 171B-1052, 189-1172, 199-1218, and 207-1260 as well as IODP Sites 320-U1333, and -U1334 spanning magnetic polarity Chrons C12n to C20n. Subsequently we applied orbital tuning of the records to the La2011 orbital solution. The resulting new time scale revises and refines the existing orbitally tuned age model and the Geomagnetic Polarity Time Scale from 31 to 43 Ma. Our newly defined absolute age for the Eocene/Oligocene boundary validates the astronomical tuned age of 33.89 Ma identified at the Massignano (Italy) global stratotype section and point. Our compilation of geochemical records of climate-controlled variability in sedimentation through the middle-to-late Eocene and early Oligocene demonstrates strong power in the eccentricity band that is readily tuned to the latest astronomical solution. Obliquity driven cyclicity is only apparent during very long eccentricity cycle minima around 35.5 Ma, 38.3 Ma and 40.1 Ma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Purpose – The purpose of this paper is to present a case study regarding the deployment of a previously developed model for the integration of management systems (MSs). The case study is developed at a manufacturing site of an international enterprise. The implementation of this model in a real business environment is aimed at assessing its feasibility. Design/methodology/approach – The presented case study takes into account different management systems standards (MSSs) progressively implemented, along the years, independently. The implementation of the model was supported by the results obtained from an investigation performed according to a structured diagnosis that was conducted to collect information related to the organizational situation of the enterprise. Findings – The main findings are as follows: a robust integrated management system (IMS), objectively more lean, structured and manageable was found to be feasible; this study provided an holistic view of the enterprise’s global management; clarifications of job descriptions and boundaries of action and responsibilities were achieved; greater efficiency in the use of resources was attained; more coordinated management of the three pillars of sustainability – environmental, economic and social, as well as risks, providing confidence and added value to the company and interested parties was achieved. Originality/value – This case study is pioneering in Portugal in respect to the implementation, at the level of an industrial organization, of the model previously developed for the integration of individualized MSs. The case study provides new insights regarding the implementation of IMSs including the rationalization of several resources and elimination of several types of organizational waste leveraging gains of efficiency. Due to its intrinsic characteristics, the model is able to support, progressively, new or revised MSSs according to the principles of annex SL (normative) – proposals for MSSs – of the International Organization for Standardization and the International Electrotechnical Commission, that the industrial organization can adopt beyond the current ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este relatório descreve o desenvolvimento de um projecto de melhoria numa linha de montagem de autocarros, numa das empresas do Grupo Salvador Caetano: a CaetanoBus. Sustentado teoricamente pelos pressupostos da filosofia lean, o projecto visou sobretudo uma organização mais eficiente dos recursos humanos e materiais, de modo a alcançar uma redução do lead time e, por conseguinte, uma diminuição simultânea de desperdícios associados ao processo e do número de colaboradores afectos a determinado modelo de autocarro. A metodologia adoptada envolveu fundamentalmente operações relativas ao estudo de tempos, balanceamento e estabelecimento de standard works. Por outro lado, outros conceitos associados ao lean, como kaizen e yamazum chart, integraram igualmente a linha estruturante de pensamento que orientou o campo das acções. Um plano de melhorias baseado no uso das ferramentas lean sugere ganhos produtivos, muito embora o término da produção do modelo, alvo de análise, tivesse impedido a sua confirmação prática nesta matéria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Num mercado cada vez mais exigente e competitivo, torna-se imprescindível que as organizações otimizem continuamente os seus sistemas produtivos. Muitas têm visto nas filosofias e práticas Lean e Kaizen as respostas para conseguirem produzir mais sem acrescentar recursos, e de forma mais rápida, mais eficaz e mais eficiente. O envolvimento da empresa numa cultura de melhoria contínua orientada para o cliente permite a criação de valor em todas as etapas, tornando-a mais flexível e competitiva. Essa transformação cultural, aliada à aplicação de ferramentas Lean e Kaizen, permitem melhorar o desempenho global da organização, reduzindo os custos através do combate aos inibidores de performance: os desperdícios, os paradigmas, a inflexibilidade e a variabilidade. No presente trabalho pretende-se mostrar a aplicabilidade de algumas destas ferramentas no processo produtivo de um componente metálico para a indústria mobiliária, assim como os ganhos alcançados com esta abordagem. Ao longo de todo o projeto foram usadas diversas ferramentas Lean Manufacturing como organização do posto de trabalho (5S), Gestão Visual, troca rápida de ferramentas (SMED), mapeamento da cadeia de valor (VSM) e alterações de layouts (com recurso ao software Arena e AutoCad para estudar a alteração mais eficiente que permitisse uma maior produtividade com menor quantidade de recursos). Também se mostra igualmente a importância dos colaboradores no processo de mudança, através das iniciativas Kaizen, do programa de sugestões, dos inquéritos de satisfação e das ações de formação, para que eles se sintam parte integrante da organização.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automotive producers are aiming to make their order fulfilment processes more flexible. Opening the pipeline of planned products for dynamic allocation to dealers/ customers is a significant step to be more flexible but the behaviour of such Virtual-Build-To-Order systems are complex to predict and their performance varies significantly as product variety levels change. This study investigates the potential for intelligent control of the pipeline feed, taking into account the current status of inventory (level and mix) and of the volume and mix of unsold products in the planning pipeline, as well as the demand profile. Five ‘intelligent’ methods for selecting the next product to be planned into the production pipeline are analysed using a discrete event simulation model and compared to the unintelligent random feed. The methods are tested under two conditions, firstly when customers must be fulfilled with the exact product they request, and secondly when customers trade-off a shorter waiting time for compromise in specification. The two forms of customer behaviour have a substantial impact on the performance of the methods and there are also significant differences between the methods themselves. When the producer has an accurate model of customer demand, methods that attempt to harmonise the mix in the system to the demand distribution are superior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cada vez mais, os principais objetivos na indústria é a produção a baixo custo, com a máxima qualidade e com o tempo de fabrico o mais curto possível. Para atingir esta meta, a indústria recorre, frequentemente, às máquinas de comando numérico (CNC), uma vez que com esta tecnologia torna se capaz alcançar uma elevada precisão e um tempo de processamento mais baixo. As máquinas ferramentas CNC podem ser aplicadas em diferentes processos de maquinagem, tais como: torneamento, fresagem, furação, entre outros. De todos estes processos, o mais utilizado é a fresagem devido à sua versatilidade. Utiliza-se normalmente este processo para maquinar materiais metálicos como é o caso do aço e dos ferros fundidos. Neste trabalho, são analisados os efeitos da variação de quatro parâmetros no processo de fresagem (velocidade de corte, velocidade de avanço, penetração radial e penetração axial), individualmente e a interação entre alguns deles, na variação da rugosidade num aço endurecido (aço 12738). Para essa análise são utilizados dois métodos de otimização: o método de Taguchi e o método das superfícies. O primeiro método foi utilizado para diminuir o número de combinações possíveis e, consequentemente, o número de ensaios a realizar é denominado por método de Taguchi. O método das superfícies ou método das superfícies de resposta (RSM) foi utilizado com o intuito de comparar os resultados obtidos com o método de Taguchi, de acordo com alguns trabalhos referidos na bibliografia especializada, o RSM converge mais rapidamente para um valor ótimo. O método de Taguchi é muito conhecido no setor industrial onde é utilizado para o controlo de qualidade. Apresenta conceitos interessantes, tais como robustez e perda de qualidade, sendo bastante útil para identificar variações do sistema de produção, durante o processo industrial, quantificando a variação e permitindo eliminar os fatores indesejáveis. Com este método foi vi construída uma matriz ortogonal L16 e para cada parâmetro foram definidos dois níveis diferentes e realizados dezasseis ensaios. Após cada ensaio, faz-se a medição superficial da rugosidade da peça. Com base nos resultados obtidos das medições da rugosidade é feito um tratamento estatístico dos dados através da análise de variância (Anova) a fim de determinar a influência de cada um dos parâmetros na rugosidade superficial. Verificou-se que a rugosidade mínima medida foi de 1,05m. Neste estudo foi também determinada a contribuição de cada um dos parâmetros de maquinagem e a sua interação. A análise dos valores de “F-ratio” (Anova) revela que os fatores mais importantes são a profundidade de corte radial e da interação entre profundidade de corte radial e profundidade de corte axial para minimizar a rugosidade da superfície. Estes têm contribuições de cerca de 30% e 24%, respetivamente. Numa segunda etapa este mesmo estudo foi realizado pelo método das superfícies, a fim de comparar os resultados por estes dois métodos e verificar qual o melhor método de otimização para minimizar a rugosidade. A metodologia das superfícies de resposta é baseada num conjunto de técnicas matemáticas e estatísticas úteis para modelar e analisar problemas em que a resposta de interesse é influenciada por diversas variáveis e cujo objetivo é otimizar essa resposta. Para este método apenas foram realizados cinco ensaios, ao contrário de Taguchi, uma vez que apenas em cinco ensaios consegue-se valores de rugosidade mais baixos do que a média da rugosidade no método de Taguchi. O valor mais baixo por este método foi de 1,03μm. Assim, conclui-se que RSM é um método de otimização mais adequado do que Taguchi para os ensaios realizados. Foram obtidos melhores resultados num menor número de ensaios, o que implica menos desgaste da ferramenta, menor tempo de processamento e uma redução significativa do material utilizado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectifs: Examiner les tendances temporelles, les déterminants en lien avec le design des études et la qualité des taux de réponse rapportés dans des études cas-témoins sur le cancer publiées lors des 30 dernières années. Méthodes: Une revue des études cas-témoins sur le cancer a été menée. Les critères d'inclusion étaient la publication (i) dans l’un de 15 grands périodiques ciblés et (ii) lors de quatre périodes de publication (1984-1986, 1995, 2005 et 2013) couvrant trois décennies. 370 études ont été sélectionnées et examinées. La méthodologie en lien avec le recrutement des sujets et la collecte de données, les caractéristiques de la population, les taux de participation et les raisons de la non-participation ont été extraites de ces études. Des statistiques descriptives ont été utilisées pour résumer la qualité des taux de réponse rapportés (en fonction de la quantité d’information disponible), les tendances temporelles et les déterminants des taux de réponse; des modèles de régression linéaire ont été utilisés pour analyser les tendances temporelles et les déterminants des taux de participation. Résultats: Dans l'ensemble, les qualités des taux de réponse rapportés et des raisons de non-participation étaient très faible, particulièrement chez les témoins. La participation a diminué au cours des 30 dernières années, et cette baisse est plus marquée dans les études menées après 2000. Lorsque l'on compare les taux de réponse dans les études récentes a ceux des études menées au cours de 1971 à 1980, il y a une plus grande baisse chez les témoins sélectionnés en population générale ( -17,04%, IC 95%: -23,17%, -10,91%) que chez les cas (-5,99%, IC 95%: -11,50%, -0,48%). Les déterminants statistiquement significatifs du taux de réponse chez les cas étaient: le type de cancer examiné, la localisation géographique de la population de l'étude, et le mode de collecte des données. Le seul déterminant statistiquement significatif du taux de réponse chez les témoins hospitaliers était leur localisation géographique. Le seul déterminant statistiquement significatif du taux de participation chez les témoins sélectionnés en population générale était le type de répondant (sujet uniquement ou accompagné d’une tierce personne). Conclusion: Le taux de participation dans les études cas-témoins sur le cancer semble avoir diminué au cours des 30 dernières années et cette baisse serait plus marquée dans les études récentes. Afin d'évaluer le niveau réel de non-participation et ses déterminants, ainsi que l'impact de la non-participation sur la validité des études, il est nécessaire que les études publiées utilisent une approche normalisée pour calculer leurs taux de participation et qu’elles rapportent ceux-ci de façon transparente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectifs: Examiner les tendances temporelles, les déterminants en lien avec le design des études et la qualité des taux de réponse rapportés dans des études cas-témoins sur le cancer publiées lors des 30 dernières années. Méthodes: Une revue des études cas-témoins sur le cancer a été menée. Les critères d'inclusion étaient la publication (i) dans l’un de 15 grands périodiques ciblés et (ii) lors de quatre périodes de publication (1984-1986, 1995, 2005 et 2013) couvrant trois décennies. 370 études ont été sélectionnées et examinées. La méthodologie en lien avec le recrutement des sujets et la collecte de données, les caractéristiques de la population, les taux de participation et les raisons de la non-participation ont été extraites de ces études. Des statistiques descriptives ont été utilisées pour résumer la qualité des taux de réponse rapportés (en fonction de la quantité d’information disponible), les tendances temporelles et les déterminants des taux de réponse; des modèles de régression linéaire ont été utilisés pour analyser les tendances temporelles et les déterminants des taux de participation. Résultats: Dans l'ensemble, les qualités des taux de réponse rapportés et des raisons de non-participation étaient très faible, particulièrement chez les témoins. La participation a diminué au cours des 30 dernières années, et cette baisse est plus marquée dans les études menées après 2000. Lorsque l'on compare les taux de réponse dans les études récentes a ceux des études menées au cours de 1971 à 1980, il y a une plus grande baisse chez les témoins sélectionnés en population générale ( -17,04%, IC 95%: -23,17%, -10,91%) que chez les cas (-5,99%, IC 95%: -11,50%, -0,48%). Les déterminants statistiquement significatifs du taux de réponse chez les cas étaient: le type de cancer examiné, la localisation géographique de la population de l'étude, et le mode de collecte des données. Le seul déterminant statistiquement significatif du taux de réponse chez les témoins hospitaliers était leur localisation géographique. Le seul déterminant statistiquement significatif du taux de participation chez les témoins sélectionnés en population générale était le type de répondant (sujet uniquement ou accompagné d’une tierce personne). Conclusion: Le taux de participation dans les études cas-témoins sur le cancer semble avoir diminué au cours des 30 dernières années et cette baisse serait plus marquée dans les études récentes. Afin d'évaluer le niveau réel de non-participation et ses déterminants, ainsi que l'impact de la non-participation sur la validité des études, il est nécessaire que les études publiées utilisent une approche normalisée pour calculer leurs taux de participation et qu’elles rapportent ceux-ci de façon transparente.