949 resultados para value stream mapping, VSM, lean principle, IT process management
Resumo:
El mercado de los semiconductores está saturado de productos similares y de distribuidores con una propuesta de servicios similar. Los procesos de Co-Creación en los que el cliente colabora en la definición y desarrollo del producto y proporciona información sobre su utilidad, prestaciones y valor percibido, con el resultado de un producto que soluciona sus necesidades reales, se están convirtiendo en un paso adelante en la diferenciación y expansión de la cadena de valor. El proceso de diseño y fabricación de semiconductores es bastante complejo, requiere inversiones cada vez mayores y demanda soluciones completas. Se requiere un ecosistema que soporte el desarrollo de los equipos electrónicos basados en dichos semiconductores. La facilidad para el diálogo y compartir información que proporciona internet, las herramientas basadas en web 2.0 y los servicios y aplicaciones en la nube; favorecen la generación de ideas, el desarrollo y evaluación de productos y posibilita la interacción entre diversos co-creadores. Para iniciar un proceso de co-creación se requiere métodos y herramientas adecuados para interactuar con los participantes e intercambiar experiencias, procesos para integrar la co-creación dentro de la operativa de la empresa, y desarrollar una organización y cultura que soporten y fomenten dicho proceso. Entre los métodos más efectivos están la Netnografía que estudia las conversaciones de las comunidades en internet; colaboración con usuarios pioneros que van por delante del Mercado y esperan un gran beneficio de la satisfacción de sus necesidades o deseos; los estudios de innovación que permiten al usuario definir y a menudo crear su propia solución y la externalización a la multitud, que mediante una convocatoria abierta plantea a la comunidad retos a resolver a cambio de algún tipo de recompensa. La especialización de empresas subcontratistas en el desarrollo y fabricación de semiconductores; facilita la innovación abierta colaborando con diversas entidades en las diversas fases del desarrollo del semiconductor y su ecosistema. La co-creación se emplea actualmente en el sector de los semiconductores para detectar ideas de diseños y aplicaciones, a menudo mediante concursos de innovación. El servicio de soporte técnico y la evaluación de los semiconductores con frecuencia es fruto de la colaboración entre los miembros de la comunidad fomentada y soportada por los fabricantes del producto. Con el programa EBVchips se posibilita el acceso a empresas pequeñas y medianas a la co-creación de semiconductores con los fabricantes en un proceso coordinado y patrocinado por el distribuidor EBV. Los semiconductores configurables como las FPGAs constituyen otro ejemplo de co-creación mediante el cual el fabricante proporciona el circuito integrado y el entorno de desarrollo y los clientes crean el producto final definiendo sus características y funcionalidades. Este proceso se enriquece con bloques funcionales de diseño, IP-cores, que a menudo son creados por la comunidad de usuarios. ABSTRACT. The semiconductor market is saturated of similar products and distributors with a similar proposal for services. The processes of co-creation in which the customer collaborates in the definition and development of the product and provides information about its utility, performance and perceived value, resulting in a product that solves their real needs, are becoming a step forward in the differentiation and expansion of the value chain. The design and semiconductor manufacturing process is quite complex, requires increasingly higher investments and demands complete solutions. It requires an ecosystem that supports the development of electronic equipments based on such semiconductors. The ease of dialogue and sharing information that provides internet, web 2.0-based tools and services and applications in the cloud; favor the generation of ideas, the development and evaluation of products and allows the interaction between various co-creators. To start a process of co-creation adequate methods and tools are required to interact with the participants and exchange experiences, processes to integrate the co-creation within the operations of the company, and developing an organization and culture that support and promote such process. Among the most effective methods are the Netnography that studies the conversations of the communities on the internet; collaboration with Lead Users who are ahead of the market and expect a great benefit from the satisfaction of their needs or desires; Innovation studies that allow the user to define and often create their own solution and Crowdsourcing, an open call to the community to solve challenges in exchange for some kind of reward. The specialization of subcontractors in the development and manufacture of semiconductors; facilitates open innovation in the context of collaboration with different entities working in the different phases of the development of the semiconductor and its ecosystem. Co-creation is used currently in the semiconductor sector to detect ideas of designs and applications, often through innovation’s contests. Technical support and evaluation of semiconductors frequently is the result of collaboration between members of the community fostered and supported by the manufacturers of the product. The EBVchips program provides access to small and medium-sized companies to the co-creation of semiconductors with manufacturers in a process coordinated and sponsored by the Distributor EBV. Configurable semiconductors like FPGAs are another example of co-creation whereby the manufacturer provides the integrated circuit and the development environment and customers create the final product by defining their features and functionality. This process is enriched with IP-cores, designs blocks that are often created by the user community.
Resumo:
The backdrop of actual problematic about the implementation of Information Technology (IT) services management in Small and Medium Enterprises (SMEs) will be described. It will be exposed the reasons why reaching a maturity/capability level through well-known standards or the implementation of good software engineering practices by means of IT infrastructure Library are really difficult to achieve by SMEs. Also, the solutions to the exposed problems will be explained. Also master thesis goals are presented in terms of: purpose, research questions, research goals, objectives and scope. Finally, thesis structure is described.
Resumo:
En la actualidad, a través de los sistemas de Tecnologías de la Información (TI) se ofrecen servicios muy diversos que tienen requisitos cada vez más específicos para garantizar su funcionamiento. Con el fin de cumplir dichas garantías, se han estandarizado diferentes normativas, así como marcos de trabajo, también llamadas recomendaciones o manuales de buenas prácticas, que permiten al proveedor de servicios verificar que se cumplen dichos requisitos y ofrecer tanto al cliente final como a la propia organización un servicio de calidad que permita generar valora todas las partes. Con la aplicación de dichas recomendaciones y normativas, las empresas y Administraciones Públicas garantizan que los servicios telemáticos que ofrecen, cumplen estándares de calidad permitiendo así ofrecer al usuario final una plataforma estable y adecuada al servicio prestado. Dichas normas y marcos hacen referencia tanto al servicio TI propiamente dicho como al propio sistema de gestión del servicio TI, de tal forma que, a través de una operación adecuada del sistema que gestiona el servicio, podemos hacer que dicho servicio esté continuamente mejorando. En concreto nos centramos en la norma más empleada, ISO 20000, y en el marco de trabajo o referencia de buenas prácticas, ITIL 2011, con el fin de dar una visión clara de los diferentes procesos, actividades y funciones que ambas definen para generar valor en la empresa a través de los sistemas de TI. Con el fin de ayudar a la comprensión tanto de ITIL como de ISO 20000, se ha desarrollado a modo de ejemplo la implementación tanto de ITIL inicialmente y luego ISO 20000 sobre de un servicio que ya está en funcionamiento definiendo para cada una de las cinco fases del ciclo de vida que, tanto en la norma como en el marco se utilizan, los procesos y funciones necesarias para su implementación, y su posterior revisión. ABSTRACT. Today, we’ve got a lot of different services thanks to Information Technologies (IT) service management: they have increasingly specific requirements to ensure a good operation on the service they support. In order to meet these requirements, it has been released different standardized regulations and frameworks, also called recommendations or good practice guides. They allow the service provider to verify that such requirements are met and offer both to the end customer and to the own organization that manage this system, a quality service that will generate value to both parts. In the end, with the implementation of these recommendations and regulations, companies and public authorities ensure that the telematics services offered meet the quality standards they seek, allowing the end user to offer a stable and appropriate service platform. So these standards and reference frames both implies on the IT service itself and on the IT service management, so that, through proper operation of the parts implied on the process that manages the service, we can offer a better service. In particular we focus on the most widely used standard, ISO 20000, and the reference framework or best practices, ITIL 2011, in order to give a clear overview of the different processes, activities and functions that define both to create value in the company through IT service management. To help the understanding of both ITIL and ISO 20000, it has been developed an example of an ITIL and then ISO 20000 implementation on a service that is already in operation defining for each of the five phases of the life cycle, both in the standard as used in the context, processes and functions necessary for its implementation, and later review.
Resumo:
A oleuropeína é o composto fenólico mais abundante presente nas folhas da oliveira, sendo que muitos estudos vêm demonstrando que este composto apresenta importantes propriedades antimicrobiana, antioxidante, anti-inflamatória, entre outras, surgindo o interesse em estudos de métodos para sua extração e aplicação em produtos na área alimentícia, cosmética e farmacêutica. O objetivo deste estudo foi a extração da oleuropeína à partir de folhas de oliva, utilizando solvente não tóxico, para posterior aplicação dos extratos em óleos vegetais a fim de se verificar seu efeito sobre a estabilidade oxidativa dos mesmos. O solvente selecionado para o estudo foi uma mistura de etanol e água (70:30, em massa, condição obtida através de um trabalho prévio), na presença de 1 % de ácido acético. Em uma primeira etapa, foram realizados experimentos de extração utilizando-se as técnicas de maceração (tipo I) e ultrassom (tipo II), em diferentes condições de temperatura (20, 30, 40, 50 e 60°C). Em uma segunda etapa, através de experimentos com maceração à temperatura ambiente, estudou-se o efeito da razão folhas:solvente (1:8, 1:6 e 1:3) e a influência da presença de ácido acético sobre o processo de extração (tipo III). Por fim, realizando-se a maceração na presença de ácido acético, temperatura ambiente e proporção folhas: solvente igual a 1:3, realizaram-se extrações sequenciadas a partir de uma mesma matéria-prima (tipo IV). Os resultados desses experimentos foram expressos em rendimento de oleuropeína (RO), teor de oleuropeína nos extratos (TO) e rendimento global (RG). Analisando-se os experimentos I e II, verificou-se que a temperatura não exerceu influência significativa sobre as respostas RO, TO e RG. Além disso, verificou-se que os valores das respostas para os experimentos com a maceração foram um pouco maiores do que os valores obtidos para as extrações com o auxílio do ultrassom. Nos experimentos tipo III, em linhas gerais, observou-se a influência positiva da presença do ácido acético sobre as respostas estudadas. Verificou-se também que, na presença de ácido, o aumento da quantidade de solvente na extração conduz ao aumento de RO e RG, e à diminuição de TO. Através do experimento tipo IV, constatou-se que mesmo após quatro extrações sequenciadas, ainda não foi possível esgotar a oleuropeína da matéria-prima. Após a obtenção de todos os extratos hidroalcoólicos, selecionou-se um contendo aproximadamente 19 % de oleuropeína para o estudo da estabilidade oxidativa em óleos vegetais (oliva e girassol) utilizando o método Rancimat. A presença de extrato aumentou em 3 horas o tempo de indução do azeite de oliva extra-virgem, e em 2 horas o tempo de indução do azeite de oliva comum. Os óleos de girassol bruto e refinado não apresentaram melhora na estabilidade oxidativa quando adicionados dos extratos. Foram realizados também testes de estabilidade oxidativa através da adição direta de folhas de oliva em pó nos azeites de oliva extra-virgem e comum. Para o azeite extra-virgem, a adição das folhas não proporcionou melhora da estabilidade oxidativa, porém para o azeite comum, houve um aumento de mais de 2 horas no tempo de indução.Os resultados apresentados neste trabalho demonstraram que é possível obter extratos contendo teores significativos de oleuropeína utilizando-se um solvente renovável. Além disso, constatou-se que os mesmos podem ser utilizados como um antioxidante natural em azeite de oliva, melhorando sua estabilidade oxidativa.
Resumo:
Nas últimas décadas, a maturidade de alguns mercados, a globalização e o crescente poder de barganha dos clientes aumentam ainda mais a necessidade das empresas em manterem e desenvolverem de forma eficaz seus clientes mais importantes. Neste contexto, ganham relevância os programas de Key Account Management (KAM), iniciativas corporativas que tratam de forma especial os clientes mais importantes do fornecedor. Para obter o desempenho financeiro superior, o programa de KAM precisa criar valor para o cliente para posterior apropriação de valor pelo fornecedor. Contudo, a maioria dos estudos enfatiza a apropriação de valor pelo fornecedor, porém poucas pesquisas investigam a criação de valor para o cliente em programas de KAM. Além disso, a maioria das pesquisas em marketing de relacionamento ainda foca muito nos impactos positivos do relacionamento. Dessa forma, é importante analisar empiricamente como é a implementação da criação de valor para o cliente em programas de KAM, identificando as principais dimensões e os fatores críticos. O objetivo do presente estudo é analisar o processo de criação de valor para o cliente em programas de Key Account Management (KAM) e propor um modelo de criação de valor para o cliente segundo a perspectiva da empresa fornecedora. As análises e o modelo são elaborados a partir de um processo de investigação abdutiva, ou seja, a combinação entre a fundamentação teórica sobre o conceito de valor e programas de KAM e a análise de conteúdo de 22 entrevistas em profundidade com especialistas em programas de KAM, profissionais de marketing/vendas que trabalham por pelo menos cinco anos com programas de KAM em grandes empresas no Brasil. O modelo proposto explica de forma integrada e sistemática como é a criação de valor para o cliente em programas de KAM segundo cinco dimensões (Desenvolvimento de relacionamentos; Entendimento dos direcionadores de valor; Desenvolvimento da proposta de valor; Comunicação da proposta de valor; e mensuração de valor), quatro moderadores (Orientação relacional do cliente; Formalização do programa de KAM para o cliente; Abordagem do fornecedor: \"orientada ao cliente\" vs. \"orientar o cliente\"; e Fit estratégico entre o fornecedor e o cliente) e três riscos (Não entrega do valor básico para o cliente; Rotatividade do Key Account Manager; e Sentimento de injustiça do cliente). Contribui-se com a teoria sobre o tema, ao incluir uma dimensão específica no modelo para desenvolvimento de relacionamentos do nível da díade (organização-organização) e indivíduo (funcionário-funcionário), e ao abordar não somente aspectos positivos do relacionamento, mas também os aspectos negativos (ou riscos da criação de valor para o cliente). Contribui-se também para a prática, ao prover uma visão mais ampla, sistemática e integradora dos diversos elementos da criação de valor para o cliente aos executivos das empresas que possuem programas de KAM, e ao recomendar práticas organizacionais que servem como guias para a tomada de decisão dos gestores de programas de KAM. Ademais, como a parte empírica do estudo é baseada no contexto brasileiro, amplia-se o conhecimento sobre KAM no Brasil. Por fim, apresentam-se as limitações do estudo com a agenda de pesquisas futuras
Resumo:
In my previous article Racial Capitalism, I examined the ways in which white individuals and predominantly white institutions derive value from non-white racial identity. This process flows from our intense social and legal preoccupation with diversity. And it results in the commodification of non-white racial identity, with negative implications for both individuals and society. This Article picks up where Racial Capitalism left off in three ways. As a foundation, it first expands the concept of racial capitalism to identity categories more generally, explaining that individual in-group members and predominantly in-group institutions — usually individuals or institutions that are white, male, straight, wealthy, and so on — can and do derive value from out-group identities. Second, the Article turns from the overarching system of identity capitalism to the myriad ways that individual out-group members actively participate in that system. In particular, I examine how out-group members leverage their out-group status to derive social and economic value for themselves. I call such out-group participants identity entrepreneurs. Identity entrepreneurship is neither inherently good nor inherently bad. Rather, it is a complicated phenomenon with both positive and negative consequences. Finally, the Article considers the appropriate response to identity entrepreneurship. We should design laws and policies to maximize both individual agency and access to information for out-group members. Such reforms would protect individual choice while making clear the consequences of identity entrepreneurship both for individual identity entrepreneurs and for the out-group as a whole. A range of legal doctrines interact with and influence identity entrepreneurship, including employment discrimination under Title VII, rights of privacy and publicity, and intellectual property. Modifying these doctrines to take account of identity entrepreneurship will further progress toward an egalitarian society in which in-group and out-group identities are valued equally.
Resumo:
Background: The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods: The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results: DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion: DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison.
Resumo:
Paper submitted to the XVIII Conference on Design of Circuits and Integrated Systems (DCIS), Ciudad Real, España, 2003.
Resumo:
Use of nonlinear parameter estimation techniques is now commonplace in ground water model calibration. However, there is still ample room for further development of these techniques in order to enable them to extract more information from calibration datasets, to more thoroughly explore the uncertainty associated with model predictions, and to make them easier to implement in various modeling contexts. This paper describes the use of pilot points as a methodology for spatial hydraulic property characterization. When used in conjunction with nonlinear parameter estimation software that incorporates advanced regularization functionality (such as PEST), use of pilot points can add a great deal of flexibility to the calibration process at the same time as it makes this process easier to implement. Pilot points can be used either as a substitute for zones of piecewise parameter uniformity, or in conjunction with such zones. In either case, they allow the disposition of areas of high and low hydraulic property value to be inferred through the calibration process, without the need for the modeler to guess the geometry of such areas prior to estimating the parameters that pertain to them. Pilot points and regularization can also be used as an adjunct to geostatistically based stochastic parameterization methods. Using the techniques described herein, a series of hydraulic property fields can be generated, all of which recognize the stochastic characterization of an area at the same time that they satisfy the constraints imposed on hydraulic property values by the need to ensure that model outputs match field measurements. Model predictions can then be made using all of these fields as a mechanism for exploring predictive uncertainty.
Resumo:
Remote sensing, as a direct adjunct to field, lithologic and structural mapping, and more recently, GIS have played an important role in the study of mineralized areas. A review on the application of remote sensing in mineral resource mapping is attempted here. It involves understanding the application of remote sensing in lithologic, structural and alteration mapping. Remote sensing becomes an important tool for locating mineral deposits, in its own right, when the primary and secondary processes of mineralization result in the formation of spectral anomalies. Reconnaissance lithologic mapping is usually the first step of mineral resource mapping. This is complimented with structural mapping, as mineral deposits usually occur along or adjacent to geologic structures, and alteration mapping, as mineral deposits are commonly associated with hydrothermal alteration of the surrounding rocks. In addition to these, understanding the use of hyperspectral remote sensing is crucial as hyperspectral data can help identify and thematically map regions of exploration interest by using the distinct absorption features of most minerals. Finally coming to the exploration stage, GIS forms the perfect tool in integrating and analyzing various georeferenced geoscience data in selecting the best sites of mineral deposits or rather good candidates for further exploration.
Resumo:
Firms have embraced electronic commerce as a means of doing business, either because they see it as a way to improve efficiency, grow market share, expand into new markets, or because they view it as essential for survival. Recent research in the United States provides some evidence that the market does value investments in electronic commerce. Following research that suggests that, in certain circumstances, the market values noninnovative investments as well as innovative investments in new products, we partition electronic commerce investment project announcements into innovative and noninnovative to determine whether there are excess returns associated with these types of announcements. Apart from our overall results being consistent with the United States findings that the market values investments in electronic commerce projects, we also find that noninnovative investments are perceived as more valuable to the firm than innovative investments. On average, the market expects innovative investments to earn a return commensurate with their risk. We conclude that innovative electronic commerce projects are most likely seen by the capital market as easily replicable, and consequently have little, if any, competitive advantage period. On the other hand, we conclude from the noninnovative investment results that these types of investments are seen as being compatible with a firm's assets-in-place, in particular, its information technology capabilities, a view consistent with the resource-based view of the firm.
Resumo:
As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.
Resumo:
Initially this paper asks two questions: In order to create and sustain competitive advantage through collaborative systems WHAT should be managed? and HOW should it be managed? It introduces the competitive business structure and reviews some of the global trends in manufacturing and business, which leads to focus on manage processes, value propositions and extended business processes. It then goes on to develop a model of the collaborative architecture for extended enterprises and demonstrates the validity of this architecture through a case study. It concludes that, in order to create and sustain competitive advantage, collaborative systems should facilitate the management of: the collaborative architecture of the extended enterprise; the extended business processes and the value proposition for each extended enterprise through a meta level management process. It also identifies areas for further research, such as better understanding of: the exact nature and interaction of multiple strategies within an enterprise; how to manage people/teams working along extended business processes; and the nature and prerequisites of the manage processes.
Resumo:
Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.
Resumo:
A phenomenon common to almost all fields is that there is a gap between theory and practical implementation. However, this is a particular problem in knowledge management, where much of the literature consists of general principles written in the context of a ‘knowledge world’ that has few, if any, references to how to carry out knowledge management in organisations. In this chapter, we put forward the view that the best way to bridge this gap between general principles and the specific issues facing a given organisation is to link knowledge management to the organisation’s business processes. After briefly reviewing, and rejecting alternative ways in which this gap might be bridged, the chapter goes on to explain the justification for, and the potential benefits and snags of, linking knowledge management to business processes. Successful and unsuccessful examples are presented. We concentrate especially on the issues of establishing what knowledge is relevant to an organisation at present, the need for organisational learning to cope with the inevitable change, and the additional problems posed by the growing internationalisation of operations. We conclude that linking knowledge management in terms of business processes is the best route for organisations to follow, but that it is not the answer to all knowledge management problems, especially where different cultures and/or cultural change are involved.