942 resultados para product data management system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

O futuro das empresas de base tecnológica depende essencialmenteda maximização do seu desempenho nas relações que elas estabeleçam com seus mercados, e do equilíbrio entre as suas competências tecnológicas e mercadológicas. Neste contexto, o "braço" de mercado das empresas é constituído pelo marketing, que atua como função maximizadora. Na indústria brasileira de biotecnologia, além de incipiente, ele é desconhecido, constituindo-se num verdadeiro calcanhar-de-Aquiles do acoplamento entre oferta e demanda. Este trabalho retrata, pela primeira vez, o perfil das atividades da indústria brasileira de biotecnologia, centrado nas relações com o mercado e busca avaliar o estágio atual da relação existente entre as competências tecnológicas e as mercadológicas das empresas. Tem por objetivos propiciar informações essenciais e realizar um diagnóstico inicial sobre o tema, para que os personagens envolvidos possam tomar consciência de seus pontos fortes e fracos e, em decorrência, assumir posturas e elaborar planos para um reposicionamento e o desenvolvimento de competências específicas em suas relações com o mercado. A partir de estudo exploratório qualitativo, seguido de estudo descritivo com uma amostra de 54 empresas e dados do primeiro semestre de 1996, é feita a descrição das características gerais relativas à gestão e à tecnologia das empresas, sendo dada atenção especial às atividades de marketing. São analisadas as relações entre as competências técnicas e mercadológicas das empresas, apresentadas as decorrentes implicações, formuladas recomendações à indústria e sugeridas questões para futuras pesquisas. As organizações estudadas atuam essencialmente no mercado industrial e buisinessto- business, mostrando-se como technology push e praticando muito mais uma filosofia de gestão "produtos-orientada" do que "mercado orientada". Exercem somente em parte, as atividades essenciais da função marketing. A relação mercado-tecnologia mostra a supremacia da competência tecnológica sobre a mercadológica, indicando que, para ganhar maior competitividade, a maioria das empresas deve rever seu atual posicionamento, planejar e promover significativo desenvolvimento de competências mercadológicas específicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The management system based on lean thinking has led to significant changes in the companies that have decided to adopt it. Frequently, those changes do not create a sustainable position coherent with the good results that are obtained. Many are the causes being discussed and analyzed, not only in academia, but also by lean manufacturing institutes and companies. The existing practices related to the managerial information system and the implementation, development, and maintenance of the lean journey may be some of such causes. This dissertation emphasizes whether the changes generated by this type of management are being followed by the managerial information system. The development of such system was in many cases based on mass production, whose principles go against lean manufacturing concepts. Thus, the objective of this dissertation is to verify whether the traditional managerial information system can meet the needs of the companies that adopt lean manufacturing principles. Through a case study based on qualitative and exploratory research in an electronic equipment assembling company, the stage of the implementation of lean thinking concepts at the time of the field research was analyzed, as well as whether the metrics, financial and non financial indicators were suitable for lean principles. The aim of such study was to empirically verify if the criticism found in the existing literature was pertinent or not. The data collected through the analysis of the documentation, the interviews with managers and in loco observation was treated through the content analysis method. The main conclusion of the research is that, although the company in question is investing in training and applying lean principles in its production line and in some administrative activities, the current managerial information system does not demonstrate the specific results obtained with lean principles. However, how changes to the managerial information system will be implemented is yet to be determined. Currently, metrics and indicators aligned with lean management are being added to the managerial reports. As more lean tools are employed, mainly with the consolidation of more value streams, the company has already diagnosed the need for new indicators. The main office has started a diagnosis of measurement and control systems in a product line in one of its affiliates with the goal of studying the possibility of applying the so called lean accounting in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method for the evaluation of pavement condition through artificial neural networks using the MLP backpropagation technique. Two of the most used procedures for detecting the pavement conditions were applied: the overall severity index and the irregularity index. Tests with the model demonstrated that the simulation with the neural network gives better results than the procedures recommended by the highway officials. This network may also be applied for the construction of a graphic computer environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the work was to evaluate soil nutrient concentration at 0-5, 5-10, and 10-20 cm in maize (Zea mays L.) grown in sequence with black oats (Avena strigosa Schreb.) under Leucaena diversifolia alley cropping agroforestry system (AFS) and traditional management system/sole crop (without trees-TS), following a randomized block design. The experiment was carried out at the Brazilian Association of Biodynamic Agriculture, in Botucatu, São Paulo, Brazil. The treatments were: control (C), chemical fertilizer (F), biomass of L. diversifolia alley cropping (B), and biomass of L. diversifolia alley cropping + chemical fertilizer (B+F). After 2 yr, it was observed that pH, organic matter, and nutrient content had a tendency to show higher values in the treatments biomass+fertilizer, biomass, and fertilizer application, in both systems. Higher values in pH, organic matter, phosphorus, potassium, calcium, magnesium, sum of bases, cation exchange capacity, percentage base saturation, boron, copper, and manganese tended to occur in the agroforestry system. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is an action research conducted in an industry of consumer goods, presenting a new model of inventory management in the company's processors. This replacement of the inventory management of the company stemmed from the need to reduce the large number of deviations in the product write off in stock, thereby generating a low accuracy and reliability of data on inventories of processors shown by the company's ERP system. Spending on inventory adjustments could thus be reduced with the implementation of the new model, thus generating a cost savings for the company and thus increasing their competitive potential in the market. In the old system adopted by the company, write off raw material inventory was done automatically by the system for customized transactions by the company. However, since the implementation of ERP in the company, the automatic write off based on historical consumption of each product were made in many cases at random, generating a lot of mistakes. The new management system has replaced the automatic system by manual at the time of the return of the processed product in the company, thus creating a control which lots and quantities were consumed in the processing and making the stock shown by the ERP reliable and accurate

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent approaches to the management of product development process (PDP), maturity levels have attracted the attention of practitioners and researchers. The CMMI model contributes to evaluate the maturity levels and improvement of the product development process management. This paper, based on CMMI model, analyzes the practices adopted in two companies of the capital goods industry, which develop and manufacture equipment upon request. It was observed that on account of market conditioning factors and different practices adapted to PDP management, these companies are at different maturity levels. One company is at the initial level of maturity while the other at the most advanced one. It was also noted that the application of CMMI model can provide improvement to PDP management, as well as present guidelines to achieve higher maturity levels, adequate to companies' needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Green Supply Chain Management (GSCM) is gaining prominence in the academy and business, as an approach that aims to promote economic and environmental gains. The GSCM is operated through the Environmental Management System Tools and treated as an Environmental Management System (EMS), involving Reverse Logistics, Green Purchasing, Green Sourcing, Green Design, Green Packaging, Green Operation, Green Manufacturing, Green Innovation and Customer Awareness. The objective of this study is to map the GSCM tools and identify their practice in a consumer goods industry in the Vale do Paraiba. The approach and data collection were made in the company's database chosen as the object of study, as well as through on site visits and interviews. The results showed that the tools Green Operation, Green Manufacturing, Green Innovation and Green Sourcing are applied in the company and just Costumer Awareness tool showed no practice at all. To other tools was identified ideology or interest of the company in applying them

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Green Supply Chain Management (GSCM) is gaining prominence in the academy and business, as an approach that aims to promote economic and environmental gains. The GSCM is operated through the Environmental Management System Tools and treated as an Environmental Management System (EMS), involving Reverse Logistics, Green Purchasing, Green Sourcing, Green Design, Green Packaging, Green Operation, Green Manufacturing, Green Innovation and Customer Awareness. The objective of this study is to map the GSCM tools and identify their practice in a consumer goods industry in the Vale do Paraiba. The approach and data collection were made in the company's database chosen as the object of study, as well as through on site visits and interviews. The results showed that the tools Green Operation, Green Manufacturing, Green Innovation and Green Sourcing are applied in the company and just Costumer Awareness tool showed no practice at all. To other tools was identified ideology or interest of the company in applying them

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Assimilation in the Unstable Subspace (AUS) was introduced by Trevisan and Uboldi in 2004, and developed by Trevisan, Uboldi and Carrassi, to minimize the analysis and forecast errors by exploiting the flow-dependent instabilities of the forecast-analysis cycle system, which may be thought of as a system forced by observations. In the AUS scheme the assimilation is obtained by confining the analysis increment in the unstable subspace of the forecast-analysis cycle system so that it will have the same structure of the dominant instabilities of the system. The unstable subspace is estimated by Breeding on the Data Assimilation System (BDAS). AUS- BDAS has already been tested in realistic models and observational configurations, including a Quasi-Geostrophicmodel and a high dimensional, primitive equation ocean model; the experiments include both fixed and“adaptive”observations. In these contexts, the AUS-BDAS approach greatly reduces the analysis error, with reasonable computational costs for data assimilation with respect, for example, to a prohibitive full Extended Kalman Filter. This is a follow-up study in which we revisit the AUS-BDAS approach in the more basic, highly nonlinear Lorenz 1963 convective model. We run observation system simulation experiments in a perfect model setting, and with two types of model error as well: random and systematic. In the different configurations examined, and in a perfect model setting, AUS once again shows better efficiency than other advanced data assimilation schemes. In the present study, we develop an iterative scheme that leads to a significant improvement of the overall assimilation performance with respect also to standard AUS. In particular, it boosts the efficiency of regime’s changes tracking, with a low computational cost. Other data assimilation schemes need estimates of ad hoc parameters, which have to be tuned for the specific model at hand. In Numerical Weather Prediction models, tuning of parameters — and in particular an estimate of the model error covariance matrix — may turn out to be quite difficult. Our proposed approach, instead, may be easier to implement in operational models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ES]En este artículo se describe la experiencia de la aplicación de técnicas de EDM (clustering) a un curso disponible en la plataforma Ude@ de la Universidad de Antioquia. El objetivo es clasificar los patrones de interacción de los estudiantes a partir de la información almacenada en la base de datos de la plataforma Moodle. Para ello, se generan informes sobre el uso de los recursos y la autoevaluación que permiten analizar el comportamiento y los patrones de navegación de los estudiantes durante el uso del LMS (Learning Management System).