963 resultados para Six sigma (Quality control standard)
Resumo:
In this article, the results of a modified SERVQUAL questionnaire (Parasuraman et al., 1991) are reported. The modifications consisted in substituting questionnaire items particularly suited to a specific service (banking) and context (county of Girona, Spain) for the original rather general and abstract items. These modifications led to more interpretable factors which accounted for a higher percentage of item variance. The data were submitted to various structural equation models which made it possible to conclude that the questionnaire contains items with a high measurement quality with respect to five identified dimensions of service quality which differ from those specified by Parasuraman et al. And are specific to the banking service. The two dimensions relating to the behaviour of employees have the greatest predictive power on overall quality and satisfaction ratings, which enables managers to use a low-cost reduced version of the questionnaire to monitor quality on a regular basis. It was also found that satisfaction and overall quality were perfectly correlated thus showing that customers do not perceive these concepts as being distinct
Resumo:
Debido a su naturaleza, los proyectos de tipo tecnológico requieren especial atención en lo que respecta a su gerencia, aspectos tales como su temporalidad exigen del Gerente conocimientos y habilidades específicas que le permitan en un período de tiempo limitado, emplear de forma óptima los recursos y el talento humano en pro del cumplimiento de los objetivos del proyecto.
Resumo:
Dentro de las actividades para el control de calidad en el laboratorio, los resultados finales de un analito en particular son considerados productos intermedios, dada la pertinencia otorgada al aseguramiento de la calidad como fin último de los programas de gestión de la calidad. Esta concepción precisa el establecimiento de instrumentos integrales para la detección de eventos como la contaminación cruzada y la adopción de medidas para evitar que se afecte la marcha analítica. Objetivo: el objetivo principal fue establecer un sistema para el monitoreo y control de la contaminación cruzada en el laboratorio de análisis microbiológico de alimentos. Materiales y métodos: la metodología empleada consistió en desarrollar diagramas de flujo para los procedimientos sobre el control de las poblaciones de mesófilos aerobios y mohos provenientes de la contaminación en los ambientes, superficies, material estéril y medios de cultivos. Dichos diagramas incluyeron un árbol de decisiones, diseñado para efectuar acciones de control con base en los intervalos de tolerancia, establecidos como herramienta objetiva hacia la toma de decisiones que normalicen los recuentos de las poblaciones microbianas en cuestión. Resultados: los límites de alerta más estrictos se obtuvieron para las poblaciones de mesófilos aerobios y mohos en los diferentes controles, excepto para el ambiente del área de preparación de medios y los correspondientes al material estéril. Conclusión: el proceso desarrollado permitió complementar el sistema de control de calidad interno en el laboratorio, al disponer de un medio objetivo para el cierre de no conformidades por contaminación cruzada.
Resumo:
Emplear la seguridad como herramienta de competitividad se traducirá a largo plazo en la eliminación o disminución de todos los gastos innecesarios que se generan por errores, desperdicios, fugas, accidentes, interrupciones no programadas, etc.
Resumo:
Con la creciente popularidad de las soluciones de IT como factor clave para aumentar la competitividad y la creación de valor para las empresas, la necesidad de invertir en proyectos de IT se incrementa considerablemente. La limitación de los recursos como un obstáculo para invertir ha obligado a las empresas a buscar metodologías para seleccionar y priorizar proyectos, asegurándose de que las decisiones que se toman son aquellas que van alineadas con las estrategias corporativas para asegurar la creación de valor y la maximización de los beneficios. Esta tesis proporciona los fundamentos para la implementación del Portafolio de dirección de Proyectos de IT (IT PPM) como una metodología eficaz para la gestión de proyectos basados en IT, y una herramienta para proporcionar criterios claros para los directores ejecutivos para la toma de decisiones. El documento proporciona la información acerca de cómo implementar el IT PPM en siete pasos, el análisis de los procesos y las funciones necesarias para su ejecución exitosa. Además, proporciona diferentes métodos y criterios para la selección y priorización de proyectos. Después de la parte teórica donde se describe el IT PPM, la tesis aporta un análisis del estudio de caso de una empresa farmacéutica. La empresa ya cuenta con un departamento de gestión de proyectos, pero se encontró la necesidad de implementar el IT PPM debido a su amplia cobertura de procesos End-to-End en Proyectos de IT, y la manera de asegurar la maximización de los beneficios. Con la investigación teórica y el análisis del estudio de caso, la tesis concluye con una definición práctica de un modelo aproximado IT PPM como una recomendación para su implementación en el Departamento de Gestión de Proyectos.
Resumo:
Remote sensing from space-borne platforms is often seen as an appealing method of monitoring components of the hydrological cycle, including river discharge, due to its spatial coverage. However, data from these platforms is often less than ideal because the geophysical properties of interest are rarely measured directly and the measurements that are taken can be subject to significant errors. This study assimilated water levels derived from a TerraSAR-X synthetic aperture radar image and digital aerial photography with simulations from a two dimensional hydraulic model to estimate discharge, inundation extent, depths and velocities at the confluence of the rivers Severn and Avon, UK. An ensemble Kalman filter was used to assimilate spot heights water levels derived by intersecting shorelines from the imagery with a digital elevation model. Discharge was estimated from the ensemble of simulations using state augmentation and then compared with gauge data. Assimilating the real data reduced the error between analyzed mean water levels and levels from three gauging stations to less than 0.3 m, which is less than typically found in post event water marks data from the field at these scales. Measurement bias was evident, but the method still provided a means of improving estimates of discharge for high flows where gauge data are unavailable or of poor quality. Posterior estimates of discharge had standard deviations between 63.3 m3s-1 and 52.7 m3s-1, which were below 15% of the gauged flows along the reach. Therefore, assuming a roughness uncertainty of 0.03-0.05 and no model structural errors discharge could be estimated by the EnKF with accuracy similar to that arguably expected from gauging stations during flood events. Quality control prior to assimilation, where measurements were rejected for being in areas of high topographic slope or close to tall vegetation and trees, was found to be essential. The study demonstrates the potential, but also the significant limitations of currently available imagery to reduce discharge uncertainty in un-gauged or poorly gauged basins when combined with model simulations in a data assimilation framework.
Resumo:
Standardisation of microsatellite allele profiles between laboratories is of fundamental importance to the transferability of genetic fingerprint data and the identification of clonal individuals held at multiple sites. Here we describe two methods of standardisation applied to the microsatellite fingerprinting of 429 Theobroma cacao L. trees representing 345 accessions held in the worlds largest Cocoa Intermediate Quarantine facility: the use of a partial allelic ladder through the production of 46 cloned and sequenced allelic standards (AJ748464 to AJ48509), and the use of standard genotypes selected to display a diverse allelic range. Until now a lack of accurate and transferable identification information has impeded efforts to genetically improve the cocoa crop. To address this need, a global initiative to fingerprint all international cocoa germplasm collections using a common set of 15 microsatellite markers is in progress. Data reported here have been deposited with the International Cocoa Germplasm Database and form the basis of a searchable resource for clonal identification. To our knowledge, this is the first quarantine facility to be completely genotyped using microsatellite markers for the purpose of quality control and clonal identification. Implications of the results for retrospective tracking of labelling errors are briefly explored.
Resumo:
This paper reviews the current state of development of both near-infrared (NIR) and mid-infrared (MIR) spectroscopic techniques for process monitoring, quality control, and authenticity determination in cheese processing. Infrared spectroscopy has been identified as an ideal process analytical technology tool, and recent publications have demonstrated the potential of both NIR and MIR spectroscopy, coupled with chemometric techniques, for monitoring coagulation, syneresis, and ripening as well as determination of authenticity, composition, sensory, and rheological parameters. Recent research is reviewed and compared on the basis of experimental design, spectroscopic and chemometric methods employed to assess the potential of infrared spectroscopy as a technology for improving process control and quality in cheese manufacture. Emerging research areas for these technologies, such as cheese authenticity and food chain traceability, are also discussed.
Resumo:
A recently developed capillary electrophoresis (CE)-negative-ionisation mass spectrometry (MS) method was used to profile anionic metabolites in a microbial-host co-metabolism study. Urine samples from rats receiving antibiotics (penicillin G and streptomycin sulfate) for 0, 4, or 8 days were analysed. A quality control sample was measured repeatedly to monitor the performance of the applied CE-MS method. After peak alignment, relative standard deviations (RSDs) for migration time of five representative compounds were below 0.4 %, whereas RSDs for peak area were 7.9–13.5 %. Using univariate and principal component analysis of obtained urinary metabolic profiles, groups of rats receiving different antibiotic treatment could be distinguished based on 17 discriminatory compounds, of which 15 were downregulated and 2 were upregulated upon treatment. Eleven compounds remained down- or upregulated after discontinuation of the antibiotics administration, whereas a recovery effect was observed for others. Based on accurate mass, nine compounds were putatively identified; these included the microbial-mammalian co-metabolites hippuric acid and indoxyl sulfate. Some discriminatory compounds were also observed by other analytical techniques, but CE-MS uniquely revealed ten metabolites modulated by antibiotic exposure, including aconitic acid and an oxocholic acid. This clearly demonstrates the added value of CE-MS for nontargeted profiling of small anionic metabolites in biological samples.
Resumo:
With the growing number and significance of urban meteorological networks (UMNs) across the world, it is becoming critical to establish a standard metadata protocol. Indeed, a review of existing UMNs indicate large variations in the quality, quantity, and availability of metadata containing technical information (i.e., equipment, communication methods) and network practices (i.e., quality assurance/quality control and data management procedures). Without such metadata, the utility of UMNs is greatly compromised. There is a need to bring together the currently disparate sets of guidelines to ensure informed and well-documented future deployments. This should significantly improve the quality, and therefore the applicability, of the high-resolution data available from such networks. Here, the first metadata protocol for UMNs is proposed, drawing on current recommendations for urban climate stations and identified best practice in existing networks
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
Instrumental neutron activation analysis (INAA) was applied to assess trace element concentrations in six samples of aspirin tablets acquired in SA o pound Paulo city, Brazil. Concentrations of the elements Br, Ca, Co, Cr, Fe, K, La, Na, Sc and Zn were determined. Comparisons were made between the results obtained with published data for aspirins from foreign countries. Certified reference materials, INCT-MPH-2 Mixed Polish Herbs were analyzed for quality control of the analytical results.
Resumo:
O objetivo desta pesquisa f oi estudar o atual estágio dos programas de controle de qualidade utilizados, segundo declarações dos entrevistados, por seis empresas de auditoria independentes, com sede ou filial na cidade do Rio de Janeiro, à luz de um referencial teórico previamente definido, de modo a avaliar a qualidade dos serviços prestados por estas empresas, bem como identificar as possíveis diferenças existentes entre os programas utilizados (Capítulo 1). Na revisão da literatura, procurou-se apresentar os programas de controle de qualidade utilizados, teoricamente, por empresas de auditoria em outros países, principalmente nos E.U.A., de modo a definir um quadro de referencial teórico. Apresentou-se, também, algumas consideraç5es a respeito da fiscalização do exercício profissional contábil no Brasil (Capitulo II). A seguir, apresentou-se a metodologia aplicada à pesquisa, justificando-se as razões para seu emprego (Capítulo III). As respostas obtidas durante a realização entrevistas, foram apresentadas através de quadro (I à IV) e notas. As respostas obtidas possibilitaram uma descrição das características gerais das empresas, dos padrões de controle de qualidade utilizados pelas empresas, e das considerações finais sobre empresas (Capítulo IV). As respostas viabilizaram analisar as seis empresas de auditoria pesquisadas, principalmente quanto aos seus padrões de Controle de Qualidade. Esta análise teve como base o quadro de referência teórico definido no capítulo II (Capítulo V). Finalmente, foi apresentado um sumário da pesquisa, as conclusões alcançadas, as recomendações deitas às partes envolvidas no processo de auditoria, e, por último, as sugestões para novas pesquisas (Capítulo VI).
Resumo:
LOPES, Jose Soares Batista et al. Application of multivariable control using artificial neural networks in a debutanizer distillation column.In: INTERNATIONAL CONGRESS OF MECHANICAL ENGINEERING - COBEM, 19, 5-9 nov. 2007, Brasilia. Anais... Brasilia, 2007