893 resultados para Statistical process control
Resumo:
A cognitively based instructional program for narrative writing was developed. The effects of using cognitively based schematic planning organizers at the pre-writing stage were evaluated using subjects from the Primary, Junior and Intermediate divisions. Results indicate that the use of organizers based on problem solving significantly improved the organization and the overall quality of narrative writing for students in grades 3, 6 and 7. The magnitude of the improvement of the treatment group over the control group performance in Organization ranged from 10.7% to 22.9%. Statistical and observational data indicate many implications for further research into the cognitive basis for writing and reading; for the improvement and evaluation of school writing programs; for the design of school curricula; and for the inservice education for teachers of writing.
Resumo:
Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins
Resumo:
This paper analyzes a proposed release controlmethodology, WIPLOAD Control (WIPLCtrl), using a transfer line case modeled by Markov process modeling methodology. The performance of WIPLCtrl is compared with that of CONWIP under 13 system configurations in terms of throughput, average inventory level, as well as average cycle time. As a supplement to the analytical model, a simulation model of the transfer line is used to observe the performance of the release control methodologies on the standard deviation of cycle time. From the analysis, we identify the system configurations in which the advantages of WIPLCtrl could be observed.
Resumo:
Siguiendo un marco teórico integrado por varios autores entorno a los sistemas de control de gestión a lo largo de varias décadas, este trabajo pretende estudiar y contrastar la relación entre el desarrollo de dichos sistemas y los recursos y capacidades. Para tal fin, se desarrolló un estudio de caso en Teleperformance Colombia (TC), una empresa dedicada a prestación de servicio de tercerización de procesos o business process outsourcing. En el estudio se establecieron dos variables para evaluar el desarrollo de sistema de control de gestión: el diseño y el uso. A su vez, para cada uno de ellos, se definieron los indicadores y preguntas que permitieran realizar la observación y posterior análisis. De igual manera, se seleccionaron los recursos y capacidades más importantes para el desarrollo del negocio: innovación, aprendizaje organizacional y capital humano. Sobre estos se validó la existencia de relación con el SCG implementado en TC. La información obtenida fue analizada y contrastada a través de pruebas estadísticas ampliamente utilizadas en este tipo de estudios en las ciencias sociales. Finalmente, se analizaron seis posibles relaciones de las cuales, solamente se ratificó el relacionamiento positivo entre uso de sistema de control gestión y el recurso y capacidad capital humano. El resto de relacionamientos, refutaron los planteamientos teóricos que establecían cierta influencia de los sistemas de control de gestión sobre recursos y capacidades de innovación y aprendizaje organizacional.
Resumo:
A novel algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimisation and Parameter Estimation (DISOPE) which has been designed to achieve the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimisation procedure. A method based on Broyden's ideas is used for approximating some derivative trajectories required. Ways for handling con straints on both manipulated and state variables are described. Further, a method for coping with batch-to- batch dynamic variations in the process, which are common in practice, is introduced. It is shown that the iterative procedure associated with the algorithm naturally suits applications to batch processes. The algorithm is success fully applied to a benchmark problem consisting of the input profile optimisation of a fed-batch fermentation process.
Resumo:
Based on integrated system optimisation and parameter estimation a method is described for on-line steady state optimisation which compensates for model-plant mismatch and solves a non-linear optimisation problem by iterating on a linear - quadratic representation. The method requires real process derivatives which are estimated using a dynamic identification technique. The utility of the method is demonstrated using a simulation of the Tennessee Eastman benchmark chemical process.
Resumo:
In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes.
Resumo:
The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.
Resumo:
In this paper, three single-control charts are proposed to monitor individual observations of a bivariate Poisson process. The specified false-alarm risk, their control limits, and ARLs were determined to compare their performances for different types and sizes of shifts. In most of the cases, the single charts presented better performance rather than two separate control charts ( one for each quality characteristic). A numerical example illustrates the proposed control charts.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.
Resumo:
Traditionally, an (X) over bar chart is used to control the process mean and an R chart is used to control the process variance. However, these charts are not sensitive to small changes in the process parameters. The adaptive ($) over bar and R charts might be considered if the aim is to detect small disturbances. Due to the statistical character of the joint (X) over bar and R charts with fixed or adaptive parameters, they are not reliable in identifing the nature of the disturbance, whether it is one that shifts the process mean, increases the process variance, or leads to a combination of both effects. In practice, the speed with which the control charts detect process changes may be more important than their ability in identifying the nature of the change. Under these circumstances, it seems to be advantageous to consider a single chart, based on only one statistic, to simultaneously monitor the process mean and variance. In this paper, we propose the adaptive non-central chi-square statistic chart. This new chart is more effective than the adaptive (X) over bar and R charts in detecting disturbances that shift the process mean, increase the process variance, or lead to a combination of both effects. Copyright (c) 2006 John Wiley & Sons, Ltd.