937 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A oportunidade de produção de biomassa microalgal tem despertado interesse pelos diversos destinos que a mesma pode ter, seja na produção de bioenergia, como fonte de alimento ou servindo como produto da biofixação de dióxido de carbono. Em geral, a produção em larga escala de cianobactérias e microalgas é feita com acompanhamento através de análises físicoquímicas offline. Neste contexto, o objetivo deste trabalho foi monitorar a concentração celular em fotobiorreator raceway para produção de biomassa microalgal usando técnicas de aquisição digital de dados e controle de processos, pela aquisição de dados inline de iluminância, concentração de biomassa, temperatura e pH. Para tal fim foi necessário construir sensor baseado em software capaz de determinar a concentração de biomassa microalgal a partir de medidas ópticas de intensidade de radiação monocromática espalhada e desenvolver modelo matemático para a produção da biomassa microalgal no microcontrolador, utilizando algoritmo de computação natural no ajuste do modelo. Foi projetado, construído e testado durante cultivos de Spirulina sp. LEB 18, em escala piloto outdoor, um sistema autônomo de registro de informações advindas do cultivo. Foi testado um sensor de concentração de biomassa baseado na medição da radiação passante. Em uma segunda etapa foi concebido, construído e testado um sensor óptico de concentração de biomassa de Spirulina sp. LEB 18 baseado na medição da intensidade da radiação que sofre espalhamento pela suspensão da cianobactéria, em experimento no laboratório, sob condições controladas de luminosidade, temperatura e fluxo de suspensão de biomassa. A partir das medidas de espalhamento da radiação luminosa, foi construído um sistema de inferência neurofuzzy, que serve como um sensor por software da concentração de biomassa em cultivo. Por fim, a partir das concentrações de biomassa de cultivo, ao longo do tempo, foi prospectado o uso da plataforma Arduino na modelagem empírica da cinética de crescimento, usando a Equação de Verhulst. As medidas realizadas no sensor óptico baseado na medida da intensidade da radiação monocromática passante através da suspensão, usado em condições outdoor, apresentaram baixa correlação entre a concentração de biomassa e a radiação, mesmo para concentrações abaixo de 0,6 g/L. Quando da investigação do espalhamento óptico pela suspensão do cultivo, para os ângulos de 45º e 90º a radiação monocromática em 530 nm apresentou um comportamento linear crescente com a concentração, apresentando coeficiente de determinação, nos dois casos, 0,95. Foi possível construir um sensor de concentração de biomassa baseado em software, usando as informações combinadas de intensidade de radiação espalhada nos ângulos de 45º e 135º com coeficiente de determinação de 0,99. É factível realizar simultaneamente a determinação inline de variáveis do processo de cultivo de Spirulina e a modelagem cinética empírica do crescimento do micro-organismo através da equação de Verhulst, em microcontrolador Arduino.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, the decision analysis in production processes involves a level of detail, in which the problem is subdivided to analyze it in terms of different and conflicting points of view. The multi-criteria analysis has been an important tool that helps assertive decisions related to the production process. This process of analysis has been incorporated into various areas of production engineering, by applying multi-criteria methods in solving the problems of the productive sector. This research presents a statistical study on the use of multi-criteria methods in the areas of Production Engineering, where 935 papers were filtered from 20.663 publications in scientific journals, considering a level of the publication quality based on the impact factor published by the JCR between 2010 and 2015. In this work, the descriptive statistics is used to represent some information and statistical analysis on the volume of applications methods. Relevant results were found with respect to the "amount of advanced methods that are being applied and in which areas related to Production Engineering." This information may provide support to researchers when preparing a multi-criteria application, whereupon it will be possible to check in which issues and how often the other authors have used multi-criteria methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A principios de 1990, los documentalistas comienzan a interesarse en hacer aplicaciones matemáticas y estadísticas en las unidades bibliográficas. F. J. Coles y Nellie B. Eales en 1917 hicieron el primer estudio con un grupo de títulos de documentos cuyo análisis consideraba el país de origen (White, p. 35). En 1923, E. Wyndham Hulme fue la primera persona en usar el término "estadísticas bibliográficas".Y propuso la utilización de métodos estadísticos para tener parámetros que sirvan para conocer el proceso de la comunicación escrita y, la naturaleza y curso del desarrollo de una disciplina. Para lograr ese aspecto empezó contando un número de documentos y analizando varias facetas de la comunicación escrita empleada en ellos (Ferrante, p. 201). En un documento escrito en 1969, Alan Pritchard propuso el término bibliometría para reemplazar el término "estadísticas bibliográficas" empleado por Hulme, argumentando que el, término es ambiguo, no muy descriptivo y que puede ser confundido con las estadísticas puras o estadísticas de bibliografías. El definió el término bibliometría como la aplicación de la matemática y métodos estadísticos a los libros y otros documentos (p. 348-349). Y desde ese momento se ha utilizado este término.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation applies statistical methods to the evaluation of automatic summarization using data from the Text Analysis Conferences in 2008-2011. Several aspects of the evaluation framework itself are studied, including the statistical testing used to determine significant differences, the assessors, and the design of the experiment. In addition, a family of evaluation metrics is developed to predict the score an automatically generated summary would receive from a human judge and its results are demonstrated at the Text Analysis Conference. Finally, variations on the evaluation framework are studied and their relative merits considered. An over-arching theme of this dissertation is the application of standard statistical methods to data that does not conform to the usual testing assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new tuning methodology of the main controller of an internal model control structure for n×n stable multivariable processes with multiple time delays based on the centralized inverted decoupling structure. Independently of the system size, very simple general expressions for the controller elements are obtained. The realizability conditions are provided and the specification of the closed-loop requirements is explained. A diagonal filter is added to the proposed control structure in order to improve the disturbance rejection without modifying the nominal set-point response. The effectiveness of the method is illustrated through different simulation examples in comparison with other works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of this study was to determine the impact of innovation on productivity in service sector companies — especially those in the hospitality sector — that value the reduction of environmental impact as relevant to the innovation process. We used a structural analysis model based on the one developed by Crépon, Duguet, and Mairesse (1998). This model is known as the CDM model (an acronym of the authors’ surnames). These authors developed seminal studies in the field of the relationships between innovation and productivity (see Griliches 1979; Pakes and Grilliches 1980). The main advantage of the CDM model is its ability to integrate the process of innovation and business productivity from an empirical perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O processo de employee branding tem demonstrado promover e reforçar o contrato psicológico entre os colaboradores e a organização, pelo incremento e potenciação do sentimento de comprometimento e lealdade do colaborador de acordo com Miles e Mangold. Esta investigação foca-se no estudo do impacto da mentoria e relações de ajuda no processo de employee branding, numa visão integrada da gestão de recursos humanos e do comportamento organizacional, com base nas relações de troca do marketing do relacionamento numa perspetiva da gestão por competências e foco nas Pessoas. Com a introdução da nova variável (mentoria e relações de ajuda) esta investigação, ao enriquecer e incrementar o processo de employee branding de Miles e Mangold proposto em 2004 e 2005, apresenta a construção de um instrumento de diagnóstico do inovador processo de Efeito de Marca de Empregado. A investigação decorreu em 30 organizações, com um total de 725 questionários, que permitiu a validação e fiabilidade do instrumento, bem como demonstra através de métodos estatísticos a influência das ações de mentoria e relações de ajuda e da atuação das relações interpessoais que promovem o processo employee branding. Se o processo de employee branding já incrementava os resultados organizacionais, com esta investigação, pode-se afirmar que o processo de Efeito de Marca de Empregado não só incrementa como também impulsiona a imagem de marca da organização pela atuação dinâmica e catalisadora das relações interpessoais dos seus colaboradores, dentro e fora da organização, com a introdução e promoção de ações mentoria e relações de ajuda entre chefias e chefiados; ABSTRACT: The employee branding process has shown as to promote and strengthen the psychological contract between employees and the organization, by increasing and maximizing the sense of employees’ commitment and loyalty, according with Miles and Mangold. This research focuses on the impact of mentoring and helping relationships in the employee branding process, in an integrated view of human resources management and organizational behavior, based on the exchange ratio of the relationship marketing in a perspective of management by competencies and people focused approach. With the introduction of the new variable (mentoring and helping relationships), this research enriches and enhances the Miles and Mangold employee branding process proposed in 2004 and 2005 and presents the construction of an diagnostic instrument for the innovative process of Employee Brand Effect. This research took place in 30 organizations with a total of 725 questionnaires, which allowed the validation and reliability of the instrument and the evidence through statistical methods of the influence of mentoring and helping relationships actions and of the interaction of interpersonal relationships promoting the employee branding process. If the employee branding process was already boosting organizational results with this research, it can be stated that the Employee Brand Effect process not only increases, but also boosts the organization’s brand image by the actuation of the dynamics of employees’ interpersonal relationships, inside and outside the organization, with the introduction and promotion of mentoring and helping relationships actions between leaders and followers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to present new results on H-infinity control synthesis for time-delay linear systems. We extend the use of a finite order LTI system, called comparison system to H-infinity analysis and design. Differently from what can be viewed as a common feature of other control design methods available in the literature to date, the one presented here treats time-delay systems control design with classical numeric routines based on Riccati equations arisen from H-infinity theory. The proposed algorithm is simple, efficient and easy to implement. Some examples illustrating state and output feedback design are solved and discussed in order to put in evidence the most relevant characteristic of the theoretical results. Moreover, a practical application involving a 3-DOF networked control system is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, the oil industry is the biggest cause of environmental pollution. The objective was to reduce the concentration of copper and chromium in the water produced by the oil industry. It was used as adsorbent natural sisal fiber Agave sp treated with nitric acid and sodium hydroxide. All vegetable fibers have physical and morphological properties that enablies the adsorption of pollutants. The basic composition of sisal is cellulose, hemicellulose and lignin. The features are typically found in the characterization of vegetable fibers, except the surface area that was practically zero. In the first stage of adsorption, it was evaluated the effect of temperature and time skeeking to optimize the execution of the factorial design. The results showed that the most feasible fiber was the one treated with acid in five hours (30°C). The second phase was a factorial design, using acid and five hours, this time was it determined in the first phase. The tests were conducted following the experimental design and the results were analyzed by statistical methods in order to optimize the main parameters that influence the process: pH, concentration (mol / L) and fiber mass/ metal solution volume. The volume / mass ratio factor showed significant interference in the adsorption process of chromium and copper. The results obtained after optimization showed that the highest percentages of extraction (98%) were obtained on the following operating conditions: pH: 5-6, Concentration: 100 ppm and mass/ volume: 1 gram of fiber/50mL solution. The results showed that the adsorption process was efficient to remove chromium and copper using sisal fibers, however, requiring further studies to optimize the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research develops an econometric framework to analyze time series processes with bounds. The framework is general enough that it can incorporate several different kinds of bounding information that constrain continuous-time stochastic processes between discretely-sampled observations. It applies to situations in which the process is known to remain within an interval between observations, by way of either a known constraint or through the observation of extreme realizations of the process. The main statistical technique employs the theory of maximum likelihood estimation. This approach leads to the development of the asymptotic distribution theory for the estimation of the parameters in bounded diffusion models. The results of this analysis present several implications for empirical research. The advantages are realized in the form of efficiency gains, bias reduction and in the flexibility of model specification. A bias arises in the presence of bounding information that is ignored, while it is mitigated within this framework. An efficiency gain arises, in the sense that the statistical methods make use of conditioning information, as revealed by the bounds. Further, the specification of an econometric model can be uncoupled from the restriction to the bounds, leaving the researcher free to model the process near the bound in a way that avoids bias from misspecification. One byproduct of the improvements in model specification is that the more precise model estimation exposes other sources of misspecification. Some processes reveal themselves to be unlikely candidates for a given diffusion model, once the observations are analyzed in combination with the bounding information. A closer inspection of the theoretical foundation behind diffusion models leads to a more general specification of the model. This approach is used to produce a set of algorithms to make the model computationally feasible and more widely applicable. Finally, the modeling framework is applied to a series of interest rates, which, for several years, have been constrained by the lower bound of zero. The estimates from a series of diffusion models suggest a substantial difference in estimation results between models that ignore bounds and the framework that takes bounding information into consideration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivo: Determinar diferencias en las impedancias basales registradas durante los procedimientos de denervación renal por radiofrecuencia de los pacientes sometidos a este procedimiento en la Fundación Cardioinfantil de Bogotá durante los años 2012 a 2015. Materiales y métodos: Estudio observacional, analítico de corte retrospectivo, donde se analizaron todas las impedancias basales medidas durante los procedimientos de denervación renal, buscando diferencias significativas entre los segmentos de las arterias intervenidas, estratificados en proximal, medio distal y superior, lateral, inferior u ostial. Con seguimiento a los pacientes a tres, seis y doce meses en cuanto a presión arterial de consultorio. Resultados: Se evaluaron 150 puntos de denervación renal exitosos, correspondientes a 23 arterias renales de 11 procedimientos. La mediana de edad fue 56 años. Al realizar un modelo de regresión lineal no se encontró ninguna diferencia estadísticamente significativa entre las impedancias de ninguno de los segmentos de las arterias ni sitios anatómicos. Se documentó disminución de presión arterial sistólica a tres meses, seis meses y doce meses de 14 mmHg (RIQ 10-33mmHg), 21 mmHg (RIQ 12-42mmHg) y 19 mmHg (RIQ 11-42 mmHg) respectivamente

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resumen Introducción: El uso de la voz profesional requiere de una técnica y medidas de conservación para no verse afectada. Un inadecuado esquema corporal en el profesional de la voz, ocasiona alteración en los parámetros respiratorios y vocales manifestándose como disfonía. Objetivo: Determinar la prevalencia y caracterización de disfonía en 200 tele operadores de un call Center en Bogotá Colombia. Métodos: Estudio de corte transversal con datos secundarios provenientes de una base de datos con registros de una población de 200 tele operadores de un call center en Bogotá, Colombia, a los cuales se les aplicó evaluación de respiración y de voz durante el año 2003. La estimación de la prevalencia de la disfonía se realizó a través de la distribución de frecuencias relativas. Se realizó caracterización de la población estudiada según variables sociodemográficas, ocupacionales y parámetros respiratorios y vocales a través de métodos estadísticos según la naturaleza de estas variables. Se determinó la asociación entre factores ambientales, síntomas asociados, síntomas vocales, perfil vocal de Wilson y disfonía mediante la prueba de Chi Cuadrado de Pearson. Resultados: la prevalencia de disfonía fue del 73% (n= 146), el 34% presentó grado de disfonía moderado. Los resultados obtenidos en los parámetros de evaluación vocal se encontraron dentro del rango de normalidad, analizados en forma individual (tono, rango intensidad) y se relacionan con los resultados obtenidos con la prevalencia disfonía. El 95,5% de los tele operadores los parámetros respiratorios se encontraron alterados. Los tele operadores con disfonía en comparación a los sin disfonía tuvieron mayor frecuencia de presentación de los siguientes factores ambientales: ruido (68% vs 50,9% p=0,03) y vapores (27,2% vs 11,3% p= 0,02), síntomas corporales y de la voz respectivamente: cuello (69,4% vs 54,7% p= 0,05), dolor en la laringe (19,7% vs 7,5% p= 0,04). Conclusión: La prevalencia de disfonía encontrada en este call center fue alta Lo que requiere la implementación de medidas de prevención como tamizajes acústicos, para hacer seguimiento a las cualidades de la voz más afectadas, entrenamiento respiratorio y vocal, pausas vocales y medidas de conservación de la voz, para contribuir a que el tele operador desarrolle un mejor manejo de sus cualidades vocales acorde con su uso y disminuir la prevalencia de disfonía.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: El Cáncer es prevenible en algunos casos, si se evita la exposición a sustancias cancerígenas en el medio ambiente. En Colombia, Cundinamarca es uno de los departamentos con mayores incrementos en la tasa de mortalidad y en el municipio de Sibaté, habitantes han manifestado preocupación por el incremento de la enfermedad. En el campo de la salud ambiental mundial, la georreferenciación aplicada al estudio de fenómenos en salud, ha tenido éxito con resultados válidos. El estudio propuso usar herramientas de información geográfica, para generar análisis de tiempo y espacio que hicieran visible el comportamiento del cáncer en Sibaté y sustentaran hipótesis de influencias ambientales sobre concentraciones de casos. Objetivo: Obtener incidencia y prevalencia de casos de cáncer en habitantes de Sibaté y georreferenciar los casos en un periodo de 5 años, con base en indagación de registros. Metodología: Estudio exploratorio descriptivo de corte transversal,sobre todos los diagnósticos de cáncer entre los años 2010 a 2014, encontrados en los archivos de la Secretaria de Salud municipal. Se incluyeron unicamente quienes tuvieron residencia permanente en el municipio y fueron diagnosticados con cáncer entre los años de 2010 a 2104. Sobre cada caso se obtuvo género, edad, estrato socioeconómico, nivel académico, ocupación y estado civil. Para el análisis de tiempo se usó la fecha de diagnóstico y para el análisis de espacio, la dirección de residencia, tipo de cáncer y coordenada geográfica. Se generaron coordenadas geográficas con un equipo GPS Garmin y se crearon mapas con los puntos de la ubicación de las viviendas de los pacientes. Se proceso la información, con Epi Info 7 Resultados: Se encontraron 107 casos de cáncer registrados en la Secretaria de Salud de Sibaté, 66 mujeres, 41 hombres. Sin división de género, el 30.93% de la población presento cáncer del sistema reproductor, el 18,56% digestivo y el 17,53% tegumentario. Se presentaron 2 grandes casos de agrupaciones espaciales en el territorio estudiado, una en el Barrio Pablo Neruda con 12 (21,05%) casos y en el casco Urbano de Sibaté con 38 (66,67%) casos. Conclusión: Se corroboro que el análisis geográfico con variables espacio temporales y de exposición, puede ser la herramienta para generar hipótesis sobre asociaciones de casos de cáncer con factores ambientales.