939 resultados para Simulation experiments


Relevância:

60.00% 60.00%

Publicador:

Resumo:

La Diabetes mellitus es una enfermedad caracterizada por la insuficiente o nula producción de insulina por parte del páncreas o la reducida sensibilidad del organismo a esta hormona, que ayuda a que la glucosa llegue a los tejidos y al sistema nervioso para suministrar energía. La Diabetes tiene una mayor prevalencia en los países desarrollados debido a múltiples factores, entre ellos la obesidad, la vida sedentaria, y disfunciones en el sistema endocrino relacionadas con el páncreas. La Diabetes Tipo 1 es una enfermedad crónica e incurable, en la que son destruidas las células beta del páncreas, que producen la insulina, haciéndose necesaria la administración de insulina de forma exógena para controlar los niveles de glucosa en sangre. El paciente debe seguir una terapia con insulina administrada por vía subcutánea, que debe estar adaptada a sus necesidades metabólicas y a sus hábitos de vida. Esta terapia intenta imitar el perfil insulínico de un páncreas sano. La tecnología actual permite abordar el desarrollo del denominado “páncreas endocrino artificial” (PEA), que aportaría precisión, eficacia y seguridad en la aplicación de las terapias con insulina y permitiría una mayor independencia de los pacientes frente a su enfermedad, que en la actualidad están sujetos a una constante toma de decisiones. El PEA consta de un sensor continuo de glucosa, una bomba de infusión de insulina y un algoritmo de control, que calcula la insulina a infusionar utilizando los niveles de glucosa del paciente como información principal. Este trabajo presenta una modificación en el método de control en lazo cerrado propuesto en un proyecto previo. El controlador del que se parte está compuesto por un controlador basal booleano y un controlador borroso postprandial basado en reglas borrosas heredadas del controlador basal. El controlador postprandial administra el 50% del bolo manual (calculado a partir de la cantidad de carbohidratos que el paciente va a consumir) en el instante del aviso de la ingesta y reparte el resto en instantes posteriores. El objetivo es conseguir una regulación óptima del nivel de glucosa en el periodo postprandial. Con el objetivo de reducir las hiperglucemias que se producen en el periodo postprandial se realiza un transporte de insulina, que es un adelanto de la insulina basal del periodo postprandial que se suministrará junto con un porcentaje variable del bolo manual. Este porcentaje estará relacionado con el estado metabólico del paciente previo a la ingesta. Además se modificará la base de conocimiento para adecuar el comportamiento del controlador al periodo postprandial. Este proyecto está enfocado en la mejora del controlador borroso postprandial previo, modificando dos aspectos: la inferencia del controlador postprandial y añadiendo una toma de decisiones automática sobre el % del bolo manual y el transporte. Se ha propuesto un controlador borroso con una nueva inferencia, que no hereda las características del controlado basal, y ha sido adaptado al periodo postprandial. Se ha añadido una inferencia borrosa que modifica la cantidad de insulina a administrar en el momento del aviso de ingesta y la cantidad de insulina basal a transportar del periodo postprandial al bolo manual. La validación del algoritmo se ha realizado mediante experimentos en simulación utilizando una población de diez pacientes sintéticos pertenecientes al Simulador de Padua/Virginia, evaluando los resultados con estadísticos para después compararlos con los obtenidos con el método de control anterior. Tras la evaluación de los resultados se puede concluir que el nuevo controlador postprandial, acompañado de la toma de decisiones automática, realiza un mejor control glucémico en el periodo postprandial, disminuyendo los niveles de las hiperglucemias. ABSTRACT. Diabetes mellitus is a disease characterized by the insufficient or null production of insulin from the pancreas or by a reduced sensitivity to this hormone, which helps glucose get to the tissues and the nervous system to provide energy. Diabetes has more prevalence in developed countries due to multiple factors, including obesity, sedentary lifestyle and endocrine dysfunctions related to the pancreas. Type 1 Diabetes is a chronic, incurable disease in which beta cells in the pancreas that produce insulin are destroyed, and exogenous insulin delivery is required to control blood glucose levels. The patient must follow a therapy with insulin administered by the subcutaneous route that should be adjusted to the metabolic needs and lifestyle of the patient. This therapy tries to imitate the insulin profile of a non-pathological pancreas. Current technology can adress the development of the so-called “endocrine artificial pancreas” (EAP) that would provide accuracy, efficacy and safety in the application of insulin therapies and will allow patients a higher level of independence from their disease. Patients are currently tied to constant decision making. The EAP consists of a continuous glucose sensor, an insulin infusion pump and a control algorithm that computes the insulin amount that has to be infused using the glucose as the main source of information. This work shows modifications to the control method in closed loop proposed in a previous project. The reference controller is composed by a boolean basal controller and a postprandial rule-based fuzzy controller which inherits the rules from the basal controller. The postprandial controller administrates 50% of the bolus (calculated from the amount of carbohydrates that the patient is going to ingest) in the moment of the intake warning, and distributes the remaining in later instants. The goal is to achieve an optimum regulation of the glucose level in the postprandial period. In order to reduce hyperglycemia in the postprandial period an insulin transport is carried out. It consists on a feedforward of the basal insulin from the postprandial period, which will be administered with a variable percentage of the manual bolus. This percentage would be linked with the metabolic state of the patient in moments previous to the intake. Furthermore, the knowledge base is going to be modified in order to fit the controller performance to the postprandial period. This project is focused on the improvement of the previous controller, modifying two aspects: the postprandial controller inference, and the automatic decision making on the percentage of the manual bolus and the transport. A fuzzy controller with a new inference has been proposed and has been adapted to the postprandial period. A fuzzy inference has been added, which modifies both the amount of manual bolus to administrate at the intake warning and the amount of basal insulin to transport to the prandial bolus. The algorithm assessment has been done through simulation experiments using a synthetic population of 10 patients in the UVA/PADOVA simulator, evaluating the results with statistical parameters for further comparison with those obtained with the previous control method. After comparing results it can be concluded that the new postprandial controller, combined with the automatic decision making, carries out a better glycemic control in the postprandial period, decreasing levels of hyperglycemia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The relationship between the optimization of the potential function and the foldability of theoretical protein models is studied based on investigations of a 27-mer cubic-lattice protein model and a more realistic lattice model for the protein crambin. In both the simple and the more complicated systems, optimization of the energy parameters achieves significant improvements in the statistical-mechanical characteristics of the systems and leads to foldable protein models in simulation experiments. The foldability of the protein models is characterized by their statistical-mechanical properties--e.g., by the density of states and by Monte Carlo folding simulations of the models. With optimized energy parameters, a high level of consistency exists among different interactions in the native structures of the protein models, as revealed by a correlation function between the optimized energy parameters and the native structure of the model proteins. The results of this work are relevant to the design of a general potential function for folding proteins by theoretical simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose a new identification method based on the residual white noise autoregressive criterion (Pukkila et al. , 1990) to select the order of VARMA structures. Results from extensive simulation experiments based on different model structures with varying number of observations and number of component series are used to demonstrate the performance of this new procedure. We also use economic and business data to compare the model structures selected by this order selection method with those identified in other published studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subsequent to the influential paper of [Chan, K.C., Karolyi, G.A., Longstaff, F.A., Sanders, A.B., 1992. An empirical comparison of alternative models of the short-term interest rate. Journal of Finance 47, 1209-1227], the generalised method of moments (GMM) has been a popular technique for estimation and inference relating to continuous-time models of the short-term interest rate. GMM has been widely employed to estimate model parameters and to assess the goodness-of-fit of competing short-rate specifications. The current paper conducts a series of simulation experiments to document the bias and precision of GMM estimates of short-rate parameters, as well as the size and power of [Hansen, L.P., 1982. Large sample properties of generalised method of moments estimators. Econometrica 50, 1029-1054], J-test of over-identifying restrictions. While the J-test appears to have appropriate size and good power in sample sizes commonly encountered in the short-rate literature, GMM estimates of the speed of mean reversion are shown to be severely biased. Consequently, it is dangerous to draw strong conclusions about the strength of mean reversion using GMM. In contrast, the parameter capturing the levels effect, which is important in differentiating between competing short-rate specifications, is estimated with little bias. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, a novel approach is developed to evaluate the overall performance of a local area network as well as to monitor some possible intrusion detections. The data is obtained via system utility 'ping' and huge data is analyzed via statistical methods. Finally, an overall performance index is defined and simulation experiments in three months proved the effectiveness of the proposed performance index. A software package is developed based on these ideas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computer models, or simulators, are widely used in a range of scientific fields to aid understanding of the processes involved and make predictions. Such simulators are often computationally demanding and are thus not amenable to statistical analysis. Emulators provide a statistical approximation, or surrogate, for the simulators accounting for the additional approximation uncertainty. This thesis develops a novel sequential screening method to reduce the set of simulator variables considered during emulation. This screening method is shown to require fewer simulator evaluations than existing approaches. Utilising the lower dimensional active variable set simplifies subsequent emulation analysis. For random output, or stochastic, simulators the output dispersion, and thus variance, is typically a function of the inputs. This work extends the emulator framework to account for such heteroscedasticity by constructing two new heteroscedastic Gaussian process representations and proposes an experimental design technique to optimally learn the model parameters. The design criterion is an extension of Fisher information to heteroscedastic variance models. Replicated observations are efficiently handled in both the design and model inference stages. Through a series of simulation experiments on both synthetic and real world simulators, the emulators inferred on optimal designs with replicated observations are shown to outperform equivalent models inferred on space-filling replicate-free designs in terms of both model parameter uncertainty and predictive variance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A current EPSRC project, product introduction process: a simulation in the extended enterprise (PIPSEE) is discussed. PIPSEE attempts to improve the execution of the product introduction process (PIP) within an extended enterprise in the aerospace sector. The modus operandi for accomplishing this has been to develop process understanding amongst a core team, spanning four different companies, through process modelling, review and improvement recommendation. In parallel, a web-based simulation capability is being used to conduct simulation experiments, and to disseminate findings by training others in the lessons that have been learned. It is intended that the use of the PIPSEE simulator should encourage radical thinking about the ‘fuzzy front end’ of the PIP. This presents a topical, exciting and challenging research problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The compaction behaviour of powders with soft and hard components is of particular interest to the paint processing industry. Unfortunately, at the present time, very little is known about the internal mechanisms within such systems and therefore suitable tests are required to help in the interpretative process. The TRUBAL, Distinct Element Method (D.E.M.) program was the method of investigation used in this study. Steel (hard) and rubber (soft) particles were used in the randomly-generated, binary assemblies because they provided a sharp contrast in physical properties. For reasons of simplicity, isotropic compression of two-dimensional assemblies was also initially considered. The assemblies were first subject to quasi-static compaction, in order to define their behaviour under equilibrium conditions. The stress-strain behaviour of the assemblies under such conditions was found to be adequately described by a second-order polynomial expansion. The structural evolution of the simulation assemblies was also similar to that observed for real powder systems. Further simulation tests were carried out to investigate the effects of particle size on the compaction behaviour of the two-dimensional, binary assemblies. Later work focused on the quasi-static compaction behaviour of three-dimensional assemblies, because they represented more realistic particle systems. The compaction behaviour of the assemblies during the simulation experiments was considered in terms of percolation theory concepts, as well as more familiar macroscopic and microstructural parameters. Percolation theory, which is based on ideas from statistical physics, has been found to be useful in the interpretation of the mechanical behaviour of simple, elastic lattices. However, from the evidence of this study, percolation theory is also able to offer a useful insight into the compaction behaviour of more realistic particle assemblies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper a methodology for evaluation of information security of objects under attacks, processed by methods of compression, is represented. Two basic parameters for evaluation of information security of objects – TIME and SIZE – are chosen and the characteristics, which reflect on their evaluation, are analyzed and estimated. A co-efficient of information security of object is proposed as a mean of the coefficients of the parameter TIME and SIZE. From the simulation experiments which were carried out methods with the highest co-efficient of information security had been determined. Assessments and conclusions for future investigations are proposed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Intrusion detection is a critical component of security information systems. The intrusion detection process attempts to detect malicious attacks by examining various data collected during processes on the protected system. This paper examines the anomaly-based intrusion detection based on sequences of system calls. The point is to construct a model that describes normal or acceptable system activity using the classification trees approach. The created database is utilized as a basis for distinguishing the intrusive activity from the legal one using string metric algorithms. The major results of the implemented simulation experiments are presented and discussed as well.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The article attempts to answer the question whether or not the latest bankruptcy prediction techniques are more reliable than traditional mathematical–statistical ones in Hungary. Simulation experiments carried out on the database of the first Hungarian bankruptcy prediction model clearly prove that bankruptcy models built using artificial neural networks have higher classification accuracy than models created in the 1990s based on discriminant analysis and logistic regression analysis. The article presents the main results, analyses the reasons for the differences and presents constructive proposals concerning the further development of Hungarian bankruptcy prediction.