914 resultados para DYNAMIC-ANALYSIS
Resumo:
This thesis describes the design and implementation of a new dynamic simulator called DASP. It is a computer program package written in standard Fortran 77 for the dynamic analysis and simulation of chemical plants. Its main uses include the investigation of a plant's response to disturbances, the determination of the optimal ranges and sensitivities of controller settings and the simulation of the startup and shutdown of chemical plants. The design and structure of the program and a number of features incorporated into it combine to make DASP an effective tool for dynamic simulation. It is an equation-oriented dynamic simulator but the model equations describing the user's problem are generated from in-built model equation library. A combination of the structuring of the model subroutines, the concept of a unit module, and the use of the connection matrix of the problem given by the user have been exploited to achieve this objective. The Executive program has a structure similar to that of a CSSL-type simulator. DASP solves a system of differential equations coupled to nonlinear algebraic equations using an advanced mixed equation solver. The strategy used in formulating the model equations makes it possible to obtain the steady state solution of the problem using the same model equations. DASP can handle state and time events in an efficient way and this includes the modification of the flowsheet. DASP is highly portable and this has been demonstrated by running it on a number of computers with only trivial modifications. The program runs on a microcomputer with 640 kByte of memory. It is a semi-interactive program, with the bulk of all input data given in pre-prepared data files with communication with the user is via an interactive terminal. Using the features in-built in the package, the user can view or modify the values of any input data, variables and parameters in the model, and modify the structure of the flowsheet of the problem during a simulation session. The program has been demonstrated and verified using a number of example problems.
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. ^ Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. ^ The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures—cash in advance and documentary credit—have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.^
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
In combination of the advantages of both parallel mechanisms and compliant mechanisms, a compliant parallel mechanism with two rotational DOFs (degrees of freedom) is designed to meet the requirement of a lightweight and compact pan-tilt platform. Firstly, two commonly-used design methods i.e. direct substitution and FACT (Freedom and Constraint Topology) are applied to design the configuration of the pan-tilt system, and similarities and differences of the two design alternatives are compared. Then inverse kinematic analysis of the candidate mechanism is implemented by using the pseudo-rigid-body model (PRBM), and the Jacobian related to its differential kinematics is further derived to help designer realize dynamic analysis of the 8R compliant mechanism. In addition, the mechanism’s maximum stress existing within its workspace is tested by finite element analysis. Finally, a method to determine joint damping of the flexure hinge is presented, which aims at exploring the effect of joint damping on actuator selection and real-time control. To the authors’ knowledge, almost no existing literature concerns with this issue.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.
Resumo:
Thesis (Ph.D, Computing) -- Queen's University, 2016-09-30 09:55:51.506
Resumo:
Numerous types of acute respiratory failure are routinely treated using non-invasive ventilatory support (NIV). Its efficacy is well documented: NIV lowers intubation and death rates in various respiratory disorders. It can be delivered by means of face masks or head helmets. Currently the scientific community’s interest about NIV helmets is mostly focused on optimising the mixing between CO2 and clean air and on improving patient comfort. To this end, fluid dynamic analysis plays a particularly important role and a two- pronged approach is frequently employed. While on one hand numerical simulations provide information about the entire flow field and different geometries, they exhibit require huge temporal and computational resources. Experiments on the other hand help to validate simulations and provide results with a much smaller time investment and thus remain at the core of research in fluid dynamics. The aim of this thesis work was to develop a flow bench and to utilise it for the analysis of NIV helmets. A flow test bench and an instrumented mannequin were successfully designed, produced and put into use. Experiments were performed to characterise the helmet interface in terms of pressure drop and flow rate drop over different inlet flow rates and outlet pressure set points. Velocity measurements by means of Particle Image Velocimetry were performed. Pressure drop and flow rate characteristics from experiments were contrasted with CFD data and sufficient agreement was observed between both numerical and experimental results. PIV studies permitted qualitative and quantitative comparisons with numerical simulation data and offered a clear picture of the internal flow behaviour, aiding the identification of coherent flow features.
Resumo:
A base-cutter represented for a mechanism of four bars, was developed using the Autocad program. The normal force of reaction of the profile in the contact point was determined through the dynamic analysis. The equations of dynamic balance were based on the laws of Newton-Euler. The linkage was subject to an optimization technique that considered the peak value of soil reaction force as the objective function to be minimized while the link lengths and the spring constant varied through a specified range. The Algorithm of Sequential Quadratic Programming-SQP was implemented of the program computational Matlab. Results were very encouraging; the maximum value of the normal reaction force was reduced from 4,250.33 to 237.13 N, making the floating process much less disturbing to the soil and the sugarcane rate. Later, others variables had been incorporated the mechanism optimized and new otimization process was implemented .
Resumo:
The spouted and fluidized bed technologies are usually employed in operations of drying, coating and granulation of particles by the chemical and pharmaceutical industries. The use of these techniques in agronomy is limited to the treatment of some species of seeds. In this work, the objective was to analyse the fluid-dynamics of fluidized and spouted beds when broccoli (Brassica oleracea L. var. Italica) seeds are used and also to verify the influence on seed germination after 60 min of seed exposition to spouting or fluidization, at room temperature. The fluid-dynamics was defined by the measurements of the bed pressure drop as a function of the air flow rate for different seeds loads. The experimental conditions were based on the physical properties of the seeds and were limited by the apparatus dimensions. The cone-cylindrical bed was constructed in plexyglass to permit flow visualization. The values of the parameters: maximum pressure drop, minimum spouting flow rate and pressure drop, and stable spout pressure drop were experimentally obtained from the fluid-dynamic analysis and were compared with the values calculated by empirical equations found in the literature. The same procedure was carried out with the fluidized bed and the important parameters for this regime were the air velocity and the bed pressure drop at minimum fluidization. The analysis of seed germination indicated that no damage was caused to the seeds by the spout or fluidization processes.
Resumo:
Este trabalho apresenta uma análise das condições sinótica e dinâmica associadas ao desenvolvimento do ciclone ocorrido entre 12 e 19 de setembro de 2008, com o objetivo de destacar diferenças e semelhanças com o ambiente em que se inseriu o evento Catarina em março de 2004. As principais semelhanças foram encontradas no padrão sinótico geral: a ocorrência de um padrão típico de bloqueio do tipo dipolo associado à anomalia de vorticidade potencial em altos níveis; cavado em níveis médios com inclinação para oeste; a presença de uma coluna de vorticidade ciclônica desde a superfície até a baixa estratosfera; e, em superfície, o padrão de uma alta ao sul de uma baixa pressão. Apesar das semelhanças no padrão geral, diferenças ocorreram entre os dois eventos que influenciaram na intensidade dos sistemas: o Catarina ocorreu em latitudes mais baixas em relação ao caso de setembro de 2008; o padrão típico de bloqueio associado ao caso de setembro de 2008 durou um dia e meio, enquanto no evento Catarina foi de três dias; a configuração da advecção de temperatura na camada entre 1000-500 hPa favoreceu o deslocamento do evento de setembro de 2008 para leste/sudeste, ao contrário do Catarina, a advecção de ar quente a leste do ciclone foi praticamente suprimida e a tendência de altura geopotencial passou a ser positiva, padrões que impedem o deslocamento do sistema para leste; no caso de setembro de 2008 o padrão da inversão do gradiente meridional de temperatura potencial na superfície de -2,0 unidade de vorticidade potencial (UVP) foi caracterizado pela incursão de uma região alongada de ar quente vinda do equador em direção ao sul e ar frio vinda do sul em direção ao equador, enquanto no caso Catarina a inversão ocorre pelo isolamento de uma bolha de ar frio ao norte e uma bolha de ar quente ao sul, o que pode ter contribuído para maior duração do padrão de bloqueio, pois a dissipação neste caso é dificultada. Sistemas como o Catarina podem ser raros no Atlântico Sul, mas isso não ocorre em relação ao ambiente sinótico em que se formou o Catarina. Para melhor entender o processo atmosférico que levou à formação do Catarina, é necessário realizar experimentos numéricos de sensibilidade para o caso de setembro de 2008 com o objetivo de verificar a possibilidade do ciclone extratropical se tornar um ciclone tropical.
Resumo:
Fluid dynamic analysis is an important branch of several chemical engineering related areas, such as drying processes and chemical reactors. However, aspects concerning fluid dynamics in wastewater treatment bioreactors still require further investigation, as they highly influence process efficiency. Therefore, it is essential to evaluate the influence of biofilm on the reactor fluid dynamic behavior, through the analysis of a few important parameters, such as minimum fluidization velocity, bed expansion and porosity, and particle terminal velocity. The main objective of the present work was to investigate the fluid dynamics of an anaerobic fluidized bed reactor, having activated carbon particles as support media for biomass immobilization. Reactor performance was tested using synthetic residual water, which was prepared using the solution employed in BOD determination. The results showed that the presence of immobilized biomass increased particle density and altered the main fluid dynamic parameters investigated.
Resumo:
Mitochondrial membrane carriers containing proline and cysteine, such as adenine nucleotide translocase (ANT), are potential targets of cyclophilin D (CyP-D) and potential Ca(2+)-induced permeability transition pore (PTP) components or regulators; CyP-D, a mitochondrial peptidyl-prolyl cis-trans isomerase, is the probable target of the PTP inhibitor cyclosporine A (CsA). In the present study, the impact of proline isomerization (from trans to cis) on the mitochondrial membrane carriers containing proline and cysteine was addressed using ANT as model. For this purpose, two different approaches were used: (i) Molecular dynamic (MD) analysis of ANT-Cys(56) relative mobility and (ii) light scattering techniques employing rat liver isolated mitochondria to assess both Ca(2+)-induced ANT conformational change and mitochondrial swelling. ANT-Pro(61) isomerization increased ANT-Cys(56) relative mobility and, moreover, desensitized ANT to the prevention of this effect by ADP. In addition, Ca(2+) induced ANT ""c"" conformation and opened PTP; while the first effect was fully inhibited, the second was only attenuated by CsA or ADP. Atractyloside (ATR), in turn, stabilized Ca(2+)-induced ANT ""c"" conformation, rendering the ANT conformational change and PTP opening less sensitive to the inhibition by CsA or ADP. These results suggest that Ca(2+) induces the ANT ""c"" conformation, apparently associated with PTP opening, but requires the CyP-D peptidyl-prolyl cis-trans isomerase activity for sustaining both effects.