904 resultados para Thermo dynamic analysis
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
In combination of the advantages of both parallel mechanisms and compliant mechanisms, a compliant parallel mechanism with two rotational DOFs (degrees of freedom) is designed to meet the requirement of a lightweight and compact pan-tilt platform. Firstly, two commonly-used design methods i.e. direct substitution and FACT (Freedom and Constraint Topology) are applied to design the configuration of the pan-tilt system, and similarities and differences of the two design alternatives are compared. Then inverse kinematic analysis of the candidate mechanism is implemented by using the pseudo-rigid-body model (PRBM), and the Jacobian related to its differential kinematics is further derived to help designer realize dynamic analysis of the 8R compliant mechanism. In addition, the mechanism’s maximum stress existing within its workspace is tested by finite element analysis. Finally, a method to determine joint damping of the flexure hinge is presented, which aims at exploring the effect of joint damping on actuator selection and real-time control. To the authors’ knowledge, almost no existing literature concerns with this issue.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.
Resumo:
Thesis (Ph.D, Computing) -- Queen's University, 2016-09-30 09:55:51.506
Resumo:
As a part of vital infrastructure and transportation networks, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs always make the infrastructure owners difficult to undertake. Structural health monitoring (SHM) is set to assess condition and foresee probable failures of designated bridge(s), so as to monitor the structural health of the bridges. The SHM systems proposed recently are incorporated with Vibration-Based Damage Detection (VBDD) techniques, Statistical Methods and Signal processing techniques and have been regarded as efficient and economical ways to solve the problem. The recent development in damage detection and condition assessment techniques based on VBDD and statistical methods are reviewed. The VBDD methods based on changes in natural frequencies, curvature/strain modes, modal strain energy (MSE) dynamic flexibility, artificial neural networks (ANN) before and after damage and other signal processing methods like Wavelet techniques and empirical mode decomposition (EMD) / Hilbert spectrum methods are discussed here.
Resumo:
For the last two decades heart disease has been the highest single cause of death for the human population. With an alarming number of patients requiring heart transplant, and donations not able to satisfy the demand, treatment looks to mechanical alternatives. Rotary Ventricular Assist Devices, VADs, are miniature pumps which can be implanted alongside the heart to assist its pumping function. These constant flow devices are smaller, more efficient and promise a longer operational life than more traditional pulsatile VADs. The development of rotary VADs has focused on single pumps assisting the left ventricle only to supply blood for the body. In many patients however, failure of both ventricles demands that an additional pulsatile device be used to support the failing right ventricle. This condition renders them hospital bound while they wait for an unlikely heart donation. Reported attempts to use two rotary pumps to support both ventricles concurrently have warned of inherent haemodynamic instability. Poor balancing of the pumps’ flow rates quickly leads to vascular congestion increasing the risk of oedema and ventricular ‘suckdown’ occluding the inlet to the pump. This thesis introduces a novel Bi-Ventricular Assist Device (BiVAD) configuration where the pump outputs are passively balanced by vascular pressure. The BiVAD consists of two rotary pumps straddling the mechanical passive controller. Fluctuations in vascular pressure induce small deflections within both pumps adjusting their outputs allowing them to maintain arterial pressure. To optimise the passive controller’s interaction with the circulation, the controller’s dynamic response is optimised with a spring, mass, damper arrangement. This two part study presents a comprehensive assessment of the prototype’s ‘viability’ as a support device. Its ‘viability’ was considered based on its sensitivity to pathogenic haemodynamics and the ability of the passive response to maintain healthy circulation. The first part of the study is an experimental investigation where a prototype device was designed and built, and then tested in a pulsatile mock circulation loop. The BiVAD was subjected to a range of haemodynamic imbalances as well as a dynamic analysis to assess the functionality of the mechanical damper. The second part introduces the development of a numerical program to simulate human circulation supported by the passively controlled BiVAD. Both investigations showed that the prototype was able to mimic the native baroreceptor response. Simulating hypertension, poor flow balancing and subsequent ventricular failure during BiVAD support allowed the passive controller’s response to be assessed. Triggered by the resulting pressure imbalance, the controller responded by passively adjusting the VAD outputs in order to maintain healthy arterial pressures. This baroreceptor-like response demonstrated the inherent stability of the auto regulating BiVAD prototype. Simulating pulmonary hypertension in the more observable numerical model, however, revealed a serious issue with the passive response. The subsequent decrease in venous return into the left heart went unnoticed by the passive controller. Meanwhile the coupled nature of the passive response not only decreased RVAD output to reduce pulmonary arterial pressure, but it also increased LVAD output. Consequently, the LVAD increased fluid evacuation from the left ventricle, LV, and so actually accelerated the onset of LV collapse. It was concluded that despite the inherently stable baroreceptor-like response of the passive controller, its lack of sensitivity to venous return made it unviable in its present configuration. The study revealed a number of other important findings. Perhaps the most significant was that the reduced pulse experienced during constant flow support unbalanced the ratio of effective resistances of both vascular circuits. Even during steady rotary support therefore, the resulting ventricle volume imbalance increased the likelihood of suckdown. Additionally, mechanical damping of the passive controller’s response successfully filtered out pressure fluctuations from residual ventricular function. Finally, the importance of recognising inertial contributions to blood flow in the atria and ventricles in a numerical simulation were highlighted. This thesis documents the first attempt to create a fully auto regulated rotary cardiac assist device. Initial results encourage development of an inlet configuration sensitive to low flow such as collapsible inlet cannulae. Combining this with the existing baroreceptor-like response of the passive controller will render a highly stable passively controlled BiVAD configuration. The prototype controller’s passive interaction with the vasculature is a significant step towards a highly stable new generation of artificial heart.
Resumo:
This paper presents a new DC-DC Multi-Output Boost (MOB) converter which can share its total output between different series of output voltages for low and high power applications. This configuration can be utilised instead of several single output power supplies. This is a compatible topology for a diode-clamed inverter in the grid connection systems, where boosting low rectified output-voltage and series DC link capacitors is required. To verify the proposed topology, steady state and dynamic analysis of a MOB converter are examined. A simple control strategy has been proposed to demonstrate the performance of the proposed topology for a double-output boost converter. The topology and its control strategy can easily be extended to offer multiple outputs. Simulation and experimental results are presented to show the validity of the control strategy for the proposed converter.
Resumo:
Increased industrialisation has brought to the forefront the susceptibility of concrete columns in both buildings and bridges to vehicle impacts. Accurate vulnerability assessments are crucial in the design process due to possible catastrophic nature of the failures that can cause. This paper reports on research undertaken to investigate the impact capacity of the columns of low to medium raised building designed according to Australian Standards. Numerical simulation techniques were used in the process and validation was done by using experimental results published in the literature. The investigation thus far has confirmed that vulnerability of typical columns in five story buildings located in urban areas to medium velocity car impacts and hence these columns need to be re-designed (if possible) or retrofitted. In addition, accuracy of the simplified method presented in EN 1991 to quantify the impact damage was scrutinised. A simplified concept to assess the damage due to all collisions modes was introduced. The research information will be extended to generate a common data base to assess the vulnerability of columns in urban areas against new generation of vehicles.
Resumo:
Structural health is a vital aspect of infrastructure sustainability. As a part of a vital infrastructure and transportation network, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs are a difficult burden for infrastructure owners. The structural health monitoring (SHM) systems proposed recently are incorporated with vibration-based damage detection techniques, statistical methods and signal processing techniques and have been regarded as efficient and economical ways to assess bridge condition and foresee probable costly failures. In this chapter, the recent developments in damage detection and condition assessment techniques based on vibration-based damage detection and statistical methods are reviewed. The vibration-based damage detection methods based on changes in natural frequencies, curvature or strain modes, modal strain energy, dynamic flexibility, artificial neural networks, before and after damage, and other signal processing methods such as Wavelet techniques, empirical mode decomposition and Hilbert spectrum methods are discussed in this chapter.
Resumo:
Increased industrialisation has brought to the forefront the susceptibility of concrete columns in both buildings and bridges to vehicle impacts. Accurate vulnerability assessments are crucial in the design process due to possible catastrophic nature of the failures that can cause. This chapter reports on research undertaken to investigate the impact capacity of the columns of low to medium raised building designed according to the Australian standards. Numerical simulation techniques were used in the process and validation was done by using experimental results published in the literature. The investigation thus far has confirmed that vulnerability of typical columns in five story buildings located in urban areas to medium velocity car impacts and hence these columns need to be re-designed or retrofitted. In addition, accuracy of the simplified method presented in EN 1991-1-7 to quantify the impact damage was scrutinised. A simplified concept to assess the damage due to all collisions modes was introduced. The research information will be extended to generate a common data base to assess the vulnerability of columns in urban areas against new generation of vehicles.
Resumo:
This study explores three-dimensional nonlineardynamic responses of typical tall buildings with and without setbacks under blast loading. These 20 storey reinforced concrete buildings have been designed for normal (dead, live and wind)loads. The influence of the setbacks on the lateral load response due to blasts in terms of peak deflections, accelerations, inter-storey drift and bending moments at critical locations (including hinge formation) were investigated. Structural response predictions were performed with a commercially available three-dimensional finite element analysis programme using non-linear direct integration time history analyses. Results obtained for buildings with different setbacks were compared and conclusions made. The comparisons revealed that buildings have setbacks that protect the tower part above the setback level from blast loading show considerably better response in terms of peak displacement and interstorey drift, when compared to buildings without setbacks. Rotational accelerations were found to depend on the periods of the rotational modes. Abrupt changes in moments and shears are experienced near the levels of the setbacks. Typical twenty storey tall buildings with shear walls and frames that are designed for only normaln loads perform reasonably well, without catastrophic collapse, when subjected to a blast that is equivalent to 500 kg TNT at a standoff distance of 10 m.
Resumo:
Columns are one of the key load bearing elements that are highly susceptible to vehicle impacts. The resulting severe damages to columns may leads to failures of the supporting structure that are catastrophic in nature. However, the columns in existing structures are seldom designed for impact due to inadequacies of design guidelines. The impact behaviour of columns designed for gravity loads and actions other than impact is, therefore, of an interest. A comprehensive investigation is conducted on reinforced concrete column with a particular focus on investigating the vulnerability of the exposed columns and to implement mitigation techniques under low to medium velocity car and truck impacts. The investigation is based on non-linear explicit computer simulations of impacted columns followed by a comprehensive validation process. The impact is simulated using force pulses generated from full scale vehicle impact tests. A material model capable of simulating triaxial loading conditions is used in the analyses. Circular columns adequate in capacity for five to twenty story buildings, designed according to Australian standards are considered in the investigation. The crucial parameters associated with the routine column designs and the different load combinations applied at the serviceability stage on the typical columns are considered in detail. Axially loaded columns are examined at the initial stage and the investigation is extended to analyse the impact behaviour under single axis bending and biaxial bending. The impact capacity reduction under varying axial loads is also investigated. Effects of the various load combinations are quantified and residual capacity of the impacted columns based on the status of the damage and mitigation techniques are also presented. In addition, the contribution of the individual parameter to the failure load is scrutinized and analytical equations are developed to identify the critical impulses in terms of the geometrical and material properties of the impacted column. In particular, an innovative technique was developed and introduced to improve the accuracy of the equations where the other techniques are failed due to the shape of the error distribution. Above all, the equations can be used to quantify the critical impulse for three consecutive points (load combinations) located on the interaction diagram for one particular column. Consequently, linear interpolation can be used to quantify the critical impulse for the loading points that are located in-between on the interaction diagram. Having provided a known force and impulse pair for an average impact duration, this method can be extended to assess the vulnerability of columns for a general vehicle population based on an analytical method that can be used to quantify the critical peak forces under different impact durations. Therefore the contribution of this research is not only limited to produce simplified yet rational design guidelines and equations, but also provides a comprehensive solution to quantify the impact capacity while delivering new insight to the scientific community for dealing with impacts.