32 resultados para Thermo dynamic analysis
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
As one of the most successfully commercialized distributed energy resources, the long-term effects of microturbines (MTs) on the distribution network has not been fully investigated due to the complex thermo-fluid-mechanical energy conversion processes. This is further complicated by the fact that the parameter and internal data of MTs are not always available to the electric utility, due to different ownerships and confidentiality concerns. To address this issue, a general modeling approach for MTs is proposed in this paper, which allows for the long-term simulation of the distribution network with multiple MTs. First, the feasibility of deriving a simplified MT model for long-term dynamic analysis of the distribution network is discussed, based on the physical understanding of dynamic processes that occurred within MTs. Then a three-stage identification method is developed in order to obtain a piecewise MT model and predict electro-mechanical system behaviors with saturation. Next, assisted with the electric power flow calculation tool, a fast simulation methodology is proposed to evaluate the long-term impact of multiple MTs on the distribution network. Finally, the model is verified by using Capstone C30 microturbine experiments, and further applied to the dynamic simulation of a modified IEEE 37-node test feeder with promising results.
Resumo:
Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.
Resumo:
In this paper we seek to shed light on the mismatch between income poverty and deprivation through a comparative and dynamic analysis of both forms of disadvantage. By extending analysis over five waves of the ECHP we are able to take into account the key dimensions characterizing poverty profiles overtime. Our conclusions turn out to be remarkably stable across countries. While persistent income poverty measures are systematically related to both cross-sectional and longitudinal measures of deprivation, the scale of mismatch is no less at the latter than at the former level. There is some evidence that although rates of volatility for income and deprivation measures are roughly similar, the processes of change themselves are somewhat different. Further light is shed on the underlying processes by cross-classifying the forms of deprivation. Those exposed to both types of deprivation are differentiated from others in terms of need and resource variables. Conclusions relating to the socio-demographic influences on risk levels are influenced by choice and combination of indicators. The results of our analysis confirm the need to devote considerably more attention than heretofore to the analysis of multi-dimensional poverty dynamics.
Resumo:
This paper is concerned with the finite element simulation of debonding failures in FRP-strengthened concrete beams. A key challenge for such simulations is that common solution techniques such as the Newton-Raphson method and the arc-length method often fail to converge. This paper examines the effectiveness of using a dynamic analysis approach in such FE simulations, in which debonding failure is treated as a dynamic problem and solved using an appropriate time integration method. Numerical results are presented to show that an appropriate dynamic approach effectively overcomes the convergence problem and provides accurate predictions of test results.
Resumo:
This paper presents the numerical simulation of the ultimate behaviour of 85 one-way and two-way spanning laterally restrained concrete slabs of variable thickness, span, reinforcement ratio, strength and boundary conditions reported in literature by different authors. The developed numerical model was described and all the assumptions were illustrated. ABAQUS, a Finite Element Analysis suite of software, was employed. Non-linear implicit static general analysis method offered by ABAQUS was used. Other analysis methods were also discussed in general in terms of application such as Explicit Dynamic Analysis and Riks method. The aim is to demonstrate the ability and efficacy of FEA to simulate the ultimate load behaviour of slabs considering different material properties and boundary conditions. The authors intended to present a numerical model that provides consistent predictions of the ultimate behaviour of laterally restrained slabs that could be used as an alternative for expensive real life testing as well as for the design and assessment of new and existing structures respectively. The enhanced strength of laterally-restrained slabs compared with conventional design methods predictions is believed to be due to compressive membrane action (CMA). CMA is an inherent phenomenon of laterally restrained concrete beams/slabs. The numerical predictions obtained from the developed model were in good correlation with the experimental results and with those obtained from the CMA method developed at the Queen’s University Belfast, UK.
Resumo:
As a newly invented parallel kinematic machine (PKM), Exechon has attracted intensive attention from both academic and industrial fields due to its conceptual high performance. Nevertheless, the dynamic behaviors of Exechon PKM have not been thoroughly investigated because of its structural and kinematic complexities. To identify the dynamic characteristics of Exechon PKM, an elastodynamic model is proposed with the substructure synthesis technique in this paper. The Exechon PKM is divided into a moving platform subsystem, a fixed base subsystem and three limb subsystems according to its structural features. Differential equations of motion for the limb subsystem are derived through finite element (FE) formulations by modeling the complex limb structure as a spatial beam with corresponding geometric cross sections. Meanwhile, revolute, universal, and spherical joints are simplified into virtual lumped springs associated with equivalent stiffnesses and mass at their geometric centers. Differential equations of motion for the moving platform are derived with Newton's second law after treating the platform as a rigid body due to its comparatively high rigidity. After introducing the deformation compatibility conditions between the platform and the limbs, governing differential equations of motion for Exechon PKM are derived. The solution to characteristic equations leads to natural frequencies and corresponding modal shapes of the PKM at any typical configuration. In order to predict the dynamic behaviors in a quick manner, an algorithm is proposed to numerically compute the distributions of natural frequencies throughout the workspace. Simulation results reveal that the lower natural frequencies are strongly position-dependent and distributed axial-symmetrically due to the structure symmetry of the limbs. At the last stage, a parametric analysis is carried out to identify the effects of structural, dimensional, and stiffness parameters on the system's dynamic characteristics with the purpose of providing useful information for optimal design and performance improvement of the Exechon PKM. The elastodynamic modeling methodology and dynamic analysis procedure can be well extended to other overconstrained PKMs with minor modifications.
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters, or text strings. A key issue with N-gram analysis is feature selection amidst the explosion of features that occurs when N is increased. The experiments within this paper represent programs as operational code (opcode) density histograms gained through dynamic analysis. A support vector machine is used to create a reference model, which is used to evaluate two methods of feature reduction, which are 'area of intersect' and 'subspace analysis using eigenvectors.' The findings show that the relationships between features are complex and simple statistics filtering approaches do not provide a viable approach. However, eigenvector subspace analysis produces a suitable filter.
Resumo:
A postbuckling blade-stiffened composite panel was loaded in uniaxial compression, until failure. During loading beyond initial buckling, this panel was observed to undergo a secondary instability characterised by a dynamic mode shape change. These abrupt changes cause considerable numerical difficulties using standard path-following quasi-static solution procedures in finite element analysis. Improved methods such as the arc-length-related procedures do better at traversing certain critical points along an equilibrium path but these procedures may also encounter difficulties in highly non-linear problems. This paper presents a robust, modified explicit dynamic analysis for the modelling of postbuckling structures. This method was shown to predict the mode-switch with good accuracy and is more efficient than standard explicit dynamic analysis. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
The inherent difficulty of thread-based shared-memory programming has recently motivated research in high-level, task-parallel programming models. Recent advances of Task-Parallel models add implicit synchronization, where the system automatically detects and satisfies data dependencies among spawned tasks. However, dynamic dependence analysis incurs significant runtime overheads, because the runtime must track task resources and use this information to schedule tasks while avoiding conflicts and races.
We present SCOOP, a compiler that effectively integrates static and dynamic analysis in code generation. SCOOP combines context-sensitive points-to, control-flow, escape, and effect analyses to remove redundant dependence checks at runtime. Our static analysis can work in combination with existing dynamic analyses and task-parallel runtimes that use annotations to specify tasks and their memory footprints. We use our static dependence analysis to detect non-conflicting tasks and an existing dynamic analysis to handle the remaining dependencies. We evaluate the resulting hybrid dependence analysis on a set of task-parallel programs.
Resumo:
This paper describes the results of non-linear elasto-plastic implicit dynamic finite element analyses that are used to predict the collapse behaviour of cold-formed steel portal frames at elevated temperatures. The collapse behaviour of a simple rigid-jointed beam idealisation and a more accurate semi-rigid jointed shell element idealisation are compared for two different fire scenarios. For the case of the shell element idealisation, the semi-rigidity of the cold-formed steel joints is explicitly taken into account through modelling of the bolt-hole elongation stiffness. In addition, the shell element idealisation is able to capture buckling of the cold-formed steel sections in the vicinity of the joints. The shell element idealisation is validated at ambient temperature against the results of full-scale tests reported in the literature. The behaviour at elevated temperatures is then considered for both the semi-rigid jointed shell and rigid-jointed beam idealisations. The inclusion of accurate joint rigidity and geometric non-linearity (second order analysis) are shown to affect the collapse behaviour at elevated temperatures. For each fire scenario considered, the importance of base fixity in preventing an undesirable outwards collapse mechanism is demonstrated. The results demonstrate that joint rigidity and varying fire scenarios should be considered in order to allow for conservative design.
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. A key issue with dynamic analysis is the length of time a program has to be run to ensure a correct classification. The motivation for this research is to find the optimum subset of operational codes (opcodes) that make the best indicators of malware and to determine how long a program has to be monitored to ensure an accurate support vector machine (SVM) classification of benign and malicious software. The experiments within this study represent programs as opcode density histograms gained through dynamic analysis for different program run periods. A SVM is used as the program classifier to determine the ability of different program run lengths to correctly determine the presence of malicious software. The findings show that malware can be detected with different program run lengths using a small number of opcodes
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters or text strings. This research uses dynamic analysis to investigate malware detection using a classification approach based on N-gram analysis. The motivation for this research is to find a subset of Ngram features that makes a robust indicator of malware. The experiments within this paper represent programs as N-gram density histograms, gained through dynamic analysis. A Support Vector Machine (SVM) is used as the program classifier to determine the ability of N-grams to correctly determine the presence of malicious software. The preliminary findings show that an N-gram size N=3 and N=4 present the best avenues for further analysis.
Resumo:
This paper presents a study on concrete fracture and the associated mesh sensitivity using the finite element (FE) method with a local concrete model in both tension (Mode I) and compression.To enable the incorporation of dynamic loading, the FE model is developed using a transient dynamic analysis code LS-DYNA Explicit.A series of investigations have been conducted on typical fracture scenarios to evaluate the model performances and calibration of relevant parameters.The K&C damage model was adopted because it is a comprehensive local concrete model which allows the user to change the crack band width, fracture energy and rate dependency of the material.Compressive localisation modelling in numerical modelling is also discussed in detail in relation to localisation.An impact test specimen is modelled.