79 resultados para Simulation analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in hardware development coupled with the rapid adoption and broad applicability of cloud computing have introduced widespread heterogeneity in data centers, significantly complicating the management of cloud applications and data center resources. This paper presents the CACTOS approach to cloud infrastructure automation and optimization, which addresses heterogeneity through a combination of in-depth analysis of application behavior with insights from commercial cloud providers. The aim of the approach is threefold: to model applications and data center resources, to simulate applications and resources for planning and operation, and to optimize application deployment and resource use in an autonomic manner. The approach is based on case studies from the areas of business analytics, enterprise applications, and scientific computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the overall formation stability of unmanned multi-vehicle is mathematically presented under interconnection topologies. A novel definition of formation error is first given and followed by the proposed formation stability hypothesis. Based on this hypothesis, a unique extension-decomposition-aggregation scheme is then employed to support the stability analysis for the overall multi-vehicle formation under a mesh topology. It is proved that the overall formation control system consisting of N number of nonlinear vehicles is not only asymptotically, but also exponentially stable in the sense of Lyapunov within a neighbourhood of the desired formation. This technique is shown to be applicable for a mesh topology but is equally applicable for other topologies. Simulation study of the formation manoeuvre of multiple Aerosonde UAVs, in 3D-space, is finally carried out verifying the achieved formation stability result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the effects of ground heterogeneity, considering permeability as a random variable, on an intruding SW wedge using Monte Carlo simulations. Random permeability fields were generated, using the method of Local Average Subdivision (LAS), based on a lognormal probability density function. The LAS method allows the creation of spatially correlated random fields, generated using coefficients of variation (COV) and horizontal and vertical scales of fluctuation (SOF). The numerical modelling code SUTRA was employed to solve the coupled flow and transport problem. The well-defined 2D dispersive Henry problem was used as the test case for the method. The intruding SW wedge is defined by two key parameters, the toe penetration length (TL) and the width of mixing zone (WMZ). These parameters were compared to the results of a homogeneous case simulated using effective permeability values. The simulation results revealed: (1) an increase in COV resulted in a seaward movement of TL; (2) the WMZ extended with increasing COV; (3) a general increase in horizontal and vertical SOF produced a seaward movement of TL, with the WMZ increasing slightly; (4) as the anisotropic ratio increased the TL intruded further inland and the WMZ reduced in size. The results show that for large values of COV, effective permeability parameters are inadequate at reproducing the effects of heterogeneity on SW intrusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural ventilation is a sustainable solution to maintaining healthy and comfortable environmental conditions in buildings. However, the effective design, construction and operation of naturally ventilated buildings require a good understanding of complex airflow patterns caused by the buoyancy and wind effects.The work presented in this article employed a 3D computational fluid dynamics (CFD) analysis in order to investigate environmental conditions and thermal comfort of the occupants of a highly-glazed naturally ventilated meeting room. This analysis was facilitated by the real-time field measurements performed in an operating building, and previously developed formal calibration methodology for reliable CFD models of indoor environments. Since, creating an accurate CFD model of an occupied space in a real-life scenario requires a high level of CFD expertise, trusted experimental data and an ability to interpret model input parameters; the calibration methodology guided towards a robust and reliable CFD model of the indoor environment. This calibrated CFD model was then used to investigate indoor environmental conditions and to evaluate thermal comfort indices for the occupants of the room. Thermal comfort expresses occupants' satisfaction with thermal environment in buildings by defining the range of indoor thermal environmental conditions acceptable to a majority of occupants. In this study, the thermal comfort analysis, supported by both field measurements and CFD simulation results, confirmed a satisfactory and optimal room operation in terms of thermal environment for the investigated real-life scenario. © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The liquid structure of pyridine-acetic acid mixtures have been investigated using neutron scattering at various mole fractions of acetic acid, χHOAc = 0.33, 0.50, and 0.67, and compared to the structures of neat pyridine and acetic acid. Data has been modelled using Empirical Potential Structure Refinement (EPSR) with a ‘free proton’ reference model, which has no prejudicial weighting towards either the existence of molecular or ionised species. Analysis of the neutron scattering results shows the existence of hydrogen-bonded acetic acid chains with pyridine inclusions, rather than the formation of an ionic liquid by proton transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite element (FE) simulations provides an inexpensive alternative for material testingof new metal alloys. Carrying out experimental testing is expensive. Nanoindentation is particularly costly due to the equipment needed to work on such a scale. FE simulations provide an inexpensive means of material testing if accurately carried out. This paper will demonstrate the applicability and accuracy of using FE modelling for basic material tests and will propose that the viscoplastic model may be used for nanoindentation testing. The simulations will test the Young’s modulus of materials during analysis when an Abaqus VUMAT is used. The viscoplastic model is incorporated into a subroutine and is tested at the macroscopic scale against previous published results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photovoltaic (PV) solar power generation is proven to be effective and sustainable but is currently hampered by relatively high costs and low conversion efficiency. This paper addresses both issues by presenting a low-cost and efficient temperature distribution analysis for identifying PV module mismatch faults by thermography. Mismatch faults reduce the power output and cause potential damage to PV cells. This paper first defines three fault categories in terms of fault levels, which lead to different terminal characteristics of the PV modules. The investigation of three faults is also conducted analytically and experimentally, and maintenance suggestions are also provided for different fault types. The proposed methodology is developed to combine the electrical and thermal characteristics of PV cells subjected to different fault mechanisms through simulation and experimental tests. Furthermore, the fault diagnosis method can be incorporated into the maximum power point tracking schemes to shift the operating point of the PV string. The developed technology has improved over the existing ones in locating the faulty cell by a thermal camera, providing a remedial measure, and maximizing the power output under faulty conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heat sinks are widely used for cooling electronic devices and systems. Their thermal performance is usually determined by the material, shape, and size of the heat sink. With the assistance of computational fluid dynamics (CFD) and surrogate-based optimization, heat sinks can be designed and optimized to achieve a high level of performance. In this paper, the design and optimization of a plate-fin-type heat sink cooled by impingement jet is presented. The flow and thermal fields are simulated using the CFD simulation; the thermal resistance of the heat sink is then estimated. A Kriging surrogate model is developed to approximate the objective function (thermal resistance) as a function of design variables. Surrogate-based optimization is implemented by adaptively adding infill points based on an integrated strategy of the minimum value, the maximum mean square error approach, and the expected improvement approaches. The results show the influence of design variables on the thermal resistance and give the optimal heat sink with lowest thermal resistance for given jet impingement conditions. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Pedigree reconstruction using genetic analysis provides a useful means to estimate fundamental population biology parameters relating to population demography, trait heritability and individual fitness when combined with other sources of data. However, there remain limitations to pedigree reconstruction in wild populations, particularly in systems where parent-offspring relationships cannot be directly observed, there is incomplete sampling of individuals, or molecular parentage inference relies on low quality DNA from archived material. While much can still be inferred from incomplete or sparse pedigrees, it is crucial to evaluate the quality and power of available genetic information a priori to testing specific biological hypotheses. Here, we used microsatellite markers to reconstruct a multi-generation pedigree of wild Atlantic salmon (Salmo salar L.) using archived scale samples collected with a total trapping system within a river over a 10 year period. Using a simulation-based approach, we determined the optimal microsatellite marker number for accurate parentage assignment, and evaluated the power of the resulting partial pedigree to investigate important evolutionary and quantitative genetic characteristics of salmon in the system.

Results: We show that at least 20 microsatellites (ave. 12 alleles/locus) are required to maximise parentage assignment and to improve the power to estimate reproductive success and heritability in this study system. We also show that 1.5 fold differences can be detected between groups simulated to have differing reproductive success, and that it is possible to detect moderate heritability values for continuous traits (h(2) similar to 0.40) with more than 80% power when using 28 moderately to highly polymorphic markers.

Conclusion: The methodologies and work flow described provide a robust approach for evaluating archived samples for pedigree-based research, even where only a proportion of the total population is sampled. The results demonstrate the feasibility of pedigree-based studies to address challenging ecological and evolutionary questions in free-living populations, where genealogies can be traced only using molecular tools, and that significant increases in pedigree assignment power can be achieved by using higher numbers of markers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a post-CMOS technology, the incipient Quantum-dot Cellular Automata technology has various advantages. A key aspect which makes it highly desirable is low power dissipation. One method that is used to analyse power dissipation in QCA circuits is bit erasure analysis. This method has been applied to analyse previously proposed QCA binary adders. However, a number of improved QCA adders have been proposed more recently that have only been evaluated in terms of area and speed. As the three key performance metrics for QCA circuits are speed, area and power, in this paper, a bit erasure analysis of these adders will be presented to determine their power dissipation. The adders to be analysed are the Carry Flow Adder (CFA), Brent-Kung Adder (B-K), Ladner-Fischer Adder (L-F) and a more recently developed area-delay efficient adder. This research will allow for a more comprehensive comparison between the different QCA adder proposals. To the best of the authors' knowledge, this is the first time power dissipation analysis has been carried out on these adders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines the importance of robust interface management for facilitating finite element analysis workflows. Topological equivalences between analysis model representations are identified and maintained in an editable and accessible manner. The model and its interfaces are automatically represented using an analysis-specific cellular decomposition of the design space. Rework of boundary conditions following changes to the design geometry or the analysis idealization can be minimized by tracking interface dependencies. Utilizing this information with the Simulation Intent specified by an analyst, automated decisions can be made to process the interface information required to rebuild analysis models. Through this work automated boundary condition application is realized within multi-component, multi-resolution and multi-fidelity analysis workflows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Milling is an important operation in many industries, such as mining and pharmaceutical. Although the comminution process during milling has been extensively studied, the material fragmentation mechanisms in a mill are still not well understood partly because of the lack of an understanding on the local stressing and dynamic information under operational conditions in mills. This paper presents a DEM simulation of particle dynamics and impact events in a centrifugal impact pin mill. The main focus is the statistical characteristics of the dominant stressing modes during the milling process. The frequency, velocity and force of the different impact events between particles and mill components, or between particles, are analysed. © 2013 AIP Publishing LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the numerical simulation of the ultimate behaviour of 85 one-way and two-way spanning laterally restrained concrete slabs of variable thickness, span, reinforcement ratio, strength and boundary conditions reported in literature by different authors. The developed numerical model was described and all the assumptions were illustrated. ABAQUS, a Finite Element Analysis suite of software, was employed. Non-linear implicit static general analysis method offered by ABAQUS was used. Other analysis methods were also discussed in general in terms of application such as Explicit Dynamic Analysis and Riks method. The aim is to demonstrate the ability and efficacy of FEA to simulate the ultimate load behaviour of slabs considering different material properties and boundary conditions. The authors intended to present a numerical model that provides consistent predictions of the ultimate behaviour of laterally restrained slabs that could be used as an alternative for expensive real life testing as well as for the design and assessment of new and existing structures respectively. The enhanced strength of laterally-restrained slabs compared with conventional design methods predictions is believed to be due to compressive membrane action (CMA). CMA is an inherent phenomenon of laterally restrained concrete beams/slabs. The numerical predictions obtained from the developed model were in good correlation with the experimental results and with those obtained from the CMA method developed at the Queen’s University Belfast, UK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper employs a unique extension-decomposition-aggregation (EDA) scheme to solve the formation flight control problem for multiple unmanned aerial vehicles (UAVs). The corresponding decentralised longitudinal and lateral formation autopilots are novelly designed to maintain the overall formation stability when encountering changes of the formation error and topologies. The concept of propagation layer number (PLN) is also proposed to provide an intuitive criterion to judge which type of formation topology is more suitable to minimise formation error propagation (FEP). The criterion states that the smaller the PLN of the formation is, the quicker the response to the formation error is. A smaller PLN also means that the resulting topology provides better prevention to the FEP. Simulation studies of formation flight of multiple Aerosonde UAVs demonstrate that the designed formation controller based on the EDA strategy performs satisfactorily in maintaining the overall formation stable, and the bidirectional partial-mesh topology is found to provide the best overall response to the formation error propagation based on the PLN criterion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile normalization (QN) is a technique for microarray data processing and is the default normalization method in the Robust Multi-array Average (RMA) procedure, which was primarily designed for analysing gene expression data from Affymetrix arrays. Given the abundance of Affymetrix microarrays and the popularity of the RMA method, it is crucially important that the normalization procedure is applied appropriately. In this study we carried out simulation experiments and also analysed real microarray data to investigate the suitability of RMA when it is applied to dataset with different groups of biological samples. From our experiments, we showed that RMA with QN does not preserve the biological signal included in each group, but rather it would mix the signals between the groups. We also showed that the Median Polish method in the summarization step of RMA has similar mixing effect. RMA is one of the most widely used methods in microarray data processing and has been applied to a vast volume of data in biomedical research. The problematic behaviour of this method suggests that previous studies employing RMA could have been misadvised or adversely affected. Therefore we think it is crucially important that the research community recognizes the issue and starts to address it. The two core elements of the RMA method, quantile normalization and Median Polish, both have the undesirable effects of mixing biological signals between different sample groups, which can be detrimental to drawing valid biological conclusions and to any subsequent analyses. Based on the evidence presented here and that in the literature, we recommend exercising caution when using RMA as a method of processing microarray gene expression data, particularly in situations where there are likely to be unknown subgroups of samples.