902 resultados para Computational time


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A 3D model of melt pool created by a moving arc type heat sources has been developed. The model solves the equations of turbulent fluid flow, heat transfer and electromagnetic field to demonstrate the flow behaviour phase-change in the pool. The coupled effects of buoyancy, capillary (Marangoni) and electromagnetic (Lorentz) forces are included within an unstructured finite volume mesh environment. The movement of the welding arc along the workpiece is accomplished via a moving co-ordinator system. Additionally a method enabling movement of the weld pool surface by fluid convection is presented whereby the mesh in the liquid region is allowed to move through a free surface. The surface grid lines move to restore equilibrium at the end of each computational time step and interior grid points then adjust following the solution of a Laplace equation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aircraft fuselages are complex assemblies of thousands of components and as a result simulation models are highly idealised. In the typical design process, a coarse FE model is used to determine loads within the structure. The size of the model and number of load cases necessitates that only linear static behaviour is considered. This paper reports on the development of a modelling approach to increase the accuracy of the global model, accounting for variations in stiffness due to non-linear structural behaviour. The strategy is based on representing a fuselage sub-section with a single non-linear element. Large portions of fuselage structure are represented by connecting these non-linear elements together to form a framework. The non-linear models are very efficient, reducing computational time significantly

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we present an investigation into using fuzzy methodologies to guide the construction of high quality feasible examination timetabling solutions. The provision of automated solutions to the examination timetabling problem is achieved through a combination of construction and improvement. The enhancement of solutions through the use of techniques such as metaheuristics is, in some cases, dependent on the quality of the solution obtained during the construction process. With a few notable exceptions, recent research has concentrated on the improvement of solutions as opposed to focusing on investigating the ‘best’ approaches to the construction phase. Addressing this issue, our approach is based on combining multiple criteria in deciding on how the construction phase should proceed. Fuzzy methods were used to combine three single construction heuristics into three different pair wise combinations of heuristics in order to guide the order in which exams were selected to be inserted into the timetable solution. In order to investigate the approach, we compared the performance of the various heuristic approaches with respect to a number of important criteria (overall cost penalty, number of skipped exams, number of iterations of a rescheduling procedure required and computational time) on twelve well-known benchmark problems. We demonstrate that the fuzzy combination of heuristics allows high quality solutions to be constructed. On one of the twelve problems we obtained lower penalty than any previously published constructive method and for all twelve we obtained lower penalty than when any of the single heuristics were used alone. Furthermore, we demonstrate that the fuzzy approach used less backtracking when constructing solutions than any of the single heuristics. We conclude that this novel fuzzy approach is a highly effective method for heuristically constructing solutions and, as such, has particular relevance to real-world situations in which the construction of feasible solutions is often a difficult task in its own right.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Design of Experiments (DoE) analysis was undertaken to generate a list of configurations for CFD numerical simulation of an aircraft crown compartment. Fitted regression models were built to predict the convective heat transfer coefficients of thermally sensitive dissipating elements located inside this compartment. These are namely the SEPDC and the Route G. Currently they are positioned close to the fuselage and it is of interest to optimise the heat transfer for reliability and performance purposes. Their locations and the external fuselage surface temperature were selected as input variables for the DoE. The models fit the CFD data with values ranging from 0.878 to 0.978, and predict that the optimum locations in terms of heat transfer are when the elements are positioned as close to the crown floor as possible ( and ?min. limits), where they come in direct contact with the air flow from the cabin ventilation system, and when they are positioned close to the centreline ( and ?CL). The methodology employed allows aircraft thermal designers to optimise equipment placement in confined areas of an aircraft during the design phase. The determined models should be incorporated into global aircraft numerical models to improve accuracy and reduce model size and computational time. © 2012 Elsevier Masson SAS. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents the finite element (FE) analysis of the consolidation of the foundation of an embankment constructed over soft clay deposit which shows significant time dependent behaviour and was improved with prefabricated vertical drains. To assess the capability of a simple elastic viscoplastic (EVP) model to predict the long term performance of a geotechnical structure constructed on soft soils, a well documented (Leneghans) embankment was analyzed to predict its long term behaviour characteristics. Two fully coupled two dimensional (2D) plane strain FE analyses have been carried out. In one of these, the foundation of the embankment was modelled with a relatively simpler time dependent EVP model and in the other one, for comparison purposes, the foundation soil was modelled with elasto-plastic Modified Cam-clay (MCC) model. Details of the analyses and the results are discussed in comparison with the field performance. Predictions from the creep (EVP) model were found to be better than those from Elasto-plastic (MCC) analysis. However, the creep analysis requires an additional parameter and additional computational time and resources. © 2011 Taylor & Francis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports image analysis methods that have been developed to study the microstructural changes of non-wovens made by the hydroentanglement process. The validity of the image processing techniques has been ascertained by applying them to test images with known properties. The parameters in preprocessing of the scanning electron microscope (SEM) images used in image processing have been tested and optimized. The fibre orientation distribution is estimated using fast Fourier transform (FFT) and Hough transform (HT) methods. The results obtained using these two methods are in good agreement. The HT method is more demanding in computational time compared with the Fourier transform (FT) method. However, the advantage of the HT method is that the actual orientation of the lines can be concluded directly from the result of the transform without the need for any further computation. The distribution of the length of the straight fibre segments of the fabrics is evaluated by the HT method. The effect of curl of the fibres on the result of this evaluation is shown.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mineral exploration programmes around the world use data from remote sensing, geophysics and direct sampling. On a regional scale, the combination of airborne geophysics and ground-based geochemical sampling can aid geological mapping and economic minerals exploration. The fact that airborne geophysical and traditional soil-sampling data are generated at different spatial resolutions means that they are not immediately comparable due to their different sampling density. Several geostatistical techniques, including indicator cokriging and collocated cokriging, can be used to integrate different types of data into a geostatistical model. With increasing numbers of variables the inference of the cross-covariance model required for cokriging can be demanding in terms of effort and computational time. In this paper a Gaussian-based Bayesian updating approach is applied to integrate airborne radiometric data and ground-sampled geochemical soil data to maximise information generated from the soil survey, to enable more accurate geological interpretation for the exploration and development of natural resources. The Bayesian updating technique decomposes the collocated estimate into a production of two models: prior and likelihood models. The prior model is built from primary information and the likelihood model is built from secondary information. The prior model is then updated with the likelihood model to build the final model. The approach allows multiple secondary variables to be simultaneously integrated into the mapping of the primary variable. The Bayesian updating approach is demonstrated using a case study from Northern Ireland where the history of mineral prospecting for precious and base metals dates from the 18th century. Vein-hosted, strata-bound and volcanogenic occurrences of mineralisation are found. The geostatistical technique was used to improve the resolution of soil geochemistry, collected one sample per 2 km2, by integrating more closely measured airborne geophysical data from the GSNI Tellus Survey, measured over a footprint of 65 x 200 m. The directly measured geochemistry data were considered as primary data in the Bayesian approach and the airborne radiometric data were used as secondary data. The approach produced more detailed updated maps and in particular maximized information on mapped estimates of zinc, copper and lead. Greater delineation of an elongated northwest/southeast trending zone in the updated maps strengthened the potential to investigate stratabound base metal deposits.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work proposes a novel approach to compute transonic Lim
it Cycle Oscillations using high fidelity analysis. CFD based Harmonic Balance methods have proven to be efficient tools to predict periodic phenomena. This paper’s contribution is to present a new methodology to determine the unknown frequency of oscillations, enabling HB methods to accurately capture Limit Cycle Oscillations (LCOs); this is achieved by defining a frequency updating procedure based on a coupled CFD/CSD Harmonic Balance formulation to find the LCO condition. A pitch/plunge aerofoil and delta wing aerodynamic and respective linear structural models are used to validate the new method against conventional time-domain simulations. Results show consistent agreement between the proposed and time-marching methods for both LCO amplitude and frequency, while producing at least one order of magnitude reduction in computational time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present TANC, a TAN classifier (tree-augmented naive) based on imprecise probabilities. TANC models prior near-ignorance via the Extreme Imprecise Dirichlet Model (EDM). A first contribution of this paper is the experimental comparison between EDM and the global Imprecise Dirichlet Model using the naive credal classifier (NCC), with the aim of showing that EDM is a sensible approximation of the global IDM. TANC is able to deal with missing data in a conservative manner by considering all possible completions (without assuming them to be missing-at-random), but avoiding an exponential increase of the computational time. By experiments on real data sets, we show that TANC is more reliable than the Bayesian TAN and that it provides better performance compared to previous TANs based on imprecise probabilities. Yet, TANC is sometimes outperformed by NCC because the learned TAN structures are too complex; this calls for novel algorithms for learning the TAN structures, better suited for an imprecise probability classifier.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work proposes a novel approach to compute transonic limit-cycle oscillations using high-fidelity analysis. Computational-Fluid-Dynamics based harmonic balance methods have proven to be efficient tools to predict periodic phenomena. This paper’s contribution is to present a new methodology to determine the unknown frequency of oscillations, enabling harmonic balance methods to accurately capture limit-cycle oscillations; this is achieved by defining a frequency-updating procedure based on a coupled computational-fluid-dynamics/computational-structural-dynamics harmonic balance formulation to find the limit-cycle oscillation condition. A pitch/plunge airfoil and delta wing aerodynamic and respective linear structural models are used to validate the new method against conventional time-domain simulations. Results show consistent agreement between the proposed and time-marching methods for both limit-cycle oscillation amplitude and frequency while producing at least a one-order-of-magnitude reduction in computational time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents a new general purpose classifier named Averaged Extended Tree Augmented Naive Bayes (AETAN), which is based on combining the advantageous characteristics of Extended Tree Augmented Naive Bayes (ETAN) and Averaged One-Dependence Estimator (AODE) classifiers. We describe the main properties of the approach and algorithms for learning it, along with an analysis of its computational time complexity. Empirical results with numerous data sets indicate that the new approach is superior to ETAN and AODE in terms of both zero-one classification accuracy and log loss. It also compares favourably against weighted AODE and hidden Naive Bayes. The learning phase of the new approach is slower than that of its competitors, while the time complexity for the testing phase is similar. Such characteristics suggest that the new classifier is ideal in scenarios where online learning is not required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hub location problem is an NP-hard problem that frequently arises in the design of transportation and distribution systems, postal delivery networks, and airline passenger flow. This work focuses on the Single Allocation Hub Location Problem (SAHLP). Genetic Algorithms (GAs) for the capacitated and uncapacitated variants of the SAHLP based on new chromosome representations and crossover operators are explored. The GAs is tested on two well-known sets of real-world problems with up to 200 nodes. The obtained results are very promising. For most of the test problems the GA obtains improved or best-known solutions and the computational time remains low. The proposed GAs can easily be extended to other variants of location problems arising in network design planning in transportation systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Le Problème de Tournées de Véhicules (PTV) est une clé importante pour gérér efficacement des systèmes logistiques, ce qui peut entraîner une amélioration du niveau de satisfaction de la clientèle. Ceci est fait en servant plus de clients dans un temps plus court. En terme général, il implique la planification des tournées d'une flotte de véhicules de capacité donnée basée à un ou plusieurs dépôts. Le but est de livrer ou collecter une certain quantité de marchandises à un ensemble des clients géographiquement dispersés, tout en respectant les contraintes de capacité des véhicules. Le PTV, comme classe de problèmes d'optimisation discrète et de grande complexité, a été étudié par de nombreux au cours des dernières décennies. Étant donné son importance pratique, des chercheurs dans les domaines de l'informatique, de la recherche opérationnelle et du génie industrielle ont mis au point des algorithmes très efficaces, de nature exacte ou heuristique, pour faire face aux différents types du PTV. Toutefois, les approches proposées pour le PTV ont souvent été accusées d'être trop concentrées sur des versions simplistes des problèmes de tournées de véhicules rencontrés dans des applications réelles. Par conséquent, les chercheurs sont récemment tournés vers des variantes du PTV qui auparavant étaient considérées trop difficiles à résoudre. Ces variantes incluent les attributs et les contraintes complexes observés dans les cas réels et fournissent des solutions qui sont exécutables dans la pratique. Ces extensions du PTV s'appellent Problème de Tournées de Véhicules Multi-Attributs (PTVMA). Le but principal de cette thèse est d'étudier les différents aspects pratiques de trois types de problèmes de tournées de véhicules multi-attributs qui seront modélisés dans celle-ci. En plus, puisque pour le PTV, comme pour la plupart des problèmes NP-complets, il est difficile de résoudre des instances de grande taille de façon optimale et dans un temps d'exécution raisonnable, nous nous tournons vers des méthodes approcheés à base d’heuristiques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En radiothérapie, la tomodensitométrie (CT) fournit l’information anatomique du patient utile au calcul de dose durant la planification de traitement. Afin de considérer la composition hétérogène des tissus, des techniques de calcul telles que la méthode Monte Carlo sont nécessaires pour calculer la dose de manière exacte. L’importation des images CT dans un tel calcul exige que chaque voxel exprimé en unité Hounsfield (HU) soit converti en une valeur physique telle que la densité électronique (ED). Cette conversion est habituellement effectuée à l’aide d’une courbe d’étalonnage HU-ED. Une anomalie ou artefact qui apparaît dans une image CT avant l’étalonnage est susceptible d’assigner un mauvais tissu à un voxel. Ces erreurs peuvent causer une perte cruciale de fiabilité du calcul de dose. Ce travail vise à attribuer une valeur exacte aux voxels d’images CT afin d’assurer la fiabilité des calculs de dose durant la planification de traitement en radiothérapie. Pour y parvenir, une étude est réalisée sur les artefacts qui sont reproduits par simulation Monte Carlo. Pour réduire le temps de calcul, les simulations sont parallélisées et transposées sur un superordinateur. Une étude de sensibilité des nombres HU en présence d’artefacts est ensuite réalisée par une analyse statistique des histogrammes. À l’origine de nombreux artefacts, le durcissement de faisceau est étudié davantage. Une revue sur l’état de l’art en matière de correction du durcissement de faisceau est présentée suivi d’une démonstration explicite d’une correction empirique.