8 resultados para Multiple Objective Optimization


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a numerical study of a linear compressor cascade to investigate the effective end wall profiling rules for highly-loaded axial compressors. The first step in the research applies a correlation analysis for the different flow field parameters by a data mining over 600 profiling samples to quantify how variations of loss, secondary flow and passage vortex interact with each other under the influence of a profiled end wall. The result identifies the dominant role of corner separation for control of total pressure loss, providing a principle that only in the flow field with serious corner separation does the does the profiled end wall change total pressure loss, secondary flow and passage vortex in the same direction. Then in the second step, a multi-objective optimization of a profiled end wall is performed to reduce loss at design point and near stall point. The development of effective end wall profiling rules is based on the manner of secondary flow control rather than the geometry features of the end wall. Using the optimum end wall cases from the Pareto front, a quantitative tool for analyzing secondary flow control is employed. The driving force induced by a profiled end wall on different regions of end wall flow are subjected to a detailed analysis and identified for their positive/negative influences in relieving corner separation, from which the effective profiling rules are further confirmed. It is found that the profiling rules on a cascade show distinct differences at design point and near stall point, thus loss control of different operating points is generally independent.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes an implementation of a method capable of integrating parametric, feature based, CAD models based on commercial software (CATIA) with the SU2 software framework. To exploit the adjoint based methods for aerodynamic optimisation within the SU2, a formulation to obtain geometric sensitivities directly from the commercial CAD parameterisation is introduced, enabling the calculation of gradients with respect to CAD based design variables. To assess the accuracy and efficiency of the alternative approach, two aerodynamic optimisation problems are investigated: an inviscid, 3D, problem with multiple constraints, and a 2D high-lift aerofoil, viscous problem without any constraints. Initial results show the new parameterisation obtaining reliable optimums, with similar levels of performance of the software native parameterisations. In the final paper, details of computing CAD sensitivities will be provided, including accuracy as well as linking geometric sensitivities to aerodynamic objective functions and constraints; the impact in the robustness of the overall method will be assessed and alternative parameterisations will be included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Molecular analysis by PCR of monoclonally rearranged immunoglobulin (Ig) genes can be used for diagnosis in B-cell lymphoproliferative disorders (LPD), as well as for monitoring minimal residual disease (MRD) after treatment. This technique has the risk of false-positive results due to the "background" amplification of similar rearrangements derived from polyclonal B-cells. This problem can be resolved in advance by additional analyses that discern between polyclonal and monoclonal PCR products, such as the heteroduplex analysis. A second problem is that PCR frequently fails to amplify the junction regions, mainly due to somatic mutations frequently present in mature (post-follicular) B-cell lymphoproliferations. The use of additional targets (e.g. Ig light chain genes) can avoid this problem. DESIGN AND METHODS: We studied the specificity of heteroduplex PCR analysis of several Ig junction regions to detect monoclonal products in samples from 84 MM patients and 24 patients with B cell polyclonal disorders. RESULTS: Using two distinct VH consensus primers (FR3 and FR2) in combination with one JH primer, 79% of the MM displayed monoclonal products. The percentage of positive cases was increased by amplification of the Vlamda-Jlamda junction regions or kappa(de) rearrangements, using two or five pairs of consensus primers, respectively. After including these targets in the heteroduplex PCR analysis, 93% of MM cases displayed monoclonal products. None of the polyclonal samples analyzed resulted in monoclonal products. Dilution experiments showed that monoclonal rearrangements could be detected with a sensitivity of at least 10(-2) in a background with >30% polyclonal B-cells, the sensitivity increasing up to 10(-3) when the polyclonal background was

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale multiple-input multiple-output (MIMO) communication systems can bring substantial improvement in spectral efficiency and/or energy efficiency, due to the excessive degrees-of-freedom and huge array gain. However, large-scale MIMO is expected to deploy lower-cost radio frequency (RF) components, which are particularly prone to hardware impairments. Unfortunately, compensation schemes are not able to remove the impact of hardware impairments completely, such that a certain amount of residual impairments always exists. In this paper, we investigate the impact of residual transmit RF impairments (RTRI) on the spectral and energy efficiency of training-based point-to-point large-scale MIMO systems, and seek to determine the optimal training length and number of antennas which maximize the energy efficiency. We derive deterministic equivalents of the signal-to-noise-and-interference ratio (SINR) with zero-forcing (ZF) receivers, as well as the corresponding spectral and energy efficiency, which are shown to be accurate even for small number of antennas. Through an iterative sequential optimization, we find that the optimal training length of systems with RTRI can be smaller compared to ideal hardware systems in the moderate SNR regime, while larger in the high SNR regime. Moreover, it is observed that RTRI can significantly decrease the optimal number of transmit and receive antennas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine risk of Down syndrome (DS) in multiple relative to singleton pregnancies, and compare prenatal diagnosis rates and pregnancy outcome.

DESIGN: Population-based prevalence study based on EUROCAT congenital anomaly registries.

SETTING: Eight European countries.

POPULATION: 14.8 million births 1990-2009; 2.89% multiple births.

METHODS: DS cases included livebirths, fetal deaths from 20 weeks, and terminations of pregnancy for fetal anomaly (TOPFA). Zygosity is inferred from like/unlike sex for birth denominators, and from concordance for DS cases.

MAIN OUTCOME MEASURES: Relative risk (RR) of DS per fetus/baby from multiple versus singleton pregnancies and per pregnancy in monozygotic/dizygotic versus singleton pregnancies. Proportion of prenatally diagnosed and pregnancy outcome.

STATISTICAL ANALYSIS: Poisson and logistic regression stratified for maternal age, country and time.

RESULTS: Overall, the adjusted (adj) RR of DS for fetus/babies from multiple versus singleton pregnancies was 0.58 (95% CI 0.53-0.62), similar for all maternal ages except for mothers over 44, for whom it was considerably lower. In 8.7% of twin pairs affected by DS, both co-twins were diagnosed with the condition. The adjRR of DS for monozygotic versus singleton pregnancies was 0.34 (95% CI 0.25-0.44) and for dizygotic versus singleton pregnancies 1.34 (95% CI 1.23-1.46). DS fetuses from multiple births were less likely to be prenatally diagnosed than singletons (adjOR 0.62 [95% CI 0.50-0.78]) and following diagnosis less likely to be TOPFA (adjOR 0.40 [95% CI 0.27-0.59]).

CONCLUSIONS: The risk of DS per fetus/baby is lower in multiple than singleton pregnancies. These estimates can be used for genetic counselling and prenatal screening.