30 resultados para Multi-Objective Optimization
Resumo:
There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.
Resumo:
Traditionally, the optimization of a turbomachinery engine casing for tip clearance has involved either twodimensional transient thermomechanical simulations or three-dimensional mechanical simulations. This paper illustrates that three-dimensional transient whole-engine thermomechanical simulations can be used within tip clearance optimizations and that the efficiency of such optimizations can be improved when a multifidelity surrogate modeling approach is employed. These simulations are employed in conjunction with a rotor suboptimization using surrogate models of rotor-dynamics performance, stress, mass and transient displacements, and an engine parameterization.
Resumo:
Background: SPARCLE is a cross-sectional survey in nine European regions, examining the relationship of the environment of children with cerebral palsy to their participation and quality of life. The objective of this report is to assess data quality, in particular heterogeneity between regions, family and item non-response and potential for bias. Methods: 1,174 children aged 8–12 years were selected from eight population-based registers of children with cerebral palsy; one further centre recruited 75 children from multiple sources. Families were visited by trained researchers who administered psychometric questionnaires. Logistic regression was used to assess factors related to family non-response and self-completion of questionnaires by children. Results: 431/1,174 (37%) families identified from registers did not respond: 146 (12%) were not traced; of the 1,028 traced families, 250 (24%) declined to participate and 35 (3%) were not approached. Families whose disabled children could walk unaided were more likely to decline to participate. 818 children entered the study of which 500 (61%) self-reported their quality of life; children with low IQ, seizures or inability to walk were less likely to self-report. There was substantial heterogeneity between regions in response rates and socio-demographic characteristics of families but not in age or gender of children. Item non-response was 2% for children and ranged from 0.4% to 5% for questionnaires completed by parents. Conclusion: While the proportion of untraced families was higher than in similar surveys, the refusal rate was comparable. To reduce bias, all analyses should allow for region, walking ability, age and socio-demographic characteristics. The 75 children in the region without a population based register are unlikely to introduce bias
Resumo:
The performance of a multi-band antenna consisting of a microstrip patch with two U-slots is designed and tested for use in aircraft cabin wireless access points. The objective of this paper is to evaluate this antenna that covers most of the current wireless bands from 1.7GHz to 5.85GHz.A specially designed wideband probe antenna is used for characterization
of field radiated from this antenna. This measurement setup gives room for future development like human presence in the cabin, the fading effects, and the path loss between transmitter and receiver.
Resumo:
This paper investigates the construction of linear-in-the-parameters (LITP) models for multi-output regression problems. Most existing stepwise forward algorithms choose the regressor terms one by one, each time maximizing the model error reduction ratio. The drawback is that such procedures cannot guarantee a sparse model, especially under highly noisy learning conditions. The main objective of this paper is to improve the sparsity and generalization capability of a model for multi-output regression problems, while reducing the computational complexity. This is achieved by proposing a novel multi-output two-stage locally regularized model construction (MTLRMC) method using the extreme learning machine (ELM). In this new algorithm, the nonlinear parameters in each term, such as the width of the Gaussian function and the power of a polynomial term, are firstly determined by the ELM. An initial multi-output LITP model is then generated according to the termination criteria in the first stage. The significance of each selected regressor is checked and the insignificant ones are replaced at the second stage. The proposed method can produce an optimized compact model by using the regularized parameters. Further, to reduce the computational complexity, a proper regression context is used to allow fast implementation of the proposed method. Simulation results confirm the effectiveness of the proposed technique. © 2013 Elsevier B.V.
Resumo:
BACKGROUND AND OBJECTIVE: Human research ethics committees provide essential review of research projects to ensure the ethical conduct of human research. Several recent reports have highlighted a complex process for successful application for human research ethics committee approval, particularly for multi-centre studies. Limited resources are available for the execution of human clinical research in Australia and around the world.
METHODS: This report overviews the process of ethics approval for a National Health and Medical Research Council-funded multi-centre study in Australia, focussing on the time and resource implications of such applications in 2007 and 2008.
RESULTS: Applications were submitted to 16 hospital and two university human research ethics committees. The total time to gain final approval from each committee ranged between 13 and 77 days (median = 46 days); the entire process took 16 months to complete and the research officer's time was estimated to cost $A34 143.
CONCLUSIONS: Obstacles to timely human research ethics committee approval are reviewed, including recent, planned and potential initiatives that could improve the ethics approval of multi-centre research.
Resumo:
Insulated-gate bipolar transistor (IGBT) power modules find widespread use in numerous power conversion applications where their reliability is of significant concern. Standard IGBT modules are fabricated for general-purpose applications while little has been designed for bespoke applications. However, conventional design of IGBTs can be improved by the multiobjective optimization technique. This paper proposes a novel design method to consider die-attachment solder failures induced by short power cycling and baseplate solder fatigue induced by the thermal cycling which are among major failure mechanisms of IGBTs. Thermal resistance is calculated analytically and the plastic work design is obtained with a high-fidelity finite-element model, which has been validated experimentally. The objective of minimizing the plastic work and constrain functions is formulated by the surrogate model. The nondominated sorting genetic algorithm-II is used to search for the Pareto-optimal solutions and the best design. The result of this combination generates an effective approach to optimize the physical structure of power electronic modules, taking account of historical environmental and operational conditions in the field.
Resumo:
Heat sinks are widely used for cooling electronic devices and systems. Their thermal performance is usually determined by the material, shape, and size of the heat sink. With the assistance of computational fluid dynamics (CFD) and surrogate-based optimization, heat sinks can be designed and optimized to achieve a high level of performance. In this paper, the design and optimization of a plate-fin-type heat sink cooled by impingement jet is presented. The flow and thermal fields are simulated using the CFD simulation; the thermal resistance of the heat sink is then estimated. A Kriging surrogate model is developed to approximate the objective function (thermal resistance) as a function of design variables. Surrogate-based optimization is implemented by adaptively adding infill points based on an integrated strategy of the minimum value, the maximum mean square error approach, and the expected improvement approaches. The results show the influence of design variables on the thermal resistance and give the optimal heat sink with lowest thermal resistance for given jet impingement conditions.
Resumo:
The proposed multi-table lookup architecture provides SDN-based, high-performance packet classification in an OpenFlow v1.1+ SDN switch. The objective of the demonstration is to show the functionality of the architecture deployed on the NetFPGA SUME Platform.
Resumo:
The design optimization of cold-formed steel portal frame buildings is considered in this paper. The objective function is based on the cost of the members for the main frame and secondary members (i.e., purlins, girts, and cladding for walls and roofs) per unit area on the plan of the building. A real-coded niching genetic algorithm is used to minimize the cost of the frame and secondary members that are designed on the basis of ultimate limit state. It iis shown that the proposed algorithm shows effective and robust capacity in generating the optimal solution, owing to the population's diversity being maintained by applying the niching method. In the optimal design, the cost of purlins and side rails are shown to account for 25% of the total cost; the main frame members account for 27% of the total cost, claddings for the walls and roofs accounted for 27% of the total cost.
Resumo:
We investigate the cell coverage optimization problem for the massive multiple-input multiple-output (MIMO) uplink. By deploying tilt-adjustable antenna arrays at the base stations, cell coverage optimization can become a promising technique which is able to strike a compromise between covering cell-edge users and pilot contamination suppression. We formulate a detailed description of this optimization problem by maximizing the cell throughput, which is shown to be mainly determined by the user distribution within several key geometrical regions. Then, the formulated problem is applied to different example scenarios: for a network with hexagonal shaped cells and uniformly distributed users, we derive an analytical lower bound of the ergodic throughput in the objective cell, based on which, it is shown that the optimal choice for the cell coverage should ensure that the coverage of different cells does not overlap; for a more generic network with sectoral shaped cells and non-uniformly distributed users, we propose an analytical approximation of the ergodic throughput. After that, a practical coverage optimization algorithm is proposed, where the optimal solution can be easily obtained through a simple one-dimensional line searching within a confined searching region. Our numerical results show that the proposed coverage optimization method is able to greatly increase the system throughput in macrocells for the massive MIMO uplink transmission, compared with the traditional schemes where the cell coverage is fixed.
Resumo:
OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.
METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.
RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).
CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.
ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.
Resumo:
We make a case for studying the impact of intra-node parallelism on the performance of data analytics. We identify four performance optimizations that are enabled by an increasing number of processing cores on a chip. We discuss the performance impact of these opimizations on two analytics operators and we identify how these optimizations affect each another.