934 resultados para ROBUST DESIGN
Resumo:
The chapter investigates Shock Control Bumps (SCB) on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 for Active Flow Control (AFC). A SCB approach is used to decelerate supersonic flow on the suction/pressure sides of transonic aerofoil that leads delaying shock occurrence or weakening of shock strength. Such an AFC technique reduces significantly the total drag at transonic speeds. This chapter considers the SCB shape design optimisation at two boundary layer transition positions (0 and 45%) using an Euler software coupled with viscous boundary layer effects and robust Evolutionary Algorithms (EAs). The optimisation method is based on a canonical Evolution Strategy (ES) algorithm and incorporates the concepts of hierarchical topology and parallel asynchronous evaluation of candidate solution. Two test cases are considered with numerical experiments; the first test deals with a transition point occurring at the leading edge and the transition point is fixed at 45% of wing chord in the second test. Numerical results are presented and it is demonstrated that an optimal SCB design can be found to significantly reduce transonic wave drag and improves lift on drag (L/D) value when compared to the baseline aerofoil design.
Resumo:
This paper demonstrates the application of a robust form of pose estimation and scene reconstruction using data from camera images. We demonstrate results that suggest the ability of the algorithm to rival methods of RANSAC based pose estimation polished by bundle adjustment in terms of solution robustness, speed and accuracy, even when given poor initialisations. Our simulated results show the behaviour of the algorithm in a number of novel simulated scenarios reflective of real world cases that show the ability of the algorithm to handle large observation noise and difficult reconstruction scenes. These results have a number of implications for the vision and robotics community, and show that the application of visual motion estimation on robotic platforms in an online fashion is approaching real-world feasibility.
Resumo:
SAP and its research partners have been developing a lan- guage for describing details of Services from various view- points called the Unified Service Description Language (USDL). At the time of writing, version 3.0 describes technical implementation aspects of services, as well as stakeholders, pricing, lifecycle, and availability. Work is also underway to address other business and legal aspects of services. This language is designed to be used in service portfolio management, with a repository of service descriptions being available to various stakeholders in an organisation to allow for service prioritisation, development, deployment and lifecycle management. The structure of the USDL metadata is specified using an object-oriented metamodel that conforms to UML, MOF and EMF Ecore. As such it is amenable to code gener-ation for implementations of repositories that store service description instances. Although Web services toolkits can be used to make these programming language objects available as a set of Web services, the practicalities of writing dis- tributed clients against over one hundred class definitions, containing several hundred attributes, will make for very large WSDL interfaces and highly inefficient “chatty” implementations. This paper gives the high-level design for a completely model-generated repository for any version of USDL (or any other data-only metamodel), which uses the Eclipse Modelling Framework’s Java code generation, along with several open source plugins to create a robust, transactional repository running in a Java application with a relational datastore. However, the repository exposes a generated WSDL interface at a coarse granularity, suitable for distributed client code and user-interface creation. It uses heuristics to drive code generation to bridge between the Web service and EMF granularities.
Resumo:
A number of Game Strategies (GS) have been developed in past decades. They have been used in the fields of economics, engineering, computer science and biology due to their efficiency in solving design optimization problems. In addition, research in multi-objective (MO) and multidisciplinary design optimization (MDO) has focused on developing robust and efficient optimization methods to produce a set of high quality solutions with low computational cost. In this paper, two optimization techniques are considered; the first optimization method uses multi-fidelity hierarchical Pareto optimality. The second optimization method uses the combination of two Game Strategies; Nash-equilibrium and Pareto optimality. The paper shows how Game Strategies can be hybridised and coupled to Multi-Objective Evolutionary Algorithms (MOEA) to accelerate convergence speed and to produce a set of high quality solutions. Numerical results obtained from both optimization methods are compared in terms of computational expense and model quality. The benefits of using Hybrid-Game Strategies are clearly demonstrated
Resumo:
Selecting an appropriate design-builder is critical to the success of DB projects. The objective of this study is to identify selection criteria for design-builders and compare their relative importance by means of a robust content analysis of 94 Request For Proposals (RFPs) for public DB projects. These DB projects had an aggregate contract value of over US$3.5 billion and were advertised between 2000 and 2010. This study summarized twenty-six selection criteria and classified into ten categories, i.e.: price, experience, technical approach, management approach, qualification, schedule, past performance, financial capability, responsiveness to the RFP, and legal status in descending order of their relative importance. The results showed that even though price still remains as the most important selection category, its relative importance declines significantly in the last decade. The categories of qualification, experience, past performance, by contrast, have been becoming more important to DB owners for selecting design-builders. Finally, it is found that the importance weighting of price in large projects is significantly higher than that in small projects. This study provides a useful reference for owners in selecting their preferred design-builders.
Resumo:
The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.
Resumo:
This paper investigates a mixed centralised-decentralised air traffic separation management system, which combines the best features of the centralised and decentralised systems whilst ensuring the reliability of the air traffic management system during degraded conditions. To overcome communication band limits, we propose a mixed separation manager on the basis of a robust decision (or min-max) problem that is posed on a reduced set of admissible flight avoidance manoeuvres (or a FAM alphabet). We also present a design method for selecting an appropriate FAM alphabet for use in the mixed separation management system. Simulation studies are presented to illustrate the benefits of our proposed FAM alphabet based mixed separation manager.
Resumo:
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.
Resumo:
Value Management (VM) is a proven methodology that provides a structured framework using supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It offers an exceptionally robust approach to exploring the need and function of projects to be aligned with client’s objectives. The functional analysis and creativity phases of VM are crucial as it focused on utilising innovative thinking to understand the objectives of clients’ projects and provide value-adding solutions at the early discovery stages of projects. There is however a perception of VM as just being another cost-cutting tool, which has overshadowed the fundamental benefits of the method, therefore negating both influence and wider use in the construction industry. This paper describes findings from a series of case studies conducted at project and corporate levels of a current public funded infrastructure projects in Malaysia. The study aims to investigate VM processes practised by the project client organisation and evaluate the effects of project team involvement in VM workshops during the design-stage of these projects. The focus of the study is on how issues related to ‘upstream’ infrastructure design aimed at improving ‘downstream’ construction process on-site, are being resolved through multi-disciplinary team consideration and decision-making. Findings from the case studies indicate that the mix of disciplines of project team members at a design-stage of a VM workshop has minimal influence on improving construction processes. However, the degree of interaction, institutionalized thinking, cultural dimensions and visualization aids adopted, have a significant impact in maximizing creativity amongst project team members during VM workshop. The case studies conducted for this research have focused on infrastructure projects that utilise traditional VM workshop as client’s chosen VM methodology to review and develop designs. Documents review and semi-structured interview with project teams are used as data collection techniques for the case study. The significant outcomes of this research are expected to offer alternative perspectives for construction professionals and clients to minimise the constraints and strengthen strategies for implementing VM on future projects.
Resumo:
The design-build (DB) delivery system is an effective means of delivering a green construction project and selecting an appropriate contractor is critical to project success. Moreover, the delivery of green buildings requires specific design, construction and operation and maintenance considerations not generally encountered in the procurement of conventional buildings. Specifying clear sustainability requirements to potential contractors is particularly important in achieving sustainable project goals. However, many client/owners either do not explicitly specify sustainability requirements or do so in a prescriptive manner during the project procurement process. This paper investigates the current state-of-the-art procurement process used in specifying the sustainability requirements of the public sector in the USA construction market by means of a robust content analysis of 40 design-build requests for proposals (RFPs). The results of the content analysis indicate that the sustainability requirement is one of the most important dimensions in the best-value evaluation of DB contractors. Client/owners predominantly specify the LEED certification levels (e.g. LEED Certified, Silver, Gold, and Platinum) for a particular facility, and include the sustainability requirements as selection criteria (with specific importance weightings) for contractor evolution. Additionally, larger size projects tend to allocate higher importance weightings to sustainability requirements.This study provides public DB client/owners with a number of practical implications for selecting appropriate design-builders for sustainable DB projects.
Resumo:
The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality can be influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigation of four urban residential catchments and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling outcomes indicate that selecting smaller average recurrence interval (ARI) events with high intensity-short duration as the threshold for the treatment system design is the most feasible since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of rainfall events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.
Resumo:
Next-generation autonomous underwater vehicles (AUVs) will be required to robustly identify underwater targets for tasks such as inspection, localization, and docking. Given their often unstructured operating environments, vision offers enormous potential in underwater navigation over more traditional methods; however, reliable target segmentation often plagues these systems. This paper addresses robust vision-based target recognition by presenting a novel scale and rotationally invariant target design and recognition routine based on self-similar landmarks that enables robust target pose estimation with respect to a single camera. These algorithms are applied to an AUV with controllers developed for vision-based docking with the target. Experimental results show that the system performs exceptionally on limited processing power and demonstrates how the combined vision and controller system enables robust target identification and docking in a variety of operating conditions.
Resumo:
The paper introduces the design of robust current and voltage control algorithms for a grid-connected three-phase inverter which is interfaced to the grid through a high-bandwidth three-phase LCL filter. The algorithms are based on the state feedback control which have been designed in a systematic approach and improved by using oversampling to deal with the issues arising due to the high-bandwidth filter. An adaptive loop delay compensation method has also been adopted to minimize the adverse effects of loop delay in digital controller and to increase the robustness of the control algorithm in the presence of parameter variations. Simulation results are presented to validate the effectiveness of the proposed algorithm.
Resumo:
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.
Resumo:
We present the treatment rationale and study design of the MetLung phase III study. This study will investigate onartuzumab (MetMAb) in combination with erlotinib compared with erlotinib alone, as second- or third-line treatment, in patients with advanced non-small-cell lung cancer (NSCLC) who are Met-positive by immunohistochemistry. Approximately 490 patients (245 per treatment arm) will receive erlotinib (150 mg oral daily) plus onartuzumab or placebo (15 mg/kg intravenous every 3 weeks) until disease progression, unacceptable toxicity, patient or physician decision to discontinue, or death. The efficacy objectives of this study are to compare overall survival (OS) (primary endpoint), progression-free survival, and response rates between the 2 treatment arms. In addition, safety, quality of life, pharmacokinetics, and translational research will be investigated across treatment arms. If the primary objective (OS) is achieved, this study will provide robust results toward an alternative treatment option for patients with Met-positive second- or third-line NSCLC. © 2012 Elsevier Inc. All Rights Reserved.