20 resultados para Robust design
Resumo:
Since their introduction in the 1950s, marine outfalls with diffusers have been prone to saline intrusion, a process in which seawater ingresses into the outfall. This can greatly reduce the dilution and subsequent dispersion of wastewater discharged, sometimes resulting in serious deterioration of coastal water quality. Although long aware of the difficulties posed by saline intrusion, engineers still lack satisfactory methods for its prediction and robust design methods for its alleviation. However, with recent developments in numerical methods and computer power, it has been suggested that commercially available computational fluid dynamics (CFD) software may be a useful aid in combating this phenomenon by improving understanding through synthesising likely behaviour. This document reviews current knowledge on saline intrusion and its implications and then outlines a model-scale investigation of the process undertaken at Queen's University Belfast, using both physical and CFD methods. Results are presented for a simple outfall configuration, incorporating several outlets. The features observed agree with general observations from full-scale marine outfalls, and quantify the intricate internal flow mechanisms associated with saline intrusion. The two-dimensional numerical model was found to represent saline intrusion, but in a qualitative manner, not yet adequate for design purposes. Specific areas requiring further development were identified. The ultimate aim is to provide a reliable, practical and cost effective means by which engineers can minimise saline intrusion through optimised outfall design.
Resumo:
This paper describes a randomised controlled trial (RCT) investigation of the added value of systemic family therapy (SFT) over individually focused cognitive behavioural therapy (CBT) for families in which one or more members has suffered trauma and been referred to a community-based psychotherapy centre. The results illustrate how an apparently robust design can be confounded by high attrition rates, low average number of therapeutic sessions and poor protocol adherence. The paper highlights a number of general and specific lessons regarding the resources and processes involved that can act as a model for those planning to undertake studies of this type and scope. A key message is that the challenges of conducting RCTs in ‘real world’ settings should not be underestimated. The wider implications in relation to the place of RCTs within the creation of the evidence base for complex psycho-social interventions is discussed and the current movement towards a phased mixed-methods approach, including the appropriate use of RCTs, which some might argue is a return to the original vision of evidence-based practice (EBP), is affirmed.
Resumo:
Taguchi method was applied to investigate the optimal operating conditions in the preparation of activated carbon using palm kernel shell with quadruple control factors: irradiation time, microwave power, concentration of phosphoric acid as impregnation substance and impregnation ratio between acid and palm kernel shell. The best combination of the control factors as obtained by applying Taguchi method was microwave power of 800 W, irradiation time of 17 min, impregnation ratio of 2, and acid concentration of 85%. The noise factor (particle size of raw material) was considered in a separate outer array, which had no effect on the quality of the activated carbon as confirmed by t-test. Activated carbon prepared at optimum combination of control factors had high BET surface area of 1,473.55 m² g-1 and high porosity. The adsorption equilibrium and kinetic data can satisfactorily be described by the Langmuir isotherm and a pseudo-second-order kinetic model, respectively. The maximum adsorbing capacity suggested by the Langmuir model was 1000 mg g-1.
Resumo:
In this paper, we propose a system level design approach considering voltage over-scaling (VOS) that achieves error resiliency using unequal error protection of different computation elements, while incurring minor quality degradation. Depending on user specifications and severity of process variations/channel noise, the degree of VOS in each block of the system is adaptively tuned to ensure minimum system power while providing "just-the-right" amount of quality and robustness. This is achieved, by taking into consideration block level interactions and ensuring that under any change of operating conditions, only the "less-crucial" computations, that contribute less to block/system output quality, are affected. The proposed approach applies unequal error protection to various blocks of a system-logic and memory-and spans multiple layers of design hierarchy-algorithm, architecture and circuit. The design methodology when applied to a multimedia subsystem shows large power benefits ( up to 69% improvement in power consumption) at reasonable image quality while tolerating errors introduced due to VOS, process variations, and channel noise.
Resumo:
In this paper, a multi-level wordline driver scheme is presented to improve 6T-SRAM read and write stability. The proposed wordline driver generates a shaped pulse during the read mode and a boosted wordline during the write mode. During read, the shaped pulse is tuned at nominal voltage for a short period of time, whereas for the remaining access time, the wordline voltage is reduced to save the power consumption of the cell. This shaped wordline pulse results in improved read noise margin without any degradation in access time for small wordline load. The improvement is explained by examining the dynamic and nonlinear behavior of the SRAM cell. Furthermore, during the hold mode, for a short time (depending on the size of boosting capacitance), wordline voltage becomes negative and charges up to zero after a specific time that results in a lower leakage current compared to conventional SRAM. The proposed technique results in at least 2× improvement in read noise margin while it improves write margin by 3× for lower supply voltages than 0.7 V. The leakage power for the proposed SRAM is reduced by 2% while the total power is improved by 3% in the worst case scenario for an SRAM array. The main advantage of the proposed wordline driver is the improvement of dynamic noise margin with less than 2.5% penalty in area. TSMC 65 nm technology models are used for simulations.
Resumo:
In this paper, a multiloop robust control strategy is proposed based on H∞ control and a partial least squares (PLS) model (H∞_PLS) for multivariable chemical processes. It is developed especially for multivariable systems in ill-conditioned plants and non-square systems. The advantage of PLS is to extract the strongest relationship between the input and the output variables in the reduced space of the latent variable model rather than in the original space of the highly dimensional variables. Without conventional decouplers, the dynamic PLS framework automatically decomposes the MIMO process into multiple single-loop systems in the PLS subspace so that the controller design can be simplified. Since plant/model mismatch is almost inevitable in practical applications, to enhance the robustness of this control system, the controllers based on the H∞ mixed sensitivity problem are designed in the PLS latent subspace. The feasibility and the effectiveness of the proposed approach are illustrated by the simulation results of a distillation column and a mixing tank process. Comparisons between H∞_PLS control and conventional individual control (either H∞ control or PLS control only) are also made
Resumo:
The scale of the Software-Defined Network (SDN) Controller design problem has become apparent with the expansion of SDN deployments. Initial SDN deployments were small-scale, single controller environments for research and usecase testing. Today, enterprise deployments requiring multiple controllers are gathering momentum e.g. Google’s backbone network, Microsoft’s public cloud, and NTT’s edge gateway. Third-party applications are also becoming available e.g. HP SDN App Store. The increase in components and interfaces for the evolved SDN implementation increases the security challenges of the SDN controller design. In this work, the requirements of a secure, robust, and resilient SDN controller are identified, stateof-the-art open-source SDN controllers are analyzed with respect to the security of their design, and recommendations for security improvements are provided. This contribution highlights the gap between the potential security solutions for SDN controllers and the actual security level of current controller designs.
Resumo:
After years of emphasis on leanness and responsiveness businesses are now experiencing their vulnerability to supply chain disturbances. Although more literature is appearing on this subject, there is a need for an integrated framework to support the analysis and design of robust food supply chains. In this chapter we present such a framework. We define the concept of robustness and classify supply chain disturbances, sources of food supply chain vulnerability, and adequate redesign principles and strategies to achieve robust supply chain performances. To test and illustrate its applicability, the research framework is applied to a meat supply chain.
Resumo:
The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.
Resumo:
This paper reports on a design study assessing the impact of laminate manufacturing constraints on the structural performance and weight of composite stiffened panels. The study demonstrates that maximizing ply continuity results in weight penalties, while various geometric constraints related to manufacture and repair can be accommodated without significant weight penalties, potentially generating robust flexible designs.
Resumo:
There is a requirement for better integration between design and analysis tools, which is difficult due to their different objectives, separate data representations and workflows. Currently, substantial effort is required to produce a suitable analysis model from design geometry. Robust links are required between these different representations to enable analysis attributes to be transferred between different design and analysis packages for models at various levels of fidelity.
This paper describes a novel approach for integrating design and analysis models by identifying and managing the relationships between the different representations. Three key technologies, Cellular Modeling, Virtual Topology and Equivalencing, have been employed to achieve effective simulation model management. These technologies and their implementation are discussed in detail. Prototype automated tools are introduced demonstrating how multiple simulation models can be linked and maintained to facilitate seamless integration throughout the design cycle.
Resumo:
We describe formulation and evaluation of novel dissolving polymeric microneedle (MN) arrays for the facilitated delivery of low molecular weight, high dose drugs. Ibuprofen sodium was used as the model here and was successfully formulated at approximately 50% w/w in the dry state using the copolymer poly(methylvinylether/maleic acid). These MNs were robust and effectively penetrated skin in vitro, dissolving rapidly to deliver the incorporated drug. The delivery of 1.5mg ibuprofen sodium, the theoretical mass of ibuprofen sodium contained within the dry MN alone, was vastly exceeded, indicating extensive delivery of the drug loaded into the baseplates. Indeed in in vitro transdermal delivery studies, approximately 33mg (90%) of the drug initially loaded into the arrays was delivered over 24h. Iontophoresis produced no meaningful increase in delivery. Biocompatibility studies and in vivo rat skin tolerance experiments raised no concerns. The blood plasma ibuprofen sodium concentrations achieved in rats (263μgml(-1) at the 24h time point) were approximately 20 times greater than the human therapeutic plasma level. By simplistic extrapolation of average weights from rats to humans, a MN patch design of no greater than 10cm(2) could cautiously be estimated to deliver therapeutically-relevant concentrations of ibuprofen sodium in humans. This work, therefore, represents a significant progression in exploitation of MN for successful transdermal delivery of a much wider range of drugs.
Resumo:
The design optimization of cold-formed steel portal frame buildings is considered in this paper. The objective function is based on the cost of the members for the main frame and secondary members (i.e., purlins, girts, and cladding for walls and roofs) per unit area on the plan of the building. A real-coded niching genetic algorithm is used to minimize the cost of the frame and secondary members that are designed on the basis of ultimate limit state. It iis shown that the proposed algorithm shows effective and robust capacity in generating the optimal solution, owing to the population's diversity being maintained by applying the niching method. In the optimal design, the cost of purlins and side rails are shown to account for 25% of the total cost; the main frame members account for 27% of the total cost, claddings for the walls and roofs accounted for 27% of the total cost.
Resumo:
An environment has been created for the optimisation of aerofoil profiles with inclusion of small surface features. For TS wave dominated flows, the paper examines the consequences of the addition of a depression on the aerodynamic optimisation of an NLF aerofoil, and describes the geometry definition fidelity and optimisation algorithm employed in the development process. The variables that define the depression for this optimisation investigation have been fixed, however a preliminary study is presented demonstrating the sensitivity of the flow to the depression characteristics. Solutions to the optimisation problem are then presented using both gradient-based and genetic algorithm techniques, and for accurate representation of the inclusion of small surface perturbations it is concluded that a global optimisation method is required for this type of aerofoil optimisation task due to the nature of the response surface generated. When dealing with surface features, changes in the transition onset are likely to be of a non-linear nature so it is highly critical to have an optimisation algorithm that is robust, suggesting that for this framework, gradient-based methods alone are not suited.