934 resultados para ROBUST DESIGN
Resumo:
This paper describes a methodology for deploying flexible dynamic configuration into embedded systems whilst preserving the reliability advantages of static systems. The methodology is based on the concept of decision points (DP) which are strategically placed to achieve fine-grained distribution of self-management logic to meet application-specific requirements. DP logic can be changed easily, and independently of the host component, enabling self-management behavior to be deferred beyond the point of system deployment. A transparent Dynamic Wrapper mechanism (DW) automatically detects and handles problems arising from the evaluation of self-management logic within each DP and ensures that the dynamic aspects of the system collapse down to statically defined default behavior to ensure safety and correctness despite failures. Dynamic context management contributes to flexibility, and removes the need for design-time binding of context providers and consumers, thus facilitating run-time composition and incremental component upgrade.
Resumo:
After years of emphasis on leanness and responsiveness businesses are now experiencing their vulnerability to supply chain disturbances. Although more literature is appearing on this subject, there is a need for an integrated framework to support the analysis and design of robust food supply chains. In this chapter we present such a framework. We define the concept of robustness and classify supply chain disturbances, sources of food supply chain vulnerability, and adequate redesign principles and strategies to achieve robust supply chain performances. To test and illustrate its applicability, the research framework is applied to a meat supply chain.
Resumo:
The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.
Resumo:
This paper reports on a design study assessing the impact of laminate manufacturing constraints on the structural performance and weight of composite stiffened panels. The study demonstrates that maximizing ply continuity results in weight penalties, while various geometric constraints related to manufacture and repair can be accommodated without significant weight penalties, potentially generating robust flexible designs.
Resumo:
There is a requirement for better integration between design and analysis tools, which is difficult due to their different objectives, separate data representations and workflows. Currently, substantial effort is required to produce a suitable analysis model from design geometry. Robust links are required between these different representations to enable analysis attributes to be transferred between different design and analysis packages for models at various levels of fidelity.
This paper describes a novel approach for integrating design and analysis models by identifying and managing the relationships between the different representations. Three key technologies, Cellular Modeling, Virtual Topology and Equivalencing, have been employed to achieve effective simulation model management. These technologies and their implementation are discussed in detail. Prototype automated tools are introduced demonstrating how multiple simulation models can be linked and maintained to facilitate seamless integration throughout the design cycle.
Resumo:
We describe formulation and evaluation of novel dissolving polymeric microneedle (MN) arrays for the facilitated delivery of low molecular weight, high dose drugs. Ibuprofen sodium was used as the model here and was successfully formulated at approximately 50% w/w in the dry state using the copolymer poly(methylvinylether/maleic acid). These MNs were robust and effectively penetrated skin in vitro, dissolving rapidly to deliver the incorporated drug. The delivery of 1.5mg ibuprofen sodium, the theoretical mass of ibuprofen sodium contained within the dry MN alone, was vastly exceeded, indicating extensive delivery of the drug loaded into the baseplates. Indeed in in vitro transdermal delivery studies, approximately 33mg (90%) of the drug initially loaded into the arrays was delivered over 24h. Iontophoresis produced no meaningful increase in delivery. Biocompatibility studies and in vivo rat skin tolerance experiments raised no concerns. The blood plasma ibuprofen sodium concentrations achieved in rats (263μgml(-1) at the 24h time point) were approximately 20 times greater than the human therapeutic plasma level. By simplistic extrapolation of average weights from rats to humans, a MN patch design of no greater than 10cm(2) could cautiously be estimated to deliver therapeutically-relevant concentrations of ibuprofen sodium in humans. This work, therefore, represents a significant progression in exploitation of MN for successful transdermal delivery of a much wider range of drugs.
Resumo:
The design optimization of cold-formed steel portal frame buildings is considered in this paper. The objective function is based on the cost of the members for the main frame and secondary members (i.e., purlins, girts, and cladding for walls and roofs) per unit area on the plan of the building. A real-coded niching genetic algorithm is used to minimize the cost of the frame and secondary members that are designed on the basis of ultimate limit state. It iis shown that the proposed algorithm shows effective and robust capacity in generating the optimal solution, owing to the population's diversity being maintained by applying the niching method. In the optimal design, the cost of purlins and side rails are shown to account for 25% of the total cost; the main frame members account for 27% of the total cost, claddings for the walls and roofs accounted for 27% of the total cost.
Resumo:
An environment has been created for the optimisation of aerofoil profiles with inclusion of small surface features. For TS wave dominated flows, the paper examines the consequences of the addition of a depression on the aerodynamic optimisation of an NLF aerofoil, and describes the geometry definition fidelity and optimisation algorithm employed in the development process. The variables that define the depression for this optimisation investigation have been fixed, however a preliminary study is presented demonstrating the sensitivity of the flow to the depression characteristics. Solutions to the optimisation problem are then presented using both gradient-based and genetic algorithm techniques, and for accurate representation of the inclusion of small surface perturbations it is concluded that a global optimisation method is required for this type of aerofoil optimisation task due to the nature of the response surface generated. When dealing with surface features, changes in the transition onset are likely to be of a non-linear nature so it is highly critical to have an optimisation algorithm that is robust, suggesting that for this framework, gradient-based methods alone are not suited.
Resumo:
Motivated by the need for designing efficient and robust fully-distributed computation in highly dynamic networks such as Peer-to-Peer (P2P) networks, we study distributed protocols for constructing and maintaining dynamic network topologies with good expansion properties. Our goal is to maintain a sparse (bounded degree) expander topology despite heavy {\em churn} (i.e., nodes joining and leaving the network continuously over time). We assume that the churn is controlled by an adversary that has complete knowledge and control of what nodes join and leave and at what time and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is a randomized distributed protocol that guarantees with high probability the maintenance of a {\em constant} degree graph with {\em high expansion} even under {\em continuous high adversarial} churn. Our protocol can tolerate a churn rate of up to $O(n/\poly\log(n))$ per round (where $n$ is the stable network size). Our protocol is efficient, lightweight, and scalable, and it incurs only $O(\poly\log(n))$ overhead for topology maintenance: only polylogarithmic (in $n$) bits needs to be processed and sent by each node per round and any node's computation cost per round is also polylogarithmic. The given protocol is a fundamental ingredient that is needed for the design of efficient fully-distributed algorithms for solving fundamental distributed computing problems such as agreement, leader election, search, and storage in highly dynamic P2P networks and enables fast and scalable algorithms for these problems that can tolerate a large amount of churn.
Resumo:
Physically Unclonable Functions (PUFs), exploit inherent manufacturing variations and present a promising solution for hardware security. They can be used for key storage, authentication and ID generations. Low power cryptographic design is also very important for security applications. However, research to date on digital PUF designs, such as Arbiter PUFs and RO PUFs, is not very efficient. These PUF designs are difficult to implement on Field Programmable Gate Arrays (FPGAs) or consume many FPGA hardware resources. In previous work, a new and efficient PUF identification generator was presented for FPGA. The PUF identification generator is designed to fit in a single slice per response bit by using a 1-bit PUF identification generator cell formed as a hard-macro. In this work, we propose an ultra-compact PUF identification generator design. It is implemented on ten low-cost Xilinx Spartan-6 FPGA LX9 microboards. The resource utilization is only 2.23%, which, to the best of the authors' knowledge, is the most compact and robust FPGA-based PUF identification generator design reported to date. This PUF identification generator delivers a stable range of uniqueness of around 50% and good reliability between 85% and 100%.
Resumo:
Electrochemical water splitting used for generating hydrogen has attracted increasingly attention due to energy and environmental issues. It is a major challenge to design an efficient, robust and inexpensive electrocatalyst to achieve preferable catalytic performance. Herein, a novel three-dimensional (3D) electrocatalyst was prepared by decorating nanostructured biological material-derived carbon nanofibers with in situ generated cobalt-based nanospheres (denoted as CNF@Co) through a facile approach. The interconnected porous 3D networks of the resulting CNF@Co catalyst provide abundant channels and interfaces, which remarkably favor both mass transfer and oxygen evolution. The as-prepared CNF@Co shows excellent electrocatalytic activity towards the oxygen evolution reactions with an onset potential of about 0.445 V vs. Ag/AgCl. It only needs a low overpotential of 314 mV to achieve a current density of 10 mA/cm<sup>2</sup> in 1.0 M KOH. Furthermore, the CNF@Co catalyst exhibits excellent stability towards water oxidation, even outperforming commercial IrO<inf>2</inf> and RuO<inf>2</inf> catalysts.
Resumo:
This paper builds on previous work to show how using holistic and iterative design optimisation tools can be used to produce a commercially viable product that reduces a costly assembly into a single moulded structure. An assembly consisting of a structural metallic support and compression moulding outer shell undergo design optimisation and analysis to remove the support from the assembly process in favour of a structural moulding. The support is analysed and a sheet moulded compound (SMC) alternative is presented, this is then combined into a manufacturable shell design which is then assessed on viability as an alternative to the original.
Alongside this a robust material selection system is implemented that removes user bias towards materials for designs. This system builds on work set out by the Cambridge Material Selector and Boothroyd and Dewhurst, while using a selection of applicable materials currently available for the compression moulding process. This material selection process has been linked into the design and analysis stage, via scripts for use in the finite element environment. This builds towards an analysis toolkit that is suggested to develop and enhance manufacturability of design studies.
Resumo:
This paper presents a new encryption scheme implemented at the physical layer of wireless networks employing orthogonal frequency-division multiplexing (OFDM). The new scheme obfuscates the subcarriers by randomly reserving several subcarriers for dummy data and resequences the training symbol by a new secure sequence. Subcarrier obfuscation renders the OFDM transmission more secure and random, while training symbol resequencing protects the entire physical layer packet, but does not affect the normal functions of synchronization and channel estimation of legitimate users while preventing eavesdroppers from performing these functions. The security analysis shows the system is robust to various attacks by analyzing the search space using an exhaustive key search. Our scheme is shown to have a better performance in terms of search space, key rate and complexity in comparison with other OFDM physical layer encryption schemes. The scheme offers options for users to customize the security level and key rate according to the hardware resource. Its low complexity nature also makes the scheme suitable for resource limited devices. Details of practical design considerations are highlighted by applying the approach to an IEEE 802.11 OFDM system case study.
Resumo:
The potential of human adenovirus vectors as vehicles for gene transfer with clinical applications in vaccination, cancer treatment and in many monogenic and acquired diseases has been demonstrated in several studies and clinical trials. However, the clinical use of these vectors can be limited by pre-existing humoral and cellular anti-capsid immunity. One way to circumvent this bottleneck while keeping the advantages of using adenovirus vectors is using non-human viruses such as Canine Adenovirus type 2 (CAV-2). Moreover, CAV-2 vectors present attractive features to develop potential treatment of neurodegenerative and ocular disorders. While the interest in CAV-2 vectors increases, scalable and robust production processes are required to meet the need for preclinical and possibly clinical uses.(...)
Resumo:
The report addresses the question of what are the preferences of broadband consumers on the Portuguese telecommunication market. A triple play bundle is being investigated. The discrete choice analysis, adopted in the study, base on 110 responses, mainly from NOVA students. The data for the analysis was collected via manually designed on-line survey. The results show that the price attribute is relatively the most important one while the television attribute is being overlooked in the decision making process. Main effects examined in the research are robust. In addition, "extras" components are being tested in terms of users' preferences.