884 resultados para G520 Systems Design Methodologies


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes the design and development cycle of a 3D biochip separator and the modelling analysis of flow behaviour in the biochip microchannel features. The focus is on identifying the difference between 2D and 3D implementations as well as developing basic forms of 3D microfluidic separators. Five variants, based around the device are proposed and analysed. These include three variations of the branch channels (circular, rectangular, disc) and two variations of the main channel (solid and concentric). Ignoring the initial transient behaviour and assuming steady state flow has been established, the efficiencies of the flow between the main and side channels for the different designs are analysed and compared with regard to relevant biomicrofluidic laws or effects (bifurcation law, Fahraeus effect, cell-free phenomenon, bending channel effect and laminar flow behaviour). The modelling results identify flow features in microchannels, a constriction and bifurcations and show detailed differences in flow fields between the various designs. The manufacturing process using injection moulding for the initial base case design is also presented and discussed. The work reported here is supported as part of the UK funded 3D-MINTEGRATION project. © 2010 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Design optimisation of compressor systems is a computationally expensive problem due to the large number of variables, complicated design space and expense of the analysis tools. One approach to reduce the expense of the process and make it achievable in industrial timescales is to employ multi-fidelity techniques, which utilise more rapid tools in conjunction with the highest fidelity analyses. The complexity of the compressor design landscape is such that the starting point for these optimisations can influence the achievable results; these starting points are often existing (optimised) compressor designs, which form a limited set in terms of both quantity and diversity of the design. To facilitate the multi-fidelity optimisation procedure, a compressor synthesis code was developed which allowed the performance attributes (e.g. stage loadings, inlet conditions) to be stipulated, enabling the generation of a variety of compressors covering a range of both design topology and quality to act as seeding geometries for the optimisation procedures. Analysis of the performance of the multi-fidelity optimisation system when restricting its exploration space to topologically different areas of the design space indicated little advantage over allowing the system to search the design space itself. However, comparing results from optimisations started from seed designs with different aerodynamic qualites indicated an improved performance could be achieved by starting an optimisation from a higher quality point, and thus that the choice of starting point did affect the final outcome of the optimisations. Both investigations indicated that the performance gains through the optimisation were largely defined by the early exploration of the design space where the multi-fidelity speedup could be exploited, thus extending this region is likely to have the greatest effect on performance of the optimisation system. © 2013 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Design optimisation of compressor systems is a computationally expensive problem due to the large number of variables, complicated design space and expense of the analysis tools. One approach to reduce the expense of the process and make it achievable in industrial timescales is to employ multi-fidelity techniques, which utilise more rapid tools in conjunction with the highest fidelity analyses. The complexity of the compressor design landscape is such that the starting point for these optimisations can influence the achievable results; these starting points are often existing (optimised) compressor designs, which form a limited set in terms of both quantity and diversity of the design. To facilitate the multi-fidelity optimisation procedure, a compressor synthesis code was developed which allowed the performance attributes (e.g. stage loadings, inlet conditions) to be stipulated, enabling the generation of a variety of compressors covering a range of both design topology and quality to act as seeding geometries for the optimisation procedures. Analysis of the performance of the multi-fidelity optimisation system when restricting its exploration space to topologically different areas of the design space indicated little advantage over allowing the system to search the design space itself. However, comparing results from optimisations started from seed designs with different aerodynamic qualites indicated an improved performance could be achieved by starting an optimisation from a higher quality point, and thus that the choice of starting point did affect the final outcome of the optimisations. Both investigations indicated that the performance gains through the optimisation were largely defined by the early exploration of the design space where the multi-fidelity speedup could be exploited, thus extending this region is likely to have the greatest effect on performance of the optimisation system. © 2012 AIAA.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to concerns about environmental protection and resource utilization, product lifecycle management for end-of-life (EOL) has received increasing attention in many industrial sectors including manufacturing, maintenance/repair, and recycling/refurbishing of the product. To support these functions, crucial issues are studied to realize a product recovery management system (PRMS), including: (1) an architecture design for EOL services, such as remanufacturing and recycling; (2) a product data model required for EOL activity based on international standards; and (3) an infrastructure for information acquisition and mapping to product lifecycle information. The presented works are illustrated via a realistic scenario. © 2008 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

New robotics is an approach to robotics that, in contrast to traditional robotics, employs ideas and principles from biology. While in the traditional approach there are generally accepted methods (e. g., from control theory), designing agents in the new robotics approach is still largely considered an art. In recent years, we have been developing a set of heuristics, or design principles, that on the one hand capture theoretical insights about intelligent (adaptive) behavior, and on the other provide guidance in actually designing and building systems. In this article we provide an overview of all the principles but focus on the principles of ecological balance, which concerns the relation between environment, morphology, materials, and control, and sensory-motor coordination, which concerns self-generated sensory stimulation as the agent interacts with the environment and which is a key to the development of high-level intelligence. As we argue, artificial evolution together with morphogenesis is not only "nice to have" but is in fact a necessary tool for designing embodied agents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Malicious software (malware) have significantly increased in terms of number and effectiveness during the past years. Until 2006, such software were mostly used to disrupt network infrastructures or to show coders’ skills. Nowadays, malware constitute a very important source of economical profit, and are very difficult to detect. Thousands of novel variants are released every day, and modern obfuscation techniques are used to ensure that signature-based anti-malware systems are not able to detect such threats. This tendency has also appeared on mobile devices, with Android being the most targeted platform. To counteract this phenomenon, a lot of approaches have been developed by the scientific community that attempt to increase the resilience of anti-malware systems. Most of these approaches rely on machine learning, and have become very popular also in commercial applications. However, attackers are now knowledgeable about these systems, and have started preparing their countermeasures. This has lead to an arms race between attackers and developers. Novel systems are progressively built to tackle the attacks that get more and more sophisticated. For this reason, a necessity grows for the developers to anticipate the attackers’ moves. This means that defense systems should be built proactively, i.e., by introducing some security design principles in their development. The main goal of this work is showing that such proactive approach can be employed on a number of case studies. To do so, I adopted a global methodology that can be divided in two steps. First, understanding what are the vulnerabilities of current state-of-the-art systems (this anticipates the attacker’s moves). Then, developing novel systems that are robust to these attacks, or suggesting research guidelines with which current systems can be improved. This work presents two main case studies, concerning the detection of PDF and Android malware. The idea is showing that a proactive approach can be applied both on the X86 and mobile world. The contributions provided on this two case studies are multifolded. With respect to PDF files, I first develop novel attacks that can empirically and optimally evade current state-of-the-art detectors. Then, I propose possible solutions with which it is possible to increase the robustness of such detectors against known and novel attacks. With respect to the Android case study, I first show how current signature-based tools and academically developed systems are weak against empirical obfuscation attacks, which can be easily employed without particular knowledge of the targeted systems. Then, I examine a possible strategy to build a machine learning detector that is robust against both empirical obfuscation and optimal attacks. Finally, I will show how proactive approaches can be also employed to develop systems that are not aimed at detecting malware, such as mobile fingerprinting systems. In particular, I propose a methodology to build a powerful mobile fingerprinting system, and examine possible attacks with which users might be able to evade it, thus preserving their privacy. To provide the aforementioned contributions, I co-developed (with the cooperation of the researchers at PRALab and Ruhr-Universität Bochum) various systems: a library to perform optimal attacks against machine learning systems (AdversariaLib), a framework for automatically obfuscating Android applications, a system to the robust detection of Javascript malware inside PDF files (LuxOR), a robust machine learning system to the detection of Android malware, and a system to fingerprint mobile devices. I also contributed to develop Android PRAGuard, a dataset containing a lot of empirical obfuscation attacks against the Android platform. Finally, I entirely developed Slayer NEO, an evolution of a previous system to the detection of PDF malware. The results attained by using the aforementioned tools show that it is possible to proactively build systems that predict possible evasion attacks. This suggests that a proactive approach is crucial to build systems that provide concrete security against general and evasion attacks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Eckerdal, A. Ratcliffe, M. McCartney, R. Mostr?m, J.E. Zander, C. Can Graduating Students Design Software Systems? Proc. 37th SIGCSE Technical Symposium on Computer Science Education. 2006

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the last decade, we have witnessed the emergence of large, warehouse-scale data centres which have enabled new internet-based software applications such as cloud computing, search engines, social media, e-government etc. Such data centres consist of large collections of servers interconnected using short-reach (reach up to a few hundred meters) optical interconnect. Today, transceivers for these applications achieve up to 100Gb/s by multiplexing 10x 10Gb/s or 4x 25Gb/s channels. In the near future however, data centre operators have expressed a need for optical links which can support 400Gb/s up to 1Tb/s. The crucial challenge is to achieve this in the same footprint (same transceiver module) and with similar power consumption as today’s technology. Straightforward scaling of the currently used space or wavelength division multiplexing may be difficult to achieve: indeed a 1Tb/s transceiver would require integration of 40 VCSELs (vertical cavity surface emitting laser diode, widely used for short‐reach optical interconnect), 40 photodiodes and the electronics operating at 25Gb/s in the same module as today’s 100Gb/s transceiver. Pushing the bit rate on such links beyond today’s commercially available 100Gb/s/fibre will require new generations of VCSELs and their driver and receiver electronics. This work looks into a number of state‐of-the-art technologies and investigates their performance restraints and recommends different set of designs, specifically targeting multilevel modulation formats. Several methods to extend the bandwidth using deep submicron (65nm and 28nm) CMOS technology are explored in this work, while also maintaining a focus upon reducing power consumption and chip area. The techniques used were pre-emphasis in rising and falling edges of the signal and bandwidth extensions by inductive peaking and different local feedback techniques. These techniques have been applied to a transmitter and receiver developed for advanced modulation formats such as PAM-4 (4 level pulse amplitude modulation). Such modulation format can increase the throughput per individual channel, which helps to overcome the challenges mentioned above to realize 400Gb/s to 1Tb/s transceivers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the swamping and timeliness of data in the organizational context, the decision maker’s choice of an appropriate decision alternative in a given situation is defied. In particular, operational actors are facing the challenge to meet business-critical decisions in a short time and at high frequency. The construct of Situation Awareness (SA) has been established in cognitive psychology as a valid basis for understanding the behavior and decision making of human beings in complex and dynamic systems. SA gives decision makers the possibility to make informed, time-critical decisions and thereby improve the performance of the respective business process. This research paper leverages SA as starting point for a design science project for Operational Business Intelligence and Analytics systems and suggests a first version of design principles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study the implications of the effectuation concept for socio-technical artifact design as part of the design science research (DSR) process in information systems (IS). Effectuation logic is the opposite of causal logic. Ef-fectuation does not focus on causes to achieve a particular effect, but on the possibilities that can be achieved with extant means and resources. Viewing so-cio-technical IS DSR through an effectuation lens highlights the possibility to design the future even without set goals. We suggest that effectuation may be a useful perspective for design in dynamic social contexts leading to a more dif-ferentiated view on the instantiation of mid-range artifacts for specific local ap-plication contexts. Design science researchers can draw on this paper’s conclu-sions to view their DSR projects through a fresh lens and to reexamine their re-search design and execution. The paper also offers avenues for future research to develop more concrete application possibilities of effectuation in socio-technical IS DSR and, thus, enrich the discourse.