984 resultados para reliability engineering
Resumo:
Many examples for emergent behaviors may be observed in self-organizing physical and biological systems which prove to be robust, stable, and adaptable. Such behaviors are often based on very simple mechanisms and rules, but artificially creating them is a challenging task which does not comply with traditional software engineering. In this article, we propose a hybrid approach by combining strategies from Genetic Programming and agent software engineering, and demonstrate that this approach effectively yields an emergent design for given problems.
Resumo:
Genetic Programming can be effectively used to create emergent behavior for a group of autonomous agents. In the process we call Offline Emergence Engineering, the behavior is at first bred in a Genetic Programming environment and then deployed to the agents in the real environment. In this article we shortly describe our approach, introduce an extended behavioral rule syntax, and discuss the impact of the expressiveness of the behavioral description to the generation success, using two scenarios in comparison: the election problem and the distributed critical section problem. We evaluate the results, formulating criteria for the applicability of our approach.
Resumo:
Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methods for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and description of the business activities of the enterprise to be able to build up, optimize and operate an successful enterprise. The main goal of SIX SIGMA is to optimize the performance of processes. A still open question is: "What are the adequate Quality criteria and methods to ensure such performance? What must we do to get Quality governance?" This paper describes a method including an Enterprise Engineering method and SIX SIGMA strategy to reach Quality Governance
Resumo:
Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methodes for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and description of the business activities of the enterprise to be able to build up, to optimize and to operate an successful enterprise.
Resumo:
Among organic materials, spirobifluorene derivatives represent a very attractive class of materials for electronic devices. These compounds have high melting points, glass transitions temperatures and morphological stability, which makes these materials suitable for organic electronic applications. In addition, some of spirobifluorenes can form porous supramolecular associations with significant volumes available for the inclusion of guests. These molecular associations based on the spirobifluorenes are noteworthy because they are purely molecular analogues of zeolites and other microporous solids, with potential applications in separation, catalysis, sensing and other areas.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.
Resumo:
The challenge of reducing carbon emission and achieving emission target until 2050, has become a key development strategy of energy distribution for each country. The automotive industries, as the important portion of implementing energy requirements, are making some related researches to meet energy requirements and customer requirements. For modern energy requirements, it should be clean, green and renewable. For customer requirements, it should be economic, reliable and long life time. Regarding increasing requirements on the market and enlarged customer quantity, EVs and PHEV are more and more important for automotive manufactures. Normally for EVs and PHEV there are two important key parts, which are battery package and power electronics composing of critical components. A rechargeable battery is a quite important element for achieving cost competitiveness, which is mainly used to story energy and provide continue energy to drive an electric motor. In order to recharge battery and drive the electric motor, power electronics group is an essential bridge to convert different energy types for both of them. In modern power electronics there are many different topologies such as non-isolated and isolated power converters which can be used to implement for charging battery. One of most used converter topology is multiphase interleaved power converter, pri- marily due to its prominent advantages, which is frequently employed to obtain optimal dynamic response, high effciency and compact converter size. Concerning its usage, many detailed investigations regarding topology, control strategy and devices have been done. In this thesis, the core research is to investigate some branched contents in term of issues analysis and optimization approaches of building magnetic component. This work starts with an introduction of reasons of developing EVs and PEHV and an overview of different possible topologies regarding specific application requirements. Because of less components, high reliability, high effciency and also no special safety requirement, non-isolated multiphase interleaved converter is selected as the basic research topology of founded W-charge project for investigating its advantages and potential branches on using optimized magnetic components. Following, all those proposed aspects and approaches are investigated and analyzed in details in order to verify constrains and advantages through using integrated coupled inductors. Furthermore, digital controller concept and a novel tapped-inductor topology is proposed for multiphase power converter and electric vehicle application.
Resumo:
A revolution\0\0\0 in earthmoving, a $100 billion industry, can be achieved with three components: the GPS location system, sensors and computers in bulldozers, and SITE CONTROLLER, a central computer system that maintains design data and directs operations. The first two components are widely available; I built SITE CONTROLLER to complete the triangle and describe it here. SITE CONTROLLER assists civil engineers in the design, estimation, and construction of earthworks, including hazardous waste site remediation. The core of SITE CONTROLLER is a site modelling system that represents existing and prospective terrain shapes, roads, hydrology, etc. Around this core are analysis, simulation, and vehicle control tools. Integrating these modules into one program enables civil engineers and contractors to use a single interface and database throughout the life of a project.
Resumo:
The electronics industry is encountering thermal challenges and opportunities with lengthscales comparable to or much less than one micrometer. Examples include nanoscale phonon hotspots in transistors and the increasing temperature rise in onchip interconnects. Millimeter-scale hotspots on microprocessors, resulting from varying rates of power consumption, are being addressed using two-phase microchannel heat sinks. Nanoscale thermal data storage technology has received much attention recently. This paper provides an overview of these topics with a focus on related research at Stanford University.
Resumo:
Conventional floating gate non-volatile memories (NVMs) present critical issues for device scalability beyond the sub-90 nm node, such as gate length and tunnel oxide thickness reduction. Nanocrystalline germanium (nc-Ge) quantum dot flash memories are fully CMOS compatible technology based on discrete isolated charge storage nodules which have the potential of pushing further the scalability of conventional NVMs. Quantum dot memories offer lower operating voltages as compared to conventional floating-gate (FG) Flash memories due to thinner tunnel dielectrics which allow higher tunneling probabilities. The isolated charge nodules suppress charge loss through lateral paths, thereby achieving a superior charge retention time. Despite the considerable amount of efforts devoted to the study of nanocrystal Flash memories, the charge storage mechanism remains obscure. Interfacial defects of the nanocrystals seem to play a role in charge storage in recent studies, although storage in the nanocrystal conduction band by quantum confinement has been reported earlier. In this work, a single transistor memory structure with threshold voltage shift, Vth, exceeding ~1.5 V corresponding to interface charge trapping in nc-Ge, operating at 0.96 MV/cm, is presented. The trapping effect is eliminated when nc-Ge is synthesized in forming gas thus excluding the possibility of quantum confinement and Coulomb blockade effects. Through discharging kinetics, the model of deep level trap charge storage is confirmed. The trap energy level is dependent on the matrix which confines the nc-Ge.
Resumo:
The release of growth factors from tissue engineering scaffolds provides signals that influence the migration, differentiation, and proliferation of cells. The incorporation of a drug delivery platform that is capable of tunable release will give tissue engineers greater versatility in the direction of tissue regeneration. We have prepared a novel composite of two biomaterials with proven track records - apatite and poly(lactic-co-glycolic acid) (PLGA) – as a drug delivery platform with promising controlled release properties. These composites have been tested in the delivery of a model protein, bovine serum albumin (BSA), as well as therapeutic proteins, recombinant human bone morphogenetic protein-2 (rhBMP-2) and rhBMP-6. The controlled release strategy is based on the use of a polymer with acidic degradation products to control the dissolution of the basic apatitic component, resulting in protein release. Therefore, any parameter that affects either polymer degradation or apatite dissolution can be used to control protein release. We have modified the protein release profile systematically by varying the polymer molecular weight, polymer hydrophobicity, apatite loading, apatite particle size, and other material and processing parameters. Biologically active rhBMP-2 was released from these composite microparticles over 100 days, in contrast to conventional collagen sponge carriers, which were depleted in approximately 2 weeks. The released rhBMP-2 was able to induce elevated alkaline phosphatase and osteocalcin expression in pluripotent murine embryonic fibroblasts. To augment tissue engineering scaffolds with tunable and sustained protein release capabilities, these composite microparticles can be dispersed in the scaffolds in different combinations to obtain a superposition of the release profiles. We have loaded rhBMP-2 into composite microparticles with a fast release profile, and rhBMP-6 into slow-releasing composite microparticles. An equi-mixture of these two sets of composite particles was then injected into a collagen sponge, allowing for dual release of the proteins from the collagenous scaffold. The ability of these BMP-loaded scaffolds to induce osteoblastic differentiation in vitro and ectopic bone formation in a rat model is being investigated. We anticipate that these apatite-polymer composite microparticles can be extended to the delivery of other signalling molecules, and can be incorporated into other types of tissue engineering scaffolds.
Resumo:
Developments in mammalian cell culture and recombinant technology has allowed for the production of recombinant proteins for use as human therapeutics. Mammalian cell culture is typically operated at the physiological temperature of 37°. However, recent research has shown that the use of low-temperature conditions (30-33°) as a platform for cell-culture results in changes in cell characteristics, such as increased specific productivity and extended periods of cell viability, that can potentially improve the production of recombinant proteins. Furthermore, many recent reports have focused on investigating low-temperature mammalian cell culture of Chinese hamster ovary (CHO) cells, one of the principal cell-lines used in industrial production of recombinant proteins. Exposure to low ambient temperatures exerts an external stress on all living cells, and elicits a cellular response. This cold-stress response has been observed in bacteria, plants and mammals, and is regulated at the gene level. The exact genes and molecular mechanisms involved in the cold-stress response in prokaryotes and plants have been well studied. There are also various reports that detail the modification of cold-stress genes to improve the characteristics of bacteria or plant cells at low temperatures. However, there is very limited information on mammalian cold-stress genes or the related pathways governing the mammalian cold-stress response. This project seeks to investigate and characterise cold-stress genes that are differentially expressed during low-temperature culture of CHO cells, and to relate them to the various changes in cell characteristics observed in low-temperature culture of CHO cells. The gene information can then be used to modify CHO cell-lines for improved performance in the production of recombinant proteins.