4 resultados para work processes.
em Universitätsbibliothek Kassel, Universität Kassel, Germany
Resumo:
Die Unternehmensumwelt wird zunehmend von dynamischen Veränderungen und Turbulenzen geprägt. Globalisierung der Märkte und schneller wechselnde Umfeldveränderungen zwingen die Unternehmen dazu, ihr strategisches Verhalten ständig an neue Bedingungen anzupassen. Ein vorausschauendes Denken und Handeln wird immer notwendiger, da in vielen Branchen Produkte und Dienstleistungen in wesentlich kürzeren Zeitabständen als bisher durch Neue ersetzt werden. Um das unternehmerische Umfeld im Blick zu haben und über Pläne, Leistungen und Kompetenzen der Konkurrenten und Kunden über Marktveränderungen und technische Neuerungen etc. aktuell und vorausschauend informiert zu sein, wird ein intelligentes, systematisches Vorgehen bei der Informationsversorgung mit Umfeldinformationen benötigt. Die Dissertation befasst sich mit der Entwicklung, Implementierung und Untersuchung eines Softwareinstrumentes und systematischen Arbeitsprozesses, um die Versorgung von Managern und Mitarbeitern mit Umfeldinformationen (externen Informationen) zu verbessern um aufkommende Chancen und/oder Risiken frühzeitig zu erkennen. Der Schwerpunkt der Arbeit bezieht sich hauptsächlich auf die Phase der Informationsbereitstellung (Beschaffung, Verarbeitung und Präsentation) und dem Abruf der Umfeldinformationen durch die Manager und Mitarbeiter. Das entwickelte Softwareinstrument und die Arbeitsabläufe werden in drei Untersuchungsfirmen implementiert und evaluiert. Über eine schriftliche Befragung soll der Zustand vor Einführung des Softwareinstrumentes und Arbeitsabläufen und ein Jahr danach erfasst werden. Zur Ergänzung der Ergebnisse der schriftlichen Erhebung werden zusätzlich leitfadengestützte Einzelinterviews durchgeführt werden. Spezielle Auswertungen der Systemzugriffe sollen Aufschluss über den Nutzungsumfang und -häufigkeit im Zeitablauf geben. Über eine sukzessive Optimierung der Software und Arbeitsabläufe werden abschließend ein überarbeitetes Softwareinstrument sowie angepasste Arbeitsabläufe und ein Einführungsverfahren vorgestellt.
Resumo:
The progress in microsystem technology or nano technology places extended requirements to the fabrication processes. The trend is moving towards structuring within the nanometer scale on the one hand, and towards fabrication of structures with high aspect ratio (ratio of vertical vs. lateral dimensions) and large depths in the 100 µm scale on the other hand. Current procedures for the microstructuring of silicon are wet chemical etching and dry or plasma etching. A modern plasma etching technique for the structuring of silicon is the so-called "gas chopping" etching technique (also called "time-multiplexed etching"). In this etching technique, passivation cycles, which prevent lateral underetching of sidewalls, and etching cycles, which etch preferably in the vertical direction because of the sidewall passivation, are constantly alternated during the complete etching process. To do this, a CHF3/CH4 plasma, which generates CF monomeres is employed during the passivation cycle, and a SF6/Ar, which generates fluorine radicals and ions plasma is employed during the etching cycle. Depending on the requirements on the etched profile, the durations of the individual passivation and etching cycles are in the range of a few seconds up to several minutes. The profiles achieved with this etching process crucially depend on the flow of reactants, i.e. CF monomeres during the passivation cycle, and ions and fluorine radicals during the etching cycle, to the bottom of the profile, especially for profiles with high aspect ratio. With regard to the predictability of the etching processes, knowledge of the fundamental effects taking place during a gas chopping etching process, and their impact onto the resulting profile is required. For this purpose in the context of this work, a model for the description of the profile evolution of such etching processes is proposed, which considers the reactions (etching or deposition) at the sample surface on a phenomenological basis. Furthermore, the reactant transport inside the etching trench is modelled, based on angular distribution functions and on absorption probabilities at the sidewalls and bottom of the trench. A comparison of the simulated profiles with corresponding experimental profiles reveals that the proposed model reproduces the experimental profiles, if the angular distribution functions and absorption probabilities employed in the model is in agreement with data found in the literature. Therefor the model developed in the context of this work is an adequate description of the effects taking place during a gas chopping plasma etching process.
Resumo:
During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.
Resumo:
In this work, fabrication processes for daylight guiding systems based on micromirror arrays are developed, evaluated and optimized.Two different approaches are used: At first, nanoimprint lithography is used to fabricate large area micromirrors by means of Substrate Conformal Imprint Lithography (SCIL).Secondly,a new lithography technique is developed using a novel bi-layered photomask to fabricate large area micromirror arrays. The experimental results showing a reproducible stable process, high yield, and is consuming less material, time, cost and effort.