11 resultados para Order systems

em Universitätsbibliothek Kassel, Universität Kassel, Germany


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In rural areas of the Mekong Countries, the problem of electricity supplying rural communities is particularly alarming. Supplying power to these areas requires facilities that are not economically viable. However, government programs are under way to provide this product that is vital to community well being. A nation priority of Mekong Countries is to provide electrical power to people in rural areas, within normal budgetary constraints. Electricity must be introduced into rural areas in such a way that maximize the technical, economic and social benefit. Another consideration is the source of electrical generation and the effects on the natural environment. The main research purpose is to implement field tests, monitoring and evaluation of the PV-Diesel Hybrid System (PVHS) at the Energy Park of School of Renewable Energy Technology (SERT) in order to test the PVSH working under the meteorological conditions of the Mekong Countries and to develop a software simulation called RES, which studies the technical and economic performance of rural electrification options. This software must be easy to use and understand for the energy planner on rural electrification projects, to evaluate the technical and economic performance of the PVHS based on the renewable energy potential for rural electrification of the Mekong Country by using RES. Finally, this project aims to give guidance for the possible use of PVHS application in this region, particularly in regard to its technical and economic sustainability. PVHS should be promoted according to the principles of proper design and adequate follow up with maintenance, so that the number of satisfied users will be achieved. PVHS is not the only possible technology for rural electrification, but for the Mekong Countries it is one of the most proper choices. Other renewable energy options such as wind, biomass and hydro power need to be studied in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report gives a detailed discussion on the system, algorithms, and techniques that we have applied in order to solve the Web Service Challenges (WSC) of the years 2006 and 2007. These international contests are focused on semantic web service composition. In each challenge of the contests, a repository of web services is given. The input and output parameters of the services in the repository are annotated with semantic concepts. A query to a semantic composition engine contains a set of available input concepts and a set of wanted output concepts. In order to employ an offered service for a requested role, the concepts of the input parameters of the offered operations must be more general than requested (contravariance). In contrast, the concepts of the output parameters of the offered service must be more specific than requested (covariance). The engine should respond to a query by providing a valid composition as fast as possible. We discuss three different methods for web service composition: an uninformed search in form of an IDDFS algorithm, a greedy informed search based on heuristic functions, and a multi-objective genetic algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With this document, we provide a compilation of in-depth discussions on some of the most current security issues in distributed systems. The six contributions have been collected and presented at the 1st Kassel Student Workshop on Security in Distributed Systems (KaSWoSDS’08). We are pleased to present a collection of papers not only shedding light on the theoretical aspects of their topics, but also being accompanied with elaborate practical examples. In Chapter 1, Stephan Opfer discusses Viruses, one of the oldest threats to system security. For years there has been an arms race between virus producers and anti-virus software providers, with no end in sight. Stefan Triller demonstrates how malicious code can be injected in a target process using a buffer overflow in Chapter 2. Websites usually store their data and user information in data bases. Like buffer overflows, the possibilities of performing SQL injection attacks targeting such data bases are left open by unwary programmers. Stephan Scheuermann gives us a deeper insight into the mechanisms behind such attacks in Chapter 3. Cross-site scripting (XSS) is a method to insert malicious code into websites viewed by other users. Michael Blumenstein explains this issue in Chapter 4. Code can be injected in other websites via XSS attacks in order to spy out data of internet users, spoofing subsumes all methods that directly involve taking on a false identity. In Chapter 5, Till Amma shows us different ways how this can be done and how it is prevented. Last but not least, cryptographic methods are used to encode confidential data in a way that even if it got in the wrong hands, the culprits cannot decode it. Over the centuries, many different ciphers have been developed, applied, and finally broken. Ilhan Glogic sketches this history in Chapter 6.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Arbeit behandelt Restartautomaten und Erweiterungen von Restartautomaten. Restartautomaten sind ein Werkzeug zum Erkennen formaler Sprachen. Sie sind motiviert durch die linguistische Methode der Analyse durch Reduktion und wurden 1995 von Jancar, Mráz, Plátek und Vogel eingeführt. Restartautomaten bestehen aus einer endlichen Kontrolle, einem Lese/Schreibfenster fester Größe und einem flexiblen Band. Anfänglich enthält dieses sowohl die Eingabe als auch Bandbegrenzungssymbole. Die Berechnung eines Restartautomaten läuft in so genannten Zyklen ab. Diese beginnen am linken Rand im Startzustand, in ihnen wird eine lokale Ersetzung auf dem Band durchgeführt und sie enden mit einem Neustart, bei dem das Lese/Schreibfenster wieder an den linken Rand bewegt wird und der Startzustand wieder eingenommen wird. Die vorliegende Arbeit beschäftigt sich hauptsächlich mit zwei Erweiterungen der Restartautomaten: CD-Systeme von Restartautomaten und nichtvergessende Restartautomaten. Nichtvergessende Restartautomaten können einen Zyklus in einem beliebigen Zustand beenden und CD-Systeme von Restartautomaten bestehen aus einer Menge von Restartautomaten, die zusammen die Eingabe verarbeiten. Dabei wird ihre Zusammenarbeit durch einen Operationsmodus, ähnlich wie bei CD-Grammatik Systemen, geregelt. Für beide Erweiterungen zeigt sich, dass die deterministischen Modelle mächtiger sind als deterministische Standardrestartautomaten. Es wird gezeigt, dass CD-Systeme von Restartautomaten in vielen Fällen durch nichtvergessende Restartautomaten simuliert werden können und andererseits lassen sich auch nichtvergessende Restartautomaten durch CD-Systeme von Restartautomaten simulieren. Des Weiteren werden Restartautomaten und nichtvergessende Restartautomaten untersucht, die nichtdeterministisch sind, aber keine Fehler machen. Es zeigt sich, dass diese Automaten durch deterministische (nichtvergessende) Restartautomaten simuliert werden können, wenn sie direkt nach der Ersetzung einen neuen Zyklus beginnen, oder ihr Fenster nach links und rechts bewegen können. Außerdem gilt, dass alle (nichtvergessenden) Restartautomaten, die zwar Fehler machen dürfen, diese aber nach endlich vielen Zyklen erkennen, durch (nichtvergessende) Restartautomaten simuliert werden können, die keine Fehler machen. Ein weiteres wichtiges Resultat besagt, dass die deterministischen monotonen nichtvergessenden Restartautomaten mit Hilfssymbolen, die direkt nach dem Ersetzungsschritt den Zyklus beenden, genau die deterministischen kontextfreien Sprachen erkennen, wohingegen die deterministischen monotonen nichtvergessenden Restartautomaten mit Hilfssymbolen ohne diese Einschränkung echt mehr, nämlich die links-rechts regulären Sprachen, erkennen. Damit werden zum ersten Mal Restartautomaten mit Hilfssymbolen, die direkt nach dem Ersetzungsschritt ihren Zyklus beenden, von Restartautomaten desselben Typs ohne diese Einschränkung getrennt. Besonders erwähnenswert ist hierbei, dass beide Automatentypen wohlbekannte Sprachklassen beschreiben.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In database marketing, the behavior of customers is analyzed by studying the transactions they have performed. In order to get a global picture of the behavior of a customer, his single transactions have to be composed together. In On-Line Analytical Processing, this operation is known as reverse pivoting. With the ongoing data analysis process, reverse pivoting has to be repeated several times, usually requiring an implementation in SQL. In this paper, we present a construction for conceptual scales for reverse pivoting in Conceptual Information Systems, and also discuss the visualization. The construction allows the reuse of previously created queries without reprogramming and offers a visualization of the results by line diagrams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on transition-metal nanoalloy clusters composed of a few atoms is fascinating by their unusual properties due to the interplay among the structure, chemical order and magnetism. Such nanoalloy clusters, can be used to construct nanometer devices for technological applications by manipulating their remarkable magnetic, chemical and optical properties. Determining the nanoscopic features exhibited by the magnetic alloy clusters signifies the need for a systematic global and local exploration of their potential-energy surface in order to identify all the relevant energetically low-lying magnetic isomers. In this thesis the sampling of the potential-energy surface has been performed by employing the state-of-the-art spin-polarized density-functional theory in combination with graph theory and the basin-hopping global optimization techniques. This combination is vital for a quantitative analysis of the quantum mechanical energetics. The first approach, i.e., spin-polarized density-functional theory together with the graph theory method, is applied to study the Fe$_m$Rh$_n$ and Co$_m$Pd$_n$ clusters having $N = m+n \leq 8$ atoms. We carried out a thorough and systematic sampling of the potential-energy surface by taking into account all possible initial cluster topologies, all different distributions of the two kinds of atoms within the cluster, the entire concentration range between the pure limits, and different initial magnetic configurations such as ferro- and anti-ferromagnetic coupling. The remarkable magnetic properties shown by FeRh and CoPd nanoclusters are attributed to the extremely reduced coordination number together with the charge transfer from 3$d$ to 4$d$ elements. The second approach, i.e., spin-polarized density-functional theory together with the basin-hopping method is applied to study the small Fe$_6$, Fe$_3$Rh$_3$ and Rh$_6$ and the larger Fe$_{13}$, Fe$_6$Rh$_7$ and Rh$_{13}$ clusters as illustrative benchmark systems. This method is able to identify the true ground-state structures of Fe$_6$ and Fe$_3$Rh$_3$ which were not obtained by using the first approach. However, both approaches predict a similar cluster for the ground-state of Rh$_6$. Moreover, the computational time taken by this approach is found to be significantly lower than the first approach. The ground-state structure of Fe$_{13}$ cluster is found to be an icosahedral structure, whereas Rh$_{13}$ and Fe$_6$Rh$_7$ isomers relax into cage-like and layered-like structures, respectively. All the clusters display a remarkable variety of structural and magnetic behaviors. It is observed that the isomers having similar shape with small distortion with respect to each other can exhibit quite different magnetic moments. This has been interpreted as a probable artifact of spin-rotational symmetry breaking introduced by the spin-polarized GGA. The possibility of combining the spin-polarized density-functional theory with some other global optimization techniques such as minima-hopping method could be the next step in this direction. This combination is expected to be an ideal sampling approach having the advantage of avoiding efficiently the search over irrelevant regions of the potential energy surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food safety management systems (FSMSs) and the scrutinisation of the food safety practices that are intended for adoption on the firm level both offer strategic value to the dried fig sector. This study aims to prove the hypothesis that export orientation is a major motivating force for the adoption of food safety systems in the Turkish dried fig firms. Data were obtained from 91 dried fig firms located in Aydin, Turkey. Interviews were carried out with firms’ managers/owners using a face-to-face questionnaire designed from May to August of 2010. While 36.3 percent of the interviewed firms had adopted one or more systems, the rest had no certification. A binomial logistic econometric model was employed. The parameters that influenced this decision included contractual agreements with other firms, implementation of good practices by the dried fig farmers, export orientation and cost-benefit ratio. Interestingly, the rest of the indicators employed had no statistically significant effect on adoption behaviour. This paper focusses on the export orientation parameter directly in order to test the validity of the main research hypothesis. The estimated marginal effect suggests that when dried fig firms are export-oriented, the probability that these firms will adopt food safety systems goes up by 39.5 percent. This rate was the first range observed among all the marginal probability values obtained and thus verified the hypothesis that export orientation is a major motivator for the adoption of food safety systems in the Turkish dried fig firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With Chinas rapid economic development during the last decades, the national demand for livestock products has quadrupled within the last 20 years. Most of that increase in demand has been answered by subsidized industrialized production systems, while million of smallholders, which still provide the larger share of livestock products in the country, have been neglected. Fostering those systems would help China to lower its strong urban migration streams, enhance the livelihood of poorer rural population and provide environmentally save livestock products which have a good chance to satisfy customers demand for ecological food. Despite their importance, China’s smallholder livestock keepers have not yet gained appropriate attention from governmental authorities and researchers. However, profound analysis of those systems is required so that adequate support can lead to a better resource utilization and productivity in the sector. To this aim, this pilot study analyzes smallholder livestock production systems in Xishuangbanna, located in southern China. The area is bordered by Lao and Myanmar and geographically counts as tropical region. Its climate is characterized by dry and temperate winters and hot summers with monsoon rains from May to October. While the regionis plain, at about 500 m asl above sea level in the south, outliers of the Himalaya mountains reach out into the north of Xishuangbanna, where the highest peak reaches 2400 m asl. Except of one larger city, Jinghong, Xishuangbanna mainly is covered by tropical rainforest, areas under agricultural cultivation and villages. The major income is generated through inner-Chinese tourism and agricultural production. Intensive rubber plantations are distinctive for the lowland plains while small-scaled traditional farms are scattered in the mountane regions. In order to determine the current state and possible future chances of smallholder livestock production in that region, this study analyzed the current status of the smallholder livestock sector in the Naban River National Nature Reserve (NRNNR), an area which is largely representative for the whole prefecture. It covers an area of about 50square kilometer and reaches from 470 up to 2400 m asl. About 5500 habitants of different ethnic origin are situated in 24 villages. All data have been collected between October 2007 and May 2010. Three major objectives have been addressed in the study: 1. Classifying existing pig production systems and exploring respective pathways for development 2. Quantifying the performance of pig breeding systemsto identify bottlenecks for production 3. Analyzing past and current buffalo utilization to determine the chances and opportunities of buffalo keeping in the future In order to classify the different pig production s ystems, a baseline survey (n=204, stratified cluster sampling) was carried out to gain data about livestock species, numbers, management practices, cultivated plant species and field sizes as well associo-economic characteristics. Sampling included two clusters at village level (altitude, ethnic affiliation), resulting in 13 clusters of which 13-17 farms were interviewed respectively. Categorical Principal Component Analysis (CatPCA) and a two-step clustering algorithm have been applied to identify determining farm characteristics and assort recorded households into classes of livestock production types. The variables keep_sow_yes/no, TLU_pig, TLU_buffalo, size_of_corn_fields, altitude_class, size_of_tea_plantationand size_of_rubber_fieldhave been found to be major determinants for the characterization of the recorded farms. All farms have extensive or semi-intensive livestock production, pigs and buffaloes are predominant livestock species while chicken and aquaculture are available but play subordinate roles for livelihoods. All pig raisers rely on a single local breed, which is known as Small Ear Pig (SMEP) in the region. Three major production systemshave been identified: Livestock-corn based LB; 41%), rubber based (RB; 39%) and pig based (PB;20%) systems. RB farms earn high income from rubber and fatten 1.9 ±1.80 pigs per household (HH), often using purchased pig feed at markets. PB farms own similar sized rubber plantations and raise 4.7 ±2.77 pigs per HH, with fodder mainly being cultivated and collected in theforest. LB farms grow corn, rice and tea and keep 4.6 ±3.32 pigs per HH, also fed with collected and cultivated fodder. Only 29% of all pigs were marketed (LB: 20%; RB: 42%; PB: 25%), average annual mortality was 4.0 ±4.52 pigs per farm (LB: 4.6 ±3.68; RB: 1.9 ±2.14; PB: 7.1 ±10.82). Pig feed mainly consists of banana pseudo stem, corn and rice hives and is prepared in batches about two to three times per week. Such fodder might be sufficient in energy content but lacks appropriate content of protein. Pigs therefore suffer from malnutrition, which becomes most critical in the time before harvest season around October. Farmers reported high occurrences of gastrointestinal parasites in carcasses and often pig stables were wet and filled with manure. Deficits in nutritional and hygienic management are major limits for development and should be the first issues addressed to improve productivity. SME pork was found to be known and referred by local customers in town and by richer lowland farmers. However, high prices and lacking availability of SME pork at local wet-markets were the reasons which limited purchase. If major management constraints are overcome, pig breeders (PB and LB farms) could increase the share of marketed pigs for town markets and provide fatteners to richer RB farmers. RB farmers are interested in fattening pigs for home consumption but do not show any motivation for commercial pig raising. To determine the productivity of input factors in pig production, eproductive performance, feed quality and quantity as well as weight development of pigs under current management were recorded. The data collection included a progeny history survey covering 184 sows and 437 farrows, bi-weekly weighing of 114 pigs during a 16-months time-span on 21 farms (10 LB and 11 PB) as well as the daily recording of feed quality and quantity given to a defined number of pigs on the same 21 farms. Feed samples of all recorded ingredients were analyzed for their respective nutrient content. Since no literature values on thedigestibility of banana pseudo stem – which is a major ingredient of traditional pig feed in NRNNR – were found, a cross-sectional digestibility trial with 2x4 pigs has been conducted on a station in the research area. With the aid of PRY Herd Life Model, all data have been utilized to determine thesystems’ current (Status Quo = SQ) output and the productivity of the input factor “feed” in terms of saleable life weight per kg DM feed intake and monetary value of output per kg DM feed intake.Two improvement scenarios were simulated, assuming 1) that farmers adopt a culling managementthat generates the highest output per unit input (Scenario 1; SC I) and 2) that through improved feeding, selected parameters of reproduction are improved by 30% (SC II). Daily weight gain averaged 55 ± 56 g per day between day 200 and 600. The average feed energy content of traditional feed mix was 14.92 MJ ME. Age at first farrowing averaged 14.5 ± 4.34 months, subsequent inter-farrowing interval was 11.4 ± 2.73 months. Littersize was 5.8 piglets and weaning age was 4.3 ± 0.99 months. 18% of piglets died before weaning. Simulating pig production at actualstatus, it has been show that monetary returns on inputs (ROI) is negative (1:0.67), but improved (1:1.2) when culling management was optimized so that highest output is gained per unit feed input. If in addition better feeding, controlled mating and better resale prices at fixed dates were simulated, ROI further increased to 1:2.45, 1:2.69, 1:2.7 and 1:3.15 for four respective grower groups. Those findings show the potential of pork production, if basic measures of improvement are applied. Futureexploration of the environment, including climate, market-season and culture is required before implementing the recommended measures to ensure a sustainable development of a more effective and resource conserving pork production in the future. The two studies have shown that the production of local SME pigs plays an important role in traditional farms in NRNNR but basic constraints are limiting their productivity. However, relatively easy approaches are sufficient for reaching a notable improvement. Also there is a demand for more SME pork on local markets and, if basic constraints have been overcome, pig farmers could turn into more commercial producers and provide pork to local markets. By that, environmentally safe meat can be offered to sensitive consumers while farmers increase their income and lower the risk of external shocks through a more diverse income generating strategy. Buffaloes have been found to be the second important livestock species on NRNNR farms. While they have been a core resource of mixed smallholderfarms in the past, the expansion of rubber tree plantations and agricultural mechanization are reasons for decreased swamp buffalo numbers today. The third study seeks to predict future utilization of buffaloes on different farm types in NRNNR by analyzing the dynamics of its buffalo population and land use changes over time and calculating labor which is required for keeping buffaloes in view of the traction power which can be utilized for field preparation. The use of buffaloes for field work and the recent development of the egional buffalo population were analyzed through interviews with 184 farmers in 2007/2008 and discussions with 62 buffalo keepers in 2009. While pig based farms (PB; n=37) have abandoned buffalo keeping, 11% of the rubber based farms (RB; n=71) and 100% of the livestock-corn based farms (LB; n=76) kept buffaloes in 2008. Herd size was 2.5 ±1.80 (n=84) buffaloes in early 2008 and 2.2 ±1.69 (n=62) in 2009. Field work on own land was the main reason forkeeping buffaloes (87.3%), but lending work buffaloes to neighbors (79.0%) was also important. Other purposes were transport of goods (16.1%), buffalo trade (11.3%) and meat consumption(6.4%). Buffalo care required 6.2 ±3.00 working hours daily, while annual working time of abuffalo was 294 ±216.6 hours. The area ploughed with buffaloes remained constant during the past 10 years despite an expansion of land cropped per farm. Further rapid replacement of buffaloes by tractors is expected in the near future. While the work economy is drastically improved by the use of tractors, buffaloes still can provide cheap work force and serve as buffer for economic shocks on poorer farms. Especially poor farms, which lack alternative assets that could quickly be liquidizedin times of urgent need for cash, should not abandon buffalo keeping. Livestock has been found to be a major part of small mixed farms in NRNNR. The general productivity was low in both analyzed species, buffaloes and pigs. Productivity of pigs can be improved through basic adjustments in feeding, reproductive and hygienic management, and with external support pig production could further be commercialized to provide pork and weaners to local markets and fattening farms. Buffalo production is relatively time intensive, and only will be of importance in the future to very poor farms and such farms that cultivate very small terraces on steep slopes. These should be encouraged to further keep buffaloes. With such measures, livestock production in NRNNR has good chances to stay competitive in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ongoing growth of the World Wide Web, catalyzed by the increasing possibility of ubiquitous access via a variety of devices, continues to strengthen its role as our prevalent information and commmunication medium. However, although tools like search engines facilitate retrieval, the task of finally making sense of Web content is still often left to human interpretation. The vision of supporting both humans and machines in such knowledge-based activities led to the development of different systems which allow to structure Web resources by metadata annotations. Interestingly, two major approaches which gained a considerable amount of attention are addressing the problem from nearly opposite directions: On the one hand, the idea of the Semantic Web suggests to formalize the knowledge within a particular domain by means of the "top-down" approach of defining ontologies. On the other hand, Social Annotation Systems as part of the so-called Web 2.0 movement implement a "bottom-up" style of categorization using arbitrary keywords. Experience as well as research in the characteristics of both systems has shown that their strengths and weaknesses seem to be inverse: While Social Annotation suffers from problems like, e. g., ambiguity or lack or precision, ontologies were especially designed to eliminate those. On the contrary, the latter suffer from a knowledge acquisition bottleneck, which is successfully overcome by the large user populations of Social Annotation Systems. Instead of being regarded as competing paradigms, the obvious potential synergies from a combination of both motivated approaches to "bridge the gap" between them. These were fostered by the evidence of emergent semantics, i. e., the self-organized evolution of implicit conceptual structures, within Social Annotation data. While several techniques to exploit the emergent patterns were proposed, a systematic analysis - especially regarding paradigms from the field of ontology learning - is still largely missing. This also includes a deeper understanding of the circumstances which affect the evolution processes. This work aims to address this gap by providing an in-depth study of methods and influencing factors to capture emergent semantics from Social Annotation Systems. We focus hereby on the acquisition of lexical semantics from the underlying networks of keywords, users and resources. Structured along different ontology learning tasks, we use a methodology of semantic grounding to characterize and evaluate the semantic relations captured by different methods. In all cases, our studies are based on datasets from several Social Annotation Systems. Specifically, we first analyze semantic relatedness among keywords, and identify measures which detect different notions of relatedness. These constitute the input of concept learning algorithms, which focus then on the discovery of synonymous and ambiguous keywords. Hereby, we assess the usefulness of various clustering techniques. As a prerequisite to induce hierarchical relationships, our next step is to study measures which quantify the level of generality of a particular keyword. We find that comparatively simple measures can approximate the generality information encoded in reference taxonomies. These insights are used to inform the final task, namely the creation of concept hierarchies. For this purpose, generality-based algorithms exhibit advantages compared to clustering approaches. In order to complement the identification of suitable methods to capture semantic structures, we analyze as a next step several factors which influence their emergence. Empirical evidence is provided that the amount of available data plays a crucial role for determining keyword meanings. From a different perspective, we examine pragmatic aspects by considering different annotation patterns among users. Based on a broad distinction between "categorizers" and "describers", we find that the latter produce more accurate results. This suggests a causal link between pragmatic and semantic aspects of keyword annotation. As a special kind of usage pattern, we then have a look at system abuse and spam. While observing a mixed picture, we suggest that an individual decision should be taken instead of disregarding spammers as a matter of principle. Finally, we discuss a set of applications which operationalize the results of our studies for enhancing both Social Annotation and semantic systems. These comprise on the one hand tools which foster the emergence of semantics, and on the one hand applications which exploit the socially induced relations to improve, e. g., searching, browsing, or user profiling facilities. In summary, the contributions of this work highlight viable methods and crucial aspects for designing enhanced knowledge-based services of a Social Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are currently at the cusp of a revolution in quantum technology that relies not just on the passive use of quantum effects, but on their active control. At the forefront of this revolution is the implementation of a quantum computer. Encoding information in quantum states as “qubits” allows to use entanglement and quantum superposition to perform calculations that are infeasible on classical computers. The fundamental challenge in the realization of quantum computers is to avoid decoherence – the loss of quantum properties – due to unwanted interaction with the environment. This thesis addresses the problem of implementing entangling two-qubit quantum gates that are robust with respect to both decoherence and classical noise. It covers three aspects: the use of efficient numerical tools for the simulation and optimal control of open and closed quantum systems, the role of advanced optimization functionals in facilitating robustness, and the application of these techniques to two of the leading implementations of quantum computation, trapped atoms and superconducting circuits. After a review of the theoretical and numerical foundations, the central part of the thesis starts with the idea of using ensemble optimization to achieve robustness with respect to both classical fluctuations in the system parameters, and decoherence. For the example of a controlled phasegate implemented with trapped Rydberg atoms, this approach is demonstrated to yield a gate that is at least one order of magnitude more robust than the best known analytic scheme. Moreover this robustness is maintained even for gate durations significantly shorter than those obtained in the analytic scheme. Superconducting circuits are a particularly promising architecture for the implementation of a quantum computer. Their flexibility is demonstrated by performing optimizations for both diagonal and non-diagonal quantum gates. In order to achieve robustness with respect to decoherence, it is essential to implement quantum gates in the shortest possible amount of time. This may be facilitated by using an optimization functional that targets an arbitrary perfect entangler, based on a geometric theory of two-qubit gates. For the example of superconducting qubits, it is shown that this approach leads to significantly shorter gate durations, higher fidelities, and faster convergence than the optimization towards specific two-qubit gates. Performing optimization in Liouville space in order to properly take into account decoherence poses significant numerical challenges, as the dimension scales quadratically compared to Hilbert space. However, it can be shown that for a unitary target, the optimization only requires propagation of at most three states, instead of a full basis of Liouville space. Both for the example of trapped Rydberg atoms, and for superconducting qubits, the successful optimization of quantum gates is demonstrated, at a significantly reduced numerical cost than was previously thought possible. Together, the results of this thesis point towards a comprehensive framework for the optimization of robust quantum gates, paving the way for the future realization of quantum computers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.