941 resultados para end user computing application streaming horizon workspace portalvmware view


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer supports and polymeric complexes are highly versatile and they are successfully employed as efficient reagents, substrates and catalysts. Recently there observed a growing interest in the synthesis of tailor-made polymer supports and functionalized polymers for the preparation of metal complexes for various applications. They have the combination of properties due to the macromolecular structure as well as due to the reactivity of the functional group. An interesting feature of functional polymers is their affinity towards metal ions. Therefore the synthesis, characterization and application of such polymeric complexes have great scientific and analytical importance. In this investigation three series of polymeric complexes of transition metal ions are prepared from three schiff bases. All the complexes and polymeric schiff bases were characterized by analytical, spectral and thermal methods The thesis consist of six chapters. The first chapter contains an introduction and a brief review on application of polymer supports, polymer supported ligands and complexes. The second chapter gives the details of reagents and instruments used and the procedure adopted for the preparation of ligands and complexes. The third chapter explains the methods employed for characterization and the results are also discussed. The fourth chapter gives a detailed study of metal ion removal using ligands whereas the fifth chapter describes the development of the Cu” ion sensor electrode. The sixth chapter is the summary of the thesis and references are presented at the end.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most common and conventional method for removing turbidity from water is by coagulating with alum or iron salts, and settling the precipitate in suitably designed clarifiers followed by filtration. But the sludge produced is bulky, difficult to dewater and accumulates in the dumping grounds causing environmental problems. Synthetic polymers such as polyacrylamide and polyethyleneoxide have been investigated for their ability to remove turbidity. They overcome many of the disadvantages of conventional methods, but are cost—effective only when rapid flocculation and reduction in sludge volume are demanded. Considering the aforementioned situation, it was felt that more easily available and eco-friendly materials must be developed for removing turbidity from water. The results of our studies in this direction are presented in this thesis. The thesis comprises of nine chapters, with a common bibliography at the end. Chapter 1 gives an introduction to the nature of turbidity and colour usually present in water. Chapter 2 discusses the nature and availability of the principal material used in these studies, namely chitosan. Chapters 3 to 8, which deal with the actual experimental work, are further subdivided into (a) introduction, (b) materials and methods, (c) results and discussion and (d) conclusions. Chapter 9 summarises the entire work so as to put the results and conclusions into proper perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the advancement in mobile devices and wireless networks mobile cloud computing, which combines mobile computing and cloud computing has gained momentum since 2009. The characteristics of mobile devices and wireless network makes the implementation of mobile cloud computing more complicated than for fixed clouds. This section lists some of the major issues in Mobile Cloud Computing. One of the key issues in mobile cloud computing is the end to end delay in servicing a request. Data caching is one of the techniques widely used in wired and wireless networks to improve data access efficiency. In this paper we explore the possibility of a cooperative caching approach to enhance data access efficiency in mobile cloud computing. The proposed approach is based on cloudlets, one of the architecture designed for mobile cloud computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Der Beitrag beschreibt die Ein- und Durchführung einer Server-basierten Computerinfrastruktur in einer Universitätsbibliothek. Beschrieben wird das so genannte MetaFrame-DV-Konzept der Universitätsbibliothek Kassel, das das dortige Informationsmanagement in den letzten vier Jahren initiiert, konzipiert und umgesetzt hat. Hierbei werden nunmehr nicht mehr nur Applikationsserver z.B. für das CD-Angebot eingesetzt, sondern sämtliche ca. 200 Mitarbeiter- und Funktionsarbeitsplätze über eine Citrix MetaFrame-Installation serverseitig betreut. Besonderes Augenmerk gilt in diesem Beitrag der Konfiguration, der praktischen Administration und den täglichen Arbeitsbedingungen an den Bibliotheksmitarbeiterarbeitsplätzen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Am Institut für Mikrostrukturtechnologie und Analytik wurde eine neue Technik entwickelt, die neue Anwendungen und Methoden der Mikro- und Nanostrukturierung auf Basis eines neuen Verfahrens erschlossen hat. NANOJET führt über die passive Rastersondenmikroskopie hinaus zu einem vielseitigen, aktiven Bearbeitungswerkzeug auf der Mikro- und Nanometerskala. NANOJET (NANOstructuring Downstream PlasmaJET) ist eine aktive Rasterkraft-Mikroskopie-Sonde. Radikale (chemisch aktive Teilchen, die ein ungepaartes Valenzelektron besitzen) strömen aus dem Ende einer ultradünnen, hohlen Rasterkraftmikroskop-Spitze. Dadurch wird es möglich, über die übliche passive Abtastung einer Probenoberfläche hinausgehend, diese simultan und in-situ durch chemische Reaktionen zu verändern. Die Abtragung von Material wird durch eine chemische Ätzreaktion erreicht. In dieser Arbeit wurde zum größten Teil Photoresist als Substrat für die Ätzexperimente verwendet. Für das Ätzen des Resists wurden die Atome des Fluors und des Sauerstoffs im Grundzustand als verantwortlich identifiziert. Durch Experimente und durch Ergänzung von Literaturdaten wurde die Annahme bestätigt, dass Sauerstoffradikale mit Unterstützung von Fluorradikalen für die hohen erzielten Ätzraten verantwortlich sind. Die Beimischung von Fluor in einem Sauerstoffplasma führt zu einer Verringerung der Aktivierungsenergie für die Ätzreaktion gegenüber Verwendung reinen Sauerstoffs. In weiterer Folge wurde ein Strukturierungsverfahren dargestellt. Hierbei wurden "geformte Kapillaren" (mikrostrukturierte Aperturen) eingesetzt. Die Herstellung der Aperturen erfolgte durch einen elektrochemischen Ätzstop-Prozess. Die typische Größe der unter Verwendung der "geformten Kapillaren" geätzten Strukturen entsprach den Kapillarenöffnungen. Es wurde ein Monte-Carlo Simulationsprogramm entwickelt, welches den Transport der reaktiven Teilchen in der langen Transportröhre simulierte. Es wurde sowohl die Transmission der Teilchen in der Transportröhre und der Kapillare als auch ihre Winkelverteilung nach dem Verlassen der Kapillare berechnet. Das Aspektverhältnis der Röhren hat dabei einen sehr starken Einfluss. Mit einem steigenden Aspektverhältnis nahm die Transmission exponentiell ab. Die geschaffene experimentelle Infrastruktur wurde genutzt, um auch biologische Objekte zu behandeln und zu untersuchen. Hierfür wurde eine neue Methodik entwickelt, die eine dreidimensionale Darstellung des Zellinneren erlaubt. Dies wurde durch die kontrollierte Abtragung von Material aus der Zellmembran durchgeführt. Die Abtragung der Zellmembran erfolgte mittels Sauerstoffradikalen, die durch eine hohle Spitze lokalisiert zum Ort der Reaktion transportiert wurden. Ein piezoresistiver Cantilever diente als Sensor in dem zur Bildgebung eingesetzten RKM. Das entwickelte Verfahren ermöglicht es nun erstmals, schonend Zellen zu öffnen und die innen liegenden Organellen weiter zu untersuchen. Als Nachweis für weitere Verwendungsmöglichkeiten des NANOJET-Verfahrens wurde auch Knochenmaterial behandelt. Die Ergebnisse dieser Experimente zeigen klar, dass das Verfahren für vielfältige biologische Materialien verwendbar ist und somit nun ein weiter Anwendungskreis in der Biologie und Medizin offen steht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aufgrund ihrer Vorteile hinsichtlich Dauerhaftigkeit und Bauwerkssicherheit ist in Deutschland seit 1998 die externe Vorspannung in Hohlkastenbrücken zur Regelbauweise geworden. Durch Verwendung der austauschbaren externen Vorspannung verspricht man sich im Brückenbau weitere Verbesserungen der Robustheit und damit eine Verlängerung der Lebensdauer. Trotz des besseren Korrosionsschutzes im Vergleich zur internen Vorspannung mit Verbund sind Schäden nicht völlig auszuschließen. Um die Vorteile der externen Vorspannung zu nutzen, ist daher eine periodische Überwachung der Spanngliedkräfte, z. B. während der Hauptprüfung des Bauwerks, durchzuführen. Für die Überwachung der Spanngliedkräfte bei Schrägseilbrücken haben sich die Schwingungsmessmethoden als wirtschaftlich und leistungsfähig erwiesen. Für die Übertragung der Methode auf den Fall der externen Vorspannung, wo kürzere Schwingungslängen vorliegen, waren zusätzliche Untersuchungen hinsichtlich der effektiven Schwingungslänge, der Randbedingungen sowie der effektiven Biegesteifigkeit erforderlich. Im Rahmen der vorliegenden Arbeit wurde das Modellkorrekturverfahren, basierend auf der iterativen Anpassung eines F.E.-Modells an die identifizierten Eigenfrequenzen und Eigenformen des Spanngliedes, für die Bestimmung der Spanngliedkräfte verwendet. Dieses Verfahren ermöglicht die Berücksichtigung der Parameter (Schwingungslänge, Randbedingungen und effektive Biegesteifigkeit) bei der Identifikation der effektiven Spanngliedkräfte. Weiterhin ist eine Modellierung jeder beliebigen Spanngliedausbildung, z. B. bei unterschiedlichen Querschnitten in den Verankerungs- bzw. Umlenkbereichen, gewährleistet. Zur Anwendung bei der Ermittlung der Spanngliedkräfte wurde eine spezielle Methode, basierend auf den besonderen dynamischen Eigenschaften der Spannglieder, entwickelt, bei der die zuvor genannten Parameter innerhalb jedes Iterationsschrittes unabhängig korrigiert werden, was zur Robustheit des Identifikationsverfahrens beiträgt. Das entwickelte Verfahren ist in einem benutzerfreundlichen Programmsystem implementiert worden. Die erzielten Ergebnisse wurden mit dem allgemeinen Identifikationsprogramm UPDATE_g2 verglichen; dabei ist eine sehr gute Übereinstimmung festgestellt worden. Beim selbst entwickelten Verfahren wird die benötigte Rechenzeit auf ca. 30 % reduziert [100 sec à 30 sec]. Es bietet sich daher für die unmittelbare Auswertung vor Ort an. Die Parameteridentifikationsverfahren wurden an den Spanngliedern von insgesamt sechs Brücken (vier unterschiedliche Spannverfahren) angewendet. Die Anzahl der getesteten Spannglieder beträgt insgesamt 340. Die Abweichung zwischen den durch Schwingungs-messungen identifizierten und gemessenen (bei einer Brücke durch eine Abhebekontrolle) bzw. aufgebrachten Spanngliedkräften war kleiner als 3 %. Ferner wurden die Auswirkungen äußerer Einflüsse infolge Temperaturschwankungen und Verkehr bei den durchgeführten Messungen untersucht. Bei der praktischen Anwendung sind Besonderheiten aufgetreten, die durch die Verwendung des Modellkorrekturverfahrens weitgehend erfasst werden konnten. Zusammenfassend lässt sich sagen, dass die Verwendung dieses Verfahrens die Genauigkeit im Vergleich mit den bisherigen Schwingungsmessmethoden beachtlich erhöht. Ferner wird eine Erweiterung des Anwendungsbereiches auch auf Spezialfälle (z. B. bei einem unplanmäßigen Anliegen) gewährleistet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let E be a number field and G be a finite group. Let A be any O_E-order of full rank in the group algebra E[G] and X be a (left) A-lattice. We give a necessary and sufficient condition for X to be free of given rank d over A. In the case that the Wedderburn decomposition E[G] \cong \oplus_xM_x is explicitly computable and each M_x is in fact a matrix ring over a field, this leads to an algorithm that either gives elements \alpha_1,...,\alpha_d \in X such that X = A\alpha_1 \oplus ... \oplusA\alpha_d or determines that no such elements exist. Let L/K be a finite Galois extension of number fields with Galois group G such that E is a subfield of K and put d = [K : E]. The algorithm can be applied to certain Galois modules that arise naturally in this situation. For example, one can take X to be O_L, the ring of algebraic integers of L, and A to be the associated order A(E[G];O_L) \subseteq E[G]. The application of the algorithm to this special situation is implemented in Magma under certain extra hypotheses when K = E = \IQ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With this document, we provide a compilation of in-depth discussions on some of the most current security issues in distributed systems. The six contributions have been collected and presented at the 1st Kassel Student Workshop on Security in Distributed Systems (KaSWoSDS’08). We are pleased to present a collection of papers not only shedding light on the theoretical aspects of their topics, but also being accompanied with elaborate practical examples. In Chapter 1, Stephan Opfer discusses Viruses, one of the oldest threats to system security. For years there has been an arms race between virus producers and anti-virus software providers, with no end in sight. Stefan Triller demonstrates how malicious code can be injected in a target process using a buffer overflow in Chapter 2. Websites usually store their data and user information in data bases. Like buffer overflows, the possibilities of performing SQL injection attacks targeting such data bases are left open by unwary programmers. Stephan Scheuermann gives us a deeper insight into the mechanisms behind such attacks in Chapter 3. Cross-site scripting (XSS) is a method to insert malicious code into websites viewed by other users. Michael Blumenstein explains this issue in Chapter 4. Code can be injected in other websites via XSS attacks in order to spy out data of internet users, spoofing subsumes all methods that directly involve taking on a false identity. In Chapter 5, Till Amma shows us different ways how this can be done and how it is prevented. Last but not least, cryptographic methods are used to encode confidential data in a way that even if it got in the wrong hands, the culprits cannot decode it. Over the centuries, many different ciphers have been developed, applied, and finally broken. Ilhan Glogic sketches this history in Chapter 6.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In connection with the (revived) demand for considering applications in the teaching of mathematics, various schemata or lists of criteria have been developed since the end of the sixties, which set up requirements about closeness to the real world or about the type of mathematics being used, and which have made it possible to analyze the available applications in their light. After having stated the problem (in section 1), we present (in section 2) a sketch of some of the best known of these and of some earlier schemata, although we are not aiming for a complete picture. Then (in section 3) we distinguish among different dimensions.in the analysis of applications. With this as a basis, we develop (in section 4) our own suggestion for categorizing types of applications and conceptions for an application-oriented mathematics instruction. Then (in section 5) we illustrate our schemata by some examples of performed evaluations. Finally (in section 6), we present some preliminary first results of the analysis of teaching conceptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context awareness, dynamic reconfiguration at runtime and heterogeneity are key characteristics of future distributed systems, particularly in ubiquitous and mobile computing scenarios. The main contributions of this dissertation are theoretical as well as architectural concepts facilitating information exchange and fusion in heterogeneous and dynamic distributed environments. Our main focus is on bridging the heterogeneity issues and, at the same time, considering uncertain, imprecise and unreliable sensor information in information fusion and reasoning approaches. A domain ontology is used to establish a common vocabulary for the exchanged information. We thereby explicitly support different representations for the same kind of information and provide Inter-Representation Operations that convert between them. Special account is taken of the conversion of associated meta-data that express uncertainty and impreciseness. The Unscented Transformation, for example, is applied to propagate Gaussian normal distributions across highly non-linear Inter-Representation Operations. Uncertain sensor information is fused using the Dempster-Shafer Theory of Evidence as it allows explicit modelling of partial and complete ignorance. We also show how to incorporate the Dempster-Shafer Theory of Evidence into probabilistic reasoning schemes such as Hidden Markov Models in order to be able to consider the uncertainty of sensor information when deriving high-level information from low-level data. For all these concepts we provide architectural support as a guideline for developers of innovative information exchange and fusion infrastructures that are particularly targeted at heterogeneous dynamic environments. Two case studies serve as proof of concept. The first case study focuses on heterogeneous autonomous robots that have to spontaneously form a cooperative team in order to achieve a common goal. The second case study is concerned with an approach for user activity recognition which serves as baseline for a context-aware adaptive application. Both case studies demonstrate the viability and strengths of the proposed solution and emphasize that the Dempster-Shafer Theory of Evidence should be preferred to pure probability theory in applications involving non-linear Inter-Representation Operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vorgestellt wird eine weltweit neue Methode, Schnittstellen zwischen Menschen und Maschinen für individuelle Bediener anzupassen. Durch Anwenden von Abstraktionen evolutionärer Mechanismen wie Selektion, Rekombination und Mutation in der EOGUI-Methodik (Evolutionary Optimization of Graphical User Interfaces) kann eine rechnergestützte Umsetzung der Methode für Graphische Bedienoberflächen, insbesondere für industrielle Prozesse, bereitgestellt werden. In die Evolutionäre Optimierung fließen sowohl die objektiven, d.h. messbaren Größen wie Auswahlhäufigkeiten und -zeiten, mit ein, als auch das anhand von Online-Fragebögen erfasste subjektive Empfinden der Bediener. Auf diese Weise wird die Visualisierung von Systemen den Bedürfnissen und Präferenzen einzelner Bedienern angepasst. Im Rahmen dieser Arbeit kann der Bediener aus vier Bedienoberflächen unterschiedlicher Abstraktionsgrade für den Beispielprozess MIPS ( MIschungsProzess-Simulation) die Objekte auswählen, die ihn bei der Prozessführung am besten unterstützen. Über den EOGUI-Algorithmus werden diese Objekte ausgewählt, ggf. verändert und in einer neuen, dem Bediener angepassten graphischen Bedienoberfläche zusammengefasst. Unter Verwendung des MIPS-Prozesses wurden Experimente mit der EOGUI-Methodik durchgeführt, um die Anwendbarkeit, Akzeptanz und Wirksamkeit der Methode für die Führung industrieller Prozesse zu überprüfen. Anhand der Untersuchungen kann zu großen Teilen gezeigt werden, dass die entwickelte Methodik zur Evolutionären Optimierung von Mensch-Maschine-Schnittstellen industrielle Prozessvisualisierungen tatsächlich an den einzelnen Bediener anpaßt und die Prozessführung verbessert.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the vision of Mark Weiser on ubiquitous computing, computers are disappearing from the focus of the users and are seamlessly interacting with other computers and users in order to provide information and services. This shift of computers away from direct computer interaction requires another way of applications to interact without bothering the user. Context is the information which can be used to characterize the situation of persons, locations, or other objects relevant for the applications. Context-aware applications are capable of monitoring and exploiting knowledge about external operating conditions. These applications can adapt their behaviour based on the retrieved information and thus to replace (at least a certain amount) the missing user interactions. Context awareness can be assumed to be an important ingredient for applications in ubiquitous computing environments. However, context management in ubiquitous computing environments must reflect the specific characteristics of these environments, for example distribution, mobility, resource-constrained devices, and heterogeneity of context sources. Modern mobile devices are equipped with fast processors, sufficient memory, and with several sensors, like Global Positioning System (GPS) sensor, light sensor, or accelerometer. Since many applications in ubiquitous computing environments can exploit context information for enhancing their service to the user, these devices are highly useful for context-aware applications in ubiquitous computing environments. Additionally, context reasoners and external context providers can be incorporated. It is possible that several context sensors, reasoners and context providers offer the same type of information. However, the information providers can differ in quality levels (e.g. accuracy), representations (e.g. position represented in coordinates and as an address) of the offered information, and costs (like battery consumption) for providing the information. In order to simplify the development of context-aware applications, the developers should be able to transparently access context information without bothering with underlying context accessing techniques and distribution aspects. They should rather be able to express which kind of information they require, which quality criteria this information should fulfil, and how much the provision of this information should cost (not only monetary cost but also energy or performance usage). For this purpose, application developers as well as developers of context providers need a common language and vocabulary to specify which information they require respectively they provide. These descriptions respectively criteria have to be matched. For a matching of these descriptions, it is likely that a transformation of the provided information is needed to fulfil the criteria of the context-aware application. As it is possible that more than one provider fulfils the criteria, a selection process is required. In this process the system has to trade off the provided quality of context and required costs of the context provider against the quality of context requested by the context consumer. This selection allows to turn on context sources only if required. Explicitly selecting context services and thereby dynamically activating and deactivating the local context provider has the advantage that also the resource consumption is reduced as especially unused context sensors are deactivated. One promising solution is a middleware providing appropriate support in consideration of the principles of service-oriented computing like loose coupling, abstraction, reusability, or discoverability of context providers. This allows us to abstract context sensors, context reasoners and also external context providers as context services. In this thesis we present our solution consisting of a context model and ontology, a context offer and query language, a comprehensive matching and mediation process and a selection service. Especially the matching and mediation process and the selection service differ from the existing works. The matching and mediation process allows an autonomous establishment of mediation processes in order to transfer information from an offered representation into a requested representation. In difference to other approaches, the selection service selects not only a service for a service request, it rather selects a set of services in order to fulfil all requests which also facilitates the sharing of services. The approach is extensively reviewed regarding the different requirements and a set of demonstrators shows its usability in real-world scenarios.