868 resultados para end user modes of operation
Resumo:
Nowadays, communication environments are already characterized by a myriad of competing and complementary technologies that aim to provide an ubiquitous connectivity service. Next Generation Networks need to hide this heterogeneity by providing a new abstraction level, while simultaneously be aware of the underlying technologies to deliver richer service experiences to the end-user. Moreover, the increasing interest for group-based multimedia services followed by their ever growing resource demands and network dynamics, has been boosting the research towards more scalable and exible network control approaches. The work developed in this Thesis enables such abstraction and exploits the prevailing heterogeneity in favor of a context-aware network management and adaptation. In this scope, we introduce a novel hierarchical control framework with self-management capabilities that enables the concept of Abstract Multiparty Trees (AMTs) to ease the control of multiparty content distribution throughout heterogeneous networks. A thorough evaluation of the proposed multiparty transport control framework was performed in the scope of this Thesis, assessing its bene ts in terms of network selection, delivery tree recon guration and resource savings. Moreover, we developed an analytical study to highlight the scalability of the AMT concept as well as its exibility in large scale networks and group sizes. To prove the feasibility and easy deployment characteristic of the proposed control framework, we implemented a proof-of-concept demonstrator that comprehends the main control procedures conceptually introduced. Its outcomes highlight a good performance of the multiparty content distribution tree control, including its local and global recon guration. In order to endow the AMT concept with the ability to guarantee the best service experience by the end-user, we integrate in the control framework two additional QoE enhancement approaches. The rst employs the concept of Network Coding to improve the robustness of the multiparty content delivery, aiming at mitigating the impact of possible packet losses in the end-user service perception. The second approach relies on a machine learning scheme to autonomously determine at each node the expected QoE towards a certain destination. This knowledge is then used by di erent QoE-aware network management schemes that, jointly, maximize the overall users' QoE. The performance and scalability of the control procedures developed, aided by the context and QoE-aware mechanisms, show the advantages of the AMT concept and the proposed hierarchical control strategy for the multiparty content distribution with enhanced service experience. Moreover we also prove the feasibility of the solution in a practical environment, and provide future research directions that bene t the evolved control framework and make it commercially feasible.
Resumo:
Perturbations of asymptotically Anti-de-Sitter (AdS) spacetimes are often considered by imposing field vanishing boundary conditions (BCs) at the AdS boundary. Such BCs, of Dirichlet-type, imply a vanishing energy flux at the boundary, but the converse is, generically, not true. Regarding AdS as a gravitational box, we consider vanishing energy flux (VEF) BCs as a more fundamental physical requirement and we show that these BCs can lead to a new branch of modes. As a concrete example, we consider Maxwell perturbations on Kerr-AdS black holes in the Teukolsky formalism, but our formulation applies also for other spin fields. Imposing VEF BCs, we find a set of two Robin BCs, even for Schwarzschild-AdS black holes. The Robin BCs on the Teukolsky variables can be used to study quasinormal modes, superradiant instabilities and vector clouds. As a first application, we consider here the quasinormal modes of Schwarzschild-AdS black holes. We find that one of the Robin BCs yields the quasinormal spectrum reported in the literature, while the other one unveils a new branch for the quasinormal spectrum.
Resumo:
The broad capabilities of current mobile devices have paved the way for Mobile Crowd Sensing (MCS) applications. The success of this emerging paradigm strongly depends on the quality of received data which, in turn, is contingent to mass user participation; the broader the participation, the more useful these systems become. However, there is an ongoing trend that tries to integrate MCS applications with emerging computing paradigms such as cloud computing. The intuition is that such a transition can significantly improve the overall efficiency while at the same time it offers stronger security and privacy-preserving mechanisms for the end-user. In this position paper, we dwell on the underpinnings of incorporating cloud computing techniques to facilitate the vast amount of data collected in MCS applications. That is, we present a list of core system, security and privacy requirements that must be met if such a transition is to be successful. To this end, we first address several competing challenges not previously considered in the literature such as the scarce energy resources of battery-powered mobile devices as well as their limited computational resources that they often prevent the use of computationally heavy cryptographic operations and thus offering limited security services to the end-user. Finally, we present a use case scenario as a comprehensive example. Based on our findings, we posit open issues and challenges, and discuss possible ways to address them, so that security and privacy do not hinder the migration of MCS systems to the cloud.
Resumo:
The current ubiquitous network access and increase in network bandwidth are driving the sales of mobile location-aware user devices and, consequently, the development of context-aware applications, namely location-based services. The goal of this project is to provide consumers of location-based services with a richer end-user experience by means of service composition, personalization, device adaptation and continuity of service. Our approach relies on a multi-agent system composed of proxy agents that act as mediators and providers of personalization meta-services, device adaptation and continuity of service for consumers of pre-existing location-based services. These proxy agents, which have Web services interfaces to ensure a high level of interoperability, perform service composition and take in consideration the preferences of the users, the limitations of the user devices, making the usage of different types of devices seamless for the end-user. To validate and evaluate the performance of this approach, use cases were defined, tests were conducted and results gathered which demonstrated that the initial goals were successfully fulfilled.
Resumo:
A presente dissertação foi realizada em colaboração com o grupo empresarial Monteiro, Ribas, tendo como principal objectivo a realização de uma auditoria à gestão dos resíduos industriais produzidos pelas suas fábricas localizadas na Estrada da Circunvalação, no Porto. Para cumprir este objectivo, inicialmente foi efectuado um levantamento das obrigações legais relativas aos resíduos e foram procuradas práticas aconselhadas para a gestão interna. Para cada uma das fábricas, verificaram-se, quais os resíduos produzidos e analisaram-se os seus percursos, considerando as suas origens, os locais e modos de acondicionamento na origem, os modos de transporte interno, os locais e modos de armazenagem preliminar, e ainda, as quantidades produzidas, os transportadores, os operadores finais e as operações finais de gestão, sendo que estas quatro últimas informações são relativas ao ano 2013. De seguida procedeu-se à realização da auditoria nas diferentes unidades, verificando o cumprimento dos requisitos legais e das boas práticas em matéria de gestão de resíduos. As principais não conformidades detectadas, comuns às várias unidades fabris foram a inexistência de local/recipiente definido para acondicionamento de alguns resíduos, a falta ou insuficiente identificação de recipientes/zonas de acondicionamento, a inexistência de bacias de retenção para resíduos líquidos perigosos, o facto de no transporte interno apenas os resíduos perigosos serem cobertos e, os resíduos líquidos perigosos não serem transportados sobre bacias de retenção móveis nem com o material necessário para absorver derrames. Para cada resíduo e para cada unidade industrial foram propostas medidas correctivas e/ou de melhoria, quando aplicável. Relativamente à armazenagem preliminar, a principal inconformidade detectada foi o facto de todos os parques (quatro) possuírem resíduos perigosos no momento das auditorias, o que não é adequado. Foram propostas medidas correctivas e/ou de melhoria para cada parque. Como proposta global, tendo em conta factores económicos e de segurança, sugeriu-se que apenas o parque de resíduos perigosos possa armazenar este tipo de resíduos, pelo que os procedimentos de transporte interno devem ser melhorados, fazendo com que estes resíduos sejam transportados directamente para o parque de resíduos perigosos. Desta forma dois dos parques devem sofrer algumas remodelações, nomeadamente serem cobertos e fechados, ainda que não totalmente, e o parque de resíduos perigosos deve ser fechado, mantendo aberturas para ventilação, deve ser equipado com kit´s de contenção de derrames, fichas de segurança, procedimentos a realizar em caso de emergência, e ainda, devido ao facto do sistema de contenção de derrames ser pequeno face ao total de armazenamento, aconselha-se o uso de bacias de retenção para alguns dos recipientes de resíduos líquidos perigosos. Ao longo deste processo e em consequência da realização da auditoria, algumas situações consideradas não conformes foram sendo corrigidas. Também foram preparadas instruções de trabalho adequadas que serão posteriormente disponibilizadas. Foi ainda elaborada uma metodologia de avaliação de processos como base de trabalho para redução dos resíduos gerados. A etapa escolhida para a aplicação da mesma foi uma etapa auxiliar do processo produtivo da Monteiro, Ribas - Revestimentos, S.A - a limpeza de cubas com solventes, por forma a tentar minimizar os resíduos de solventes produzidos nesta operação. Uma vez que a fábrica já realiza a operação tendo em consideração medidas de prevenção e reutilização, a reciclagem é neste momento a única forma de tentar minimizar os resíduos de solventes. Foram então estudadas duas opções, nomeadamente a aquisição de um equipamento de regeneração de solventes e a contratação de uma operadora que proceda à regeneração dos resíduos de solventes e faça o retorno do solvente regenerado. A primeira opção poderá permitir uma redução de cerca de 95% na produção de resíduos de solventes e na aquisição de solvente puro, estimando-se uma poupança anual de cerca de **** €, com um período de recuperação do capital de cerca de 16 meses e a segunda pode conduzir a uma redução significativa na aquisição de solvente puro, cerca de 65%, e a uma poupança anual de cerca de **** €.
Resumo:
If you want to know whether a property is true or not in a specific algebraic structure,you need to test that property on the given structure. This can be done by hand, which can be cumbersome and erroneous. In addition, the time consumed in testing depends on the size of the structure where the property is applied. We present an implementation of a system for finding counterexamples and testing properties of models of first-order theories. This system is supposed to provide a convenient and paperless environment for researchers and students investigating or studying such models and algebraic structures in particular. To implement a first-order theory in the system, a suitable first-order language.( and some axioms are required. The components of a language are given by a collection of variables, a set of predicate symbols, and a set of operation symbols. Variables and operation symbols are used to build terms. Terms, predicate symbols, and the usual logical connectives are used to build formulas. A first-order theory now consists of a language together with a set of closed formulas, i.e. formulas without free occurrences of variables. The set of formulas is also called the axioms of the theory. The system uses several different formats to allow the user to specify languages, to define axioms and theories and to create models. Besides the obvious operations and tests on these structures, we have introduced the notion of a functor between classes of models in order to generate more co~plex models from given ones automatically. As an example, we will use the system to create several lattices structures starting from a model of the theory of pre-orders.
Resumo:
Les microcantileviers fonctionnalisés offrent une plateforme idéale pour la nano- et micro-mécanique et pour le développement de (bio-) capteurs tres sensible. Le principe d’opération consiste dans des évènements physicochimiques qui se passent du côté fonctionnalisé du microcantilevier induisant une différence de stress de surface entre les deux côtés du cantilevier qui cause une déflexion verticale du levier. Par contre, les facteurs et les phénomènes interfacials qui régissent la nature et l'intensité du stress de surface sont encore méconnus. Pour éclaircir ce phénomène, la première partie de cette thèse porte sur l'étude des réactions de microcantileviers qui sont recouverts d'or et fonctionnalisés par une monocouche auto-assemblée (MAA) électroactive. La formation d'une MAA de ferrocènylundécanethiol (FcC11SH) à la surface d'or d'un microcantilevier est le modèle utilisé pour mieux comprendre le stress de surface induit par l’électrochimie. Les résultats obtenus démontrent qu'une transformation rédox de la MAA de FcC11SH crée un stress de surface qui résulte dans une déflexion verticale du microcantilevier. Dépendamment de la flexibilité du microcantilevier, cette déflexion peut varier de quelques nanomètres à quelques micromètres. L’oxydation de cette MAA de FcC11SH dans un environnement d'ions perchlorate génère un changement de stress de surface compressive. Les résultats indiquent que la déflexion du microcantilevier est due à une tension latérale provenant d'une réorientation et d'une expansion moléculaire lors du transfért de charge et de pairage d’anions. Pour vérifier cette hypothèse, les mêmes expériences ont été répéteés avec des microcantileviers qui ont été couverts d'une MAA mixte, où les groupements électroactifs de ferrocène sont isolés par des alkylthiols inactifs. Lorsqu’un potentiel est appliqué, un courant est détecté mais le microcantilevier ne signale aucune déflexion. Ces résultats confirment que la déflexion du microcantilevier est due à une pression latérale provenant du ferrocènium qui se réorganise et qui crée une pression sur ses pairs avoisinants plutôt que du couplage d’anions. L’amplitude de la déflexion verticale du microcantilevier dépend de la structure moléculaire de la MAA et du le type d’anion utilisés lors de la réaction électrochimique. Dans la prochaine partie de la thèse, l’électrochimie et la spectroscopie de résonance de plasmon en surface ont été combinées pour arriver à une description de l’adsorption et de l’agrégation des n-alkyl sulfates à l’interface FcC11SAu/électrolyte. À toutes les concentrations de solution, les molécules d'agent tensio-actif sont empilées perpendiculairement à la surface d'électrode sous forme de monocouche condensé entrecroisé. Cependant, la densité du film spécifiquement adsorbé s'est avérée être affectée par l'état d'organisation des agents tensio-actifs en solution. À faible concentration, où les molécules d'agent tensio-actif sont présentes en tant que monomères solvatés, les monomères peuvent facilement s'adapter à l’évolution de la concentration en surface du ferrocènium lors du balayage du potential. Cependant, lorsque les molécules sont présentes en solution en tant que micelles une densité plus faible d'agent tensio-actif a été trouvée en raison de l'incapacité de répondre effectivement à la surface de ferrocenium générée dynamiquement.
Resumo:
Die vorliegende Arbeit behandelt Restartautomaten und Erweiterungen von Restartautomaten. Restartautomaten sind ein Werkzeug zum Erkennen formaler Sprachen. Sie sind motiviert durch die linguistische Methode der Analyse durch Reduktion und wurden 1995 von Jancar, Mráz, Plátek und Vogel eingeführt. Restartautomaten bestehen aus einer endlichen Kontrolle, einem Lese/Schreibfenster fester Größe und einem flexiblen Band. Anfänglich enthält dieses sowohl die Eingabe als auch Bandbegrenzungssymbole. Die Berechnung eines Restartautomaten läuft in so genannten Zyklen ab. Diese beginnen am linken Rand im Startzustand, in ihnen wird eine lokale Ersetzung auf dem Band durchgeführt und sie enden mit einem Neustart, bei dem das Lese/Schreibfenster wieder an den linken Rand bewegt wird und der Startzustand wieder eingenommen wird. Die vorliegende Arbeit beschäftigt sich hauptsächlich mit zwei Erweiterungen der Restartautomaten: CD-Systeme von Restartautomaten und nichtvergessende Restartautomaten. Nichtvergessende Restartautomaten können einen Zyklus in einem beliebigen Zustand beenden und CD-Systeme von Restartautomaten bestehen aus einer Menge von Restartautomaten, die zusammen die Eingabe verarbeiten. Dabei wird ihre Zusammenarbeit durch einen Operationsmodus, ähnlich wie bei CD-Grammatik Systemen, geregelt. Für beide Erweiterungen zeigt sich, dass die deterministischen Modelle mächtiger sind als deterministische Standardrestartautomaten. Es wird gezeigt, dass CD-Systeme von Restartautomaten in vielen Fällen durch nichtvergessende Restartautomaten simuliert werden können und andererseits lassen sich auch nichtvergessende Restartautomaten durch CD-Systeme von Restartautomaten simulieren. Des Weiteren werden Restartautomaten und nichtvergessende Restartautomaten untersucht, die nichtdeterministisch sind, aber keine Fehler machen. Es zeigt sich, dass diese Automaten durch deterministische (nichtvergessende) Restartautomaten simuliert werden können, wenn sie direkt nach der Ersetzung einen neuen Zyklus beginnen, oder ihr Fenster nach links und rechts bewegen können. Außerdem gilt, dass alle (nichtvergessenden) Restartautomaten, die zwar Fehler machen dürfen, diese aber nach endlich vielen Zyklen erkennen, durch (nichtvergessende) Restartautomaten simuliert werden können, die keine Fehler machen. Ein weiteres wichtiges Resultat besagt, dass die deterministischen monotonen nichtvergessenden Restartautomaten mit Hilfssymbolen, die direkt nach dem Ersetzungsschritt den Zyklus beenden, genau die deterministischen kontextfreien Sprachen erkennen, wohingegen die deterministischen monotonen nichtvergessenden Restartautomaten mit Hilfssymbolen ohne diese Einschränkung echt mehr, nämlich die links-rechts regulären Sprachen, erkennen. Damit werden zum ersten Mal Restartautomaten mit Hilfssymbolen, die direkt nach dem Ersetzungsschritt ihren Zyklus beenden, von Restartautomaten desselben Typs ohne diese Einschränkung getrennt. Besonders erwähnenswert ist hierbei, dass beide Automatentypen wohlbekannte Sprachklassen beschreiben.
Resumo:
In recent years, progress in the area of mobile telecommunications has changed our way of life, in the private as well as the business domain. Mobile and wireless networks have ever increasing bit rates, mobile network operators provide more and more services, and at the same time costs for the usage of mobile services and bit rates are decreasing. However, mobile services today still lack functions that seamlessly integrate into users’ everyday life. That is, service attributes such as context-awareness and personalisation are often either proprietary, limited or not available at all. In order to overcome this deficiency, telecommunications companies are heavily engaged in the research and development of service platforms for networks beyond 3G for the provisioning of innovative mobile services. These service platforms are to support such service attributes. Service platforms are to provide basic service-independent functions such as billing, identity management, context management, user profile management, etc. Instead of developing own solutions, developers of end-user services such as innovative messaging services or location-based services can utilise the platform-side functions for their own purposes. In doing so, the platform-side support for such functions takes away complexity, development time and development costs from service developers. Context-awareness and personalisation are two of the most important aspects of service platforms in telecommunications environments. The combination of context-awareness and personalisation features can also be described as situation-dependent personalisation of services. The support for this feature requires several processing steps. The focus of this doctoral thesis is on the processing step, in which the user’s current context is matched against situation-dependent user preferences to find the matching user preferences for the current user’s situation. However, to achieve this, a user profile management system and corresponding functionality is required. These parts are also covered by this thesis. Altogether, this thesis provides the following contributions: The first part of the contribution is mainly architecture-oriented. First and foremost, we provide a user profile management system that addresses the specific requirements of service platforms in telecommunications environments. In particular, the user profile management system has to deal with situation-specific user preferences and with user information for various services. In order to structure the user information, we also propose a user profile structure and the corresponding user profile ontology as part of an ontology infrastructure in a service platform. The second part of the contribution is the selection mechanism for finding matching situation-dependent user preferences for the personalisation of services. This functionality is provided as a sub-module of the user profile management system. Contrary to existing solutions, our selection mechanism is based on ontology reasoning. This mechanism is evaluated in terms of runtime performance and in terms of supported functionality compared to other approaches. The results of the evaluation show the benefits and the drawbacks of ontology modelling and ontology reasoning in practical applications.
Resumo:
Wavelength division multiplexing (WDM) networks have been adopted as a near-future solution for the broadband Internet. In previous work we proposed a new architecture, named enhanced grooming (G+), that extends the capabilities of traditional optical routes (lightpaths). In this paper, we compare the operational expenditures incurred by routing a set of demands using lightpaths with that of lighttours. The comparison is done by solving an integer linear programming (ILP) problem based on a path formulation. Results show that, under the assumption of single-hop routing, almost 15% of the operational cost can be reduced with our architecture. In multi-hop routing the operation cost is reduced in 7.1% and at the same time the ratio of operational cost to number of optical-electro-optical conversions is reduced for our architecture. This means that ISPs could provide the same satisfaction in terms of delay to the end-user with a lower investment in the network architecture
Resumo:
El fin de la Guerra Fría supuso no sólo el triunfo del capitalismo y de la democracia liberal, sino un cambio significativo en el Sistema Internacional; siendo menos centralizado y más regionalizado, como consecuencia de la proximidad y relaciones de interdependencia entre sus actores (no sólo Estados) y permitiendo la formación de Complejos Regionales de Seguridad (CRS). Los CRS son una forma efectiva de relacionarse y aproximarse a la arena internacional pues a través de sus procesos de securitización y desecuritización consiguen lograr objetivos específicos. Partiendo de ello, tanto la Unión Europea (UE) como la Comunidad para el Desarrollo de África Austral (SADC) iniciaron varios procesos de securitización relacionados con la integración regional; siendo un ejemplo de ello la eliminación de los controles en sus fronteras interiores o libre circulación de personas; pues consideraron que de no hacerse realidad, ello generaría amenazas políticas (su influencia y capacidad de actuación estaban amenazadas), económicas (en cuanto a su competitividad y niveles básicos de bienestar) y societales (en cuanto a la identidad de la comunidad como indispensable para la integración) que pondrían en riesgo la existencia misma de sus CRS. En esta medida, la UE creó el Espacio Schengen, que fue producto de un proceso de securitización desde inicios de la década de los 80 hasta mediados de la década de los 90; y la SADC se encuentra inmersa en tal proceso de securitización desde 1992 hasta la actualidad y espera la ratificación del Protocolo para la Facilitación del Movimiento de personas como primer paso para lograr la eliminación de controles en sus fronteras interiores. Si bien tanto la UE como la SADC consideraron que de no permitir la libre circulación de personas, su integración y por lo tanto, sus CRS estaban en riesgo; la SADC no lo ha logrado. Ello hace indispensable hacer un análisis más profundo de sus procesos de securitización para así encontrar sus falencias con respecto al éxito de la UE. El análisis está basado en la Teoría de los Complejos de Seguridad de Barry Buzan, plasmada en la obra Security a New Framework for Analysis (1998) de Barry Buzan, Ole Waever y Jaap de Wilde y será dividido en cada una de las etapas del proceso de securitización: la identificación de una amenaza existencial a un objeto referente a través de un acto discursivo, la aceptación de una amenaza por parte de una audiencia relevante y las acciones de emergencia para hacer frente a las amenazas existenciales; reconociendo las diferencias y similitudes de un proceso de securitización exitoso frente a otro que aún no lo ha sido.
Resumo:
BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.
Resumo:
The Group on Earth Observations System of Systems, GEOSS, is a co-ordinated initiative by many nations to address the needs for earth-system information expressed by the 2002 World Summit on Sustainable Development. We discuss the role of earth-system modelling and data assimilation in transforming earth-system observations into the predictive and status-assessment products required by GEOSS, across many areas of socio-economic interest. First we review recent gains in the predictive skill of operational global earth-system models, on time-scales of days to several seasons. We then discuss recent work to develop from the global predictions a diverse set of end-user applications which can meet GEOSS requirements for information of socio-economic benefit; examples include forecasts of coastal storm surges, floods in large river basins, seasonal crop yield forecasts and seasonal lead-time alerts for malaria epidemics. We note ongoing efforts to extend operational earth-system modelling and assimilation capabilities to atmospheric composition, in support of improved services for air-quality forecasts and for treaty assessment. We next sketch likely GEOSS observational requirements in the coming decades. In concluding, we reflect on the cost of earth observations relative to the modest cost of transforming the observations into information of socio-economic value.
Resumo:
Social networking mediated by web sites is a relatively new phenomenon and as with all technological innovations there continues to be a period of both technical and social adjustment to fit the services in with people’s behaviours, and for people to adjust their practices in the light of the affordances provided by the technology. Social networking benefits strongly from large scale availability. Users gain greater benefit from social networking services when more of their friends are using them.This applies in social terms, but also in eLearning and professional networks. The network effect provides one explanation for the popularity of internet based social networking sites (SNS) because the number of connections between people which can be maintained by using them is greatly increased in comparison to the networks available before the internet. The ability of users to determine how much they trust information available to them from contacts within their social network is important in almost all modes of use. As sources of information on a range of topics from academic to shopping advice, the level of trust which a user can put in other nodes is a key aspect of the utility of the system.
Resumo:
The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.