797 resultados para agile software development
Resumo:
Recently, morphometric measurements of the ascending aorta have been done with ECG-gated multidector computerized tomography (MDCT) to help the development of future novel transcatheter therapies (TCT); nevertheless, the variability of such measurements remains unknown. Thirty patients referred for ECG-gated CT thoracic angiography were evaluated. Continuous reformations of the ascending aorta, perpendicular to the centerline, were obtained automatically with a commercially available computer aided diagnosis (CAD). Then measurements of the maximal diameter were done with the CAD and manually by two observers (separately). Measurements were repeated one month later. The Bland-Altman method, Spearman coefficients, and a Wilcoxon signed-rank test were used to evaluate the variability, the correlation, and the differences between observers. The interobserver variability for maximal diameter between the two observers was up to 1.2 mm with limits of agreement [-1.5, +0.9] mm; whereas the intraobserver limits were [-1.2, +1.0] mm for the first observer and [-0.8, +0.8] mm for the second observer. The intraobserver CAD variability was 0.8 mm. The correlation was good between observers and the CAD (0.980-0.986); however, significant differences do exist (P<0.001). The maximum variability observed was 1.2 mm and should be considered in reports of measurements of the ascending aorta. The CAD is as reproducible as an experienced reader.
Resumo:
This thesis examines coordination of systems development process in a contemporary software producing organization. The thesis consists of a series of empirical studies in which the actions, conceptions and artifacts of practitioners are analyzed using a theory-building case study research approach. The three phases of the thesis provide empirical observations on different aspects of systemsdevelopment. In the first phase is examined the role of architecture in coordination and cost estimation in multi-site environment. The second phase involves two studies on the evolving requirement understanding process and how to measure this process. The third phase summarizes the first two phases and concentrates on the role of methods and how practitioners work with them. All the phases provide evidence that current systems development method approaches are too naïve in looking at the complexity of the real world. In practice, development is influenced by opportunity and other contingent factors. The systems development processis not coordinated using phases and tasks defined in methods providing universal mechanism for managing this process like most of the method approaches assume.Instead, the studies suggest that managing systems development process happens through coordinating development activities using methods as tools. These studies contribute to the systems development methods by emphasizing the support of communication and collaboration between systems development participants. Methods should not describe the development activities and phases in a detail level, butshould include the higher level guidance for practitioners on how to act in different systems development environments.
Resumo:
Les sociétés modernes dépendent de plus en plus sur les systèmes informatiques et ainsi, il y a de plus en plus de pression sur les équipes de développement pour produire des logiciels de bonne qualité. Plusieurs compagnies utilisent des modèles de qualité, des suites de programmes qui analysent et évaluent la qualité d'autres programmes, mais la construction de modèles de qualité est difficile parce qu'il existe plusieurs questions qui n'ont pas été répondues dans la littérature. Nous avons étudié les pratiques de modélisation de la qualité auprès d'une grande entreprise et avons identifié les trois dimensions où une recherche additionnelle est désirable : Le support de la subjectivité de la qualité, les techniques pour faire le suivi de la qualité lors de l'évolution des logiciels, et la composition de la qualité entre différents niveaux d'abstraction. Concernant la subjectivité, nous avons proposé l'utilisation de modèles bayésiens parce qu'ils sont capables de traiter des données ambiguës. Nous avons appliqué nos modèles au problème de la détection des défauts de conception. Dans une étude de deux logiciels libres, nous avons trouvé que notre approche est supérieure aux techniques décrites dans l'état de l'art, qui sont basées sur des règles. Pour supporter l'évolution des logiciels, nous avons considéré que les scores produits par un modèle de qualité sont des signaux qui peuvent être analysés en utilisant des techniques d'exploration de données pour identifier des patrons d'évolution de la qualité. Nous avons étudié comment les défauts de conception apparaissent et disparaissent des logiciels. Un logiciel est typiquement conçu comme une hiérarchie de composants, mais les modèles de qualité ne tiennent pas compte de cette organisation. Dans la dernière partie de la dissertation, nous présentons un modèle de qualité à deux niveaux. Ces modèles ont trois parties: un modèle au niveau du composant, un modèle qui évalue l'importance de chacun des composants, et un autre qui évalue la qualité d'un composé en combinant la qualité de ses composants. L'approche a été testée sur la prédiction de classes à fort changement à partir de la qualité des méthodes. Nous avons trouvé que nos modèles à deux niveaux permettent une meilleure identification des classes à fort changement. Pour terminer, nous avons appliqué nos modèles à deux niveaux pour l'évaluation de la navigabilité des sites web à partir de la qualité des pages. Nos modèles étaient capables de distinguer entre des sites de très bonne qualité et des sites choisis aléatoirement. Au cours de la dissertation, nous présentons non seulement des problèmes théoriques et leurs solutions, mais nous avons également mené des expériences pour démontrer les avantages et les limitations de nos solutions. Nos résultats indiquent qu'on peut espérer améliorer l'état de l'art dans les trois dimensions présentées. En particulier, notre travail sur la composition de la qualité et la modélisation de l'importance est le premier à cibler ce problème. Nous croyons que nos modèles à deux niveaux sont un point de départ intéressant pour des travaux de recherche plus approfondis.
Resumo:
Les logiciels sont en constante évolution, nécessitant une maintenance et un développement continus. Ils subissent des changements tout au long de leur vie, que ce soit pendant l'ajout de nouvelles fonctionnalités ou la correction de bogues dans le code. Lorsque ces logiciels évoluent, leurs architectures ont tendance à se dégrader avec le temps et deviennent moins adaptables aux nouvelles spécifications des utilisateurs. Elles deviennent plus complexes et plus difficiles à maintenir. Dans certains cas, les développeurs préfèrent refaire la conception de ces architectures à partir du zéro plutôt que de prolonger la durée de leurs vies, ce qui engendre une augmentation importante des coûts de développement et de maintenance. Par conséquent, les développeurs doivent comprendre les facteurs qui conduisent à la dégradation des architectures, pour prendre des mesures proactives qui facilitent les futurs changements et ralentissent leur dégradation. La dégradation des architectures se produit lorsque des développeurs qui ne comprennent pas la conception originale du logiciel apportent des changements au logiciel. D'une part, faire des changements sans comprendre leurs impacts peut conduire à l'introduction de bogues et à la retraite prématurée du logiciel. D'autre part, les développeurs qui manquent de connaissances et–ou d'expérience dans la résolution d'un problème de conception peuvent introduire des défauts de conception. Ces défauts ont pour conséquence de rendre les logiciels plus difficiles à maintenir et évoluer. Par conséquent, les développeurs ont besoin de mécanismes pour comprendre l'impact d'un changement sur le reste du logiciel et d'outils pour détecter les défauts de conception afin de les corriger. Dans le cadre de cette thèse, nous proposons trois principales contributions. La première contribution concerne l'évaluation de la dégradation des architectures logicielles. Cette évaluation consiste à utiliser une technique d’appariement de diagrammes, tels que les diagrammes de classes, pour identifier les changements structurels entre plusieurs versions d'une architecture logicielle. Cette étape nécessite l'identification des renommages de classes. Par conséquent, la première étape de notre approche consiste à identifier les renommages de classes durant l'évolution de l'architecture logicielle. Ensuite, la deuxième étape consiste à faire l'appariement de plusieurs versions d'une architecture pour identifier ses parties stables et celles qui sont en dégradation. Nous proposons des algorithmes de bit-vecteur et de clustering pour analyser la correspondance entre plusieurs versions d'une architecture. La troisième étape consiste à mesurer la dégradation de l'architecture durant l'évolution du logiciel. Nous proposons un ensemble de m´etriques sur les parties stables du logiciel, pour évaluer cette dégradation. La deuxième contribution est liée à l'analyse de l'impact des changements dans un logiciel. Dans ce contexte, nous présentons une nouvelle métaphore inspirée de la séismologie pour identifier l'impact des changements. Notre approche considère un changement à une classe comme un tremblement de terre qui se propage dans le logiciel à travers une longue chaîne de classes intermédiaires. Notre approche combine l'analyse de dépendances structurelles des classes et l'analyse de leur historique (les relations de co-changement) afin de mesurer l'ampleur de la propagation du changement dans le logiciel, i.e., comment un changement se propage à partir de la classe modifiée è d'autres classes du logiciel. La troisième contribution concerne la détection des défauts de conception. Nous proposons une métaphore inspirée du système immunitaire naturel. Comme toute créature vivante, la conception de systèmes est exposée aux maladies, qui sont des défauts de conception. Les approches de détection sont des mécanismes de défense pour les conception des systèmes. Un système immunitaire naturel peut détecter des pathogènes similaires avec une bonne précision. Cette bonne précision a inspiré une famille d'algorithmes de classification, appelés systèmes immunitaires artificiels (AIS), que nous utilisions pour détecter les défauts de conception. Les différentes contributions ont été évaluées sur des logiciels libres orientés objets et les résultats obtenus nous permettent de formuler les conclusions suivantes: • Les métriques Tunnel Triplets Metric (TTM) et Common Triplets Metric (CTM), fournissent aux développeurs de bons indices sur la dégradation de l'architecture. La d´ecroissance de TTM indique que la conception originale de l'architecture s’est dégradée. La stabilité de TTM indique la stabilité de la conception originale, ce qui signifie que le système est adapté aux nouvelles spécifications des utilisateurs. • La séismologie est une métaphore intéressante pour l'analyse de l'impact des changements. En effet, les changements se propagent dans les systèmes comme les tremblements de terre. L'impact d'un changement est plus important autour de la classe qui change et diminue progressivement avec la distance à cette classe. Notre approche aide les développeurs à identifier l'impact d'un changement. • Le système immunitaire est une métaphore intéressante pour la détection des défauts de conception. Les résultats des expériences ont montré que la précision et le rappel de notre approche sont comparables ou supérieurs à ceux des approches existantes.
Resumo:
Les logiciels sont en constante évolution, nécessitant une maintenance et un développement continus. Ils subissent des changements tout au long de leur vie, que ce soit pendant l'ajout de nouvelles fonctionnalités ou la correction de bogues. Lorsque les logiciels évoluent, leurs architectures ont tendance à se dégrader et deviennent moins adaptables aux nouvelles spécifications des utilisateurs. En effet, les architectures de ces logiciels deviennent plus complexes et plus difficiles à maintenir à cause des nombreuses dépendances entre les artefacts. Par conséquent, les développeurs doivent comprendre les dépendances entre les artefacts des logiciels pour prendre des mesures proactives qui facilitent les futurs changements et ralentissent la dégradation des architectures des logiciels. D'une part, le maintien d'un logiciel sans la compréhension des les dépendances entre ses artefacts peut conduire à l'introduction de défauts. D'autre part, lorsque les développeurs manquent de connaissances sur l'impact de leurs activités de maintenance, ils peuvent introduire des défauts de conception, qui ont un impact négatif sur l'évolution du logiciel. Ainsi, les développeurs ont besoin de mécanismes pour comprendre comment le changement d'un artefact impacte le reste du logiciel. Dans cette thèse, nous proposons trois contributions principales : La spécification de deux nouveaux patrons de changement et leurs utilisations pour fournir aux développeurs des informations utiles concernant les dépendances de co-changement. La spécification de la relation entre les patrons d'évolutions des artefacts et les fautes. La découverte de la relation entre les dépendances des anti-patrons et la prédisposition des différentes composantes d'un logiciel aux fautes.
Resumo:
Les systèmes logiciels sont devenus de plus en plus répondus et importants dans notre société. Ainsi, il y a un besoin constant de logiciels de haute qualité. Pour améliorer la qualité de logiciels, l’une des techniques les plus utilisées est le refactoring qui sert à améliorer la structure d'un programme tout en préservant son comportement externe. Le refactoring promet, s'il est appliqué convenablement, à améliorer la compréhensibilité, la maintenabilité et l'extensibilité du logiciel tout en améliorant la productivité des programmeurs. En général, le refactoring pourra s’appliquer au niveau de spécification, conception ou code. Cette thèse porte sur l'automatisation de processus de recommandation de refactoring, au niveau code, s’appliquant en deux étapes principales: 1) la détection des fragments de code qui devraient être améliorés (e.g., les défauts de conception), et 2) l'identification des solutions de refactoring à appliquer. Pour la première étape, nous traduisons des régularités qui peuvent être trouvés dans des exemples de défauts de conception. Nous utilisons un algorithme génétique pour générer automatiquement des règles de détection à partir des exemples de défauts. Pour la deuxième étape, nous introduisons une approche se basant sur une recherche heuristique. Le processus consiste à trouver la séquence optimale d'opérations de refactoring permettant d'améliorer la qualité du logiciel en minimisant le nombre de défauts tout en priorisant les instances les plus critiques. De plus, nous explorons d'autres objectifs à optimiser: le nombre de changements requis pour appliquer la solution de refactoring, la préservation de la sémantique, et la consistance avec l’historique de changements. Ainsi, réduire le nombre de changements permets de garder autant que possible avec la conception initiale. La préservation de la sémantique assure que le programme restructuré est sémantiquement cohérent. De plus, nous utilisons l'historique de changement pour suggérer de nouveaux refactorings dans des contextes similaires. En outre, nous introduisons une approche multi-objective pour améliorer les attributs de qualité du logiciel (la flexibilité, la maintenabilité, etc.), fixer les « mauvaises » pratiques de conception (défauts de conception), tout en introduisant les « bonnes » pratiques de conception (patrons de conception).
Resumo:
The present study focuses on the stability of the coast, exploitation of the coastal resources, human activities within the study are that extends from Fort Cochin at north to Thottappally at south, central Kerala State and hinterlands, socio-economic problems of the coastal community and the environmental issues arising in the recent past due to human activities. The objective of the study is critically analyse the coastal zone region and prevailing situation and to propose a comprehensive management plan for the sustainable development of the region under study. The thesis covers varied aspects of coastal uses like fisheries, tourism, land use, water resources etc. To critically examine the above scenarios, the ILWIS (Integrated Land and Water Information Systems) – GIS software has been used. A satellite image of the area has been used for the coastline change detection and land use patterns. The outcome of the present study will be beneficial to the various stakeholders within the coastal region and its hinterlands. To further add, this study should find better applications to similar or near-similar situations of Southeast Asia where identical scenarios are noticeable.
Resumo:
The goal of this work was developing a query processing system using software agents. Open Agent Architecture framework is used for system development. The system supports queries in both Hindi and Malayalam; two prominent regional languages of India. Natural language processing techniques are used for meaning extraction from the plain query and information from database is given back to the user in his native language. The system architecture is designed in a structured way that it can be adapted to other regional languages of India. . This system can be effectively used in application areas like e-governance, agriculture, rural health, education, national resource planning, disaster management, information kiosks etc where people from all walks of life are involved.
Resumo:
The thesis is the outcome of the experimental and theoretical Investigations on novel feeding techniques for bandwidth enhancement of microstrip patches. The new feeding techniques provide bandwidth enhancement without deteriorating the radiation characteristics of the antenna. The antenna is analysed using finite Difference Time Domain (FDTD) method. The predicated results are compared with the experimental results and excellent agreement is observed. The results are also verified using IE3D simulation software. The antenna is suitable for personal and broadband communications.
Resumo:
Ship recycling has been considered as the best means to dispose off an obsolete ship. The current state of art of technology combined with the demands of sustainable developments from the global maritime industrial sector has modified the status of erstwhile ‘ship breaking’ involving ship scrap business to a modern industry undertaking dismantling of ships and recycling/reusing the dismantled products in a supply chain of pre owned product market by following the principles of recycling. Industries will have to formulate a set of best practices and blend them with the engineering activities for producing better quality products, improving the productivity and for achieving improved performances related to sustainable development. Improved performance by industries in a sustainable development perspective is accomplished only by implementing the 4E principles, ie.,. ecofriendliness, engineering efficiency, energy conservation and ergonomics in their core operations. The present study has done a comprehensive investigation into various ship recycling operations for formulating a set of best practices.Being the ultimate life cycle stage of a ship, ship recycling activities incorporate certain commercial procedures well in advance to facilitate the objectives of dismantling and recycling/reusing of various parts of the vessel. Thorough knowledge regarding these background procedures in ship recycling is essential for examining and understanding the industrial business operations associated with it. As a first step, the practices followed in merchant shipping operations regarding the decision on decommissioning have been and made available in the thesis. Brief description about the positioning methods and important preparations for the most feasible ship recycling method ie.,. beach method have been provided as a part of the outline of the background information. Available sources of guidelines, codes and rules & regulations for ship recycling have been compiled and included in the discussion.Very brief summary of practices in major ship recycling destinations has been prepared and listed for providing an overview of the global ship recycling activities. The present status of ship recycling by treating it as a full fledged engineering industry has been brought out to establish the need for looking into the development of the best practices. Major engineering attributes of ship as a unique engineering product and the significant influencing factors on her life cycle stage operations have been studied and added to the information base on ship recycling. Role of ship recycling industry as an important player in global sustainable development efforts has been reviewed by analysing the benefits of ship recycling. A brief synopsis on the state of art of ship recycling in major international ship recycling centres has also been incorporated in the backdrop knowledgebase generation on ship recycling processes.Publications available in this field have been reviewed and classified into five subject categories viz., Infrastructure for recycling yards and methods of dismantling, Rules regarding ship recycling activities, Environmental and safety aspects of ship recycling, Role of naval architects and ship classification societies, Application of information technology and Demand forecasting. The inference from the literature survey have been summarised and recorded. Noticeable observations in the inference include need of creation of a comprehensive knowledgebase on ship recycling and its effective implementation in the industry and the insignificant involvement of naval architects and shipbuilding engineers in ship recycling industry. These two important inferences and the message conveyed by them have been addressed with due importance in the subsequent part of the present study.As a part of the study the importance of demand forecasting in ship recycling has been introduced and presented. A sample input for ship recycling data for implementation of computer based methods of demand forecasting has been presented in this section of the thesis.The interdisciplinary nature of engineering processes involved in ship recycling has been identified as one of the important features of this industry. The present study has identified more than a dozen major stake holders in ship recycling having their own interests and roles. It has also been observed that most of the ship recycling activities is carried out in South East Asian countries where the beach based ship recycling is done in yards without proper infrastructure support. A model of beach based ship recycling has been developed and the roles, responsibilities and the mutual interactions of the elements of the system have been documented as a part of the study Subsequently the need of a generation of a wide knowledgebase on ship recycling activities as pointed out by the literature survey has been addressed. The information base and source of expertise required to build a broad knowledgebase on ship recycling operations have been identified and tabulated. Eleven important ship recycling processes have been identified and a brief sketch of steps involved in these processes have been examined and addressed in detail. Based on these findings, a detailed sequential disassembly process plan of ship recycling has been prepared and charted. After having established the need of best practices in ship recycling initially, the present study here identifies development of a user friendly expert system for ship recycling process as one of the constituents of the proposed best practises. A user friendly expert system has been developed for beach based ship recycling processes and is named as Ship Recycling Recommender (SRR). Two important functions of SRR, first one for the ‘Administrators’, the stake holders at the helm of the ship recycling affairs and second one for the ‘Users’, the stake holders who execute the actual dismantling have been presented by highlighting the steps involved in the execution of the software. The important output generated, ie.,. recommended practices for ship dismantling processes and safe handling information on materials present onboard have been presented with the help of ship recycling reports generated by the expert system. A brief account of necessity of having a ship recycling work content estimation as part of the best practices has been presented in the study. This is supported by a detailed work estimation schedule for the same as one of the appendices.As mentioned earlier, a definite lack of involvement of naval architect has been observed in development of methodologies for improving the status of ship recycling industry. Present study has put forward a holistic approach to review the status of ship recycling not simply as end of life activity of all ‘time expired’ vessels, but as a focal point of integrating all life cycle activities. A new engineering design philosophy targeting sustainable development of marine industrial domain, named design for ship recycling has been identified, formulated and presented. A new model of ship life cycle has been proposed by adding few stages to the traditional life cycle after analysing their critical role in accomplishing clean and safe end of life and partial dismantling of ships. Two applications of design for ship recycling viz, recyclability of ships and her products and allotment of Green Safety Index for ships have been presented as a part of implementation of the philosophy in actual practice.
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
The purpose of this paper is to describe the design and development of a digital library at Cochin University of Science and Technology (CUSAT), India, using DSpace open source software. The study covers the structure, contents and usage of CUSAT digital library. Design/methodology/approach – This paper examines the possibilities of applying open source in libraries. An evaluative approach is carried out to explore the features of the CUSAT digital library. The Google Analytics service is employed to measure the amount of use of digital library by users across the world. Findings – CUSAT has successfully applied DSpace open source software for building a digital library. The digital library has had visits from 78 countries, with the major share from India. The distribution of documents in the digital library is uneven. Past exam question papers share the major part of the collection. The number of research papers, articles and rare documents is less. Originality/value – The study is the first of its type that tries to understand digital library design and development using DSpace open source software in a university environment with a focus on the analysis of distribution of items and measuring the value by usage statistics employing the Google Analytics service. The digital library model can be useful for designing similar systems
Resumo:
The Central Library of Cochin University of Science and Technology (CUSAT) has been automated by proprietary software (Adlib Library) since 2000. After 11 years, in 2011, the university authorities decided to shift to an open source software (OSS), for integrated library management system (ILMS), Koha for automating the library housekeeping operations. In this context, this study attempts to share the experiences in cataloging with both type of software. The features of the cataloging modules of both the software are analysed on the badis of certain check points. It is found that the cataloging module of Koha is almost in par with that of proven proprietary software that has been in market for the past 25 years. Some suggestions made by this study may be incorporated for the further development and perfection of Koha.
Resumo:
The basic concepts of digital signal processing are taught to the students in engineering and science. The focus of the course is on linear, time invariant systems. The question as to what happens when the system is governed by a quadratic or cubic equation remains unanswered in the vast majority of literature on signal processing. Light has been shed on this problem when John V Mathews and Giovanni L Sicuranza published the book Polynomial Signal Processing. This book opened up an unseen vista of polynomial systems for signal and image processing. The book presented the theory and implementations of both adaptive and non-adaptive FIR and IIR quadratic systems which offer improved performance than conventional linear systems. The theory of quadratic systems presents a pristine and virgin area of research that offers computationally intensive work. Once the area of research is selected, the next issue is the choice of the software tool to carry out the work. Conventional languages like C and C++ are easily eliminated as they are not interpreted and lack good quality plotting libraries. MATLAB is proved to be very slow and so do SCILAB and Octave. The search for a language for scientific computing that was as fast as C, but with a good quality plotting library, ended up in Python, a distant relative of LISP. It proved to be ideal for scientific computing. An account of the use of Python, its scientific computing package scipy and the plotting library pylab is given in the appendix Initially, work is focused on designing predictors that exploit the polynomial nonlinearities inherent in speech generation mechanisms. Soon, the work got diverted into medical image processing which offered more potential to exploit by the use of quadratic methods. The major focus in this area is on quadratic edge detection methods for retinal images and fingerprints as well as de-noising raw MRI signals
Resumo:
Self-adaptive software provides a profound solution for adapting applications to changing contexts in dynamic and heterogeneous environments. Having emerged from Autonomic Computing, it incorporates fully autonomous decision making based on predefined structural and behavioural models. The most common approach for architectural runtime adaptation is the MAPE-K adaptation loop implementing an external adaptation manager without manual user control. However, it has turned out that adaptation behaviour lacks acceptance if it does not correspond to a user’s expectations – particularly for Ubiquitous Computing scenarios with user interaction. Adaptations can be irritating and distracting if they are not appropriate for a certain situation. In general, uncertainty during development and at run-time causes problems with users being outside the adaptation loop. In a literature study, we analyse publications about self-adaptive software research. The results show a discrepancy between the motivated application domains, the maturity of examples, and the quality of evaluations on the one hand and the provided solutions on the other hand. Only few publications analysed the impact of their work on the user, but many employ user-oriented examples for motivation and demonstration. To incorporate the user within the adaptation loop and to deal with uncertainty, our proposed solutions enable user participation for interactive selfadaptive software while at the same time maintaining the benefits of intelligent autonomous behaviour. We define three dimensions of user participation, namely temporal, behavioural, and structural user participation. This dissertation contributes solutions for user participation in the temporal and behavioural dimension. The temporal dimension addresses the moment of adaptation which is classically determined by the self-adaptive system. We provide mechanisms allowing users to influence or to define the moment of adaptation. With our solution, users can have full control over the moment of adaptation or the self-adaptive software considers the user’s situation more appropriately. The behavioural dimension addresses the actual adaptation logic and the resulting run-time behaviour. Application behaviour is established during development and does not necessarily match the run-time expectations. Our contributions are three distinct solutions which allow users to make changes to the application’s runtime behaviour: dynamic utility functions, fuzzy-based reasoning, and learning-based reasoning. The foundation of our work is a notification and feedback solution that improves intelligibility and controllability of self-adaptive applications by implementing a bi-directional communication between self-adaptive software and the user. The different mechanisms from the temporal and behavioural participation dimension require the notification and feedback solution to inform users on adaptation actions and to provide a mechanism to influence adaptations. Case studies show the feasibility of the developed solutions. Moreover, an extensive user study with 62 participants was conducted to evaluate the impact of notifications before and after adaptations. Although the study revealed that there is no preference for a particular notification design, participants clearly appreciated intelligibility and controllability over autonomous adaptations.