10 resultados para Model driven architecture (MDA) initiative

em Universitätsbibliothek Kassel, Universität Kassel, Germany


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) differ from conventional distributed systems in many aspects. The resource limitation of sensor nodes, the ad-hoc communication and topology of the network, coupled with an unpredictable deployment environment are difficult non-functional constraints that must be carefully taken into account when developing software systems for a WSN. Thus, more research needs to be done on designing, implementing and maintaining software for WSNs. This thesis aims to contribute to research being done in this area by presenting an approach to WSN application development that will improve the reusability, flexibility, and maintainability of the software. Firstly, we present a programming model and software architecture aimed at describing WSN applications, independently of the underlying operating system and hardware. The proposed architecture is described and realized using the Model-Driven Architecture (MDA) standard in order to achieve satisfactory levels of encapsulation and abstraction when programming sensor nodes. Besides, we study different non-functional constrains of WSN application and propose two approaches to optimize the application to satisfy these constrains. A real prototype framework was built to demonstrate the developed solutions in the thesis. The framework implemented the programming model and the multi-layered software architecture as components. A graphical interface, code generation components and supporting tools were also included to help developers design, implement, optimize, and test the WSN software. Finally, we evaluate and critically assess the proposed concepts. Two case studies are provided to support the evaluation. The first case study, a framework evaluation, is designed to assess the ease at which novice and intermediate users can develop correct and power efficient WSN applications, the portability level achieved by developing applications at a high-level of abstraction, and the estimated overhead due to usage of the framework in terms of the footprint and executable code size of the application. In the second case study, we discuss the design, implementation and optimization of a real-world application named TempSense, where a sensor network is used to monitor the temperature within an area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fujaba is an Open Source UML CASE tool project started at the software engineering group of Paderborn University in 1997. In 2002 Fujaba has been redesigned and became the Fujaba Tool Suite with a plug-in architecture allowing developers to add functionality easily while retaining full control over their contributions. Multiple Application Domains Fujaba followed the model-driven development philosophy right from its beginning in 1997. At the early days, Fujaba had a special focus on code generation from UML diagrams resulting in a visual programming language with a special emphasis on object structure manipulating rules. Today, at least six rather independent tool versions are under development in Paderborn, Kassel, and Darmstadt for supporting (1) reengineering, (2) embedded real-time systems, (3) education, (4) specification of distributed control systems, (5) integration with the ECLIPSE platform, and (6) MOF-based integration of system (re-) engineering tools. International Community According to our knowledge, quite a number of research groups have also chosen Fujaba as a platform for UML and MDA related research activities. In addition, quite a number of Fujaba users send requests for more functionality and extensions. Therefore, the 8th International Fujaba Days aimed at bringing together Fujaba develop- ers and Fujaba users from all over the world to present their ideas and projects and to discuss them with each other and with the Fujaba core development team.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kern der vorliegenden Arbeit ist die Modellierung komplexer Webapplikationen mit dem Story-Driven-Modeling Ansatz.Ziel ist es hierbei,die komplette Applikation allein durch die Spezifikation von Modellen zu entwickeln. Das händische Erstellen von Quelltext ist nicht notwendig. Die vorliegende Arbeit zeigt sowohl den Forschungsweg, der die angestrebte Modellierung von Webapplikationen ermöglicht, als auch die resultierenden Ergebnisse auf. Zur Unterstützung des Entwicklungsprozesses wird weiterhin ein modellgetriebener Softwareentwicklungsprozess vorgestellt, der die Modellierung einer Webapplikation von der Aufnahme der Anforderungen, bis zur abschließenden Erzeugung des Quelltextes durch Codegenerierung aus den spezifizierten Modellen, abdeckt. Für den definierten Prozess wird ferner Werkzeugunterstützung innerhalb der Fujaba Toolsuite bereitgestellt. Im Rahmen der vorliegenden Arbeit wurde die bestehede Toolsuite hierzu um alle zur Prozessunterstützung notwendigen Werkzeuge erweitert. Darüber hinaus wurden im Rahmen der vorliegenden Arbeit die in Fujaba bestehenden Werkzeuge erweitert, um neben den klassischen Möglichkeiten zur Modellierung komplexer Java-Applikationen auch die Erzeugung von Webapplikationen zu ermöglichen. Neben der genauen Beschreibung des Entwicklungsprozesses werden im Rahmen dieser Arbeit die entstehenden Webapplikationen mit ihren spezifischen Eigenschaften genau beschrieben. Zur Erzeugung dieser Applikationen wird neben dem Entwicklungsprozess die Diagrammart der Workflowdiagramme eingeführt und beschrieben. Diese Diagramme dienen der Abbildung des intendierten Userworkflows der Applikation im Rahmen der Anforderungsanalyse und stellen im weiteren Entwicklungsverlauf ein dediziertes Entwicklungsartefakt dar. Basierend auf den Workflowdiagrammen werden sowohl die grafische Benutzerschnittstelle der Webapplikation beschrieben, als auch ein Laufzeitsystem initialisiert, welches basierend auf den im Workflowdiagramm abgebildeten Abläufen die Anwendung steuert. Dieses Laufzeitsystem wurde im Rahmen der vorliegenden Arbeit entwickelt und in der Prozessunterstützung verankert. Alle notwendigen Änderungen und Anpassungen und Erweiterungen an bereits bestehenden Teilen der Fujaba Toolsuite werden unter dem Aspekt der Erstellung clientseitiger Datenmodelle einer Webapplikation genau beschrieben und in Verbindung mit den zu erfüllenden Voraussetzungen erläutert. In diesem Zusammenhang wird ebenfalls beschrieben, wie Graphtransformationen zur Umsetzung von Businesslogik auf der Clientseite einer Webapplikation eingesetzt werden können und auf welche Weise Datenmodelländerungen zwischen unterschiedlichen Clients synchronisiert werden können. Insgesamt zeigt die vorliegende Arbeit einen Weg auf, den bestehenden Ansatz des StoryDriven-Modeling für die Erzeugung von Webapplikationen einzusetzen. Durch die im Rahmen dieser Arbeit beschriebene Herangehensweise werden hierbei gleichzeitig Webbrowser zu einer neuen Klasse von Graphersetzungs-Engines erweitert, indem Graphtransformationen innerhalb der Ajax-Engine des Browsers ausgeliefert und ausgeführt werden.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Facing the double menace of climate change threats and water crisis, poor communities have now encountered ever more severe challenges in ensuring agricultural productivity and food security. Communities hence have to manage these challenges by adopting a comprehensive approach that not only enhances water resource management, but also adapts agricultural activities to climate variability. Implemented by the Global Environment Facility’s Small Grants Programme, the Community Water Initiative (CWI) has adopted a distinctive approach to support demand-driven, innovative, low cost and community-based water resource management for food security. Experiences from CWI showed that a comprehensive, locally adapted approach that integrates water resources management, poverty reduction, climate adaptation and community empowerment provides a good model for sustainable development in poor rural areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kern der vorliegenden Arbeit ist die Erforschung von Methoden, Techniken und Werkzeugen zur Fehlersuche in modellbasierten Softwareentwicklungsprozessen. Hierzu wird zuerst ein von mir mitentwickelter, neuartiger und modellbasierter Softwareentwicklungsprozess, der sogenannte Fujaba Process, vorgestellt. Dieser Prozess wird von Usecase Szenarien getrieben, die durch spezielle Kollaborationsdiagramme formalisiert werden. Auch die weiteren Artefakte des Prozess bishin zur fertigen Applikation werden durch UML Diagrammarten modelliert. Es ist keine Programmierung im Quelltext nötig. Werkzeugunterstützung für den vorgestellte Prozess wird von dem Fujaba CASE Tool bereitgestellt. Große Teile der Werkzeugunterstützung für den Fujaba Process, darunter die Toolunterstützung für das Testen und Debuggen, wurden im Rahmen dieser Arbeit entwickelt. Im ersten Teil der Arbeit wird der Fujaba Process im Detail erklärt und unsere Erfahrungen mit dem Einsatz des Prozesses in Industrieprojekten sowie in der Lehre dargestellt. Der zweite Teil beschreibt die im Rahmen dieser Arbeit entwickelte Testgenerierung, die zu einem wichtigen Teil des Fujaba Process geworden ist. Hierbei werden aus den formalisierten Usecase Szenarien ausführbare Testfälle generiert. Es wird das zugrunde liegende Konzept, die konkrete technische Umsetzung und die Erfahrungen aus der Praxis mit der entwickelten Testgenerierung dargestellt. Der letzte Teil beschäftigt sich mit dem Debuggen im Fujaba Process. Es werden verschiedene im Rahmen dieser Arbeit entwickelte Konzepte und Techniken vorgestellt, die die Fehlersuche während der Applikationsentwicklung vereinfachen. Hierbei wurde darauf geachtet, dass das Debuggen, wie alle anderen Schritte im Fujaba Process, ausschließlich auf Modellebene passiert. Unter anderem werden Techniken zur schrittweisen Ausführung von Modellen, ein Objekt Browser und ein Debugger, der die rückwärtige Ausführung von Programmen erlaubt (back-in-time debugging), vorgestellt. Alle beschriebenen Konzepte wurden in dieser Arbeit als Plugins für die Eclipse Version von Fujaba, Fujaba4Eclipse, implementiert und erprobt. Bei der Implementierung der Plugins wurde auf eine enge Integration mit Fujaba zum einen und mit Eclipse auf der anderen Seite geachtet. Zusammenfassend wird also ein Entwicklungsprozess vorgestellt, die Möglichkeit in diesem mit automatischen Tests Fehler zu identifizieren und diese Fehler dann mittels spezieller Debuggingtechniken im Programm zu lokalisieren und schließlich zu beheben. Dabei läuft der komplette Prozess auf Modellebene ab. Für die Test- und Debuggingtechniken wurden in dieser Arbeit Plugins für Fujaba4Eclipse entwickelt, die den Entwickler bestmöglich bei der zugehörigen Tätigkeit unterstützen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction of short intense laser pulses with atoms/molecules produces a multitude of highly nonlinear processes requiring a non-perturbative treatment. Detailed study of these highly nonlinear processes by numerically solving the time-dependent Schrodinger equation becomes a daunting task when the number of degrees of freedom is large. Also the coupling between the electronic and nuclear degrees of freedom further aggravates the computational problems. In the present work we show that the time-dependent Hartree (TDH) approximation, which neglects the correlation effects, gives unreliable description of the system dynamics both in the absence and presence of an external field. A theoretical framework is required that treats the electrons and nuclei on equal footing and fully quantum mechanically. To address this issue we discuss two approaches, namely the multicomponent density functional theory (MCDFT) and the multiconfiguration time-dependent Hartree (MCTDH) method, that go beyond the TDH approximation and describe the correlated electron-nuclear dynamics accurately. In the MCDFT framework, where the time-dependent electronic and nuclear densities are the basic variables, we discuss an algorithm to calculate the exact Kohn-Sham (KS) potentials for small model systems. By simulating the photodissociation process in a model hydrogen molecular ion, we show that the exact KS potentials contain all the many-body effects and give an insight into the system dynamics. In the MCTDH approach, the wave function is expanded as a sum of products of single-particle functions (SPFs). The MCTDH method is able to describe the electron-nuclear correlation effects as the SPFs and the expansion coefficients evolve in time and give an accurate description of the system dynamics. We show that the MCTDH method is suitable to study a variety of processes such as the fragmentation of molecules, high-order harmonic generation, the two-center interference effect, and the lochfrass effect. We discuss these phenomena in a model hydrogen molecular ion and a model hydrogen molecule. Inclusion of absorbing boundaries in the mean-field approximation and its consequences are discussed using the model hydrogen molecular ion. To this end, two types of calculations are considered: (i) a variational approach with a complex absorbing potential included in the full many-particle Hamiltonian and (ii) an approach in the spirit of time-dependent density functional theory (TDDFT), including complex absorbing potentials in the single-particle equations. It is elucidated that for small grids the TDDFT approach is superior to the variational approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to develop an internet-based seminar framework applicable for landscape architecture education. This process was accompanied by various aims. The basic expectation was to keep the main characteristics of landscape architecture education also in the online format. On top of that, four further objectives were anticipated: (1) training of competences for virtual team work, (2) fostering intercultural competence, (3) creation of equal opportunities for education through internet-based open access and (4) synergy effects and learning processes across institutional boundaries. This work started with the hypothesis that these four expected advantages would compensate for additional organisational efforts caused by the online delivery of the seminars and thus lead to a sustainable integration of this new learning mode into landscape architecture curricula. This rationale was followed by a presentation of four areas of knowledge to which the seminar development was directly related (1) landscape architecture as a subject and its pedagogy, (2) general learning theories, (3) developments in the ICT sector and (4) wider societal driving forces such as global citizenship and the increase of open educational resources. The research design took the shape of a pedagogical action research cycle. This approach was constructive: The author herself is teaching international landscape architecture students so that the model could directly be applied in practice. Seven online seminars were implemented in the period from 2008 to 2013 and this experience represents the core of this study. The seminars were conducted with varying themes while its pedagogy, organisation and the technological tools remained widely identical. The research design is further based on three levels of observation: (1) the seminar design on the basis of theory and methods from the learning sciences, in particular educational constructivism, (2) the seminar evaluation and (3) the evaluation of the seminars’ long term impact. The seminar model itself basically consists of four elements: (1) the taxonomy of learning objectives, (2) ICT tools and their application and pedagogy, (3) process models and (4) the case study framework. The seminar framework was followed by the presentation of the evaluation findings. The major findings of this study can be summed up as follows: Implementing online seminars across educational and national boundaries was possible both in term of organisation and technology. In particular, a high level of cultural diversity among the seminar participants has definitively been achieved. However, there were also obvious obstacles. These were primarily competing study commitments and incompatible schedules among the students attending from different academic programmes, partly even in different time zones. Both factors had negative impact on the individual and working group performances. With respect to the technical framework it can be concluded that the majority of the participants were able to use the tools either directly without any problem or after overcoming some smaller problems. Also the seminar wiki was intensively used for completing the seminar assignments. However, too less truly collaborative text production was observed which could be improved by changing the requirements for the collaborative task. Two different process models have been applied for guiding the collaboration of the small groups and both were in general successful. However, it needs to be said that even if the students were able to follow the collaborative task and to co-construct and compare case studies, most of them were not able to synthesize the knowledge they had compiled. This means that the area of consideration often remained on the level of the case and further reflections, generalisations and critique were largely missing. This shows that the seminar model needs to find better ways for triggering knowledge building and critical reflection. It was also suggested to have a more differentiated group building strategy in future seminars. A comparison of pre- and post seminar concept maps showed that an increase of factual and conceptual knowledge on the individual level was widely recognizable. Also the evaluation of the case studies (the major seminar output) revealed that the students have undergone developments of both the factual and the conceptual knowledge domain. Also their self-assessment with respect to individual learning development showed that the highest consensus was achieved in the field of subject-specific knowledge. The participants were much more doubtful with regard to the progress of generic competences such as analysis, communication and organisation. However, 50% of the participants confirmed that they perceived individual development on all competence areas the survey had asked for. Have the additional four targets been met? Concerning the competences for working in a virtual team it can be concluded that the vast majority was able to use the internet-based tools and to work with them in a target-oriented way. However, there were obvious differences regarding the intensity and activity of participation, both because of external and personal factors. A very positive aspect is the achievement of a high cultural diversity supporting the participants’ intercultural competence. Learning from group members was obviously a success factor for the working groups. Regarding the possibilities for better accessibility of educational opportunities it became clear that a significant number of participants were not able to go abroad during their studies because of financial or personal reasons. They confirmed that the online seminar was to some extent a compensation for not having been abroad for studying. Inter-institutional learning and synergy was achieved in so far that many teachers from different countries contributed with individual lectures. However, those teachers hardly ever followed more than one session. Therefore, the learning effect remained largely within the seminar learning group. Looking back at the research design it can be said that the pedagogical action research cycle was an appropriate and valuable approach allowing for strong interaction between theory and practice. However, some more external evaluation from peers in particular regarding the participants’ products would have been valuable.