952 resultados para OpenFlow, SDN, Software-Defined Networking, Cloud
Resumo:
Today cloud computing is the next stage in development information-oriented society in field of information technologies. Great attention is paid to cloud computing in general, but the lack of scientific consideration to components brings to the problem, that not all aspects are well examined. This thesis is an attempt to consider Platform as a Service (a technology of providing development environment through the Internet) from divergent angles. Technical characteristics, costs, time, estimation of effectiveness, risks, strategies that can be applied, migration process, advantages and disadvantages and the future of the approach are examined to get the overall picture of cloud platforms. During the work literature study was used to examine Platform as a Service, characteristics of existent cloud platforms were explored and a model of a typical software development company was developed to create a scenario of migration to cloud technologies. The research showed that besides all virtues in reducing costs and time, cloud platforms have some significant obstacles in adoption. Privacy, security and insufficient legislation impede the concept to be widespread.
Resumo:
The purpose of this thesis is to investigate projects funded in European 7th framework Information and Communication Technology- work programme. The research has been limited to issue ”Pervasive and trusted network and service infrastructure” and the aim is to find out which are the most important topics into which research will concentrate in the future. The thesis will provide important information for the Department of Information Technology in Lappeenranta University of Technology. First in this thesis will be investigated what are the requirements for the projects which were funded in “Pervasive and trusted network and service infrastructure” – programme 2007. Second the projects funded according to “Pervasive and trusted network and service infrastructure”-programme will be listed in to tables and the most important keywords will be gathered. Finally according to the keyword appearances the vision of the most important future topics will be defined. According to keyword-analysis the wireless networks are in important role in the future and core networks will be implemented with fiber technology to ensure fast data transfer. Software development favors Service Oriented Architecture (SOA) and open source solutions. The interoperability and ensuring the privacy are in key role in the future. 3D in all forms and content delivery are important topics as well. When all the projects were compared, the most important issue was discovered to be SOA which leads the way to cloud computing.
Resumo:
The goal of this study is to develop managerial recommendations for international vendors and system integrators, which offer Software as a Service for enterprise information systems on the Russian market. Those recommendations can be used to develop marketing, sales, new product and service level agreement strategies. For those reasons factors affecting SaaS adoption were determined and their influence on intention to adoption was examined.
Resumo:
Cloud computing enables on-demand network access to shared resources (e.g., computation, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort. Cloud computing refers to both the applications delivered as services over the Internet and the hardware and system software in the data centers. Software as a service (SaaS) is part of cloud computing. It is one of the cloud service models. SaaS is software deployed as a hosted service and accessed over the Internet. In SaaS, the consumer uses the provider‘s applications running in the cloud. SaaS separates the possession and ownership of software from its use. The applications can be accessed from any device through a thin client interface. A typical SaaS application is used with a web browser based on monthly pricing. In this thesis, the characteristics of cloud computing and SaaS are presented. Also, a few implementation platforms for SaaS are discussed. Then, four different SaaS implementation cases and one transformation case are deliberated. The pros and cons of SaaS are studied. This is done based on literature references and analysis of the SaaS implementations and the transformation case. The analysis is done both from the customer‘s and service provider‘s point of view. In addition, the pros and cons of on-premises software are listed. The purpose of this thesis is to find when SaaS should be utilized and when it is better to choose a traditional on-premises software. The qualities of SaaS bring many benefits both for the customer as well as the provider. A customer should utilize SaaS when it provides cost savings, ease, and scalability over on-premises software. SaaS is reasonable when the customer does not need tailoring, but he only needs a simple, general-purpose service, and the application supports customer‘s core business. A provider should utilize SaaS when it offers cost savings, scalability, faster development, and wider customer base over on-premises software. It is wise to choose SaaS when the application is cheap, aimed at mass market, needs frequent updating, needs high performance computing, needs storing large amounts of data, or there is some other direct value from the cloud infrastructure.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
Scrum is an agile project management approach that has been widely practiced in the software development projects. It has proven to increase quality, productivity, customer satisfaction, transparency and team morale among other benefits from its implementation. The concept of scrum is based on the concepts of incremental innovation strategies, lean manufacturing, kaizen, iterative development and so on and is usually contrasted with the linear development models such as the waterfall method in the software industry. The traditional approaches to project management such as the waterfall method imply intensive upfront planning and approval of the entire project. These sort of approaches work well in the well-defined stable environments where all the specifications of the project are known in the beginning. However, in the uncertain environments when a project requires continuous development and incorporation of new requirements, they do not tend to work well. The scrum framework was inspiraed by Nonaka’s article about new product developement and was later adopted by software development practitioners. This research explores conditions for and benefits of the application of scrum framework beyond software development projects. There are currently a few case studies on the scrum implementation in non-software projects, but there is a noticeable trend of it in the scrum practitioners’ community. The research is based on the real-life context multiple case study analysis of three different non-software projects. The results of the research showed that in order to succeed within scrum projects need to satisfy certain conditions – necessary and sufficient. Among them the key factors are uncertainty of the project environment, not well defined outcomes, commitment of the scrum teams and management support. The top advantages of scrum implementation identified in the present research include improved transparency, accountability, team morale, communications, cooperation and collaboration. Further researches are advised to be carried out in order to validate these findings on a larger sample and to focus on more specific areas of scrum project management implementation.
Resumo:
Työn tavoite on ollut tutkia ja rakentaa pilvipalvelun identiteetin- ja pääsynhallinta liiketoimintapalveluksi yrityksille ja organisaatioille. Lähtökohtana on ollut valmiiden identiteetinhallintaohjelmistotuotteiden käyttäminen kehitettävän palvelutuotteen osana. Työssä on ollut tarkoitus selvittää, voiko identiteetinhallintaa ja pääsynhallintaa tuottaa ja tarjota pilvipalveluna kannattavasti. Tutkimusote on ollut konstruktiivinen ja triangulaatiossa on käytetty useaa menetelmää, jotta on saatu selvä kuva liiketoiminnan luonteesta ja tarpeista. Menetelmiä ovat olleet kyselytutkimus ja peste-analyysi. Lisäksi on tehty liiketoimintasuunnitelma ja palveluliiketoiminnan kuvaus Orsterwalderin canvas- menetelmällä. Jokainen tutkimusosa on ollut oleellinen määritettäessä palvelutuotteen ominaisuuksia, koska tavoite on ollut saada mahdollisimman luotettava ja helppokäyttöinen tuote nopeasti kasvaville pilvipalvelumarkkinoille. Tutkimuksen tuloksena on määritelty malli palveluliiketoiminnan tarpeisiin sopivasta turvallisesta palvelualustasta, joka skaalautuu hyvin pilvipalveluiden käytön lisääntyessä voimakkaasti. Liiketoimintasuunnitelman laskelmien avulla on löydetty käyttäjämäärien alarajat kannattavaan liiketoimintaan. Lisäksi on huomattu palvelun rakenteen auttavan yrityksiä ja organisaatiota suojaamaan pilvipalveluiden käyttäjätunnukset ja salasanat väärinkäytöksiltä, mikä on tarpeellista ja ajankohtaista kaikille organisaatioille, jotka harkitsevat pilvipalveluiden käyttöä ja haluavat tehdä sen tietoturvallisesti. Tutkimuksen tuloksena on pystytty määrittelemään, onko liiketoiminta kannattavaa vai ei sekä palvelun tarvitsemat liiketoimintaelementit.
Resumo:
Cloud computing, despite its success and promises, presents issues for businesses migrating their legacy applications to cloud. In this research legacy-to-cloud migration issues are reviewed based on literature findings and an experience report. Solutions are applied to Tieto Open Application Suite (TOAS) software development platform running on cloud infrastructure. It is observed that the migration strategy heavily affects the migration approach. For TOAS a strategy of redesigning the applications for cloud is suggested. Common migration-driven application level modifications include adaptation to service-oriented architecture, load balancing, and runtime and technology changes. A cloud platform such as TOAS might introduce additional needs. Decision making on migration strategy is found to be an issue to be solved case by case. Use of assistive decision making tools is suggested.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The purpose of this study was to find out how a software company can successfully expand business to the Danish software market through distribution channel. The study was commissioned by a Finnish software company and it was conducted using a qualitative research method by analyzing external and internal business environment, and interviewing Danish ICT organizations and M-Files personnel. Interviews were semi-structured interviews, which were designed to collect comprehensive information on the existing ICT and software market in Denmark. The research used three external and internal analyzing frameworks; PEST analysis (market level), Porter´s Five Force analysis (industry level competition) and SWOT analysis (company level). Distribution channels theory was a base to understand why and what kind of distribution channels the case company uses, and what kind of channels target markets companies’ uses. Channel strategy and design were integrated to the industry level analysis. The empirical findings revealed that Denmark has very business friendly ICT environment. Several organizations have ranked Denmark´s information and communication technology as the best in the world. Denmark’s ICT and software market are relatively small, compared to many other countries in Europe. Danish software market is centralized. Largest software clusters are in the largest cities; Copenhagen, Aarhus, Odense and Aalborg. From these clusters, software companies can most likely find suitable resellers. The following growing trends are clearly seen in the software market: mobile and wireless applications, outsourcing, security solutions, cloud computing, social business solutions and e-business solutions. When expanding software business to the Danish market, it is important to take into account these trends. In Denmark distribution channels varies depending on the product or service. For many, a natural distribution channel is a local partner or internet. In the public sector solutions are purchased through a public procurement process. In the private sector the buying process is more straight forwarded. Danish companies are buying software from reliable suppliers. This means that they usually buy software direct from big software vendors or local partners. Some customers prefer to use professional consulting companies. These consulting companies can strongly influence on the selection of the supplier and products, and in this light, consulting companies can be important partners for software companies. Even though the competition is fierce in ECM and DMS solutions, Danish market offers opportunities for foreign companies. Penetration to the Danish market through reseller channel requires advanced solutions and objective selection criteria for channel partners. Based on the findings, Danish companies are interested in advanced and efficient software solutions. Interest towards M-Files solutions was clearly seen and the company has excellent opportunity to expand business to the Danish market through reseller channel. Since the research explored the Danish ICT and software market, the results of the study may offer valuable information also to the other software companies which are expanding their business to the Danish market.
Resumo:
The whole research of the current Master Thesis project is related to Big Data transfer over Parallel Data Link and my main objective is to assist the Saint-Petersburg National Research University ITMO research team to accomplish this project and apply Green IT methods for the data transfer system. The goal of the team is to transfer Big Data by using parallel data links with SDN Openflow approach. My task as a team member was to compare existing data transfer applications in case to verify which results the highest data transfer speed in which occasions and explain the reasons. In the context of this thesis work a comparison between 5 different utilities was done, which including Fast Data Transfer (FDT), BBCP, BBFTP, GridFTP, and FTS3. A number of scripts where developed which consist of creating random binary data to be incompressible to have fair comparison between utilities, execute the Utilities with specified parameters, create log files, results, system parameters, and plot graphs to compare the results. Transferring such an enormous variety of data can take a long time, and hence, the necessity appears to reduce the energy consumption to make them greener. In the context of Green IT approach, our team used Cloud Computing infrastructure called OpenStack. It’s more efficient to allocated specific amount of hardware resources to test different scenarios rather than using the whole resources from our testbed. Testing our implementation with OpenStack infrastructure results that the virtual channel does not consist of any traffic and we can achieve the highest possible throughput. After receiving the final results we are in place to identify which utilities produce faster data transfer in different scenarios with specific TCP parameters and we can use them in real network data links.
Resumo:
Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.
Resumo:
Manufacturing industry has been always facing challenge to improve the production efficiency, product quality, innovation ability and struggling to adopt cost-effective manufacturing system. In recent years cloud computing is emerging as one of the major enablers for the manufacturing industry. Combining the emerged cloud computing and other advanced manufacturing technologies such as Internet of Things, service-oriented architecture (SOA), networked manufacturing (NM) and manufacturing grid (MGrid), with existing manufacturing models and enterprise information technologies, a new paradigm called cloud manufacturing is proposed by the recent literature. This study presents concepts and ideas of cloud computing and cloud manufacturing. The concept, architecture, core enabling technologies, and typical characteristics of cloud manufacturing are discussed, as well as the difference and relationship between cloud computing and cloud manufacturing. The research is based on mixed qualitative and quantitative methods, and a case study. The case is a prototype of cloud manufacturing solution, which is software platform cooperated by ATR Soft Oy and SW Company China office. This study tries to understand the practical impacts and challenges that are derived from cloud manufacturing. The main conclusion of this study is that cloud manufacturing is an approach to achieve the transformation from traditional production-oriented manufacturing to next generation service-oriented manufacturing. Many manufacturing enterprises are already using a form of cloud computing in their existing network infrastructure to increase flexibility of its supply chain, reduce resources consumption, the study finds out the shift from cloud computing to cloud manufacturing is feasible. Meanwhile, the study points out the related theory, methodology and application of cloud manufacturing system are far from maturity, it is still an open field where many new technologies need to be studied.
Resumo:
Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.