931 resultados para Service Programming Environment
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Wireless sensor networks and its applications have been widely researched and implemented in both commercial and non commercial areas. The usage of wireless sensor network has developed its market from military usage to daily use of human livings. Wireless sensor network applications from monitoring prospect are used in home monitoring, farm fields and habitant monitoring to buildings structural monitoring. As the usage boundaries of wireless sensor networks and its applications are emerging there are definite ongoing research, such as lifetime for wireless sensor network, security of sensor nodes and expanding the applications with modern day scenarios of applications as web services. The main focus in this thesis work is to study and implement monitoring application for infrastructure based sensor network and expand its usability as web service to facilitate mobile clients. The developed application is implemented for wireless sensor nodes information collection and monitoring purpose enabling home or office environment remote monitoring for a user.
Resumo:
The forthcoming media revolution of exchanging paper documents to digital media in construction engineering requires new tools to be developed. The basis of this bachelor’s thesis was to explore the preliminary possibilities of exporting imagery from a Building Information Modelling –software to a mobile phone on a construction yard. This was done by producing a Web Service which uses the design software’s Application Programming Interface to interact with a structures model in order to produce the requested imagery. While mobile phones were found lacking as client devices, because of limited processing power and small displays, the implementation showed that the Tekla Structures API can be used to automatically produce various types of imagery. Web Services can be used to transfer this data to the client. Before further development the needs of the contractor, benefits for the building master and inspector and the full potential of the BIM-software need to be mapped out with surveys.
Resumo:
The costs of health care are increasing, and at the same time, population is aging. This leads health care organizations to focus more on home based care services. This thesis focuses on the home care organization of the South Karelian District of Social and Health Services (Eksote), which was established in 2010; how its operation is organized and managed, and which problem types are faced in the daily operation of home care. This thesis examines home care services through an extensive interview study, process mapping and statistical data analysis. To be able to understand the nature of services and special environment theory models, such as service management and performance measurement, service processes and service design are introduced. This study is conducted from an external researcher‟s point of view and should be used as a discussion opener. The outcome of this thesis is an upper level development path for Eksote home care. The organization should evaluate and build a service offering, then productize home care services and modularize the products and identify similarities. Service processes should be mapped to generate efficiency for repeating tasks. Units should be reasonably sized and geographically located to facilitate management and operation. All this can be done by recognizing the different types of service products: runners repeaters and strangers. Furthermore, the organization should not hide behind medical issues and should understand the legislative, medical and operational frameworks in health care.
Resumo:
The main objective of this Master’s Thesis was to examine the interrelations of service quality and relationship quality (customer satisfaction, trust and commitment), and find out are they antecedents for customer loyalty in business-to-business context. Literature review revealed some research gaps concerning these focal concepts, which should be studied more closely. The theoretical basis for this research was collected for evaluating a strategic increase of customer’s perceptions of service quality and relationship quality as well as customer loyalty in business-to-business environment, and it was tested empirically in a sample of 164 corporate customers, who responded to the Internet-based survey. The measures, used in the survey, were first assessed by using confirmatory factor analysis (CFA), and then the hypothesized relationships were further verified using structural equation modeling (SEM) in LISREL 8.80. There was found support for a half of the hypothesized construct relations. The results of the research confirm the direct influence of trust and commitment on customer loyalty. Also, service quality turned out to have an indirect impact on customer loyalty through trust. No support, however, was offered for the proposed impact of customer satisfaction on loyalty in this case. The research provides managerially relevant and actionable results that may help service providers execute more specific customer relationship quality strategies that lead to higher customer loyalty.
Resumo:
This paper presents the development of a two-dimensional interactive software environment for structural analysis and optimization based on object-oriented programming using the C++ language. The main feature of the software is the effective integration of several computational tools into graphical user interfaces implemented in the Windows-98 and Windows-NT operating systems. The interfaces simplify data specification in the simulation and optimization of two-dimensional linear elastic problems. NURBS have been used in the software modules to represent geometric and graphical data. Extensions to the analysis of three-dimensional problems have been implemented and are also discussed in this paper.
Resumo:
The role of contract manufacturing and subcontracting has been seen in black and white in product and service point of view. It used to be seen either as a product or a service. In the thesis product-service system, offering combining products and services, was discussed. Theory was created from two perspectives; Service productization via Business Model generation and product servitization via New Service Development process. Target for the case study was to point out new ways of service thinking and ways for changing business environment in contract manufacturing, especially in customer satisfaction and profitability point of view. The case study is following the New Service Development process phases. First ideas were collected from literature and via sales management interviews. Service offering and tool for service requirement evaluation was created. Last financial results of example service scenarios were calculated. It is recommended to take service offering into internal use and further develop it into modular service model. It is also recommended to take created customer service requirement evaluation tool into use for capturing customer service needs but also for communicating those internally.
Resumo:
The trend of concentrating to core competencies leads to outsourcing of non-core activities. One such activity is logistics, where the responsibility is given to third-party service providers. This means the service provider acts as an intermediary between the buyer and the end customer. This thesis concentrates on depicting the operational environment of one such service provider, Swissport Finland Ltd, and the improvement of their checked baggage irregularity service. The tools used for this work were service blueprinting, an illustrative method for service mapping, and failure modes and effects analysis. The theoretical part of the thesis offers a framework for using these tools for logistics services, while the empirical part consists of a study mostly qualitative in nature. Action research method was used for the service improvement research. According to the results of this study the combination of service blueprinting and FMEA can be used successfully for irregularity service improvement. The most important result was an enhanced irregularity process that has been found to alleviate earlier problems.
Resumo:
Video transcoding refers to the process of converting a digital video from one format into another format. It is a compute-intensive operation. Therefore, transcoding of a large number of simultaneous video streams requires a large amount of computing resources. Moreover, to handle di erent load conditions in a cost-e cient manner, the video transcoding service should be dynamically scalable. Infrastructure as a Service Clouds currently offer computing resources, such as virtual machines, under the pay-per-use business model. Thus the IaaS Clouds can be leveraged to provide a coste cient, dynamically scalable video transcoding service. To use computing resources e ciently in a cloud computing environment, cost-e cient virtual machine provisioning is required to avoid overutilization and under-utilization of virtual machines. This thesis presents proactive virtual machine resource allocation and de-allocation algorithms for video transcoding in cloud computing. Since users' requests for videos may change at di erent times, a check is required to see if the current computing resources are adequate for the video requests. Therefore, the work on admission control is also provided. In addition to admission control, temporal resolution reduction is used to avoid jitters in a video. Furthermore, in a cloud computing environment such as Amazon EC2, the computing resources are more expensive as compared with the storage resources. Therefore, to avoid repetition of transcoding operations, a transcoded video needs to be stored for a certain time. To store all videos for the same amount of time is also not cost-e cient because popular transcoded videos have high access rate while unpopular transcoded videos are rarely accessed. This thesis provides a cost-e cient computation and storage trade-o strategy, which stores videos in the video repository as long as it is cost-e cient to store them. This thesis also proposes video segmentation strategies for bit rate reduction and spatial resolution reduction video transcoding. The evaluation of proposed strategies is performed using a message passing interface based video transcoder, which uses a coarse-grain parallel processing approach where video is segmented at group of pictures level.
Resumo:
Today lean-philosophy has gathered a lot of popularity and interest in many industries. This customer-oriented philosophy helps to understand customer’s value creation which can be used to improve efficiency. A comprehensive study of lean and lean-methods in service industry were created in this research. In theoretical part lean-philosophy is studied in different levels which will help to understand its diversity. To support lean, this research also presents basic concepts of process management. Lastly theoretical part presents a development model to support process development in systematical way. The empirical part of the study was performed by performing experimental measurements during the service center’s product return process and by analyzing this data. Measurements were used to map out factors that have a negative influence on the process flow. Several development propositions were discussed to remove these factors. Problems mainly occur due to challenges in controlling customers and due to the lack of responsibility and continuous improvement on operational level. Development propositions concern such factors as change in service center’s physical environment, standardization of work tasks and training. These factors will remove waste in the product return process and support the idea of continuous improvement.
Resumo:
As computer networks grow larger and more complex there is a need for a new, simpler kind of approach to configuring them. Software Defined Networking (SDN) takes the control plane away from individual nodes and centralizes the network control by utilizing a flow based traffic management. In this thesis the suitability of SDN in a small ISP (Internet Service Provider) network is considered for an alternative to the current traditional core network and access network OSSs (Operations Support System), mainly to simplify the network management but also to see what else would SDN offer for such an environment. Combining information learned from a theoretical study on the matter to a more practical experiment of SDN network simulation using Mininet simulation software and OpenDayLight SDN controller software does this. Although the simulation shows that SDN is able to provide the functionality needed for the network, the immaturity of the technology suggests that for a small ISP network there is no need to utilize SDN just yet. For when SDN becomes more commonplace a brief transition plan is introduced.
Resumo:
Työn tavoitteena on kehittää ABB:lle palvelutuote, jota voidaan tarjota voimalaitosasiakkaille. Uuden palvelutuotteen tulee vastata ABB:n uuden strategian linjauksiin. Palvelulla tarjotaan asiakkaille 1.1.2015 voimaan tulleen energiatehokkuuslain määrittelemien pakollisten toimenpiteiden suoritusta. Työssä kerätään, käsitellään ja analysoidaan tietoa voimalaitosasiakkaille suunnatun palvelun tuotteistamisprosessin päätöksenteon tueksi. Palvelutuotteen kehittämistä varten tutkitaan ABB:n nykyisiä palvelutuotteita, osaamista ja referenssi projekteja, energiatehokkuuslakia, voimalaitosten energiatehokkuus-potentiaalia ja erilaisia energiakatselmusmalleja. Päätöksenteon tueksi tehdään referenssiprojektina energia-analyysi voimalaitokselle, jossa voimalaitoksesta tehdään ipsePRO simulointiohjelmalla mallinnus. Mallinnuksen ja koeajojen avulla tutkitaan voimalaitoksen minimikuorman optimointia. Markkinatutkimuksessa selvitetään lainsäädännön vaikutusta, nykyistä markkinatilannetta, potentiaalisia asiakkaita, kilpailijoita ja ABB:n mahdollisuuksia toimia alalla SWOT–analyysin avulla. Tutkimuksen tulosten perusteella tehdään päätös tuotteistaa voimalaitoksille palvelutuote, joka sisältää kaikki toimet energiatehokkuuslain asettamien vaatimusten täyttämiseen yrityksen energiakatselmuksen vastuuhenkilön, energiakatselmuksen ja kohdekatselmuksien teon osalta. Lisäksi työn aikana Energiavirasto myönsi ABB:lle pätevyyden toimia yrityksen energiakatselmuksen vastuuhenkilönä, mikä on edellytyksenä palvelun tarjoamiselle.
Resumo:
The significance and impact of services in the modern global economy has become greater and there has been more demand for decades in the academic community of international business for further research into better understanding internationalisation of services. Theories based on the internationalisation of manufacturing firms have been long questioned for their applicability to services. This study aims at contributing to understanding internationalisation of services by examining how market selection decisions are made for new service products within the existing markets of a multinational financial service provider. The study focused on the factors influencing market selection and the study was conducted as a case study on a multinational financial service firm and two of its new service products. Two directors responsible for the development and internationalisation of the case service products were interviewed in guided semi-structured interviews based on themes adopted from the literature review and the outcome theoretical framework. The main empirical findings of the study suggest that the most significant factors influencing the market selection for new service products within a multinational financial service firm’s existing markets are: commitment to the new service products by both the management and the rest of the product related organisation; capability and competence by the local country organisations to adopt new services; market potential which combines market size, market structure and competitive environment; product fit to the market requirements; and enabling partnerships. Based on the empirical findings, this study suggests a framework of factors influencing market selection for new service products, and proposes further research issues and methods to test and extend the findings of this research.
Resumo:
The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.
Resumo:
Corporate support functions are increasingly being concentrated into service centers. Service Management principles guide companies in the transition. Service Financial Management is an integral part in supporting the strategic positioning of the service center. The main goal of this thesis is to create a step-by-step plan to improve and automate the service charging processes for the finance service function of the case company. Automating the service transaction data collection for reporting is expected to improve efficiency, reliability and transparency. Interviews with finance service managers are held to define current processes and areas for improvement. These create the basis for the creation of a development roadmap that takes place in two phases. The first phase is to create an environment where automation is possible, and the second phase is the automation of each finance service. Benchmarking interviews are held with the service centers in three other companies to discover best practices. The service charging processes between the studied companies are found incompatible, and suggestions for process automation cannot be inferred. Some implications of Service Financial Management decisions to the strategy of the service center are identified. The bundling of services and charging them inside or outside of the goal-setting frame of the business unit can be used to support the strategic choice and customer acceptance of the service center.