977 resultados para service composition testing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Web services are loosely coupled applications that use XML documents as a way of integrating distinct systems on the internet. Such documents are used by in standards such as SOAP, WSDL and UDDI which establish, respectively, integrated patterns for the representation of messages, description, and publication of services, thus facilitating the interoperability between heterogeneous systems. Often one single service does not meet the users needs, therefore new systems can be designed from the composition of two or more services. This which is the design goal behind the of the Service Oriented Architecture. Parallel to this scenario, we have the PEWS (Predicate Path-Expressions for Web Services) language, which speci es behavioural speci cations of composite web service interfaces.. The development of the PEWS language is divided into two parts: front-end and back-end. From a PEWS program, the front-end performs the lexical analysis, syntactic and semantic compositions and nally generate XML code. The function of the back-end is to execute the composition PEWS. This master's dissertation work aims to: (i) reformulate the proposed architecture for the runtime system of the language, (ii) Implement the back-end for PEWS by using .NET Framework tools to execute PEWS programs using the Windows Work ow Foundation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El aumento de las capacidades de interconexión de dispositivos de todo tipo está suponiendo una revolución en el campo de la prestación de servicios, tanto en la cantidad como en la variedad. Esta evolución ha puesto de manifiesto la necesidad de abordar un desarrollo tecnológico sin precedentes, donde la previsión de dispositivos interconectados e interoperando entre sí y con las personas alcanza cifras del orden de los millardos. Esta idea de un mundo de cosas interconectadas ha dado lugar a una visión que se ha dado en llamar Internet de las Cosas. Un mundo donde las cosas de cualquier tipo pueden interactuar con otras cosas, incluyendo las que forman parte de redes con recurso limitados. Y esto además conduce a la creación de servicios compuestos que superan a la suma de las partes. Además de la relevancia tecnológica, esta nueva visión enlaza con la de la Ciudad del Futuro. Un concepto que recurre a la convergencia de la energía, el transporte y las tecnologías de la información y las comunicaciones para definir una forma mediante la que lograr el crecimiento sostenible y competitivo, mejorando así la calidad de vida y abriendo el gobierno de las ciudades a la participación ciudadana. En la línea de desarrollo que permite avanzar hacia la consecución de tales objetivos, este Proyecto Fin de Carrera propone una forma de virtualizar los servicios ofrecidos por la diversidad de dispositivos que van adquiriendo la capacidad de interoperar en una red. Para ello se apoya en el uso de una capa de intermediación orientada a servicios, nSOM, desarrollada en la EUITT. Sobre esta arquitectura se proponen como objetivos el diseño y desarrollo de una pasarela de servicios que haga accesibles desde la web los recursos ofrecidos en una red de sensores; el diseño y desarrollo de un registro de dispositivos y servicios en concordancia a la propuesta de arquitectura de referencia para Internet de las Cosas; y el estudio y diseño de un marco para la composición de servicios orquestados en redes de recursos limitados. Para alcanzar estos objetivos primero se abordará un estudio del estado del arte donde se profundizará en el conocimiento de la las tecnologías para la interoperatividad entre cosas, abordando los principios de las redes inalámbricas de sensores y actuadores, las arquitecturas para las comunicaciones Máquina a Máquina e Internet de las Cosas, y la visión de la Web de las Cosas. Seguidamente se tratarán las tecnologías de red y de servicios de interés, para finalizar con un breve repaso a las tecnologías para la composición de servicios. Le seguirá una descripción detallada de la arquitectura nSOM y del diseño propuesto para este proyecto. Finalmente se propondrá un escenario sobre el que se llevarán a cabo diferentes pruebas de validación. ABSTRACT. The increasing of the capabilities of all kind of devices is causing a revolution in the field of the provision of services, both in quantity and in diversity. This situation has highlighted the need to address unprecedented technological development, where the forecast of interconnected and interoperable devices between them and human beings reaches the order of billions. And these numbers go further when the connectivity of constrained networks is taken into account. This idea of an interconnected world of things has led to a vision that has been called "The Internet of Things". It’s a vision of a world where things of any kind can interact with other things, even those in the domain of a constrained network. This also leads to the creation of new composed services that exceed the sum of the parts. Besides the technological interest, this new vision relates with the one from the Smart City. A concept that uses the convergence of the energy, the transport, and the information and communication technologies to define a way to achieve sustainable and competitive growth, improving the quality of life, and opening the governance of the cities to the participation. In the development pathway to reach these goals, this Final Degree Dissertation proposes a way for the virtualization of the services offered by the variety of devices that are reaching the ability to interoperate in a network. For this it is supported by a service oriented middleware called nSOM that has been developed at EUITT. Using this architecture the goals proposed for this project are the design and development of a service gateway that makes available the resources of a sensor network through a web interface; the design and development of a Device & Service Registry according to the reference architecture proposal for the Internet of Things; and the study and design of a composition framework for orchestrated services in constrained networks. To achieve these goals this dissertation begins with a State of the Art study where the background knowledge about the technologies in use for the interoperation of things will be settled. At first it starts talking about Wireless Sensor and Actuator Networks, the architectures for Machine-to-Machine communication and Internet of Things, and also the concepts for the Web of Things vision. Next the related network and services technologies are explored, ending with a brief review of service composition technologies. Then will follow a detailed description of the nSOM architecture, and also of the proposed design for this project. Finally a scenario will be proposed where a series of validation tests will be conducted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the quick advance of web service technologies, end-users can conduct various on-line tasks, such as shopping on-line. Usually, end-users compose a set of services to accomplish a task, and need to enter values to services to invoke the composite services. Quite often, users re-visit websites and use services to perform re-occurring tasks. The users are required to enter the same information into various web services to accomplish such re-occurring tasks. However, repetitively typing the same information into services is a tedious job for end-users. It can negatively impact user experience when an end-user needs to type the re-occurring information repetitively into web services. Recent studies have proposed several approaches to help users fill in values to services automatically. However, prior studies mainly suffer the following drawbacks: (1) limited support of collecting and analyzing user inputs; (2) poor accuracy of filling values to services; (3) not designed for service composition. To overcome the aforementioned drawbacks, we need maximize the reuse of previous user inputs across services and end-users. In this thesis, we introduce our approaches that prevent end-users from entering the same information into repetitive on-line tasks. More specifically, we improve the process of filling out services in the following 4 aspects: First, we investigate the characteristics of input parameters. We propose an ontology-based approach to automatically categorize parameters and fill values to the categorized input parameters. Second, we propose a comprehensive framework that leverages user contexts and usage patterns into the process of filling values to services. Third, we propose an approach for maximizing the value propagation among services and end-users by linking a set of semantically related parameters together and similar end-users. Last, we propose a ranking-based framework that ranks a list of previous user inputs for an input parameter to save a user from unnecessary data entries. Our framework learns and analyzes interactions of user inputs and input parameters to rank user inputs for input parameters under different contexts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

"FAA-H-8083-3"--Cover.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Graph Reduction Machines, are a traditional technique for implementing functional programming languages. They allow to run programs by transforming graphs by the successive application of reduction rules. Web service composition enables the creation of new web services from existing ones. BPEL is a workflow-based language for creating web service compositions. It is also the industrial and academic standard for this kind of languages. As it is designed to compose web services, the use of BPEL in a scenario where multiple technologies need to be used is problematic: when operations other than web services need to be performed to implement the business logic of a company, part of the work is done on an ad hoc basis. To allow heterogeneous operations to be part of the same workflow, may help to improve the implementation of business processes in a principled way. This work uses a simple variation of the BPEL language for creating compositions containing not only web service operations but also big data tasks or user-defined operations. We define an extensible graph reduction machine that allows the evaluation of BPEL programs and implement this machine as proof of concept. We present some experimental results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the quick advance of web service technologies, end-users can conduct various on-line tasks, such as shopping on-line. Usually, end-users compose a set of services to accomplish a task, and need to enter values to services to invoke the composite services. Quite often, users re-visit websites and use services to perform re-occurring tasks. The users are required to enter the same information into various web services to accomplish such re-occurring tasks. However, repetitively typing the same information into services is a tedious job for end-users. It can negatively impact user experience when an end-user needs to type the re-occurring information repetitively into web services. Recent studies have proposed several approaches to help users fill in values to services automatically. However, prior studies mainly suffer the following drawbacks: (1) limited support of collecting and analyzing user inputs; (2) poor accuracy of filling values to services; (3) not designed for service composition. To overcome the aforementioned drawbacks, we need maximize the reuse of previous user inputs across services and end-users. In this thesis, we introduce our approaches that prevent end-users from entering the same information into repetitive on-line tasks. More specifically, we improve the process of filling out services in the following 4 aspects: First, we investigate the characteristics of input parameters. We propose an ontology-based approach to automatically categorize parameters and fill values to the categorized input parameters. Second, we propose a comprehensive framework that leverages user contexts and usage patterns into the process of filling values to services. Third, we propose an approach for maximizing the value propagation among services and end-users by linking a set of semantically related parameters together and similar end-users. Last, we propose a ranking-based framework that ranks a list of previous user inputs for an input parameter to save a user from unnecessary data entries. Our framework learns and analyzes interactions of user inputs and input parameters to rank user inputs for input parameters under different contexts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract : The structural build-up of fresh cement-based materials has a great impact on their structural performance after casting. Accordingly, the mixture design should be tailored to adapt the kinetics of build-up given the application on hand. The rate of structural build-up of cement-based suspensions at rest is a complex phenomenon affected by both physical and chemical structuration processes. The structuration kinetics are strongly dependent on the mixture’s composition, testing parameters, as well as the shear history. Accurate measurements of build-up rely on the efficiency of the applied pre-shear regime to achieve an initial well-dispersed state as well as the applied stress during the liquid-solid transition. Studying the physical and chemical mechanisms of build-up of cement suspensions at rest can enhance the fundamental understanding of this phenomenon. This can, therefore, allow a better control of the rheological and time-dependent properties of cement-based materials. The research focused on the use of dynamic rheology in investigating the kinetics of structural build-up of fresh cement pastes. The research program was conducted in three different phases. The first phase was devoted to evaluating the dispersing efficiency of various disruptive shear techniques. The investigated shearing profiles included rotational, oscillatory, and combination of both. The initial and final states of suspension’s structure, before and after disruption, were determined by applying a small-amplitude oscillatory shear (SAOS). The difference between the viscoelastic values before and after disruption was used to express the degree of dispersion. An efficient technique to disperse concentrated cement suspensions was developed. The second phase aimed to establish a rheometric approach to dissociate and monitor the individual physical and chemical mechanisms of build-up of cement paste. In this regard, the non-destructive dynamic rheometry was used to investigate the evolutions of both storage modulus and phase angle of inert calcium carbonate and cement suspensions. Two independent build-up indices were proposed. The structural build-up of various cement suspensions made with different cement contents, silica fume replacement percentages, and high-range water reducer dosages was evaluated using the proposed indices. These indices were then compared to the well-known thixotropic index (Athix.). Furthermore, the proposed indices were correlated to the decay in lateral pressure determined for various cement pastes cast in a pressure column. The proposed pre-shearing protocol and build-up indices (phases 1 and 2) were then used to investigate the effect of mixture’s parameters on the kinetics of structural build-up in phase 3. The investigated mixture’s parameters included cement content and fineness, alkali sulfate content, and temperature of cement suspension. Zeta potential, calorimetric, spectrometric measurements were performed to explore the corresponding microstructural changes in cement suspensions, such as inter-particle cohesion, rate of Brownian flocculation, and nucleation rate. A model linking the build-up indices and the microstructural characteristics was developed to predict the build-up behaviour of cement-based suspensions The obtained results showed that oscillatory shear may have a greater effect on dispersing concentrated cement suspension than the rotational shear. Furthermore, the increase in induced shear strain was found to enhance the breakdown of suspension’s structure until a critical point, after which thickening effects dominate. An effective dispersing method is then proposed. This consists of applying a rotational shear around the transitional value between the linear and non-linear variations of the apparent viscosity with shear rate, followed by an oscillatory shear at the crossover shear strain and high angular frequency of 100 rad/s. Investigating the evolutions of viscoelastic properties of inert calcite-based and cement suspensions and allowed establishing two independent build-up indices. The first one (the percolation time) can represent the rest time needed to form the elastic network. On the other hand, the second one (rigidification rate) can describe the increase in stress-bearing capacity of formed network due to cement hydration. In addition, results showed that combining the percolation time and the rigidification rate can provide deeper insight into the structuration process of cement suspensions. Furthermore, these indices were found to be well-correlated to the decay in the lateral pressure of cement suspensions. The variations of proposed build-up indices with mixture’s parameters showed that the percolation time is most likely controlled by the frequency of Brownian collisions, distance between dispersed particles, and intensity of cohesion between cement particles. On the other hand, a higher rigidification rate can be secured by increasing the number of contact points per unit volume of paste, nucleation rate of cement hydrates, and intensity of inter-particle cohesion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper studies the effects of service offshoring on the skill composition of labor demand, using novel comparable data for nine Western European countries between 1990 and 2004. The empirical analysis delivers three main results. First, service offshoring is skill-biased, because it increases the demand for high and medium skilled labor and decreases the demand for low skilled labor. Second, the effects of service offshoring are similar to those of material offshoring, both qualitatively and quantitatively. Third, the economic magnitude of these effects is not large.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES: To determine 1) HIV testing practices in a 1400-bed university hospital where local HIV prevalence is 0.4% and 2) the effect on testing practices of national HIV testing guidelines, revised in March 2010, recommending Physician-Initiated Counselling and Testing (PICT). METHODS: Using 2 hospital databases, we determined the number of HIV tests performed by selected clinical services, and the number of patients tested as a percentage of the number seen per service ('testing rate'). To explore the effect of the revised national guidelines, we examined testing rates for two years pre- and two years post-PICT guideline publication. RESULTS: Combining the clinical services, 253,178 patients were seen and 9,183 tests were performed (of which 80 tested positive, 0.9%) in the four-year study period. The emergency department (ED) performed the second highest number of tests, but had the lowest testing rates (0.9-1.1%). Of inpatient services, neurology and psychiatry had higher testing rates than internal medicine (19.7% and 9.6% versus 8%, respectively). There was no significant increase in testing rates, either globally or in the majority of the clinical services examined, and no increase in new HIV diagnoses post-PICT recommendations. CONCLUSIONS: Using a simple two-database tool, we observe no global improvement in HIV testing rates in our hospital following new national guidelines but do identify services where testing practices merit improvement. This study may show the limit of PICT strategies based on physician risk assessment, compared to the opt-out approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Suorituskyky- ja kuormitustestien tekeminen sovelluksille on erittäin tärkeä osa tuotantoprosessia nykypäivänä. Myös Web-sovelluksia testataan yhä enemmän. Tarve suorituskyky- ja kuormitustestien tekemiselle on selvä. Testattavan ympäristön tämänhetkinen, mutta myös tulevaisuuden toimivuus taataan oikein tehdyillä testeillä ja niitä seuraavilla korjaustoimenpiteillä. Suurten käyttäjämäärien testaaminen manuaalisesti on kuitenkin hyvin vaikeaa. Sirpaleisen ympäristön, kuten palveluihin perustuvien Web-sovellusympäristöjen testaaminen on haaste. Tämän työn aiheena on arvioida työkaluja ja menetelmiä, joilla raskaita teollisia Web-sovelluksia voidaan testata. Tavoitteena on löytää testausmenetelmiä, joilla voidaan luotettavasti simuloida suuria käyttäjämääriä. Tavoitteena on myös arvioida erilaisten yhteyksien ja protokollien vaikutusta Web-sovelluksen suorituskykyyn.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concept of service oriented architecture has been extensively explored in software engineering, due to the fact that it produces architectures made up of several interconnected modules, easy to reuse when building new systems. This approach to design would be impossible without interconnection mechanisms such as REST (Representationa State Transfer) services, which allow module communication while minimizing coupling. . However, this low coupling brings disadvantages, such as the lack of transparency, which makes it difficult to sistematically create tests without knowledge of the inner working of a system. In this article, we present an automatic error detection system for REST services, based on a statistical analysis over responses produced at multiple service invocations. Thus, a service can be systematically tested without knowing its full specification. The method can find errors in REST services which could not be identified by means of traditional testing methods, and provides limited testing coverage for services whose response format is unknown. It can be also useful as a complement to other testing mechanisms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automated Teller Machines (ATMs) are sensitive self-service systems that require important investments in security and testing. ATM certifications are testing processes for machines that integrate software components from different vendors and are performed before their deployment for public use. This project was originated from the need of optimization of the certification process in an ATM manufacturing company. The process identifies compatibility problems between software components through testing. It is composed by a huge number of manual user tasks that makes the process very expensive and error-prone. Moreover, it is not possible to fully automate the process as it requires human intervention for manipulating ATM peripherals. This project presented important challenges for the development team. First, this is a critical process, as all the ATM operations rely on the software under test. Second, the context of use of ATMs applications is vastly different from ordinary software. Third, ATMs’ useful lifetime is beyond 15 years and both new and old models need to be supported. Fourth, the know-how for efficient testing depends on each specialist and it is not explicitly documented. Fifth, the huge number of tests and their importance implies the need for user efficiency and accuracy. All these factors led us conclude that besides the technical challenges, the usability of the intended software solution was critical for the project success. This business context is the motivation of this Master Thesis project. Our proposal focused in the development process applied. By combining user-centered design (UCD) with agile development we ensured both the high priority of usability and the early mitigation of software development risks caused by all the technology constraints. We performed 23 development iterations and finally we were able to provide a working solution on time according to users’ expectations. The evaluation of the project was carried out through usability tests, where 4 real users participated in different tests in the real context of use. The results were positive, according to different metrics: error rate, efficiency, effectiveness, and user satisfaction. We discuss the problems found, the benefits and the lessons learned in the process. Finally, we measured the expected project benefits by comparing the effort required by the current and the new process (once the new software tool is adopted). The savings corresponded to 40% less effort (man-hours) per certification. Future work includes additional evaluation of product usability in a real scenario (with customers) and the measuring of benefits in terms of quality improvement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A revision of a similar publication, AMS-16.