907 resultados para web processing service (WPS)
Resumo:
Dynamic composition of services provides the ability to build complex distributed applications at run time by combining existing services, thus coping with a large variety of complex requirements that cannot be met by individual services alone. However, with the increasing amount of available services that differ in granularity (amount of functionality provided) and qualities, selecting the best combination of services becomes very complex. In response, this paper addresses the challenges of service selection, and makes a twofold contribution. First, a rich representation of compositional planning knowledge is provided, allowing the expression of multiple decompositions of tasks at arbitrary levels of granularity. Second, two distinct search space reduction techniques are introduced, the application of which, prior to performing service selection, results in significant improvement in selection performance in terms of execution time, which is demonstrated via experimental results.
Resumo:
Smart water metering technologies for residential buildings offer, in principle, great opportunities for sustainable urban water management. However, much of this potential is as yet unrealized. Despite that several ICT solutions have already been deployed aiming at optimum operations on the water utilities side (e.g. real time control for water networks, dynamic pump scheduling etc.), little work has been done to date on the consumer side. This paper presents a web-based platform targeting primarily the household end user. The platform enables consumers to monitor, on a real-time basis, the water demand of their household, providing feedback not only on the total water consumption and relevant costs but also on the efficiency (or otherwise) of specific indoor and outdoor uses. Targeting the reduction of consumption, the provided feedback is combined with notifications about possible leakages\bursts, and customised suggestions to improve the efficiency of existing household uses. It also enables various comparisons, with past consumption or even with that of similar households, aiming to motivate further the householder to become an active player in the water efficiency challenge. The issue of enhancing the platform’s functionality with energy timeseries is also discussed in view of recent advances in smart metering and the concept of “smart cities”. The paper presents a prototype of this web-based application and critically discusses first testing results and insights. It also presents the way in which the platform communicates with central databases, at the water utility level. It is suggested that such developments are closing the gap between technology availability and usefulness to end users and could help both the uptake of smart metering and awareness raising leading, potentially, to significant reductions of urban water consumption. The work has received funding from the European Union FP7 Programme through the iWIDGET Project, under grant agreement no318272.
Resumo:
A indústria de serviços online é caracterizada por um volume alto de Fusões e Aquisições no período de 2005 a 2015. As líderes de mercado, Apple, Google e Microsoft, incorporaram essa forma de crescimento inorgânico em suas estratégias corporativas. Essa tese examina as atividades de Fusões e Aquisições dessas três empresas. Consequentemente, ela tem foco em dois aspectos principais. Primeiro, existe o objetivo de saciar uma escassez na literatura acadêmica, no que se diz respeito ao estabelecimento de uma conexão entre a estratégia corporativa dessas empresas e as decisões tomadas de Fusões e Aquisições. Segundo, há também o objetivo de estimar possíveis futuros desenvolvimentos no setor. Através de uma análise de conteúdo qualitativa das publicações das empresas, relatórios de análise de mercado, e outros conteúdos de terceiros, estudos de caso foram desenvolvidos. Os resultados mostram o processo de posicionamento estratégico por parte da Apple, Google e Microsoft, dentro do mercado de serviços online, entre os anos de 2005 e 2015. As recorrentes fusões e aquisições são analisadas, no que se diz respeito as estratégias corporativas dessas empresas e a responsividade perante as atividades de seus competidores. Os resultados evidenciam atividades agressivas de Fusões e Aquisições em grupos estratégicos em comum entre as três empresas, especialmente no mercado de aparelhos de comunicação móvel e serviços de comunicação.
Resumo:
A adoção de software de gestão de alarmes revela-se essencial nas organizações, sobretudo no contexto hospitalar e de segurança, devido à celeridade com que os alarmes têm de ser processados nos ambientes críticos em que são gerados. Nos últimos anos, tem-se verificado uma enorme promoção de diretivas que recomendam a utilização de software de gestão de alarmes, de forma a que as organizações estejam preparadas para lidar com situações problemáticas e para prestar um serviço de qualidade. O fenómeno da ubiquidade computacional promovido pela utilização massiva da Web e de dispositivos móveis revolucionou de forma significativa o modo como as pessoas comunicam e partilham informação entre si. Deste modo, tem havido uma consciencialização por parte das organizações que desenvolvem sistemas de gestão de alarmes em investir recursos na migração das suas aplicações de desktop para a Web e para dispositivos móveis. O Connexall é uma das soluções de gestão de alarmes com maior adoção no mercado, no entanto, carece de aplicações de software focalizadas na Web e nos dispositivos móveis. Neste sentido, o objetivo deste projeto de mestrado consiste em desenvolver duas aplicações de gestão de alarmes, o Active Alarm Client Plus para Android e o Device Assignment Client para a Web, integradas com o Connexall, através da utilização de um Web service para o efeito. Com o desenvolvimento deste projeto, pretende-se expandir os horizontes de aplicação do Connexall no que diz respeito à diversidade de equipamentos computacionais presentes no mercado, de modo a promover a ubiquidade cada vez mais assente no acesso e partilha de informação no contexto de gestão de alarmes.
Resumo:
This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value
Resumo:
The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider
Resumo:
The popularization of the Internet has stimulated the appearance of Search Engines that have as their objective aid the users in the Web information research process. However, it s common for users to make queries and receive results which do not satisfy their initial needs. The Information Retrieval in Context (IRiX) technique allows for the information related to a specific theme to be related to the initial user query, enabling, in this way, better results. This study presents a prototype of a search engine based on contexts built from linguistic gatherings and on relationships defined by the user. The context information can be shared with softwares and other tool users with the objective of promoting a socialization of contexts
Resumo:
The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring
Resumo:
The control of industrial processes has become increasingly complex due to variety of factory devices, quality requirement and market competition. Such complexity requires a large amount of data to be treated by the three levels of process control: field devices, control systems and management softwares. To use data effectively in each one of these levels is extremely important to industry. Many of today s industrial computer systems consist of distributed software systems written in a wide variety of programming languages and developed for specific platforms, so, even more companies apply a significant investment to maintain or even re-write their systems for different platforms. Furthermore, it is rare that a software system works in complete isolation. In industrial automation is common that, software had to interact with other systems on different machines and even written in different languages. Thus, interoperability is not just a long-term challenge, but also a current context requirement of industrial software production. This work aims to propose a middleware solution for communication over web service and presents an user case applying the solution developed to an integrated system for industrial data capture , allowing such data to be available simplified and platformindependent across the network
Resumo:
Fiber reinforced epoxy composites are used in a wide variety of applications in the aerospace field. These materials have high specific moduli, high specific strength and their properties can be tailored to application requirements. In order to screening optimum materials behavior, the effects of external environments on the mechanical properties during usage must be clearly understood. The environmental action, such as high moisture concentration, high temperatures, corrosive fluids or ultraviolet radiation (UV), can affect the performance of advanced composites during service. These factors can limit the applications of composites by deteriorating the mechanical properties over a period of time. Properties determination is attributed to the chemical and/or physical damages caused in the polymer matrix, loss of adhesion of fiber/resin interface, and/or reduction of fiber strength and stiffness. The dynamic elastic properties are important characteristics of glass fiber reinforced composites (GRFC). They control the damping behavior of composite structures and are also an ideal tool for monitoring the development of GFRC's mechanical properties during their processing or service. One of the most used tests is the vibration damping. In this work, the measurement consisted of recording the vibration decay of a rectangular plate excited by a controlled mechanism to identify the elastic and damping properties of the material under test. The frequency amplitude were measured by accelerometers and calculated by using a digital method. The present studies have been performed to explore relations between the dynamic mechanical properties, damping test and the influence of high moisture concentration of glass fiber reinforced composites (plain weave). The results show that the E' decreased with the increase in the exposed time for glass fiber/epoxy composites specimens exposed at 80 degrees C and 90% RH. The E' values found were: 26.7, 26.7, 25.4, 24.7 and 24.7 GPa for 0, 15, 30, 45 and 60 days of exposure, respectively. (c) 2005 Springer Science + Business Media, Inc.
Resumo:
This thesis presents ⇡SOD-M (Policy-based Service Oriented Development Methodology), a methodology for modeling reliable service-based applications using policies. It proposes a model driven method with: (i) a set of meta-models for representing non-functional constraints associated to service-based applications, starting from an use case model until a service composition model; (ii) a platform providing guidelines for expressing the composition and the policies; (iii) model-to-model and model-to-text transformation rules for semi-automatizing the implementation of reliable service-based applications; and (iv) an environment that implements these meta-models and rules, and enables the application of ⇡SOD-M. This thesis also presents a classification and nomenclature for non-functional requirements for developing service-oriented applications. Our approach is intended to add value to the development of service-oriented applications that have quality requirements needs. This work uses concepts from the service-oriented development, non-functional requirements design and model-driven delevopment areas to propose a solution that minimizes the problem of reliable service modeling. Some examples are developed as proof of concepts
Resumo:
Recently the focus given to Web Services and Semantic Web technologies has provided the development of several research projects in different ways to addressing the Web services composition issue. Meanwhile, the challenge of creating an environment that provides the specification of an abstract business process and that it is automatically implemented by a composite service in a dynamic way is considered a currently open problem. WSDL and BPEL provided by industry support only manual service composition because they lack needed semantics so that Web services are discovered, selected and combined by software agents. Services ontology provided by Semantic Web enriches the syntactic descriptions of Web services to facilitate the automation of tasks, such as discovery and composition. This work presents an environment for specifying and ad-hoc executing Web services-based business processes, named WebFlowAH. The WebFlowAH employs common domain ontology to describe both Web services and business processes. It allows processes specification in terms of users goals or desires that are expressed based on the concepts of such common domain ontology. This approach allows processes to be specified in an abstract high level way, unburdening the user from the underline details needed to effectively run the process workflow
Resumo:
The visualization of three-dimensional(3D)images is increasigly being sed in the area of medicine, helping physicians diagnose desease. the advances achived in scaners esed for acquisition of these 3d exames, such as computerized tumography(CT) and Magnetic Resonance imaging (MRI), enable the generation of images with higher resolutions, thus, generating files with much larger sizes. Currently, the images of computationally expensive one, and demanding the use of a righ and computer for such task. The direct remote acess of these images thruogh the internet is not efficient also, since all images have to be trasferred to the user´s equipment before the 3D visualization process ca start. with these problems in mind, this work proposes and analyses a solution for the remote redering of 3D medical images, called Remote Rendering (RR3D). In RR3D, the whole hedering process is pefomed a server or a cluster of servers, with high computational power, and only the resulting image is tranferred to the client, still allowing the client to peform operations such as rotations, zoom, etc. the solution was developed using web services written in java and an architecture that uses the scientific visualization packcage paraview, the framework paraviewWeb and the PACS server DCM4CHEE.The solution was tested with two scenarios where the rendering process was performed by a sever with graphics hadwere (GPU) and by a server without GPUs. In the scenarios without GPUs, the soluction was executed in parallel with several number of cores (processing units)dedicated to it. In order to compare our solution to order medical visualization application, a third scenario was esed in the rendering process, was done locally. In all tree scenarios, the solution was tested for different network speeds. The solution solved satisfactorily the problem with the delay in the transfer of the DICOM files, while alowing the use of low and computers as client for visualizing the exams even, tablets and smart phones