849 resultados para Service Programming Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service compositions put together loosely-coupled component services to perform more complex, higher level, or cross-organizational tasks in a platform-independent manner. Quality-of-Service (QoS) properties, such as execution time, availability, or cost, are critical for their usability, and permissible boundaries for their values are defined in Service Level Agreements (SLAs). We propose a method whereby constraints that model SLA conformance and violation are derived at any given point of the execution of a service composition. These constraints are generated using the structure of the composition and properties of the component services, which can be either known or empirically measured. Violation of these constraints means that the corresponding scenario is unfeasible, while satisfaction gives values for the constrained variables (start / end times for activities, or number of loop iterations) which make the scenario possible. These results can be used to perform optimized service matching or trigger preventive adaptation or healing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term "Logic Programming" refers to a variety of computer languages and execution models which are based on the traditional concept of Symbolic Logic. The expressive power of these languages offers promise to be of great assistance in facing the programming challenges of present and future symbolic processing applications in Artificial Intelligence, Knowledge-based systems, and many other areas of computing. The sequential execution speed of logic programs has been greatly improved since the advent of the first interpreters. However, higher inference speeds are still required in order to meet the demands of applications such as those contemplated for next generation computer systems. The execution of logic programs in parallel is currently considered a promising strategy for attaining such inference speeds. Logic Programming in turn appears as a suitable programming paradigm for parallel architectures because of the many opportunities for parallel execution present in the implementation of logic programs. This dissertation presents an efficient parallel execution model for logic programs. The model is described from the source language level down to an "Abstract Machine" level suitable for direct implementation on existing parallel systems or for the design of special purpose parallel architectures. Few assumptions are made at the source language level and therefore the techniques developed and the general Abstract Machine design are applicable to a variety of logic (and also functional) languages. These techniques offer efficient solutions to several areas of parallel Logic Programming implementation previously considered problematic or a source of considerable overhead, such as the detection and handling of variable binding conflicts in AND-Parallelism, the specification of control and management of the execution tree, the treatment of distributed backtracking, and goal scheduling and memory management issues, etc. A parallel Abstract Machine design is offered, specifying data areas, operation, and a suitable instruction set. This design is based on extending to a parallel environment the techniques introduced by the Warren Abstract Machine, which have already made very fast and space efficient sequential systems a reality. Therefore, the model herein presented is capable of retaining sequential execution speed similar to that of high performance sequential systems, while extracting additional gains in speed by efficiently implementing parallel execution. These claims are supported by simulations of the Abstract Machine on sample programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the sustainability of farm irrigation systems in the Cébalat district in northern Tunisia. It addressed the challenging topic of sustainable agriculture through a bio-economic approach linking a biophysical model to an economic optimisation model. A crop growth simulation model (CropSyst) was used to build a database to determine the relationships between agricultural practices, crop yields and environmental effects (salt accumulation in soil and leaching of nitrates) in a context of high climatic variability. The database was then fed into a recursive stochastic model set for a 10-year plan that allowed analysing the effects of cropping patterns on farm income, salt accumulation and nitrate leaching. We assumed that the long-term sustainability of soil productivity might be in conflict with farm profitability in the short-term. Assuming a discount rate of 10% (for the base scenario), the model closely reproduced the current system and allowed to predict the degradation of soil quality due to long-term salt accumulation. The results showed that there was more accumulation of salt in the soil for the base scenario than for the alternative scenario (discount rate of 0%). This result was induced by applying a higher quantity of water per hectare for the alternative as compared to a base scenario. The results also showed that nitrogen leaching is very low for the two discount rates and all climate scenarios. In conclusion, the results show that the difference in farm income between the alternative and base scenarios increases over time to attain 45% after 10 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

End-user development (EUD) is much hyped, and its impact has outstripped even the most optimistic forecasts. Even so, the vision of end users programming their own solutions has not yet materialized. This will continue to be so unless we in both industry and the research community set ourselves the ambitious challenge of devising end to end an end-user application development model for developing a new age of EUD tools. We have embarked on this venture, and this paper presents the main insights and outcomes of our research and development efforts as part of a number of successful EU research projects. Our proposal not only aims to reshape software engineering to meet the needs of EUD but also to refashion its components as solution building blocks instead of programs and software developments. This way, end users will really be empowered to build solutions based on artefacts akin to their expertise and understanding of ideal solutions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main challenges facing next generation Cloud platform services is the need to simultaneously achieve ease of programming, consistency, and high scalability. Big Data applications have so far focused on batch processing. The next step for Big Data is to move to the online world. This shift will raise the requirements for transactional guarantees. CumuloNimbo is a new EC-funded project led by Universidad Politécnica de Madrid (UPM) that addresses these issues via a highly scalable multi-tier transactional platform as a service (PaaS) that bridges the gap between OLTP and Big Data applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IP Multimedia Subsystem (IMS) is considered to provide multimedia services to users through an IP-based control plane. The current IMS service invocation mechanism, however, requires the Serving-Call Session Control Function (S-CSCF) invokes each Application Server (AS) sequentially to perform service subscription pro?le, which results in the heavy load of the S-CSCF and the long session set-up delay. To solve this issue, this paper proposes a linear chained service invocation mechanism to invoke each AS consecutively. By checking all the initial Filter Criteria (iFC) one-time and adding the addresses of all involved ASs to the ?Route? header, this new approach enables multiple services to be invoked as a linear chain during a session. We model the service invocation mechanisms through Jackson networks, which are validated through simulations. The analytic results verify that the linear chained service invocation mechanism can effectively reduce session set-up delay of the service layer and decrease the load level of the S-CSCF

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demand of new services, the emergence of new business models, insufficient innovation, underestimation of customer loyalty and reluctance to adopt new management are evidence of the deficiencies and the lack of research about the relations between patients and dental clinics. In this article we propose the structure of a model of Relationship Marketing (RM) in the dental clinic that integrates information from SERVQUAL, Customer Loyalty (CL) and activities of RM and combines the vision of dentist and patient. The first pilot study on dentists showed that: they recognize the value of maintaining better patients however they don't perform RM actions to retain them. They have databases of patients but not sophisticated enough as compared to RM tools. They perceive that the patients value "Assurance" and "Empathy" (two dimensions of service quality). Finally, they indicate that a loyal patient not necessarily pays more by the service. The proposed model will be validated using Fuzzy Logic simulation and the ultimate goal of this research line is contributing a new definition of CL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a highly automated mechanism to build an undo facility into a new or existing system easily. Our proposal is based on the observation that for a large set of operators it is not necessary to store in-memory object states or executed system commands to undo an action; the storage of input data is instead enough. This strategy simplifies greatly the design of the undo process and encapsulates most of the functionalities required in a framework structure similar to the many object-oriented programming frameworks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Como en todos los medios de transporte, la seguridad en los viajes en avión es de primordial importancia. Con los aumentos de tráfico aéreo previstos en Europa para la próxima década, es evidente que el riesgo de accidentes necesita ser evaluado y monitorizado cuidadosamente de forma continúa. La Tesis presente tiene como objetivo el desarrollo de un modelo de riesgo de colisión exhaustivo como método para evaluar el nivel de seguridad en ruta del espacio aéreo europeo, considerando todos los factores de influencia. La mayor limitación en el desarrollo de metodologías y herramientas de monitorización adecuadas para evaluar el nivel de seguridad en espacios de ruta europeos, donde los controladores aéreos monitorizan el tráfico aéreo mediante la vigilancia radar y proporcionan instrucciones tácticas a las aeronaves, reside en la estimación del riesgo operacional. Hoy en día, la estimación del riesgo operacional está basada normalmente en reportes de incidentes proporcionados por el proveedor de servicios de navegación aérea (ANSP). Esta Tesis propone un nuevo e innovador enfoque para evaluar el nivel de seguridad basado exclusivamente en el procesamiento y análisis trazas radar. La metodología propuesta ha sido diseñada para complementar la información recogida en las bases de datos de accidentes e incidentes, mediante la provisión de información robusta de los factores de tráfico aéreo y métricas de seguridad inferidas del análisis automático en profundidad de todos los eventos de proximidad. La metodología 3-D CRM se ha implementado en un prototipo desarrollado en MATLAB © para analizar automáticamente las trazas radar y planes de vuelo registrados por los Sistemas de Procesamiento de Datos Radar (RDP) e identificar y analizar todos los eventos de proximidad (conflictos, conflictos potenciales y colisiones potenciales) en un periodo de tiempo y volumen del espacio aéreo. Actualmente, el prototipo 3-D CRM está siendo adaptado e integrado en la herramienta de monitorización de prestaciones de Aena (PERSEO) para complementar las bases de accidentes e incidentes ATM y mejorar la monitorización y proporcionar evidencias de los niveles de seguridad. ABSTRACT As with all forms of transport, the safety of air travel is of paramount importance. With the projected increases in European air traffic in the next decade and beyond, it is clear that the risk of accidents needs to be assessed and carefully monitored on a continuing basis. The present thesis is aimed at the development of a comprehensive collision risk model as a method of assessing the European en-route risk, due to all causes and across all dimensions within the airspace. The major constraint in developing appropriate monitoring methodologies and tools to assess the level of safety in en-route airspaces where controllers monitor air traffic by means of radar surveillance and provide aircraft with tactical instructions lies in the estimation of the operational risk. The operational risk estimate normally relies on incident reports provided by the air navigation service providers (ANSPs). This thesis proposes a new and innovative approach to assessing aircraft safety level based exclusively upon the process and analysis of radar tracks. The proposed methodology has been designed to complement the information collected in the accident and incident databases, thereby providing robust information on air traffic factors and safety metrics inferred from the in depth assessment of proximate events. The 3-D CRM methodology is implemented in a prototype tool in MATLAB © in order to automatically analyze recorded aircraft tracks and flight plan data from the Radar Data Processing systems (RDP) and identify and analyze all proximate events (conflicts, potential conflicts and potential collisions) within a time span and a given volume of airspace. Currently, the 3D-CRM prototype is been adapted and integrated in AENA’S Performance Monitoring Tool (PERSEO) to complement the information provided by the ATM accident and incident databases and to enhance monitoring and providing evidence of levels of safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabajo de Tesis se desarrolla en el marco de los escenarios de ejecución distribuida de servicios móviles y contribuye a la definición y desarrollo del concepto de usuario prosumer. El usuario prosumer se caracteriza por utilizar su teléfono móvil para crear, proveer y ejecutar servicios. Este nuevo modelo de usuario contribuye al avance de la sociedad de la información, ya que el usuario prosumer se transforma de creador de contenidos a creador de servicios (estos últimos formados por contenidos y la lógica para acceder a ellos, procesarlos y representarlos). El objetivo general de este trabajo de Tesis es la provisión de un modelo de creación, distribución y ejecución de servicios para entorno móvil que permita a los usuarios no programadores (usuarios prosumer), pero expertos en un determinado dominio, crear y ejecutar sus propias aplicaciones y servicios. Para ello se definen, desarrollan e implementan metodologías, procesos, algoritmos y mecanismos adaptables a dominios específicos, para construir entornos de ejecución distribuida de servicios móviles para usuarios prosumer. La provisión de herramientas de creación adaptadas a usuarios no expertos es una tendencia actual que está siendo desarrollada en distintos trabajos de investigación. Sin embargo, no se ha propuesto una metodología de desarrollo de servicios que involucre al usuario prosumer en el proceso de diseño, desarrollo, implementación y validación de servicios. Este trabajo de Tesis realiza un estudio de las metodologías y tecnologías más innovadoras relacionadas con la co‐creación y utiliza este análisis para definir y validar una metodología que habilita al usuario para ser el responsable de la creación de servicios finales. Siendo los entornos móviles prosumer (mobile prosumer environments) una particularización de los entornos de ejecución distribuida de servicios móviles, en este trabajo se tesis se investiga en técnicas de adaptación, distribución, coordinación de servicios y acceso a recursos identificando como requisitos las problemáticas de este tipo de entornos y las características de los usuarios que participan en los mismos. Se contribuye a la adaptación de servicios definiendo un modelo de variabilidad que soporte la interdependencia entre las decisiones de personalización de los usuarios, incorporando mecanismos de guiado y detección de errores. La distribución de servicios se implementa utilizando técnicas de descomposición en árbol SPQR, cuantificando el impacto de separar cualquier servicio en distintos dominios. Considerando el plano de comunicaciones para la coordinación en la ejecución de servicios distribuidos hemos identificado varias problemáticas, como las pérdidas de enlace, conexiones, desconexiones y descubrimiento de participantes, que resolvemos utilizando técnicas de diseminación basadas en publicación subscripción y algoritmos Gossip. Para lograr una ejecución flexible de servicios distribuidos en entorno móvil, soportamos la adaptación a cambios en la disponibilidad de los recursos, proporcionando una infraestructura de comunicaciones para el acceso uniforme y eficiente a recursos. Se han realizado validaciones experimentales para evaluar la viabilidad de las soluciones propuestas, definiendo escenarios de aplicación relevantes (el nuevo universo inteligente, prosumerización de servicios en entornos hospitalarios y emergencias en la web de la cosas). Abstract This Thesis work is developed in the framework of distributed execution of mobile services and contributes to the definition and development of the concept of prosumer user. The prosumer user is characterized by using his mobile phone to create, provide and execute services. This new user model contributes to the advancement of the information society, as the prosumer is transformed from producer of content, to producer of services (consisting of content and logic to access them, process them and represent them). The overall goal of this Thesis work is to provide a model for creation, distribution and execution of services for the mobile environment that enables non‐programmers (prosumer users), but experts in a given domain, to create and execute their own applications and services. For this purpose I define, develop and implement methodologies, processes, algorithms and mechanisms, adapted to specific domains, to build distributed environments for the execution of mobile services for prosumer users. The provision of creation tools adapted to non‐expert users is a current trend that is being developed in different research works. However, it has not been proposed a service development methodology involving the prosumer user in the process of design, development, implementation and validation of services. This thesis work studies innovative methodologies and technologies related to the co‐creation and relies on this analysis to define and validate a methodological approach that enables the user to be responsible for creating final services. Being mobile prosumer environments a specific case of environments for distributed execution of mobile services, this Thesis work researches in service adaptation, distribution, coordination and resource access techniques, and identifies as requirements the challenges of such environments and characteristics of the participating users. I contribute to service adaptation by defining a variability model that supports the dependency of user personalization decisions, incorporating guiding and error detection mechanisms. Service distribution is implemented by using decomposition techniques based on SPQR trees, quantifying the impact of separating any service in different domains. Considering the communication level for the coordination of distributed service executions I have identified several problems, such as link losses, connections, disconnections and discovery of participants, which I solve using dissemination techniques based on publish‐subscribe communication models and Gossip algorithms. To achieve a flexible distributed service execution in mobile environments, I support adaptation to changes in the availability of resources, while providing a communication infrastructure for the uniform and efficient access to resources. Experimental validations have been conducted to assess the feasibility of the proposed solutions, defining relevant application scenarios (the new intelligent universe, service prosumerization in hospitals and emergency situations in the web of things).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of semantic and Linked Data technologies for Enterprise Application Integration (EAI) is increasing in recent years. Linked Data and Semantic Web technologies such as the Resource Description Framework (RDF) data model provide several key advantages over the current de-facto Web Service and XML based integration approaches. The flexibility provided by representing the data in a more versatile RDF model using ontologies enables avoiding complex schema transformations and makes data more accessible using Web standards, preventing the formation of data silos. These three benefits represent an edge for Linked Data-based EAI. However, work still has to be performed so that these technologies can cope with the particularities of the EAI scenarios in different terms, such as data control, ownership, consistency, or accuracy. The first part of the paper provides an introduction to Enterprise Application Integration using Linked Data and the requirements imposed by EAI to Linked Data technologies focusing on one of the problems that arise in this scenario, the coreference problem, and presents a coreference service that supports the use of Linked Data in EAI systems. The proposed solution introduces the use of a context that aggregates a set of related identities and mappings from the identities to different resources that reside in distinct applications and provide different views or aspects of the same entity. A detailed architecture of the Coreference Service is presented explaining how it can be used to manage the contexts, identities, resources, and applications which they relate to. The paper shows how the proposed service can be utilized in an EAI scenario using an example involving a dashboard that integrates data from different systems and the proposed workflow for registering and resolving identities. As most enterprise applications are driven by business processes and involve legacy data, the proposed approach can be easily incorporated into enterprise applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.