996 resultados para Execution context


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Software must be constantly adapted to changing requirements. The time scale, abstraction level and granularity of adaptations may vary from short-term, fine-grained adaptation to long-term, coarse-grained evolution. Fine-grained, dynamic and context-dependent adaptations can be particularly difficult to realize in long-lived, large-scale software systems. We argue that, in order to effectively and efficiently deploy such changes, adaptive applications must be built on an infrastructure that is not just model-driven, but is both model-centric and context-aware. Specifically, this means that high-level, causally-connected models of the application and the software infrastructure itself should be available at run-time, and that changes may need to be scoped to the run-time execution context. We first review the dimensions of software adaptation and evolution, and then we show how model-centric design can address the adaptation needs of a variety of applications that span these dimensions. We demonstrate through concrete examples how model-centric and context-aware designs work at the level of application interface, programming language and runtime. We then propose a research agenda for a model-centric development environment that supports dynamic software adaptation and evolution.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Context-dependent behavior is becoming increasingly important for a wide range of application domains, from pervasive computing to common business applications. Unfortunately, mainstream programming languages do not provide mechanisms that enable software entities to adapt their behavior dynamically to the current execution context. This leads developers to adopt convoluted designs to achieve the necessary runtime flexibility. We propose a new programming technique called Context-oriented Programming (COP) which addresses this problem. COP treats context explicitly, and provides mechanisms to dynamically adapt behavior in reaction to changes in context, even after system deployment at runtime. In this paper we lay the foundations of COP, show how dynamic layer activation enables multi-dimensional dispatch, illustrate the application of COP by examples in several language extensions, and demonstrate that COP is largely independent of other commitments to programming style.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sensor network nodes exhibit characteristics of both embedded systems and general-purpose systems.A sensor network operating system is a kind of embedded operating system, but unlike a typical embedded operating system, sensor network operatin g system may not be real time, and is constrained by memory and energy constraints. Most sensor network operating systems are based on event-driven approach. Event-driven approach is efficient in terms of time and space.Also this approach does not require a separate stack for each execution context. But using this model, it is difficult to implement long running tasks, like cryptographic operations. A thread based computation requires a separate stack for each execution context, and is less efficient in terms of time and space. In this paper, we propose a thread based execution model that uses only a fixed number of stacks. In this execution model, the number of stacks at each priority level are fixed. It minimizes the stack requirement for multi-threading environment and at the same time provides ease of programming. We give an implementation of this model in Contiki OS by separating thread implementation from protothread implementation completely. We have tested our OS by implementing a clock synchronization protocol using it.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Software Transactional Memory (STM) systems have poor performance under high contention scenarios. Since many transactions compete for the same data, most of them are aborted, wasting processor runtime. Contention management policies are typically used to avoid that, but they are passive approaches as they wait for an abort to happen so they can take action. More proactive approaches have emerged, trying to predict when a transaction is likely to abort so its execution can be delayed. Such techniques are limited, as they do not replace the doomed transaction by another or, when they do, they rely on the operating system for that, having little or no control on which transaction should run. In this paper we propose LUTS, a Lightweight User-Level Transaction Scheduler, which is based on an execution context record mechanism. Unlike other techniques, LUTS provides the means for selecting another transaction to run in parallel, thus improving system throughput. Moreover, it avoids most of the issues caused by pseudo parallelism, as it only launches as many system-level threads as the number of available processor cores. We discuss LUTS design and present three conflict-avoidance heuristics built around LUTS scheduling capabilities. Experimental results, conducted with STMBench7 and STAMP benchmark suites, show LUTS efficiency when running high contention applications and how conflict-avoidance heuristics can improve STM performance even more. In fact, our transaction scheduling techniques are capable of improving program performance even in overloaded scenarios. © 2011 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta tesis se basa en el estudio de la trayectoria que pasa por dos puntos en el problema de los dos cuerpos, inicialmente desarrollado por Lambert, del que toma su nombre. En el pasado, el Problema de Lambert se ha utilizado para la determinación de órbitas a partir de observaciones astronómicas de los cuerpos celestes. Actualmente, se utiliza continuamente en determinación de órbitas, misiones planetaria e interplanetarias, encuentro espacial e interceptación, o incluso en corrección de orbitas. Dada su gran importancia, se decide investigar especialmente sobre su solución y las aplicaciones en las misiones espaciales actuales. El campo de investigación abierto, es muy amplio, así que, es necesario determinar unos objetivos específicos realistas, en el contexto de ejecución de una Tesis, pero que sirvan para mostrar con suficiente claridad el potencial de los resultados aportados en este trabajo, e incluso poder extenderlos a otros campos de aplicación. Como resultado de este análisis, el objetivo principal de la Tesis se enfoca en el desarrollo de algoritmos para resolver el Problema de Lambert, que puedan ser aplicados de forma muy eficiente en las misiones reales donde aparece. En todos los desarrollos, se ha considerado especialmente la eficiencia del cálculo computacional necesario en comparación con los métodos existentes en la actualidad, destacando la forma de evitar la pérdida de precisión inherente a este tipo de algoritmos y la posibilidad de aplicar cualquier método iterativo que implique el uso de derivadas de cualquier orden. En busca de estos objetivos, se desarrollan varias soluciones para resolver el Problema de Lambert, todas ellas basadas en la resolución de ecuaciones transcendentes, con las cuales, se alcanzan las siguientes aportaciones principales de este trabajo: • Una forma genérica completamente diferente de obtener las diversas ecuaciones para resolver el Problema de Lambert, mediante desarrollo analítico, desde cero, a partir de las ecuaciones elementales conocidas de las cónicas (geométricas y temporal), proporcionando en todas ellas fórmulas para el cálculo de derivadas de cualquier orden. • Proporcionar una visión unificada de las ecuaciones más relevantes existentes, mostrando la equivalencia con variantes de las ecuaciones aquí desarrolladas. • Deducción de una nueva variante de ecuación, el mayor logro de esta Tesis, que destaca en eficiencia sobre todas las demás (tanto en coste como en precisión). • Estudio de la sensibilidad de la solución ante variación de los datos iniciales, y como aplicar los resultados a casos reales de optimización de trayectorias. • También, a partir de los resultados, es posible deducir muchas propiedades utilizadas en la literatura para simplificar el problema, en particular la propiedad de invariancia, que conduce al Problema Transformado Simplificado. ABSTRACT This thesis is based on the study of the two-body, two-point boundary-value problem, initially developed by Lambert, from who it takes its name. Since the past, Lambert's Problem has been used for orbit determination from astronomical observations of celestial bodies. Currently, it is continuously used in orbit determinations, for planetary and interplanetary missions, space rendezvous, and interception, or even in orbit corrections. Given its great importance, it is decided to investigate their solution and applications in the current space missions. The open research field is very wide, it is necessary to determine specific and realistic objectives in the execution context of a Thesis, but that these serve to show clearly enough the potential of the results provided in this work, and even to extended them to other areas of application. As a result of this analysis, the main aim of the thesis focuses on the development of algorithms to solve the Lambert’s Problem which can be applied very efficiently in real missions where it appears. In all these developments, it has been specially considered the efficiency of the required computational calculation compared to currently existing methods, highlighting how to avoid the loss of precision inherent in such algorithms and the possibility to apply any iterative method involving the use of derivatives of any order. Looking to meet these objectives, a number of solutions to solve the Lambert’s Problem are developed, all based on the resolution of transcendental equations, with which the following main contributions of this work are reached: • A completely different generic way to get the various equations to solve the Lambert’s Problem by analytical development, from scratch, from the known elementary conic equations (geometrics and temporal), by providing, in all cases, the calculation of derivatives of any order. • Provide a unified view of most existing relevant equations, showing the equivalence with variants of the equations developed here. • Deduction of a new variant of equation, the goal of this Thesis, which emphasizes efficiency (both computational cost and accuracy) over all other. • Estudio de la sensibilidad de la solución ante la variación de las condiciones iniciales, mostrando cómo aprovechar los resultados a casos reales de optimización de trayectorias. • Study of the sensitivity of the solution to the variation of the initial data, and how to use the results to real cases of trajectories’ optimization. • Additionally, from results, it is possible to deduce many properties used in literature to simplify the problem, in particular the invariance property, which leads to a simplified transformed problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years concerns over litigation and the trend towards close monitoring of academic activity has seen the effective hijacking of research ethics by university managers and bureaucrats. This can effectively curtail cutting edge research as perceived ‘safe’ research strategies are encouraged. However, ethics is about more than research governance. Ultimately, it seeks to avoid harm and to increase benefits to society. Rural development debate is fairly quiet on the question of ethics, leaving guidance to professional bodies. This study draws on empirical research that examined the lives of migrant communities in Northern Ireland. This context of increasingly diverse rural development actors provides a backdrop for the way in which the researcher navigates through ethical issues as they unfold in the field. The analysis seeks to relocate ethics from being an annoying bureaucratic requirement to one where it is inherent to rigorous and professional research and practice. It reveals how attention to professional ethics can contribute to effective, situated and reflexive practice, thus transforming ethics to become an asset to professional researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the domain of aerospace aftermarkets, which often has long supply chains that feed into the maintenance of aircraft, contracts are used to establish agreements between aircraft operators and maintenance suppliers. However, violations at the bottom of the supply chain (part suppliers) can easily cascade to the top (aircraft operators), making it difficult to determine the source of the violation, and seek to address it. In this context, we have developed a global monitoring architecture that ensures the detection of norm violations and generates explanations for the origin of violations. In this paper, we describe the implementation and deployment of a global monitor in the aerospace domain of [8] and show how it generates explanations for violations within the maintenance supply chain. We show how these explanations can be used not only to detect violations at runtime, but also to uncover potential problems in contracts before their deployment, thus improving them.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation examines the global technological and environmental history of copper smelting and the conflict that developed between historic preservation and environmental remediation at major copper smelting sites in the United States after their productive periods ended. Part I of the dissertation is a synthetic overview of the history of copper smelting and its environmental impact. After reviewing the basic metallurgy of copper ores, the dissertation contains successive chapters on the history of copper smelting to 1640, culminating in the so-called German, or Continental, processing system; on the emergence of the rival Welsh system during the British industrial revolution; and on the growth of American dominance in copper production the late 19th and early 20th centuries. The latter chapter focuses, in particular, on three of the most important early American copper districts: Michigan’s Keweenaw Peninsula, Tennessee’s Copper Basin, and Butte-Anaconda, Montana. As these three districts went into decline and ultimately out of production, they left a rich industrial heritage and significant waste and pollution problems generated by increasingly more sophisticated technologies capable of commercially processing steadily growing volumes of decreasingly rich ores. Part II of the dissertation looks at the conflict between historic preservation and environmental remediation that emerged locally and nationally in copper districts as they went into decline and eventually ceased production. Locally, former copper mining communities often split between those who wished to commemorate a region’s past importance and develop heritage tourism, and local developers who wished to clear up and clean out old industrial sites for other purposes. Nationally, Congress passed laws in the 1960s and 1970s mandating the preservation of historical resources (National Historic Preservation Act) and laws mandating the cleanup of contaminated landscapes (CERCLA, or Superfund), objectives sometimes in conflict – especially in the case of copper smelting sites. The dissertation devotes individual chapters to the conflicts that developed between environmental remediation, particularly involving the Environmental Protection Agency and the heritage movement in the Tennessee, Montana, and Michigan copper districts. A concluding chapter provides a broad model to illustrate the relationship between industrial decline, federal environmental remediation activities, and the growth of heritage consciousness in former copper mining and smelting areas, analyzes why the outcome varied in the three areas, and suggests methods for dealing with heritage-remediation issues to minimize conflict and maximize heritage preservation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study investigated the role of the right posterior parietal cortex (PPC) in the triggering of memory-guided saccades by means of double-pulse transcranial magnetic stimulation (dTMS). Shortly before saccade onset, dTMS with different interstimulus intervals (ISI; 35, 50, 65 or 80 ms) was applied. For contralateral saccades, dTMS significantly decreased saccadic latency with an ISI of 80 ms and increased saccadic gain with an ISI of 65 and 80 ms. Together with the findings of a previous study during frontal eye field (FEF) stimulation the present results demonstrate similarities and differences between both regions in the execution of memory-guided saccades. Firstly, dTMS facilitates saccade triggering in both regions, but the timing is different. Secondly, dTMS over the PPC provokes a hypermetria of contralateral memory-guided saccades that was not observed during FEF stimulation. The results are discussed within the context of recent neurophysiological findings in monkeys.