855 resultados para Scenario Programming, Markup Language, End User Programming
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. They consist of tiny devices with limited processing and power capabilities, called motes that can be deployed in large numbers of useful sensing capabilities. Even though, they are flexible and easy to deploy, there are a number of considerations when it comes to their fault tolerance, conserving energy and re-programmability that need to be addressed before we draw any substantial conclusions about the effectiveness of this technology. In order to overcome their limitations, we propose a middleware solution. The proposed scheme is composed based on two main methods. The first method involves the creation of a flexible communication protocol based on technologies such as Mobile Code/Agents and Linda-like tuple spaces. In this way, every node of the wireless sensor network will produce and process data based on what is the best for it but also for the group that it belongs too. The second method incorporates the above protocol in a middleware that will aim to bridge the gap between the application layer and low level constructs such as the physical layer of the wireless sensor network. A fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort towards the deployed applications running in an energy efficient manner inside the network. The proposed scheme is evaluated through a number of trials aiming to test its merits under real time conditions and to identify its effectiveness against other similar approaches. Finally, parameters which determine the characteristics of the proposed scheme are also examined.
Resumo:
As mobile devices become increasingly diverse and continue to shrink in size and weight, their portability is enhanced but, unfortunately, their usability tends to suffer. Ultimately, the usability of mobile technologies determines their future success in terms of end-user acceptance and, thereafter, adoption and social impact. Widespread acceptance will not, however, be achieved if users’ interaction with mobile technology amounts to a negative experience. Mobile user interfaces need to be designed to meet the functional and sensory needs of users. Social and Organizational Impacts of Emerging Mobile Devices: Evaluating Use focuses on human-computer interaction related to the innovation and research in the design, evaluation, and use of innovative handheld, mobile, and wearable technologies in order to broaden the overall body of knowledge regarding such issues. It aims to provide an international forum for researchers, educators, and practitioners to advance knowledge and practice in all facets of design and evaluation of human interaction with mobile technologies.
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. In order to overcome their limitations such as fault tolerance and conservation of energy, we propose a middleware solution, In-Motes. In-Motes stands as a fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort the deployed applications to run in an energy efficient manner inside the network. The proposed scheme is evaluated through the In-Motes EYE application, aiming to test its merits under real time conditions. In-Motes EYE application which is an agent based real time In-Motes application developed for sensing acceleration variations in an environment. The application was tested in a prototype area, road alike, for a period of four months.
Resumo:
Models are central tools for modern scientists and decision makers, and there are many existing frameworks to support their creation, execution and composition. Many frameworks are based on proprietary interfaces, and do not lend themselves to the integration of models from diverse disciplines. Web based systems, or systems based on web services, such as Taverna and Kepler, allow composition of models based on standard web service technologies. At the same time the Open Geospatial Consortium has been developing their own service stack, which includes the Web Processing Service, designed to facilitate the executing of geospatial processing - including complex environmental models. The current Open Geospatial Consortium service stack employs Extensible Markup Language as a default data exchange standard, and widely-used encodings such as JavaScript Object Notation can often only be used when incorporated with Extensible Markup Language. Similarly, no successful engagement of the Web Processing Service standard with the well-supported technologies of Simple Object Access Protocol and Web Services Description Language has been seen. In this paper we propose a pure Simple Object Access Protocol/Web Services Description Language processing service which addresses some of the issues with the Web Processing Service specication and brings us closer to achieving a degree of interoperability between geospatial models, and thus realising the vision of a useful 'model web'.
Resumo:
The use of spreadsheets has become routine in all aspects of business with usage growing across a range of functional areas and a continuing trend towards end user spreadsheet development. However, several studies have raised concerns about the accuracy of spreadsheet models in general, and of end user developed applications in particular, raising the risk element for users. High error rates have been discovered, even though the users/developers were confident that their spreadsheets were correct. The lack of an easy to use, context-sensitive validation methodology has been highlighted as a significant contributor to the problems of accuracy. This paper describes experiences in using a practical, contingency factor-based methodology for validation of spreadsheet-based DSS. Because the end user is often both the system developer and a stakeholder, the contingency factor-based validation methodology may need to be used in more than one way. The methodology can also be extended to encompass other DSS.
Resumo:
In this article we envision factors and trends that shape the next generation of environmental monitoring systems. One key factor in this respect is the combined effect of end-user needs and the general development of IT services and their availability. Currently, an environmental (monitoring) system is assumed to be reactive. It delivers measurement data and computational results only if the user explicitly asks for it either by query or subscription. There is a temptation to automate this by simply pushing data to end-users. This, however, leads easily to an "advertisement strategy", where data is pushed to end-users regardless of users' needs. Under this strategy, the mere amount of received data obfuscates the individual messages; any "automatic" service, regardless of its fitness, overruns a system that requires the user's initiative. The foreseeable problem is that, unless there is no overall management, each new environmental service is going to compete for end-users' attention and, thus, inadvertently hinder the use of existing services. As the main contribution we investigate the nature of proactive environmental systems, and how they should be designed to avoid the aforementioned problem. We also discuss how semantics, participatory sensing, uncertainty management, and situational awareness link to proactive environmental systems. We illustrate our proposals with some real-life examples.
Resumo:
The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.
Resumo:
Over recent years, hub-and-spoke distribution techniques have attracted widespread research attention. Despite there being a growing body of literature in this area there is less focus on the spoke-terminal element of the hub-and-spoke system as being a key component in the overall service received by the end-user. Current literature is highly geared towards discussing bulk optimization of freight units rather than to the more discrete and individualistic profile characteristics of shared-user Less-than-truckload (LTL) freight. In this paper, a literature review is presented to review the role hub-and-spoke systems play in meeting multi-profile customer demands, particularly in developing sectors with more sophisticated needs, such as retail. The paper also looks at the use of simulation technology as a suitable tool for analyzing spoke-terminal operations within developing hub-and spoke systems.
Resumo:
Over recent years, hub-and-spoke distribution techniques have attracted widespread research attention. Despite there being a growing body of literature in this area there is less focus on the spoke-terminal element of the hub-and-spoke system as being a key component in the overall service received by the end-user. Current literature is highly geared towards discussing bulk optimization of freight units rather than to the more discrete and individualistic profile characteristics of shared-user Less-than-truckload (LTL) freight. In this paper, a literature review is presented to review the role hub-and-spoke systems play in meeting multi-profile customer demands, particularly in developing sectors with more sophisticated needs, such as retail. The paper also looks at the use of simulation technology as a suitable tool for analyzing spoke-terminal operations within developing hub-and spoke systems.
Resumo:
We introduce ReDites, a system for realtime event detection, tracking, monitoring and visualisation. It is designed to assist Information Analysts in understanding and exploring complex events as they unfold in the world. Events are automatically detected from the Twitter stream. Then those that are categorised as being security-relevant are tracked, geolocated, summarised and visualised for the end-user. Furthermore, the system tracks changes in emotions over events, signalling possible flashpoints or abatement. We demonstrate the capabilities of ReDites using an extended use case from the September 2013 Westgate shooting incident. Through an evaluation of system latencies, we also show that enriched events are made available for users to explore within seconds of that event occurring.
Resumo:
The world is connected by a core network of long-haul optical communication systems that link countries and continents, enabling long-distance phone calls, data-center communications, and the Internet. The demands on information rates have been constantly driven up by applications such as online gaming, high-definition video, and cloud computing. All over the world, end-user connection speeds are being increased by replacing conventional digital subscriber line (DSL) and asymmetric DSL (ADSL) with fiber to the home. Clearly, the capacity of the core network must also increase proportionally. © 1991-2012 IEEE.
Resumo:
Energy service companies (ESCOs) are faced with a range of challenges and opportunities associated with the rapidly changing and flexible requirements of energy customers (end users) and rapid improvements in technologies associated with energy and ICT. These opportunities for innovation include better prediction of energy demand, transparency of data to the end user, flexible and time dependent energy pricing and a range of novel finance models. The liberalisation of energy markets across the world has leads to a very small price differential between suppliers on the unit cost of energy. Energy companies are therefore looking to add additional layers of value using service models borrowed from the manufacturing industry. This opens a range of new product and service offerings to energy markets and consumers and has implications for the overall efficiency, utility and price of energy provision.
Conceptual Model and Security Requirements for DRM Techniques Used for e-Learning Objects Protection
Resumo:
This paper deals with the security problems of DRM protected e-learning content. After a short review of the main DRM systems and methods used in e-learning, an examination is made of participators in DRM schemes (e-learning object author, content creator, content publisher, license creator and end user). Then a conceptual model of security related processes of DRM implementation is proposed which is improved afterwards to reflect some particularities in DRM protection of e-learning objects. A methodical way is used to describe the security related motives, responsibilities and goals of the main participators involved in the DRM system. Taken together with the process model, these security properties are used to establish a list of requirements to fulfill and a possibility for formal verification of real DRM systems compliance with these requirements.
Resumo:
A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.
Resumo:
The Arnamagnæan Institute, principally in the form of the present writer, has been involved in a number of projects to do with the digitisation, electronic description and text-encoding of medieval manuscripts. Several of these projects were dealt with in a previous article 'The view from the North: Some Scandinavian digitisation projects', NCD review, 4 (2004), pp. 22-30. This paper looks in some depth at two others, MASTER and CHLT. The Arnamagnæan Institute is a teaching and research institute within the Faculty of Humanities at the University of Copenhagen. It is named after the Icelandic scholar and antiquarian Árni Magnússon (1663-1730), secretary of the Royal Danish Archives and Professor of Danish Antiquities at the University of Copenhagen, who in the course of his lifetime built up what is arguably the single most important collection of early Scandinavian manuscripts in the world, some 2,500 manuscript items, the earliest dating from the 12th century. The majority of these are from Iceland, but the collection also contains important Norwegian, Danish and Swedish manuscripts, along with approximately 100 manuscripts of continental provenance. In addition to the manuscripts proper, there are collections of original charters and apographa: 776 Norwegian (including Faroese, Shetlandic and Orcadian) charters and 2895 copies, 1571 Danish charters and 1372 copies, and 1345 Icelandic charters and 5942 copies. When he died in 1730, Árni Magnússon bequeathed his collection to the University of Copenhagen. The original collection has subsequently been augmented through individual purchases and gifts and the acquisition of a number of smaller collections, bringing the total to nearly 3000 manuscript items, which, with the charters and apographa, comprise over half a million pages.