965 resultados para End-user enrichments
Resumo:
Relationships between organizations can be characterized by cooperation, conflict, and change. In this dissertation we study cooperation between organizations by investigating how norms in relationships can enhance innovativeness and subsequently impact relationship performance. We do so by incorporating both beneficial aspects of long term relationships as well as “dark side” factors that may decrease innovativeness. This provides a balanced assessment of the factors increasing and decreasing the performance of relationships. Next, we study conflict between organizations by taking a network view on conflict which helps explain why organizations react to conflict. We find stakeholders to have an effect on channel conflict responsiveness. Finally we study change by means of an organization’s ability to successfully add an Internet channel to their distribution system in order to sell its products or services directly to the end-user. We find that an Internet channel is best implemented by organizations that are flexible and we identify several circumstances under which this flexibility is highest.
Resumo:
The INTAMAP FP6 project has developed an interoperable framework for real-time automatic mapping of critical environmental variables by extending spatial statistical methods and employing open, web-based, data exchange protocols and visualisation tools. This paper will give an overview of the underlying problem, of the project, and discuss which problems it has solved and which open problems seem to be most relevant to deal with next. The interpolation problem that INTAMAP solves is the generic problem of spatial interpolation of environmental variables without user interaction, based on measurements of e.g. PM10, rainfall or gamma dose rate, at arbitrary locations or over a regular grid covering the area of interest. It deals with problems of varying spatial resolution of measurements, the interpolation of averages over larger areas, and with providing information on the interpolation error to the end-user. In addition, monitoring network optimisation is addressed in a non-automatic context.
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. They consist of tiny devices with limited processing and power capabilities, called motes that can be deployed in large numbers of useful sensing capabilities. Even though, they are flexible and easy to deploy, there are a number of considerations when it comes to their fault tolerance, conserving energy and re-programmability that need to be addressed before we draw any substantial conclusions about the effectiveness of this technology. In order to overcome their limitations, we propose a middleware solution. The proposed scheme is composed based on two main methods. The first method involves the creation of a flexible communication protocol based on technologies such as Mobile Code/Agents and Linda-like tuple spaces. In this way, every node of the wireless sensor network will produce and process data based on what is the best for it but also for the group that it belongs too. The second method incorporates the above protocol in a middleware that will aim to bridge the gap between the application layer and low level constructs such as the physical layer of the wireless sensor network. A fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort towards the deployed applications running in an energy efficient manner inside the network. The proposed scheme is evaluated through a number of trials aiming to test its merits under real time conditions and to identify its effectiveness against other similar approaches. Finally, parameters which determine the characteristics of the proposed scheme are also examined.
Resumo:
As mobile devices become increasingly diverse and continue to shrink in size and weight, their portability is enhanced but, unfortunately, their usability tends to suffer. Ultimately, the usability of mobile technologies determines their future success in terms of end-user acceptance and, thereafter, adoption and social impact. Widespread acceptance will not, however, be achieved if users’ interaction with mobile technology amounts to a negative experience. Mobile user interfaces need to be designed to meet the functional and sensory needs of users. Social and Organizational Impacts of Emerging Mobile Devices: Evaluating Use focuses on human-computer interaction related to the innovation and research in the design, evaluation, and use of innovative handheld, mobile, and wearable technologies in order to broaden the overall body of knowledge regarding such issues. It aims to provide an international forum for researchers, educators, and practitioners to advance knowledge and practice in all facets of design and evaluation of human interaction with mobile technologies.
Resumo:
Wireless sensor networks have been identified as one of the key technologies for the 21st century. In order to overcome their limitations such as fault tolerance and conservation of energy, we propose a middleware solution, In-Motes. In-Motes stands as a fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort the deployed applications to run in an energy efficient manner inside the network. The proposed scheme is evaluated through the In-Motes EYE application, aiming to test its merits under real time conditions. In-Motes EYE application which is an agent based real time In-Motes application developed for sensing acceleration variations in an environment. The application was tested in a prototype area, road alike, for a period of four months.
Resumo:
The use of spreadsheets has become routine in all aspects of business with usage growing across a range of functional areas and a continuing trend towards end user spreadsheet development. However, several studies have raised concerns about the accuracy of spreadsheet models in general, and of end user developed applications in particular, raising the risk element for users. High error rates have been discovered, even though the users/developers were confident that their spreadsheets were correct. The lack of an easy to use, context-sensitive validation methodology has been highlighted as a significant contributor to the problems of accuracy. This paper describes experiences in using a practical, contingency factor-based methodology for validation of spreadsheet-based DSS. Because the end user is often both the system developer and a stakeholder, the contingency factor-based validation methodology may need to be used in more than one way. The methodology can also be extended to encompass other DSS.
Resumo:
In this article we envision factors and trends that shape the next generation of environmental monitoring systems. One key factor in this respect is the combined effect of end-user needs and the general development of IT services and their availability. Currently, an environmental (monitoring) system is assumed to be reactive. It delivers measurement data and computational results only if the user explicitly asks for it either by query or subscription. There is a temptation to automate this by simply pushing data to end-users. This, however, leads easily to an "advertisement strategy", where data is pushed to end-users regardless of users' needs. Under this strategy, the mere amount of received data obfuscates the individual messages; any "automatic" service, regardless of its fitness, overruns a system that requires the user's initiative. The foreseeable problem is that, unless there is no overall management, each new environmental service is going to compete for end-users' attention and, thus, inadvertently hinder the use of existing services. As the main contribution we investigate the nature of proactive environmental systems, and how they should be designed to avoid the aforementioned problem. We also discuss how semantics, participatory sensing, uncertainty management, and situational awareness link to proactive environmental systems. We illustrate our proposals with some real-life examples.
Resumo:
Over recent years, hub-and-spoke distribution techniques have attracted widespread research attention. Despite there being a growing body of literature in this area there is less focus on the spoke-terminal element of the hub-and-spoke system as being a key component in the overall service received by the end-user. Current literature is highly geared towards discussing bulk optimization of freight units rather than to the more discrete and individualistic profile characteristics of shared-user Less-than-truckload (LTL) freight. In this paper, a literature review is presented to review the role hub-and-spoke systems play in meeting multi-profile customer demands, particularly in developing sectors with more sophisticated needs, such as retail. The paper also looks at the use of simulation technology as a suitable tool for analyzing spoke-terminal operations within developing hub-and spoke systems.
Resumo:
Linked Data semantic sources, in particular DBpedia, can be used to answer many user queries. PowerAqua is an open multi-ontology Question Answering (QA) system for the Semantic Web (SW). However, the emergence of Linked Data, characterized by its openness, heterogeneity and scale, introduces a new dimension to the Semantic Web scenario, in which exploiting the relevant information to extract answers for Natural Language (NL) user queries is a major challenge. In this paper we discuss the issues and lessons learned from our experience of integrating PowerAqua as a front-end for DBpedia and a subset of Linked Data sources. As such, we go one step beyond the state of the art on end-users interfaces for Linked Data by introducing mapping and fusion techniques needed to translate a user query by means of multiple sources. Our first informal experiments probe whether, in fact, it is feasible to obtain answers to user queries by composing information across semantic sources and Linked Data, even in its current form, where the strength of Linked Data is more a by-product of its size than its quality. We believe our experiences can be extrapolated to a variety of end-user applications that wish to scale, open up, exploit and re-use what possibly is the greatest wealth of data about everything in the history of Artificial Intelligence. © 2010 Springer-Verlag.
Resumo:
Over recent years, hub-and-spoke distribution techniques have attracted widespread research attention. Despite there being a growing body of literature in this area there is less focus on the spoke-terminal element of the hub-and-spoke system as being a key component in the overall service received by the end-user. Current literature is highly geared towards discussing bulk optimization of freight units rather than to the more discrete and individualistic profile characteristics of shared-user Less-than-truckload (LTL) freight. In this paper, a literature review is presented to review the role hub-and-spoke systems play in meeting multi-profile customer demands, particularly in developing sectors with more sophisticated needs, such as retail. The paper also looks at the use of simulation technology as a suitable tool for analyzing spoke-terminal operations within developing hub-and spoke systems.
Resumo:
We introduce ReDites, a system for realtime event detection, tracking, monitoring and visualisation. It is designed to assist Information Analysts in understanding and exploring complex events as they unfold in the world. Events are automatically detected from the Twitter stream. Then those that are categorised as being security-relevant are tracked, geolocated, summarised and visualised for the end-user. Furthermore, the system tracks changes in emotions over events, signalling possible flashpoints or abatement. We demonstrate the capabilities of ReDites using an extended use case from the September 2013 Westgate shooting incident. Through an evaluation of system latencies, we also show that enriched events are made available for users to explore within seconds of that event occurring.
Resumo:
The world is connected by a core network of long-haul optical communication systems that link countries and continents, enabling long-distance phone calls, data-center communications, and the Internet. The demands on information rates have been constantly driven up by applications such as online gaming, high-definition video, and cloud computing. All over the world, end-user connection speeds are being increased by replacing conventional digital subscriber line (DSL) and asymmetric DSL (ADSL) with fiber to the home. Clearly, the capacity of the core network must also increase proportionally. © 1991-2012 IEEE.
Resumo:
Energy service companies (ESCOs) are faced with a range of challenges and opportunities associated with the rapidly changing and flexible requirements of energy customers (end users) and rapid improvements in technologies associated with energy and ICT. These opportunities for innovation include better prediction of energy demand, transparency of data to the end user, flexible and time dependent energy pricing and a range of novel finance models. The liberalisation of energy markets across the world has leads to a very small price differential between suppliers on the unit cost of energy. Energy companies are therefore looking to add additional layers of value using service models borrowed from the manufacturing industry. This opens a range of new product and service offerings to energy markets and consumers and has implications for the overall efficiency, utility and price of energy provision.
Conceptual Model and Security Requirements for DRM Techniques Used for e-Learning Objects Protection
Resumo:
This paper deals with the security problems of DRM protected e-learning content. After a short review of the main DRM systems and methods used in e-learning, an examination is made of participators in DRM schemes (e-learning object author, content creator, content publisher, license creator and end user). Then a conceptual model of security related processes of DRM implementation is proposed which is improved afterwards to reflect some particularities in DRM protection of e-learning objects. A methodical way is used to describe the security related motives, responsibilities and goals of the main participators involved in the DRM system. Taken together with the process model, these security properties are used to establish a list of requirements to fulfill and a possibility for formal verification of real DRM systems compliance with these requirements.
Resumo:
A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.